how do I use pymongo behind proxy that require authentication - python

how do I use pymongo behind proxy that require authentication?
I am able to find settings for ssh tunnel servers, such as
How to connect remote mongodb with pymongo
But I am working on a environment that is behind a firewall that need to use proxy authentication. How do I config for that? For OSX terminal I use something similar to this:
export http_proxy="username:password#ip address:port number"
I find this new feature for socks5 proxy authentication https://jira.mongodb.org/browse/CSHARP-734, but I am just looking for basic or NTLM authentication methods, is it supported?

pymongo does not use http protocol.
You can not use http_proxy.
See https://jira.mongodb.org/browse/PYTHON-1182 for details.
But you can use socks5 proxy
for example:i use v2rayN + proxifier on my windows

Related

Testing SSL v3 Support in Python

I’m extracting the SSL certificate from a website using the socket + ssl library in python. My understanding that it connects using the preferred method used by the server.
Using this method I’m able to identify what version of SSL is used to connect, but I also need to identify whether the website supports SSL v3, in the case when the default connection is TLS.
Is there a way to identify this information without manually testing multiple SSL connections?
I don't think sites advertise what they support. Rather, it's negotiated between client and server.
You could use the excellent server tester at www.ssllabs.com. It will try lots of configurations and report what the server in question supports. (Hopefully the site doesn't support SSL v3!)

PythonAnywhere - Are sockets allowed?

I have a beginner PythonAnywhere account, which, the account comparison page notes, have "Access to External Internet Sites: Specific Sites via HTTP(S) Only."
So I know only certain hosts can be accessed through HTTP protocols, but are there restrictions on use of the socket module? In particular, can I set up a Python server using socket?
PythonAnywhere dev here. Short answer: you can't run a socket server on PythonAnywhere, no.
Longer answer: the socket module is supported, and from paid accounts you can use it for outbound connections just like you could on your normal machine. On a free account, you could also create a socket connection to the proxy server that handles free accounts' Internet access, and then use the HTTP protocol to request a whitelisted site from it (though that would be hard work, and it would be easier to use requests or something like that).
What you can't do on PythonAnywhere is run a socket server that can be accessed from outside our system.
Nope. PythonAnywhere doesn't support the socket module.

How to make a valid SSL Certificate / Keyfile to use with Flask SSL WSGI?

I am writing a Flask Web-Application and use eventlet as the networking library for that application (eventlet is wrapped by Flask-SocketIO to allow asynchronous operation)
Following this guide I have been successfully creating a SSL key- and cert-file which I pass to the WSGI Server
socket_io.run(app,
host=APP_HOST,
port=APP_PORT,
keyfile='ia.key',
certfile='ia.crt')
This works fine but unfortunately Safari / Chrome says that my SSL-Certificate is not trustworthy when I access the page for the first time.
The Chrome-Failure is the following:
NET::ERR_CERT_COMMON_NAME_INVALID
How to I generate a valid SSL Certificate so that the browsers don't show that error when a user connects to the web application the first time!?
That is because it is something called a "Self Signed Certificate", which is not from any trusted company, so any modern browser auto-detects this as an untrusted site. If you are using a UNIX-based operating system, (Linux, or macOS, Fedora, and more), you can use what I am using. You have to generate new certification from a trusted site.
This is what I use to get a TRUSTED certificate that most browsers can use: https://certbot.eff.org/instructions.

Allowing remote access to Elasticsearch

I have a default installation of Elasticsearch which I am trying to query from a third party server. However, it seems that by default this is blocked.
Is anyone please able to tell me how I can configure Elasticsearch so that I can query it from a different server?
When elasticsearch is installed and run without any configuration changes by default it binds to localhost only. To access the elasticsearch REST API endpoint remotely the below changes has to be made on the server where elasticsearch has been installed.
Elasticsearch Configuration Change
Update the network.host property in elasticsearch.yml as per the guidelines provided in the elasticsearch documentation
For example to bind to all IPv4 addresses on the local machine, change as below
network.host : 0.0.0.0
Firewall Rules Update
Update the Linux firewall to allow access to port 9200. Please refer your Linux documentation for adding rules to the firewall.
For example to allow access to all the servers(public) in CentosOS use the firewall-cmd
sudo firewall-cmd --zone=public --permanent --add-port=9200/tcp
sudo firewall-cmd --reload
Note : In production environment public access is discouraged. A restricted access should be preferred.
In config/elasticsearch.yml, put network.host: 0.0.0.0.
And also add Inbound Rule in firewall for your ElasticSearch port(9200 ByDefault).
It worked in ElasticSearch version 2.3.0
Edit: As Sisso mentions in his comment below, Elasticsearch as of 2.0 at least binds to localhost by default. See https://www.elastic.co/guide/en/elasticsearch/reference/2.0/modules-network.html for more information.
As Damien mentions in his answer, by default ES allows all access to port 9200. In fact, you need to use external tools to provide authentication to the ES resource - something like a webapp frontend or just simple nginx with Basic Auth turned on.
Things that can prevent you from accessing a remote system (you probably know these):
network configuration problems
ES host firewall blocks incoming requests on port 9200
remote host firewall blocks outgoing requests to ES host and/or port 9200
ES is configured to bind to the wrong IP address (by default however, it binds to all available IPs)
Best guess? Check that you can connect from remote host to ES host, then check firewall on both systems. If you can't diagnose further, maybe someone on the ES mailing list (https://groups.google.com/forum/#!forum/elasticsearch) or IRC channel (#elasticsearch on Freenode) can help.
There is no restriction by default, ElasticSearch expose a standard HTTP API on the port 9200.
From your third party server, are you able to: curl http://es_hostname:9200/?
To allow remote access with one default node, settings\elasticsearch.yml should have:
network.host: 0.0.0.0
http.port: 9200
My case I need three instances. For each instance, it's necessary declare also the port range used.
network.host: 0.0.0.0
http.port: 9200-9202

How can I access App Engine through a Corporate proxy?

I have corporate proxy that supports https but not HTTP CONNECT (even after authentication). It just gives 403 Forbidden in response anything but HTTP or HTTPS URLS. It uses HTTP authenication, not NTLM. It is well documented the urllib2 does not work with https thru a proxy. App Engine trys to connect to a https URL using urllib2 to update the app.
On *nix, urllib2 expects proxies to set using env variables.
export http_proxy="http://mycorporateproxy:8080"
export https_proxy="https://mycorporateproxy:8080"
This is sited as a work around: http://code.activestate.com/recipes/456195/. Also see http://code.google.com/p/googleappengine/issues/detail?id=126.
None of these fixes have worked for me. They seem to rely on the proxy server supporting HTTP CONNECT. Does anyone have any other work arounds? I sure I am not the only
one behind a restrictive corporate proxy.
Do you mean it uses http basic-auth before allowing proxying, and does it then allow 'connect'.
Then you should be able to tunnel over it using http-tunnel or proxytunnel

Categories