I'm writting a Flask app that connects to external soap service that uses TLS v1.2.
I'm using Python 2.7 and requests library in version 2.18.1.
I've contacted server owner and he told me that I need to include multiple client certificates in TLS connection. It's a chain of 3 certificates which I have in separate .pem files. (root + indermediate + my client certificate).
Server won't let me in if I would have just the last one.
I've tested this with SoapUI and Wireshark and it's true. I receive a response only when I provide the whole chain of 3 certificates.
I get an error from the server when passing just my client certificate.
From requests documentation you can read that as client certificate you can pass just one cert using:
session = requests.session()
session.cert = ('/path/client_cert.pem', '/path/private_key.pem')
response = session.post(SERVICE_URL, data=XML_CONTENT, headers=HEADERS)
I get an error even if my "client_cert.pem" file is a bundle of 3 certificates (just like you do it in session.verify with CA certs). I can see on Wireshark that only the first one is used in TLS connection.
Is there any way to include multiple certificates TLS connection in Python's requests library?
Maybe I should use different library or override some of it's code?
I've got it!
I had some legacy library versions installed.
It seems that this issue was fixed by requests library developers in version 1.23. I also had to update urllib3.
My current requirements.txt is:
requests==2.22.0
urllib3==1.25.2 # compatible with requests 2.22
For following spec everything works perfecly. I've checked TLS connection on Wireshark. All certificates from "client_cert.pem" chain are passed.
If you'll have problems like this in the future remember to check if your requests and urllib3 library versions are compatible.
Thank you guys!
I’m extracting the SSL certificate from a website using the socket + ssl library in python. My understanding that it connects using the preferred method used by the server.
Using this method I’m able to identify what version of SSL is used to connect, but I also need to identify whether the website supports SSL v3, in the case when the default connection is TLS.
Is there a way to identify this information without manually testing multiple SSL connections?
I don't think sites advertise what they support. Rather, it's negotiated between client and server.
You could use the excellent server tester at www.ssllabs.com. It will try lots of configurations and report what the server in question supports. (Hopefully the site doesn't support SSL v3!)
I'm trying to parse the data from this url:
https://www.chemeo.com/search?q=show%3Ahfus+tf%3A275%3B283
But I think this is failing because the website uses SSL TLS 1.3. How can I enable my Python script, below, to connect using SSL in urllib.request?
I've tried using an SSL context but this doesn't seem to work.
This is the Python 3.6 code I have:
import urllib.request
import ssl
from bs4 import BeautifulSoup
scontext = ssl.SSLContext(ssl.PROTOCOL_SSLv23)
chemeo_search_url = "https://www.chemeo.com/search?q=show%3Ahfus+tf%3A275%3B283"
print(chemeo_search_url)
with urllib.request.urlopen(chemeo_search_url, context=scontext) as f:
print(f.read(200))
Try:
ssl.PROTOCOL_TLS
From the docs on "PROTOCOL_SSLv23":
Deprecated since version 2.7.13: Use PROTOCOL_TLS instead.
note:
Be sure to have the CA certificate bundles installed, like on a minimal build of alpine linux, busybox, the certs have to be installed. Also sometimes if python wasn't compiled with SSL support, it might be necessary to to do so. Also depending on which version of OpenSSL has been compiled will determine which features for SSL will be usable.
Also note chemeo site doesn't use TLSv1.3 ... it is still experimental and not all that secure at the time of this writing, they currently support tls 1.0, 1.1, 1.2 using "letsencrypt" as their cert provider.
When I try a django view that loads an external URL using requests, I get a 'module' object has no attribute 'create_connection'. However, when I use urllib2 or the same requests code from the interactive shell, it works.
My environment:
Python 2.5.2
Requests 0.10.0 (I am using a 3rd party api which requires this version)
Apache with WSGI inside a virtualenv for my django site
Django 1.4.1
Debian Linux 5
I do not have SELinux (or any similar security)
I actually use 2 different APIs for completely different functions. They both require requests and they both give this error:
Exception Type: AttributeError
Exception Value: 'module' object has no attribute 'create_connection'
Exception Location: /my/virtualenv/dir/lib/python2.5/site-packages/requests/packages/urllib3/connectionpool.py in connect, line 67
The mentioned line in the exception is:
sock = socket.create_connection((self.host, self.port), self.timeout)
It looks like the version of socket that comes with python 2.5 does not have the create_connection method (it was added in 2.6). However, I tried running the exact same code from the python interactive shell within the virtualenv and everything works. Also, requests 0.10.0 is supposed to work with python 2.5.
I created the following 2 test views because I suspected requests to be part of the problem:
def get_requests(request):
import requests
r = requests.get("https://google.ca")
return HttpResponse(r.text)
def get_urllib(request):
import urllib2
r = urllib2.urlopen('https://google.ca')
return HttpResponse(r.read())
The urllib view works, and the requests view gives me the same error as above.
The fact that urllib works indicates to me that Apache has permission to connect to the internet (its not a firewall issue).
I've done a tcpdump when trying the views, and requests never even attempts to connect out.
Any ideas? Please don't suggest using something other than requests because I am using 2 different 3rd party APIs which require it.
Thanks.
It looks like you've run into a bug in requests 0.10.0 (or, really, in urllib3) with HTTPS in Python 2.5 with the ssl module installed.
If you trace through the 0.10.0 source, you can see that if ssl is installed and you make an HTTPS requests, you are going to get to the VerifiedHTTPSConnection.connect method. This is also explained in comments in the HTTPSConnectionPool source. But you don't really even need to trace through the source, because you already saw that from your tracebacks.
And if you look at the source to that method, it unconditionally calls socket.create_connection, which is guaranteed to fail in 2.5.
The odds that anyone is ever going to fix this bug are pretty minimal. It looks like it was introduced in 0.10.0, and 0.10.1 resolved it by just dropping 2.5 support. (I'm not positive about that, because I can't find it in the bug tracker.)
So, what can you do about it?
First, note that, while create_connection is "higher level" than connect, there are only real advantage is that it does the name lookup before deciding what kind of socket to create. If you know you only ever care about IPv4, you can replace it with this:
self.sock = socket.socket()
self.sock.settimeout(self.timeout)
self.sock.connect((self.host, self.port))
If you care about IPv6, you can just borrow the 2.6 code for create_connection instead.
So, you have a few options:
Fork the source and patch urllib3.connectionpool.VerifiedHTTPConnection.connect to use the workaround instead of create_connection.
Monkeypatch urllib3.connectionpool.VerifiedHTTPConnection.connect at runtime.
Monkeypatch socket at runtime to add a create_connection implementation.
However, I wouldn't want to guarantee that 0.10.0 won't have further problems with Python 2.5, given the history.
From 0.10.1, requests dropped support for Python 2.5.
Upgrade your Python or use requests 0.10.0 version of requests.
0.10.1 (2012-01-23)
PYTHON 3 SUPPORT! Dropped 2.5 Support. (Backwards Incompatible)
Use requests 0.9.3. That works for sure with Python 2.5. We needed Python 2.5 support and either Kenneth told us to use that version, or that was the newest version we found that actually worked in practice. Just watch out for any API differences if you need to go back to using that version.
As abarnert pointed out, this is caused by a bug in requests 0.10.0 and it won't be fixed since python 2.5 support was dropped in version 0.10.1.
So I edited this file requests/packages/urllib3/connectionpool.py (on line 67).
The original line:
sock = socket.create_connection((self.host, self.port), self.timeout)
I replaced it with:
try:
sock = socket.create_connection((self.host, self.port), self.timeout)
except AttributeError:
# python 2.5 fix
sock = socket.socket()
if self.timeout is not None:
sock.settimeout(self.timeout)
sock.connect((self.host, self.port))
With this change, everything is working.
I am using the instructions found here, to try to inspect the HTTP commands being sent to my webserver.
However, I am not seeing the HTTP commands being printed on the console as suggested in the tutorial. Does anyone know how to display/debug the HTTP commands at the CLI?
I am running Python 2.6.5 on Linux Ubuntu
The tutorial information seems to be deprecated.
Correct way to debug with urllib2 nowadays is:
import urllib2
request = urllib2.Request('http://diveintomark.org/xml/atom.xml')
opener = urllib2.build_opener(urllib2.HTTPHandler(debuglevel=1))
feeddata = opener.open(request).read()
Debugging with urllib works the old way though.