Does python requests certificate verification require openssl on windows - python

I am using the requests library for my python client which talks to app servers. I do not want to put the verify=false in the production version as it is blindly trusting. I know that requests APIs support certificate verification. http://docs.python-requests.org/en/latest/user/advanced/#ssl-cert-verification. But I am not able to find the required dependencies for the same. Is installing the openssl separately required on windows?

You do not need to install OpenSSL on Windows to get certificate verification with requests.
Yes, OpenSSL is required, but OpenSSL is statically linked with the Python Windows binaries, so as long as the ssl module is present in your Python install everything will work fine.

Related

Python requests: Passing multiple client certificates to session.cert

I'm writting a Flask app that connects to external soap service that uses TLS v1.2.
I'm using Python 2.7 and requests library in version 2.18.1.
I've contacted server owner and he told me that I need to include multiple client certificates in TLS connection. It's a chain of 3 certificates which I have in separate .pem files. (root + indermediate + my client certificate).
Server won't let me in if I would have just the last one.
I've tested this with SoapUI and Wireshark and it's true. I receive a response only when I provide the whole chain of 3 certificates.
I get an error from the server when passing just my client certificate.
From requests documentation you can read that as client certificate you can pass just one cert using:
session = requests.session()
session.cert = ('/path/client_cert.pem', '/path/private_key.pem')
response = session.post(SERVICE_URL, data=XML_CONTENT, headers=HEADERS)
I get an error even if my "client_cert.pem" file is a bundle of 3 certificates (just like you do it in session.verify with CA certs). I can see on Wireshark that only the first one is used in TLS connection.
Is there any way to include multiple certificates TLS connection in Python's requests library?
Maybe I should use different library or override some of it's code?
I've got it!
I had some legacy library versions installed.
It seems that this issue was fixed by requests library developers in version 1.23. I also had to update urllib3.
My current requirements.txt is:
requests==2.22.0
urllib3==1.25.2 # compatible with requests 2.22
For following spec everything works perfecly. I've checked TLS connection on Wireshark. All certificates from "client_cert.pem" chain are passed.
If you'll have problems like this in the future remember to check if your requests and urllib3 library versions are compatible.
Thank you guys!

SSL Error certificate verification failed when using requests module on alpine linux but works fine on my windows machine

I am using python requests module to make few API calls. i have to provide certificates to validate it. I have provided pem file to "cert" and ca to "verify" and it works fine on my windows machine.
When the same code is ran in an docker container it gives me
certificate verification failed error.
OS - Alpine Linux.
any thoughts on this would be appriciated.

urllib.request SSL Connection Python 3

I'm trying to parse the data from this url:
https://www.chemeo.com/search?q=show%3Ahfus+tf%3A275%3B283
But I think this is failing because the website uses SSL TLS 1.3. How can I enable my Python script, below, to connect using SSL in urllib.request?
I've tried using an SSL context but this doesn't seem to work.
This is the Python 3.6 code I have:
import urllib.request
import ssl
from bs4 import BeautifulSoup
scontext = ssl.SSLContext(ssl.PROTOCOL_SSLv23)
chemeo_search_url = "https://www.chemeo.com/search?q=show%3Ahfus+tf%3A275%3B283"
print(chemeo_search_url)
with urllib.request.urlopen(chemeo_search_url, context=scontext) as f:
print(f.read(200))
Try:
ssl.PROTOCOL_TLS
From the docs on "PROTOCOL_SSLv23":
Deprecated since version 2.7.13: Use PROTOCOL_TLS instead.
note:
Be sure to have the CA certificate bundles installed, like on a minimal build of alpine linux, busybox, the certs have to be installed. Also sometimes if python wasn't compiled with SSL support, it might be necessary to to do so. Also depending on which version of OpenSSL has been compiled will determine which features for SSL will be usable.
Also note chemeo site doesn't use TLSv1.3 ... it is still experimental and not all that secure at the time of this writing, they currently support tls 1.0, 1.1, 1.2 using "letsencrypt" as their cert provider.

Accessing SPNEGO authenticated web service from Python on Windows

I am trying to access a REST service that uses Kerberos authentication (company internal) from a Python app on Windows. However, it seems that the service is configured to expect a SPNEGO only as when I try to use requests-kerberos to connect as in:
requests.get('servicename', auth=HTTPKerberosAuth())
it produces a 500 Error from the server with:
javax.servlet.ServletException: GSSException: No credential found for: 1.2.840.113554.1.2.2
My guess is that server is configured to expect SPNEGO only and Python client supports only Kerberos.
I have tried installing PyKerberos but that fails as it expects krb5 on the system and I am doing this under Windows. Are there any libraries available that could help me do a SPNEGO call from Python in Windows?
In case anyone else would be having a similar problem - resolved by using pycurl with pycurl.HTTPAUTH_GSSNEGOTIATE attribute set.

Is there a Python API that will return precise cipher information about a TLS connection?

I'm trying to write a python (or Java) program that makes an https connection to a website and then returns properties of the https connection. I've been using python's ssl (http://docs.python.org/2/library/ssl.html), specifically the .cipher() method. My main issue with the output is that it isn't very specific:
('RC4-SHA', 'TLSv1/SSLv3', 128)
This is the output for www.amazon.com. But when I go into my browser and manually examine the connection, I can really see that it is: RC4 with SHA1 message authentication, RSA key exchange and TLS v1.0. In fact, the .cipher() method outputs TLSv1/SSLv3 for TLS versions 1.0, 1.1 and 1.2 and outputs SHA for SHA1 and SHA256.
Is there any Python (or Java) API that will give me more information about the https connection?
The Python ssl module is just a wrapper around OpenSSL, and it can't provide any more information than the library provides. But really, I don't think you're missing any information. It is being specific. You're just always getting TLSv1.0 and SHA1.
First, TLSv1/SSLv3 in OpenSSL 0.9 means TLSv1.0. It cannot mean 1.1 or 1.2, because OpenSSL 0.9 does not support those protocols. (And you are probably using OpenSSL 0.9. For example, the 3.3.0 64-bit Mac binary installer I just got off Python.org uses 0.9.8r.) You can check this from Python with ssl.OPENSSL_VERSION.
Second, RC4-SHA means SHA1, not SHA256. RC4-SHA is just the OpenSSL name for the TLSv1.0 cipher suite TLS_RSA_WITH_RC4_128_SHA. That's a complete specification of the cipher; there are different ciphers with SHA256 on the end of their names. You can see the list of cipher suites specified in RFC 2246 and its addenda for TLS 1.0 (it's RFC4346 for 1.1 and RFC 5246 for 1.2). I don't think the mapping from OpenSSL names to RFC names is specified anywhere except inside the code, but if you have the command-line OpenSSL tools you can type openssl ciphers to dump out the list of OpenSSL names for all suggested cipher suites it will send, and then you can match them up to the values sent in the handshake. (To see the handshake, openssl s_client -connect www.amazon.com:443 -msg, or try -debug or other flags instead of/in addition to -msg.)
So, why does your browser show TLSv1.2 or SHA256 for some of those same sites? Because your browser has a completely different SSL library (or a newer OpenSSL), and therefore does a completely different handshake with the server, and ends up agreeing on different cipher suites, and therefore it reports different information.
So, it's not that Python is negotiating TLSv1.2 or SHA256 and just hiding that from you; it's negotiating TLSv1.0 and SHA1 with the same server, and telling you what it's done.
If you want to use a different library that can handle things OpenSSL 0.9 can't, there are lots of choices. If you install OpenSSL 1.0.1 or later and build PyOpenSSL against that, I believe (I haven't tested) that you should be able to negotiate newer protocols and ciphers and find out that you've done so. It might even rework if you rebuild Python, or the standalone ssl module, against it. (If neither of those works, there are a zillion more OpenSSL wrappers and other SSL or TLS modules at PyPI, or you can just ctypes your favorite yourself.)

Categories