Use multiple CA certs with python requests - python

I'm in a corporate network and need to use a self-signed certificate for the requests library
in a docker image.
I installed it by putting it in /usr/local/shares/ca-certificates and calling update-ca-certificates like this:
COPY EDAG_Bundle.crt /usr/local/share/ca-certificates/my_cert.crt
RUN update-ca-certificates
ENV REQUESTS_CA_BUNDLE /usr/local/share/ca-certificates/my_cert.crt
Now I am able to access files on a Server in our corporate network without running in a certificate error.
Unfortunately this change caused pip to stop working. As pip is using requests too, it also now uses the self signed certificate instead of the one from certifi.
The requests documentation states the following:
You can pass verify the path to a CA_BUNDLE file with certificates of trusted CAs:
requests.get('https://github.com', verify='/path/to/certfile')
This list of trusted CAs can also be specified through the REQUESTS_CA_BUNDLE environment variable.
As I get this, I can define a List of trusted CAs, not just one.
How can I configure requests to use both CAs? (my self signed one and the one of certifi located in
/site-packages/certifi/cacert.pem).
Setting both in the environment variable by seperating the paths with a colon does not work.

Use /etc/ssl/certs/ca-certificates.crt as your REQUESTS_CA_BUNDLE.
requests.get('https://github.com', verify='/etc/ssl/certs/ca-certificates.crt')
When you put a self-issued CA certificate to /usr/local/shares/ca-certificates, then run update-ca-certificates, it will read those in and append to the global "ca trust file" (ca-certificates.crt). This will hold trust for both publicly trusted and your self-installed CA-s.
Note: Debuan/Ubuntu systems, CentOS/Alpine probably have this in a different location (ref).

Related

In Python ssl.create_default_context won't retrieve new certificate from the Local Computer Cert Store

I am trying to retrieve certificate data using Python 3.11.0 on Windows using the ssl.create_default_context command.
While it does retrieve some certificates, it doesn't retrieve all of them. For example, I added a new Go-Daddy ssl certificate to the Certificate Snap-In of the MMC, specifically to the "Trusted Root Certificate Authority" section, since I saw the ssl.create_default_context command pulls existing certificates from there.
After adding said certificate and running the following lines of code:
certs = ssl.create_default_context().get_ca_certs(binary_form=True)
pem_certs = [ssl.DER_cert_to_PEM_cert(der) for der in certs]
I'm getting a list of certificates, but it doesn't include the new certificate I manually added.
I tried to read the command configuration but didn't find any flag that could help me retrieve any extra certificates with this command.
Any help would be greatly appreciated, thanks!

Pymongo unable to read Certificate Authority file

I am trying to setup TLS encrypted connections to MongoDB database using PyMongo. I have 2 python binaries installation at 2 different locations. But, both have version: 3.6.8. For both of them I have installed PyMongo version: 4.1.1. Have completed the process for generating CA keys and server private keys. I then added the ca.pem to '/etc/pki/ca-trust/source/anchors/' and ran 'sudo update-ca-trust' to add the certificate authority in the operating system certificate store. Then, updated the mongod.conf file and restarted the mongod instance. I am able to connect to the mongo shell using this command
mongo --tls --host='server-host-name'
But, the main issue is I am able to connect to the database using one python package, but the other gives this error:
[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:852)
error=AutoReconnect('SSL handshake failed:....)]
The output of the below command is:
openssl version -d
OPENSSLDIR: "/etc/pki/tls"
One workaround to make the other python binary also work was to explicitly export the path in the environment variable
export SSL_CERT_FILE=/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem
But, I want the other python binary to also look for the CAs in the appropriate directory automatically.
All these tests are performed locally and not through some remote connections (which would require the certificate paths to be specified explicitly). I wanted to know the internal working of pymongo.MongoClient specifically for TLS connections in detail. Basically, I wanted some understanding how does it fetch the CAFiles from the operating system certificate store.
Also, how do I increase the logging for pymongo, any workaround for this? Can someone help me debug this? I can add additional information if required. Thank you.

Make Ansible trust my custom CA on macOS and CentOS

I am setting up a web service that is reachable over HTTPS and it uses an internal CA. I want Ansible to verify the certificate so I need to make it trust this CA. I am running Ansible on both macOS and CentOS, so I need to make it trust my custom CA on both these types of operating systems.
I have tried to put my CA certificate inside /etc/ssl/certs/ and added it as to /usr/local/etc/openssl/cert.pem using blockinfile but none of those have worked. I would prefer a way that is easy to clean up, like adding the CA file to a directory instead of appending it to a file.
I am running Ansible 2.8 and have figured out that it uses urllib to make the HTTP requests. But I cannot find any information on where it looks for CA certs on different operating systems.
Any ideas? Thanks!
Partial answer for Centos (other distros use different paths/binaries and I have no clue how this is managed on macintosh).
Add your CA certificate file with a .pem extension to the /etc/pki/ca-trust/source/anchors/ folder
Run as root the command update-ca-certificates
The cert should now be recognized by ansible (and all other environments/softwares using openssl like e.g. curl)
If you still get validation errors, you can check what is the default path used by openssl
openssl version -a | grep -i openssldir
In this directory, there should be a symbolic link cert.pem pointing to /etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem (i.e. the CA Bundle file updated by update-ca-certificates). If this is missing, try to create it.
Usually there are not less than 3 environment variables that you need to set with path to custom CA. Look at https://github.com/tox-dev/tox/pull/1439 for their exact names. Also be sure they are set on the machine that runs the http requests!!

Adding custom root cert to pip SSL settings?

At work we have a MITM SSL root certificate for all web traffic. I've gotten pip to work by creating a pip.ini file and adding the cert=\path\to\cert.pem option to it. However, now it only works when I'm at work, and fails to download when I'm anywhere else. Is there any way to add the root certificate to the available list of root certs rather than replacing them? Or some other way to have pip work easily no matter which network I'm on?
Lets figure out why it does succeed at the office and what goes wrong elsewhere.
I suspect the config succeeds at work because you are within a VPN and the VPN is insisting on the certificate file in order to allow pip communication. There are other ways beside the pip.ini file to make the certificate available.
Please report this. When you leave work, does pip succeed if you move pip.ini out of the way? Most likely answer is to change the way you are using the certificate.
There are some posts to study. The approach you use is emphasized in first two:
https://superuser.com/questions/665069/can-i-get-around-using-pip-install-cert
pip: cert failed, but curl works
The alternative solution is to add your cert to the cert bundle. There is an excellent thread about this:
How to add a custom CA Root certificate to the CA Store used by pip in Windows?
In there, look for the part about creating a cert bundle:
https://stackoverflow.com/a/52961564/1086346
I believe that if you do that, then the cert will be available if pip needs it at your office, but it will not be in the way when you are elsewhere.
Good luck, let us know what happens.

Let's encrypt certificate, Python and Windows

I changed my Webserver from HTTP to HTTPS with "Let"s Encrypt".
The Webserver contains an API, and I have an Python application, which uses the API.
Under Linux is all fine, but under Windows I receive this below, when I'm logging in.
[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:590)
My thought was, that the SSL certificate isn't installed.
So I downloaded the "isrgrootx1.der" and "lets-encrypt-x1-cross-signed.der" renamed both to the ending "*.cer".
Then I opened the Windows console, and run this:
certutil -addstore "Root" "isrgrootx1.cer".
certutil -addstore "Root" "lets-encrypt-x1-cross-signed.cer".
The second command failed, because it isn't a root certificate.
My question is: In which group has the "lets-encrypt-x1-cross-signed.cer" to be installed?
You shouldn't need to add "lets-encrypt-x1-cross-signed.cer" to your Windows machine, since it's only an intermediate certificate. And you shouldn't need to add "isrgrootx1.cer" either, since Let's Encrypt certificates chain to "DST Root X3", which is already included with Windows.
Most likely your web server was not configured to send the intermediate certificate. If you're using Certbot, for instance, you'll want to configure your web server using "fullchain.pem" rather than "cert.pem".

Categories