Make Ansible trust my custom CA on macOS and CentOS - python

I am setting up a web service that is reachable over HTTPS and it uses an internal CA. I want Ansible to verify the certificate so I need to make it trust this CA. I am running Ansible on both macOS and CentOS, so I need to make it trust my custom CA on both these types of operating systems.
I have tried to put my CA certificate inside /etc/ssl/certs/ and added it as to /usr/local/etc/openssl/cert.pem using blockinfile but none of those have worked. I would prefer a way that is easy to clean up, like adding the CA file to a directory instead of appending it to a file.
I am running Ansible 2.8 and have figured out that it uses urllib to make the HTTP requests. But I cannot find any information on where it looks for CA certs on different operating systems.
Any ideas? Thanks!

Partial answer for Centos (other distros use different paths/binaries and I have no clue how this is managed on macintosh).
Add your CA certificate file with a .pem extension to the /etc/pki/ca-trust/source/anchors/ folder
Run as root the command update-ca-certificates
The cert should now be recognized by ansible (and all other environments/softwares using openssl like e.g. curl)
If you still get validation errors, you can check what is the default path used by openssl
openssl version -a | grep -i openssldir
In this directory, there should be a symbolic link cert.pem pointing to /etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem (i.e. the CA Bundle file updated by update-ca-certificates). If this is missing, try to create it.

Usually there are not less than 3 environment variables that you need to set with path to custom CA. Look at https://github.com/tox-dev/tox/pull/1439 for their exact names. Also be sure they are set on the machine that runs the http requests!!

Related

Pymongo unable to read Certificate Authority file

I am trying to setup TLS encrypted connections to MongoDB database using PyMongo. I have 2 python binaries installation at 2 different locations. But, both have version: 3.6.8. For both of them I have installed PyMongo version: 4.1.1. Have completed the process for generating CA keys and server private keys. I then added the ca.pem to '/etc/pki/ca-trust/source/anchors/' and ran 'sudo update-ca-trust' to add the certificate authority in the operating system certificate store. Then, updated the mongod.conf file and restarted the mongod instance. I am able to connect to the mongo shell using this command
mongo --tls --host='server-host-name'
But, the main issue is I am able to connect to the database using one python package, but the other gives this error:
[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:852)
error=AutoReconnect('SSL handshake failed:....)]
The output of the below command is:
openssl version -d
OPENSSLDIR: "/etc/pki/tls"
One workaround to make the other python binary also work was to explicitly export the path in the environment variable
export SSL_CERT_FILE=/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem
But, I want the other python binary to also look for the CAs in the appropriate directory automatically.
All these tests are performed locally and not through some remote connections (which would require the certificate paths to be specified explicitly). I wanted to know the internal working of pymongo.MongoClient specifically for TLS connections in detail. Basically, I wanted some understanding how does it fetch the CAFiles from the operating system certificate store.
Also, how do I increase the logging for pymongo, any workaround for this? Can someone help me debug this? I can add additional information if required. Thank you.

Use multiple CA certs with python requests

I'm in a corporate network and need to use a self-signed certificate for the requests library
in a docker image.
I installed it by putting it in /usr/local/shares/ca-certificates and calling update-ca-certificates like this:
COPY EDAG_Bundle.crt /usr/local/share/ca-certificates/my_cert.crt
RUN update-ca-certificates
ENV REQUESTS_CA_BUNDLE /usr/local/share/ca-certificates/my_cert.crt
Now I am able to access files on a Server in our corporate network without running in a certificate error.
Unfortunately this change caused pip to stop working. As pip is using requests too, it also now uses the self signed certificate instead of the one from certifi.
The requests documentation states the following:
You can pass verify the path to a CA_BUNDLE file with certificates of trusted CAs:
requests.get('https://github.com', verify='/path/to/certfile')
This list of trusted CAs can also be specified through the REQUESTS_CA_BUNDLE environment variable.
As I get this, I can define a List of trusted CAs, not just one.
How can I configure requests to use both CAs? (my self signed one and the one of certifi located in
/site-packages/certifi/cacert.pem).
Setting both in the environment variable by seperating the paths with a colon does not work.
Use /etc/ssl/certs/ca-certificates.crt as your REQUESTS_CA_BUNDLE.
requests.get('https://github.com', verify='/etc/ssl/certs/ca-certificates.crt')
When you put a self-issued CA certificate to /usr/local/shares/ca-certificates, then run update-ca-certificates, it will read those in and append to the global "ca trust file" (ca-certificates.crt). This will hold trust for both publicly trusted and your self-installed CA-s.
Note: Debuan/Ubuntu systems, CentOS/Alpine probably have this in a different location (ref).

Adding custom root cert to pip SSL settings?

At work we have a MITM SSL root certificate for all web traffic. I've gotten pip to work by creating a pip.ini file and adding the cert=\path\to\cert.pem option to it. However, now it only works when I'm at work, and fails to download when I'm anywhere else. Is there any way to add the root certificate to the available list of root certs rather than replacing them? Or some other way to have pip work easily no matter which network I'm on?
Lets figure out why it does succeed at the office and what goes wrong elsewhere.
I suspect the config succeeds at work because you are within a VPN and the VPN is insisting on the certificate file in order to allow pip communication. There are other ways beside the pip.ini file to make the certificate available.
Please report this. When you leave work, does pip succeed if you move pip.ini out of the way? Most likely answer is to change the way you are using the certificate.
There are some posts to study. The approach you use is emphasized in first two:
https://superuser.com/questions/665069/can-i-get-around-using-pip-install-cert
pip: cert failed, but curl works
The alternative solution is to add your cert to the cert bundle. There is an excellent thread about this:
How to add a custom CA Root certificate to the CA Store used by pip in Windows?
In there, look for the part about creating a cert bundle:
https://stackoverflow.com/a/52961564/1086346
I believe that if you do that, then the cert will be available if pip needs it at your office, but it will not be in the way when you are elsewhere.
Good luck, let us know what happens.

Adding a custom CA Root certificate to GCloud utility (or Python generally) on Windows

I'm using gcloud on Windows to develop GAE stuff. The network here has a MITM root certificate by design so all SSL traffic can be snooped; I can install the root cert easily into a browser or Windows certificate store, but can't successfully get this work for Python, or more specifically, gcloud (which has its own Python bundled). The answers at How to add a custom CA Root certificate to the CA Store used by Python in Windows? don't work - I've tried setting SSL_CERT_DIR and SSL_CERT_FILE environment variables to no avail, and the pip.ini solution isn't applicable as I'm not using pip.
Assuming all your credential setup is in order, for MITM you likely also need to set proxy settings, for instance
gcloud config set proxy/address 127.0.0.1
gcloud config set proxy/port 8080
gcloud config set proxy/type http
replacing address/port for your MITM and then either one of these:
gcloud config set auth/disable_ssl_validation True
or
gcloud config set core/custom_ca_certs_file cert.pem
Test by running some command, for example
gcloud projects list
You can use --log-http additional gcloud flag and/or tools like burp to further debug what certs/proxies are being used.
The previous answer works for gcloud, but does not work with gsutil. gsutil currently ignores whatever value for ca certificates you have in the gcloud config, so you must add it to your boto config file. (on a gcp instance it's /etc/boto.cfg). Add these lines:
[Boto]
ca_certificates_file = /path/to/cert.pem

easy_install/curl fails because of "SSL3_GET_SERVER_CERTIFICATE:certificate verify failed"

I recently installed a new wildcard SHA256 certificate from Comodo. It's a "Premium SSL" cert.
I am now unable to curl/wget, or anything that uses those common SSL libraries, files from the server. I usually get the following message back:
SSL3_GET_SERVER_CERTIFICATE:certificate verify failed
This is particularly an issue when I run easy_install and use our custom cheeseshop as the index, which is using the new cert.
I've tried running sudo update-ca-certificates but that didn't resolve the issue.
My Apache config has the following:
SSLEngine on
SSLCertificateFile /etc/ssl/localcerts/domain.net.crt
SSLCertificateKeyFile /etc/ssl/localcerts/domain.net.key
SSLCertificateChainFile /etc/ssl/localcerts/domain.net.ca-bundle
When I view the site in Chrome or Firefox, I get no errors. I've used online SSL Analysers as well which seem to pass fine.
If I pass the ca-bundle file given to me by Comodo direcly into curl, it works, but otherwise it doesn't.
My understanding is that this is because it's not in /etc/ssl/certs/cacerts.pem. I tried adding the bundled certs in as well, but it didn't work.
What's the best way to resolve this? We're using easy_install with Chef when deploying, so I'd like to avoid having to point to the ca bundle if at all possible.
Thanks.

Categories