At work we have a MITM SSL root certificate for all web traffic. I've gotten pip to work by creating a pip.ini file and adding the cert=\path\to\cert.pem option to it. However, now it only works when I'm at work, and fails to download when I'm anywhere else. Is there any way to add the root certificate to the available list of root certs rather than replacing them? Or some other way to have pip work easily no matter which network I'm on?
Lets figure out why it does succeed at the office and what goes wrong elsewhere.
I suspect the config succeeds at work because you are within a VPN and the VPN is insisting on the certificate file in order to allow pip communication. There are other ways beside the pip.ini file to make the certificate available.
Please report this. When you leave work, does pip succeed if you move pip.ini out of the way? Most likely answer is to change the way you are using the certificate.
There are some posts to study. The approach you use is emphasized in first two:
https://superuser.com/questions/665069/can-i-get-around-using-pip-install-cert
pip: cert failed, but curl works
The alternative solution is to add your cert to the cert bundle. There is an excellent thread about this:
How to add a custom CA Root certificate to the CA Store used by pip in Windows?
In there, look for the part about creating a cert bundle:
https://stackoverflow.com/a/52961564/1086346
I believe that if you do that, then the cert will be available if pip needs it at your office, but it will not be in the way when you are elsewhere.
Good luck, let us know what happens.
Related
Hoping someone can prevent me from having a mental breakdown ...
I am using the python requests module to consume a web API written in .NET (Windows). I have set myself up as my own CA and signed a server CSR (to do this I followed a good tutorial at https://realpython.com/python-https/#reader-comments). I package the resulting signed certificate with the server.
What happened next:
I try to access the API in Chrome (running through localhost). Doesn't work. As expected.
I add the CA certificate (which is self-signed) to my trusted root CAs in Chrome. Works. As expected.
Now I go to python:
I call requests.get with the 'verify' parameter pointing to the CA certificate. Does not work. Complains that the certificate is self-signed.
If I point 'verify' to the actual signed server certificate however ... it works.
Perhaps everything is fine, but my question is this:
From much of what I've read, including, especially, the aforementioned tutorial, you should just be able to point 'verify' at the CA certificate alone (rather than the signed server certificate). Is this right? Or is the behaviour I am seeing in my case the expected behaviour?
Thanks for any help you might be able to give. I realise I have posted no code. That can follow if there is indeed an issue with the way that I am doing things.
Update
So, on Ubuntu, requests can be verified with the CA certificate alone. The tutorial I was following was written for Linux users. The 'issue' I attempted to describe above is specific to running on windows.
I am trying to setup TLS encrypted connections to MongoDB database using PyMongo. I have 2 python binaries installation at 2 different locations. But, both have version: 3.6.8. For both of them I have installed PyMongo version: 4.1.1. Have completed the process for generating CA keys and server private keys. I then added the ca.pem to '/etc/pki/ca-trust/source/anchors/' and ran 'sudo update-ca-trust' to add the certificate authority in the operating system certificate store. Then, updated the mongod.conf file and restarted the mongod instance. I am able to connect to the mongo shell using this command
mongo --tls --host='server-host-name'
But, the main issue is I am able to connect to the database using one python package, but the other gives this error:
[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:852)
error=AutoReconnect('SSL handshake failed:....)]
The output of the below command is:
openssl version -d
OPENSSLDIR: "/etc/pki/tls"
One workaround to make the other python binary also work was to explicitly export the path in the environment variable
export SSL_CERT_FILE=/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem
But, I want the other python binary to also look for the CAs in the appropriate directory automatically.
All these tests are performed locally and not through some remote connections (which would require the certificate paths to be specified explicitly). I wanted to know the internal working of pymongo.MongoClient specifically for TLS connections in detail. Basically, I wanted some understanding how does it fetch the CAFiles from the operating system certificate store.
Also, how do I increase the logging for pymongo, any workaround for this? Can someone help me debug this? I can add additional information if required. Thank you.
I'm in a corporate network and need to use a self-signed certificate for the requests library
in a docker image.
I installed it by putting it in /usr/local/shares/ca-certificates and calling update-ca-certificates like this:
COPY EDAG_Bundle.crt /usr/local/share/ca-certificates/my_cert.crt
RUN update-ca-certificates
ENV REQUESTS_CA_BUNDLE /usr/local/share/ca-certificates/my_cert.crt
Now I am able to access files on a Server in our corporate network without running in a certificate error.
Unfortunately this change caused pip to stop working. As pip is using requests too, it also now uses the self signed certificate instead of the one from certifi.
The requests documentation states the following:
You can pass verify the path to a CA_BUNDLE file with certificates of trusted CAs:
requests.get('https://github.com', verify='/path/to/certfile')
This list of trusted CAs can also be specified through the REQUESTS_CA_BUNDLE environment variable.
As I get this, I can define a List of trusted CAs, not just one.
How can I configure requests to use both CAs? (my self signed one and the one of certifi located in
/site-packages/certifi/cacert.pem).
Setting both in the environment variable by seperating the paths with a colon does not work.
Use /etc/ssl/certs/ca-certificates.crt as your REQUESTS_CA_BUNDLE.
requests.get('https://github.com', verify='/etc/ssl/certs/ca-certificates.crt')
When you put a self-issued CA certificate to /usr/local/shares/ca-certificates, then run update-ca-certificates, it will read those in and append to the global "ca trust file" (ca-certificates.crt). This will hold trust for both publicly trusted and your self-installed CA-s.
Note: Debuan/Ubuntu systems, CentOS/Alpine probably have this in a different location (ref).
I am setting up a web service that is reachable over HTTPS and it uses an internal CA. I want Ansible to verify the certificate so I need to make it trust this CA. I am running Ansible on both macOS and CentOS, so I need to make it trust my custom CA on both these types of operating systems.
I have tried to put my CA certificate inside /etc/ssl/certs/ and added it as to /usr/local/etc/openssl/cert.pem using blockinfile but none of those have worked. I would prefer a way that is easy to clean up, like adding the CA file to a directory instead of appending it to a file.
I am running Ansible 2.8 and have figured out that it uses urllib to make the HTTP requests. But I cannot find any information on where it looks for CA certs on different operating systems.
Any ideas? Thanks!
Partial answer for Centos (other distros use different paths/binaries and I have no clue how this is managed on macintosh).
Add your CA certificate file with a .pem extension to the /etc/pki/ca-trust/source/anchors/ folder
Run as root the command update-ca-certificates
The cert should now be recognized by ansible (and all other environments/softwares using openssl like e.g. curl)
If you still get validation errors, you can check what is the default path used by openssl
openssl version -a | grep -i openssldir
In this directory, there should be a symbolic link cert.pem pointing to /etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem (i.e. the CA Bundle file updated by update-ca-certificates). If this is missing, try to create it.
Usually there are not less than 3 environment variables that you need to set with path to custom CA. Look at https://github.com/tox-dev/tox/pull/1439 for their exact names. Also be sure they are set on the machine that runs the http requests!!
I recently installed a new wildcard SHA256 certificate from Comodo. It's a "Premium SSL" cert.
I am now unable to curl/wget, or anything that uses those common SSL libraries, files from the server. I usually get the following message back:
SSL3_GET_SERVER_CERTIFICATE:certificate verify failed
This is particularly an issue when I run easy_install and use our custom cheeseshop as the index, which is using the new cert.
I've tried running sudo update-ca-certificates but that didn't resolve the issue.
My Apache config has the following:
SSLEngine on
SSLCertificateFile /etc/ssl/localcerts/domain.net.crt
SSLCertificateKeyFile /etc/ssl/localcerts/domain.net.key
SSLCertificateChainFile /etc/ssl/localcerts/domain.net.ca-bundle
When I view the site in Chrome or Firefox, I get no errors. I've used online SSL Analysers as well which seem to pass fine.
If I pass the ca-bundle file given to me by Comodo direcly into curl, it works, but otherwise it doesn't.
My understanding is that this is because it's not in /etc/ssl/certs/cacerts.pem. I tried adding the bundled certs in as well, but it didn't work.
What's the best way to resolve this? We're using easy_install with Chef when deploying, so I'd like to avoid having to point to the ca bundle if at all possible.
Thanks.