I changed my Webserver from HTTP to HTTPS with "Let"s Encrypt".
The Webserver contains an API, and I have an Python application, which uses the API.
Under Linux is all fine, but under Windows I receive this below, when I'm logging in.
[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:590)
My thought was, that the SSL certificate isn't installed.
So I downloaded the "isrgrootx1.der" and "lets-encrypt-x1-cross-signed.der" renamed both to the ending "*.cer".
Then I opened the Windows console, and run this:
certutil -addstore "Root" "isrgrootx1.cer".
certutil -addstore "Root" "lets-encrypt-x1-cross-signed.cer".
The second command failed, because it isn't a root certificate.
My question is: In which group has the "lets-encrypt-x1-cross-signed.cer" to be installed?
You shouldn't need to add "lets-encrypt-x1-cross-signed.cer" to your Windows machine, since it's only an intermediate certificate. And you shouldn't need to add "isrgrootx1.cer" either, since Let's Encrypt certificates chain to "DST Root X3", which is already included with Windows.
Most likely your web server was not configured to send the intermediate certificate. If you're using Certbot, for instance, you'll want to configure your web server using "fullchain.pem" rather than "cert.pem".
Related
I am trying to setup TLS encrypted connections to MongoDB database using PyMongo. I have 2 python binaries installation at 2 different locations. But, both have version: 3.6.8. For both of them I have installed PyMongo version: 4.1.1. Have completed the process for generating CA keys and server private keys. I then added the ca.pem to '/etc/pki/ca-trust/source/anchors/' and ran 'sudo update-ca-trust' to add the certificate authority in the operating system certificate store. Then, updated the mongod.conf file and restarted the mongod instance. I am able to connect to the mongo shell using this command
mongo --tls --host='server-host-name'
But, the main issue is I am able to connect to the database using one python package, but the other gives this error:
[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:852)
error=AutoReconnect('SSL handshake failed:....)]
The output of the below command is:
openssl version -d
OPENSSLDIR: "/etc/pki/tls"
One workaround to make the other python binary also work was to explicitly export the path in the environment variable
export SSL_CERT_FILE=/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem
But, I want the other python binary to also look for the CAs in the appropriate directory automatically.
All these tests are performed locally and not through some remote connections (which would require the certificate paths to be specified explicitly). I wanted to know the internal working of pymongo.MongoClient specifically for TLS connections in detail. Basically, I wanted some understanding how does it fetch the CAFiles from the operating system certificate store.
Also, how do I increase the logging for pymongo, any workaround for this? Can someone help me debug this? I can add additional information if required. Thank you.
I am working on Python script on Windows 10 to connect to consume KAFKA topic. The SSL certificate is installed on Windows server in .jks format. The SSL connection to KAFKA is possible only with his certificate.
I wanted to know if there is a way I can tell Python to get the default certificate from the specific location? Will Python accept .jks format certificate? If not then what options I have.
Python isn't Java. JKS files really only work within the context of a JVM
You can use keytool commands to export a PEM certificate from a JKS file to be used for non Java purposes
How to convert trust certificate from .jks to .pem?
I have been trying to connect to a RabbitMQ (it was created from AWS Messaging Service if it matters) instance via celery 5.0.5. The connection link starts as follows amqps://user:password#..../. I receive the following error when running my python script:
consumer: Cannot connect to amqps://sessionstackadmin:**#b-0482d011-0cca-40bd-968e-c19d6c85e2a9.mq.eu-central-1.amazonaws.com:5671//: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed
I am running the script from a docker container with python 3.6.12. The docker container has access to the endpoint (at least it can telnet to it). I have the feeling that the python process does not respect the distro certificate chain and it just fails verifying the certificate.
I solved it! Celery is using Kombu, which is using py-amqp and it happens that the latest version 5.0.3 from Jan 19 is broken.
My GH ticket https://github.com/celery/py-amqp/issues/349
Solution: add amqp==5.0.2 as a hard dependency in your project requirements.
Fix at: git+git://github.com/celery/py-amqp.git#0b8a832d32179d33152d886acd6f081f25ea4bf2
I am leaving the workaround that "fixes" this. For some reason the kombu library when trying to handle ssl connections does not respect the default CA certs coming with your distribution. This is basically handled by https://docs.python.org/3/library/ssl.html#ssl.create_default_context which the library does not use. Sadly it does not allow to pass in a custom SSLContext but only a set of options that will be later passed down to the context. One such options is broker_use_ssl. By settings it to {'ca_certs': '/etc/ssl/certs/ca-certificates.crt'} it will respect the CA certs from the distribution (keep in mind that I am using an ubuntu/debian based image and the CA certs configuration file resides there, if you are using another distro check out the proper location for your CA certs).
My python application is running locally on http://0.0.0.0:80/
Later, to make SSL connection I generated MyCert.crt and Mycert.key files.
Later, after providing the location of certificate and key files. My application works perfect and starts running over https connection as per expectation.
code snippet:
from OpenSSL import SSL
context = ("C:/myCert.crt", "C:/Mycert.key")
app.run(host="0.0.0.0", port='80',ssl_context = context)
Now, I import the same certificate to: windows certificates manager -> Trusted Root Certification Authorities. It shows certificate name as localhost
Now, my goal is to access the certificate for same python application and start using it from windows certificate manager.
I referred couple of libraries(requests, wincertstore) but I am unable to understand them as I am new in this domain.
How Do I modify my python code to access this certificate.
you need to change your port firstly, 443 will be great (as far as we know, https go over 443)
and replace your line with this one:
app.run(host='0.0.0.0', port=443, debug=True, ssl_context=('/home/ubuntu/cert/myCert.pem', '/home/ubuntu/cert/myCert2.pem'))
read this article, it will help you:
https://blog.miguelgrinberg.com/post/running-your-flask-application-over-https
I am running a docker container with Ubuntu as the base and am trying to add a new Certificate Authority to the project.
I'm not entirely sure what's failing, but I cannot seem to make it work. I followed the directions on this page: http://manpages.ubuntu.com/manpages/zesty/man8/update-ca-certificates.8.html by adding the CA file to a directory in /usr/share/ca-certificates, specifying the CA files in /etc/ca-certificates.conf, and then running update-ca-certificates, which completes with a message saying that it added 3 new certificates.
However, aiohttp is still printing the error
aiohttp.errors.ClientOSError: [Errno 1] Cannot connect to host www.myserver.com:443 ssl:True [Can not connect to www.myserver.com:443 [[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:719)]]
I was informed that aiohttp doesn't access a certificate store itself, but rather relies on asyncio which I think was absorbed into python itself recently. So I don't know if somewhere along the chain something is using a different certificate store, but I would just like to know where I can add my CA files so that they will work with aiohttp.