Certificate check fails with Python requests - python

I'm attempting to load a website using the Python requests package via a http/https proxy:
url = 'https://booster-magazine.ch' # Example of failing URL
proxies = {
'http': 'proxy.domain.internal:4321',
'https': 'proxy.domain.internal:4321',
}
r = requests.get(url, proxies=proxies)
Running this triggers the following error message
requests.exceptions.SSLError: ("bad handshake: Error([('SSL Routine', 'ssl3_get_server_certificate', 'certificate verify failed')],)",)
Web browsers indicate that the site's certificate is trusted and cURL on the command line using the same webproxy as the python code works just fine.
curl -x proxy.domain.internal:4321 https://booster-magazine.ch # No error
Some sites I have tested fail, while others work fine. As far as I can tell, the SSL Labs report for this failing site doesn't have any issues.
Package versions I'm using:
requests==2.23.0
certifi==2020.4.5.1
I'm aware of the existence of the verify=False option in the requests library. This is generally bad practice and opens the possibility to mitm attacks. The goal is to have a working ssl validation.

Related

can connect to URL with curl but not with requests (i.e. requests ignoring my CA bundle?)

I am able to connect to a certain URL with cURL, after I installed the corresponding SSL certificates:
$ export MY_URL=https://www.infosubvenciones.es/bdnstrans/GE/es/convocatoria/616783
$ curl -vvvv $MY_URL # Fails
$ sudo openssl x509 -inform pem -outform pem -in /tmp/custom-cert.pem -out /usr/local/share/ca-certificates/custom-cert.crt
$ sudo update-ca-certificates
$ curl -vvvv $MY_URL # OK
However, requests (or httpx, or any other library I use) refuses to do so:
In [1]: import os
...: import requests
...: requests.get(os.environ["MY_URL"])
---------------------------------------------------------------------------
SSLCertVerificationError Traceback (most recent call last)
...
SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:997)
My understanding is that requests uses certifi and as such these custom certificates are not available here:
In [1]: import certifi
In [2]: certifi.where()
Out[2]: '/tmp/test_ca/.venv/lib/python3.10/site-packages/certifi/cacert.pem'
I have already tried a number of things, like trying to use the system CA bundle:
export REQUESTS_CA_BUNDLE=/etc/ssl/certs/ca-certificates.crt same error
requests.get(..., verify="/etc/ssl/certs/ca-certificates.crt") same error
switched to httpx + a custom SSL context as explained in the docs, same error
attempted truststore as discussed in this httpx issue, same error
How can I make Python (requests, httpx, raw ssl, anything) use the same certificates that cURL is successfully using?
The only thing that worked so far, inspired by this hackish SO answer, is to do verify=False. But I don't want to do that.
In [9]: requests.get(
...: my_url,
...: verify=False,
...: )
/tmp/test_ca/.venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1043: InsecureRequestWarning: Unverified HTTPS request is being made to host 'xxx'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings
i tried your stuff on my system (Manjaro Linux, python 3.10) i can make a connection. I downloaded the complete certificate chain from the website (with my browser). After that i can use it with:
r = requests.get(url=URL, verify=<path to pem file>)
and with
export REQUESTS_CA_BUNDLE=<path to pem>
r = requests.get(url=URL)
I tried the export within pyCharm.
So the python stuff is working and you may have a problem in your certificates. Without this stuff i get the ssl error (of course), because python does not use the system certs as you mentioned correct. In my pem-file i have 3 certificates. Maybe you have only 1 and the others are in the global store, so that curl does not need the complete chain, instead of python. You should try to download the complete chain with your browser and try again.

SSLError while running python script from console

I have a problem with following script:
import requests
path = 'https://www.google.com/'
r = requests.get(path)
print(r.status_code)
When I am running this code in spyder IDE it is working properly.
When I am running it from console with commands:
activate my_env
python script.py
It is also working.
But when I am running it from console with command:
C:\Users\user\AppData\Local\conda\conda\envs\my_env\python.exe script.py
It gives me error:
requests.exceptions.SSLError: HTTPSConnectionPool(host='www.google.com', port=44
3): Max retries exceeded with url: / (Caused by SSLError("Can't connect to HTTPS
URL because the SSL module is not available."))
I was trying to add proxies param and certificate location to verify param, but nothing seems to help.
I am using python 3.7 and my working environment is Windows Server 2012.
I was able to find answer, I can't believe I haven't seen it before.
In short, copy files libcrypto-1_1-x64.*, libssl-1_1-x64.* from D:\Anaconda3\Library\bin to D:\Anaconda3\DLLs.

Ubuntu Verifies SSL Cert, but Python does not: requests.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:727)

I am hosting a site using SSL / HTTPS, and am attempting to make a request to it from a Python 2.7 script on the server (Ubuntu 18.04).
When running the script, I get this error:
requests.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:727)
However, when I run curl --verbose -X GET -I <url> on the same server, it says the certificate was verified.
I do know that the cert is in fact valid and is not a self signed cert.
Any ideas on what I can do to get python to accept that cert?
Edit: here's the code to trigger the issue. Note that I'm not including the URL as it is not accessible to the general public:
import requests
r = requests.get('https://www.example.org')
print r.status_code

Python Requests 'Permission denied' but works with sudo

I am using Python Requests library to connect to a REST server by using .pem certificates to authenticate who I am to establish a connection so I can start collecting data, parsing it, etc. When I run my program via Eclipse or the terminal, I get this error:
[('system library', 'fopen', 'Permission denied'), ('BIO routines', 'FILE_CTRL', 'system lib'), ('SSL routines', 'SSL_CTX_use_certificate_file', 'system lib')]
However, if I run as 'sudo' - the requests library works as intended and I am able to retrieve the data. Unfortunately, running as 'sudo' has side effects that where the default Python interpreter is the root interpreter, which is Python2. However, there are a lot of library dependencies that are needed from Anaconda.
For context, here is the function I am using to establish a connection:
PEM_FILE = os.path.expanduser("path/to/pem/file.pem") #Path is to a folder in root level
def set_up_connection(self):
#URL's to request a connection with
rest_auth = 'https://www.restwebsite.com/get/data'
ip_address = self.get_ip_address()
body = json.dumps({'userid':'user', 'password':'pass', 'ip_address':ip_address})
try:
resp = self.session.post(rest_auth, data=body, cert=PEM_FILE, verify=False)
values = resp.json()
token = values['token']
except Exception as e:
print(e)
token = None
return token, ip_address
TLDR; Using 'python rest_connector.py' renders an error. Running that command as sudo works.
Context for the certificates: The .pem cert permissions is set to 600 (rw-------).
To try and solve my problem running as sudo, I have started a terminal and run 'sudo -Es' which sets the terminal to run as root and uses Anaconda as my default interpreter, BUT I end up with a handshake error:
[('SSL routines', 'ssl3_read_bytes', 'tlsv1 alert unknown ca'), ('SSL routines', 'ssl3_read_bytes', 'ssl handshake failure')]
If someone can help me solve it this way, it would be a nice temporary fix, but I still need to be able to run this without sudo.
Thanks, in advance.
The username needs to be able to read the file. You can verify by running ls path_to_file.pem.
If you have changed ownership of the file, you might still be missing executable permissions on the directories containing the file.
You can potentially fix that with chmod -R +x ~/path/to_directory_containing_perm
Ok, so I managed to solve this and I'll post what I did in case anyone else stumbles upon this with a similar problem.
Permissions being set to 600 for the certs and pem file, ownership being set to root, and performing the openssl hashing function, the only problem was where the certs were placed in the sub directories.
While I placed the certs into 'etc/pki/tls/certs', they actually belonged in 'etc/ssl/certs'. The same goes for the .pem file except that .pem would be placed in a restricted folder, 'private'. After moving the files to the correct folder. Then I was able to set the verify param for the request to the cert_path and everything worked like I needed it to.
resp = self.session.post(rest_auth, data=body, cert=PEM_FILE, verify=cert_path)
'etc/pki/tls/certs' is the directory for Fedora distribution of Linux. 'etc/ssl/certs' is the directory for the Ubuntu distribution of Linux.
I hope this helps someone.

Curl command in Requests Python

I am trying write a python equivalent script for the cURLcommand below using Requests library. I am not able to find relevant flags to disable SSL verification and set no proxy.
curl -v -k -T debug.zip https://url-to-no-ssl-server/index.aspx --noproxy url-to-no-ssl-server -X POST -H "filename: debug.zip"
How do I convert this command to python-requests?
This SO Answer shows how to disable proxies:
session = requests.Session()
session.trust_env = False
The documentation for requests has disabling SSL verification:
Requests can also ignore verifying the SSL certificate if you set verify to False:
requests.get('https://kennethreitz.com', verify=False)
<Response [200]>
By default, verify is set to True. Option verify only applies to host certs.

Categories