Curl command in Requests Python - python

I am trying write a python equivalent script for the cURLcommand below using Requests library. I am not able to find relevant flags to disable SSL verification and set no proxy.
curl -v -k -T debug.zip https://url-to-no-ssl-server/index.aspx --noproxy url-to-no-ssl-server -X POST -H "filename: debug.zip"
How do I convert this command to python-requests?

This SO Answer shows how to disable proxies:
session = requests.Session()
session.trust_env = False
The documentation for requests has disabling SSL verification:
Requests can also ignore verifying the SSL certificate if you set verify to False:
requests.get('https://kennethreitz.com', verify=False)
<Response [200]>
By default, verify is set to True. Option verify only applies to host certs.

Related

Getting SSLCertVerificationError, but CURL is working

I'm trying to replicate a CURL command using aiohttp in Python.
curl -k -X POST https://XXXXXXXXXXXXXXX.com \
--cert file.cer:pass \
--key file.key \
-u "XXXXXXXXXXX:XXXXXXXXXXX" \
-d "username=XXXXXXXXXXX" \
-d "password=XXXXXXXXXXX" \
-H "Content-Type: application/x-www-form-urlencoded" \
-H "Authorization: Basic XXXXXXXXXXX"
That works fine.
Reading the docs here, https://docs.aiohttp.org/en/stable/client_advanced.html#ssl-control-for-tcp-sockets, I create a SSL Context like so, using Python's ssl module:
import ssl
ssl_context = ssl.create_default_context()
ssl_context.load_cert_chain(
certfile=CERT_PATH,
keyfile=KEY_PATH,
password=KEY_PASS
)
Then, with aiohttp when executing a request:
async with session.post(url,json=body,ssl=ssl_context) as r:
...
And I get this:
ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:997)
Any ideas why?
Update:
Solved this by following the answer here: How can I create a Python SSL Client/Server pair where only the server authenticates the client
Basically replaced ssl.create_default_context() with ssl.SSLContext()
Only thing is that now I get a warning:
DeprecationWarning: ssl.SSLContext() without protocol argument is deprecated.
DeprecationWarning: ssl.PROTOCOL_TLS is deprecated
But that is a problem for another day. I'm open to hear more suggestions, tho.

can connect to URL with curl but not with requests (i.e. requests ignoring my CA bundle?)

I am able to connect to a certain URL with cURL, after I installed the corresponding SSL certificates:
$ export MY_URL=https://www.infosubvenciones.es/bdnstrans/GE/es/convocatoria/616783
$ curl -vvvv $MY_URL # Fails
$ sudo openssl x509 -inform pem -outform pem -in /tmp/custom-cert.pem -out /usr/local/share/ca-certificates/custom-cert.crt
$ sudo update-ca-certificates
$ curl -vvvv $MY_URL # OK
However, requests (or httpx, or any other library I use) refuses to do so:
In [1]: import os
...: import requests
...: requests.get(os.environ["MY_URL"])
---------------------------------------------------------------------------
SSLCertVerificationError Traceback (most recent call last)
...
SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:997)
My understanding is that requests uses certifi and as such these custom certificates are not available here:
In [1]: import certifi
In [2]: certifi.where()
Out[2]: '/tmp/test_ca/.venv/lib/python3.10/site-packages/certifi/cacert.pem'
I have already tried a number of things, like trying to use the system CA bundle:
export REQUESTS_CA_BUNDLE=/etc/ssl/certs/ca-certificates.crt same error
requests.get(..., verify="/etc/ssl/certs/ca-certificates.crt") same error
switched to httpx + a custom SSL context as explained in the docs, same error
attempted truststore as discussed in this httpx issue, same error
How can I make Python (requests, httpx, raw ssl, anything) use the same certificates that cURL is successfully using?
The only thing that worked so far, inspired by this hackish SO answer, is to do verify=False. But I don't want to do that.
In [9]: requests.get(
...: my_url,
...: verify=False,
...: )
/tmp/test_ca/.venv/lib/python3.10/site-packages/urllib3/connectionpool.py:1043: InsecureRequestWarning: Unverified HTTPS request is being made to host 'xxx'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings
i tried your stuff on my system (Manjaro Linux, python 3.10) i can make a connection. I downloaded the complete certificate chain from the website (with my browser). After that i can use it with:
r = requests.get(url=URL, verify=<path to pem file>)
and with
export REQUESTS_CA_BUNDLE=<path to pem>
r = requests.get(url=URL)
I tried the export within pyCharm.
So the python stuff is working and you may have a problem in your certificates. Without this stuff i get the ssl error (of course), because python does not use the system certs as you mentioned correct. In my pem-file i have 3 certificates. Maybe you have only 1 and the others are in the global store, so that curl does not need the complete chain, instead of python. You should try to download the complete chain with your browser and try again.

ERROR: Proxy URL had no scheme. However, URL & Proxies are properly setup

I'm getting the error:
urllib3.exceptions.ProxySchemeUnknown: Proxy URL had no scheme, should start with http:// or https://
but the proxies are fine & so is the URL.
URL = f"https://google.com/search?q={query2}&num=100"
mysite = self.listbox.get(0)
headers = {"user-agent": USER_AGENT}
while True:
proxy = next(proxy_cycle)
print(proxy)
proxies = {"http": proxy, "https": proxy}
print(proxies)
resp = requests.get(URL, proxies=proxies, headers=headers)
if resp.status_code == 200:
break
Print results:
41.139.253.91:8080
{'http': '41.139.253.91:8080', 'https': '41.139.253.91:8080'}
On Linux unset http_proxy and https_proxy using terminal on the current location of your project
unset http_proxy
unset https_proxy
I had the same problem and setting in my terminal https_proxy variable really helped me. You can set it as follows:
set HTTPS_PROXY=http://username:password#proxy.example.com:8080
set https_proxy=http://username:password#proxy.example.com:8080
Where proxy.example.com is the proxy address (in my case it is "localhost") and 8080 is my port.
You can figure out your username by typing echo %username% in your command line. As for the proxy server, on Windows, you need to go to "Internet Options" -> "Connections" -> LAN Settings and tick "Use a proxy server for your LAN". There, you can find your proxy address and port.
An important note here. If you're using PyCharm, try first running your script from the terminal. I say this because you may get the same error if you will just run the file by "pushing" the button. But using the terminal may help you get rid of this error.
P.S. Also, you can try to downgrade your pip to 20.2.3 as it may help you too.
I was having same issue. I resolved with upgrading requests library in python3 by
pip3 install --upgrade requests
I think it is related to lower version of requests library conflicting higher version of python3

Certificate check fails with Python requests

I'm attempting to load a website using the Python requests package via a http/https proxy:
url = 'https://booster-magazine.ch' # Example of failing URL
proxies = {
'http': 'proxy.domain.internal:4321',
'https': 'proxy.domain.internal:4321',
}
r = requests.get(url, proxies=proxies)
Running this triggers the following error message
requests.exceptions.SSLError: ("bad handshake: Error([('SSL Routine', 'ssl3_get_server_certificate', 'certificate verify failed')],)",)
Web browsers indicate that the site's certificate is trusted and cURL on the command line using the same webproxy as the python code works just fine.
curl -x proxy.domain.internal:4321 https://booster-magazine.ch # No error
Some sites I have tested fail, while others work fine. As far as I can tell, the SSL Labs report for this failing site doesn't have any issues.
Package versions I'm using:
requests==2.23.0
certifi==2020.4.5.1
I'm aware of the existence of the verify=False option in the requests library. This is generally bad practice and opens the possibility to mitm attacks. The goal is to have a working ssl validation.

How do I send authenticated https requests using Python?

What is the equivalent Python to the following curl command?
curl https://www.streak.com/api/v1/boxes/{Keyhere}/ -u passwordhere:
I tried this but I get 400 errors:
requests.get('https://www.streak.com/api/v1/boxes/{Key}/threads -u PASSWORD:', auth=HTTPBasicAuth('USER', 'pass'))
This type of authentication is very common and the requests documentation covers it.
requests.get('https://www.streak.com/api/v1/boxes/{Key}/threads', auth=(YOUR_API_KEY, ''))
You need to remove the -u PASSWORD: from your destination.
An alternative is to use the streak_client package on PiPY or GitHub

Categories