I'm trying to pull data from an API which is secured by SSL. I wrote a python script to pull the data. Beforehand I have to convert a .p12 file to an openSSL certificate. When I use the following code it works just fine:
# ----- SCRIPT 1 -----
def pfx_to_pem(pfx_path, pfx_password):
''' Decrypts the .pfx file to be used with requests. '''
with tempfile.NamedTemporaryFile(suffix='.pem') as t_pem:
f_pem = open(t_pem.name, 'wb')
pfx = open(pfx_path, 'rb').read()
p12 = OpenSSL.crypto.load_pkcs12(pfx, pfx_password)
f_pem.write(OpenSSL.crypto.dump_privatekey(OpenSSL.crypto.FILETYPE_PEM, p12.get_privatekey()))
f_pem.write(OpenSSL.crypto.dump_certificate(OpenSSL.crypto.FILETYPE_PEM, p12.get_certificate()))
ca = p12.get_ca_certificates()
if ca is not None:
for cert in ca:
f_pem.write(OpenSSL.crypto.dump_certificate(OpenSSL.crypto.FILETYPE_PEM, cert))
f_pem.close()
yield t_pem.name
# read some config
with open('config.json') as config_json:
config = json.load(config_json)
api_url = config['api_url']
cert = config['cert']
cert_pem_path = cert['file']
cert_key_file = cert['pass']
# make the request
with pfx_to_pem(cert_pem_path, cert_key_file) as cert:
r = requests.get(api_url, cert = cert)
Because I'm also using the same functionality to authenticate my Flask web service towards the server I split up the cert file into three files:
# ----- SCRIPT 1 -----
# get certificate
f_pem.write(OpenSSL.crypto.dump_certificate(
OpenSSL.crypto.FILETYPE_PEM, p12.get_certificate())
)
# get keyfile
f_key.write(OpenSSL.crypto.dump_privatekey(
OpenSSL.crypto.FILETYPE_PEM, p12.get_privatekey())
)
# get CA_BUNDLE
ca = p12.get_ca_certificates()
if ca is not None:
for cert in ca:
f_ca.write(
OpenSSL.crypto.dump_certificate(
OpenSSL.crypto.FILETYPE_PEM, cert
))
Then I'm running the web service with the following code:
# ----- SCRIPT 2 -----
context = ssl.SSLContext(ssl.PROTOCOL_TLSv1_2)
context.verify_mode = ssl.CERT_REQUIRED
context.load_verify_locations(cert_ca)
context.load_cert_chain(cert_pem, cert_key)
app.run(ssl_context = context, host = '0.0.0.0')
and changed the requests call to
# ----- SCRIPT 1 -----
r = requests.get(api_url, cert = (cert_pem, cert_key), verify = cert_ca)
When trying to pull data from the API I get the error
requests.exceptions.SSLError: HTTPSConnectionPool(host='some.host', port=443): Max retries exceeded with url: /some/path/var?ID=xxxxxx (Caused by SSLError(SSLError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:847)'),))
Question 1: What am I doing wrong creating the CA_BUNDLE?
Question 2: Am I handling the creation of the web service correctly? My goal is to verify my server against the server holding the data to eventually be able to receive the data by push request.
EDIT: when connecting to my web service (in a browser) I receive the warning that the connection is not secure, because the certificate is not valid, despite the fact that I imported the .p12 certificate into my browser.
So I'm using the request and json library to call API, in my case I can set-up the request to ignore the certificate and this quickly solved my issue
requests.get(url, headers=headers, verify=False)
the argument verify=False ignore the certificate but when you run your code it will show a warning message as output saying that the certificate is wrong, so you can add this other piece of code to don't get request warning showed:
import urllib3
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
I know that doesn't answer your question but maybe you can try to see if without certificate you are able to get information without problem.
Related
I am trying to create a little Python script to send data to a server using the requests module in Python. To make it a bit more secure i want to use self signed certificates made in a program called XCA. When using the certificates in the browser everything works and is secure. When using Postman to send a request with the certificates it works as well. But when i created the Python script it seems to not work or can't get to the certificate.
I have tried to include the CA with the 'Verify' command as well as adding the certificates with the 'cert' command.
When using the 'Verify' command (as seen down below) with the CA i get the error message: Remote end closed connection without response' This message seems to appear everytime i add the CA to this script somehow.
When i use the cert command i get this error message: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:997)'))). I have searched for a sollution to this problem and it seemed that the CA should be include in the certifi 'cacert.pem' file. I have done this but then i also get the 'Remote disconect' message.
Code with 'verify':
import requests
import json
url = "https://ip/tapi"
payload = json.dumps ({"command" : "GetUserList"})
headers = {'content-type': 'application/json',
'X-TAPI': '',
'Authorization': 'Basic ',
'connection': 'keep-alive'}
r = requests.request("POST", url, headers=headers, data= payload, verify= 'Tbox_CA.crt')
print(r.text)
Code with 'cert':
import requests
import json
cert_file_path = 'HTTPS_client.crt'
key_file_path = 'HTTPS_client_key.pem'
url = "https://ip/tapi"
payload = json.dumps ({"command" : "GetUserList"})
headers = {'content-type': 'application/json',
'X-TAPI': '',
'Authorization': 'Basic ',
'connection': 'keep-alive'}
cert = cert_file_path, key_file_path
r = requests.request("POST", url, headers=headers, data= payload, cert=cert)
print(r.text)
When using the 'Verify' command (as seen down below) with the CA i get the error message: Remote end closed connection without response' This message seems to appear everytime i add the CA to this script somehow.
Which means that SSL works, the HTTP request was sent, but then the server closes the connection without sending a response. SSL is not the problem here and the certificate validation worked.
When i use the cert command i get this error message: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:997)'))).
The cert argument is not useful here since it is about providing a client certificate if the server requests it. Only, your server does not request a client certificate so providing this argument is basically the same as not providing it. Since no useful CA is given the certificate validation works. Since SSL thus already failed it cannot even send the real HTTP request inside the SSL connection, i.e. the error you've received when giving a CA is just masked because this error comes earlier.
Why the request failed from Python but not from the browser or Postman is unknown. There might be some server side bot protection implemented, but this is just a guess since there is nothing known about the actual URL you access.
So the program i am developing involves posting documents in bank DMS server. They have provided me server certificate in .cer format which i have inserted in my verify variable in code. They also provided client id and password which i have to embed in the header itself. I generated self signed client certificate and private key and gave them the client certificate in cer format and public key. Also in code i gave path of client certificate and private key in cert tuple.
Upon executing code, i am getting this error:
HTTPSConnectionPool(host='apimuat.xxxbank.com', port=9095): Max retries exceeded with url: /doc-mgmt/v1/uploadDoc (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7fb01bd8a160>: Failed to establish a new connection: [Errno 60] Operation timed out'))
File "/Users/fpl_mayank/Documents/FPL/python-virtual-env/uploadDocApi/server.py", line 164, in main
result = requests.post(url,
File "/Users/fpl_mayank/Documents/FPL/python-virtual-env/uploadDocApi/server.py", line 189, in <module>
main()
I have tested it with 'https://postman-echo.com/post' without mentioning cert and verify just to check if my request is going through or not. it is working fine there.
This is my code snippet where i am using request functions.
url='https://apimuat.xxxbank.com:9095/doc-mgmt/v1/uploadDoc'
headers = {"Content-Type": "application/json", "client_id":"af197b22539647fba4db8b971b43e38",
"client_secret":"c1AA406e24074d8887954472C78a924"}
data = req
result = requests.post(url,
data=data,
headers=headers,
cert=('/Users/fpl_mayank/Documents/FPL/python-virtual-
env/uploadDocApi/keystore/dms_csr_certificate_self.cer','/Users/fpl_mayank/Documents/FPL/python-virtual-env/uploadDocApi/keystore/dms_private_key.key'),
verify='/Users/fpl_mayank/Documents/FPL/python-virtual-env/uploadDocApi/truststore/APIM-UAT.cer'
)
res = result.json()
In apidoc it was mentioned, 2-way SSL authentication will be implemented bw client and server. Also i have made virtual-env for this program for that matter. Please help. I am the first one to write an API using python in my company so only way to get my issue resolve is through good ol stackoverflow.
So i solved this. idk exactly what solved it but make sure when working on api's, get the endpoint's ip whitelisted from your network, as per requirement and same goes from their side too. Also i was sending formatted json request having identation and spaces so make sure to keep json in one line.
Using curl I can connect to a server that needs specific certificate.
curl -E ./file.crt.pem --key ./file.key.pem -k https://server.url
curl version: 7.29.0
But when using Python's requests library, I get an error:
import requests
cert_file_path = "file.crt.pem"
key_file_path = "file.key.pem"
cert = (cert_file_path, key_file_path)
url = 'https://server.url'
r = requests.post(url, cert=cert, verify=False)
Error:
SSLError(SSLError("bad handshake: Error([('SSL routines', 'ssl3_read_bytes', 'tlsv1 alert unknown ca')])"))
Python version: v3.7
What am I missing?
A comment on this answer helped me figure this out.
Update your code as such:
import requests
cert_file_path = "file.crt.pem"
key_file_path = "file.key.pem"
cert = (cert_file_path, key_file_path)
url = 'https://server.url'
r = requests.post(url, cert=cert, verify="path/to/ca_public_keys.pem") # replace with your file
I'm assuming you're using a self-signed certificate, so you need to specify the .pem file containing the public certificates of the CA that issued your self-signed certificate. Make sure to include the intermediate certificates, otherwise the requests library will throw the tlsv1 alert unknown ca error.
You can check the issuer of your client certificate by typing openssl x509 -noout -in file.crt.pem -issuer in a terminal.
Request module checks environmental variable REQUESTS_CA_BUNDLE for cert file. So just do this
export REQUESTS_CA_BUNDLE=/absolute/path/to/your/file.crt.pem
Your python code will simply be:
import requests
url = 'https://server.url'
r = requests.post(url)
print(r.text)
I have generated following self-signed certificates for my server and client.
I have created ca.crt & ca.key. Using ca.crt & ca.key, I have created server.crt, server.key for server and client.crt, client.key for client respectively.
I am using python requests library as client. Below is the code snippet:
import json
import requests
cert = ("/home/tests/certs/client.crt",
"/home/tests/certs/client.key")
class TestCart():
def test_cart(self, **kwargs):
url = "https://192.168.X.Y/cart"
cart_data = {
'id': kwargs.get('id'),
'items': kwargs.get('items')
}
req_data = json.dumps(cart_data)
resp = requests.post(url,
data=req_data,
verify="/home/certs/ca.cert",
cert=cert)
print resp.text
if __name__ == '__main__':
t_cart = TestCart()
data = {'id': 'ba396e79-0f0f-4952-a931-5a528c9ff72c', 'items': []}
t_cart.test_cart(**data)
This gives exception:
requests.exceptions.SSLError: HTTPSConnectionPool(host='192.168.X.Y',
port=443): Max retries exceeded with url: /cart (Caused by
SSLError(SSLError(1, u'[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify
failed (_ssl.c:590)'),))
If I use verify=False, code works, but I want to verify. What should be the value of verify in my request ?
It is highly recommended to have a deeper look at the excellent documentation for requests. It has a special chapter about SSL Cert Validation which explains:
You can pass verify the path to a CA_BUNDLE file or directory with certificates of trusted CAs:
>>> requests.get('https://github.com', verify='/path/to/certfile')
Assuming that your server certificate was signed by your ca.crt you should use this for the verify parameter.
EDIT: based on the discussion it looks like that CA and server certificate used the same subject. This means that the certificate validation assumes that this is a self-signed certificate which thus results in an certificate validation error.
I need to make an API call (of sorts) in Django as a part of the custom authentication system we require. A username and password is sent to a specific URL over SSL (using GET for those parameters) and the response should be an HTTP 200 "OK" response with the body containing XML with the user's info.
On an unsuccessful auth, it will return an HTTP 401 "Unauthorized" response.
For security reasons, I need to check:
The request was sent over an HTTPS connection
The server certificate's public key matches an expected value (I use 'certificate pinning' to defend against broken CAs)
Is this possible in python/django using pycurl/urllib2 or any other method?
Using M2Crypto:
from M2Crypto import SSL
ctx = SSL.Context('sslv3')
ctx.set_verify(SSL.verify_peer | SSL.verify_fail_if_no_peer_cert, depth=9)
if ctx.load_verify_locations('ca.pem') != 1:
raise Exception('No CA certs')
c = SSL.Connection(ctx)
c.connect(('www.google.com', 443)) # automatically checks cert matches host
c.send('GET / \n')
c.close()
Using urllib2_ssl (it goes without saying but to be explicit: use it at your own risk):
import urllib2, urllib2_ssl
opener = urllib2.build_opener(urllib2_ssl.HTTPSHandler(ca_certs='ca.pem'))
xml = opener.open('https://example.com/').read()
Related: Making HTTPS Requests secure in Python.
Using pycurl:
c = pycurl.Curl()
c.setopt(pycurl.URL, "https://example.com?param1=val1¶m2=val2")
c.setopt(pycurl.HTTPGET, 1)
c.setopt(pycurl.CAINFO, 'ca.pem')
c.setopt(pycurl.SSL_VERIFYPEER, 1)
c.setopt(pycurl.SSL_VERIFYHOST, 2)
c.setopt(pycurl.SSLVERSION, 3)
c.setopt(pycurl.NOBODY, 1)
c.setopt(pycurl.NOSIGNAL, 1)
c.perform()
c.close()
To implement 'certificate pinning' provide different 'ca.pem' for different domains.
httplib2 can do https requests with certificate validation:
import httplib2
http = httplib2.Http(ca_certs='/path/to/cert.pem')
try:
http.request('https://...')
except httplib2.SSLHandshakeError, e:
# do something
Just make sure that your httplib2 is up to date. The one which is shipped with my distribution (ubuntu 10.04) does not have ca_certs parameter.
Also in similar question to yours there is an example of certificate validation with pycurl.