"key values mismatch" when using context.use_certificate_chain_file - python

When using context.use_certificate_chain_file I get a key error (openssl.Context Python). The error is:
Traceback (most recent call last):
File "/home/user/public_html/application.py", line 363, in <module>
context.use_privatekey_file('/etc/ssl/private/' + HOSTNAME + '.key')
OpenSSL.SSL.Error: [('x509 certificate routines', 'X509_check_private_key', 'key values mismatch')]
It saying a key values mismatch, but I wouldn't think the chain would affect that.
If I comment the context.use_certificate_chain_file line, it works perfectly (but gives an ssl verification error in the browser).
Here is the snippet of my code:
context = openssl.Context(openssl.SSLv23_METHOD)
context.set_options(openssl.OP_NO_SSLv2)
context.set_options(openssl.OP_NO_SSLv3)
context.use_certificate_file('/etc/ssl/certs/' + HOSTNAME + '.crt')
context.use_certificate_chain_file('/etc/ssl/certs/' + HOSTNAME + '.cabundle')
context.use_privatekey_file('/etc/ssl/private/' + HOSTNAME + '.key')
context.set_cipher_list(':'.join(supported_ciphers))
Any ideas why its giving the error?

Any ideas why its giving the error?
The error is propagated up from OpenSSL. Its error 0x0B080074:
$ openssl errstr 0x0B080074
error:0B080074:x509 certificate routines:X509_check_private_key:key values mismatch
Based on SSL install problem - “key value mismatch” (but they do match?), you have one of two problems.
First, the private key does not match the public key in the certificate. Second, your certificate_chain_file is missing the intermediate certificates required to build a valid path from the server's certificate to a root. Here, the root would be the CA that signed your certificate.
So your fix is to either (1) ensure the public/private key pair is in fact a pair, or (2) include the necessary intermediate certificates in the chain file.
Without knowing the private key ('/etc/ssl/private/' + HOSTNAME + '.key'), the server certificate ('/etc/ssl/certs/' + HOSTNAME + '.crt') or the contents of the chain file ('/etc/ssl/certs/' + HOSTNAME + '.cabundle'), we really can't give you more details on how to fix it.
You can provide us with the server's certificate with:
cat '/etc/ssl/certs/' + HOSTNAME + '.crt' | openssl x509 -text -noout
You can provide us with the chain file by just cat'ing. It will be 3 or 4 PEM encoded certificates concatenated together:
cat `'/etc/ssl/certs/' + HOSTNAME + '.cabundle'`

Its working now, the chain had to be appended to the crt.

Related

SSLCertVerification Error when connecting to OECD

Here is my function. It was working yesterday, but not any more.
from cif import cif
def make_oecd_request():
country = ['AUS','AUT']
dsname = 'B1_GE'
measure = ['GPSA']
frequency = 'Q'
startDate = '1947-Q1'
endDate = '2021-Q3'
data, subjects, measures = cif.createDataFrameFromOECD(countries=countries,
dsname=dsname,measure=measure,
frequency=frequency,startDate=startDate,endDate=endDate)
Here is the error:
requests.exceptions.SSLError:
HTTPSConnectionPool(host='stats.oecd.org', port=443):
Max retries exceeded with url:
/SDMX-JSON/data/B1_GE/AUS..GPSA.Q/all?startTime=1947-Q1&endTime=2021-Q3
&dimensionAtObservation
=AllDimensions (Caused by
SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED]
certificate verify failed:
unable to get local issuer certificate (_ssl.c:997)')))
Has anyone faced this issue before?
It has nothing to do with API limits in this case. Usually when you hit a limit you will get a more explicit message. This is purely an SSL certificate issue. It could have expired or you could have an issue with the intermediary certificate.
First thing you could do is try to hit the endpoint with a browser and look at the certificate (it's typically a padlock icon in the address bar depending on your browser).
If the certificate looks fine, try pip install --upgrade certifi
Still doesn't work? You need to get out the big guns. Note that it would be much better if you do this in a virtualized environment.
First step, go to https://www.digicert.com/help/ and search for stats.oecd.org. Note that it will tell you that the server is misconfigured and is not providing the intermediary certificate. Note the name of the certificate in question: DigiCert TLS RSA SHA256 2020 CA1
Now go to https://www.digicert.com/kb/digicert-root-certificates.htm and search for DigiCert TLS RSA SHA256 2020 CA1. When you find it, download the pem file. Open the file in your favorite editor and copy everything.
Now modify your code like so:
import certifi
from cif import cif
def make_oecd_request():
countries = ["AUS", "AUT"]
dsname = "B1_GE"
measure = ["GPSA"]
frequency = "Q"
startDate = "1947-Q1"
endDate = "2021-Q3"
data, subjects, measures = cif.createDataFrameFromOECD(
countries=countries,
dsname=dsname,
measure=measure,
frequency=frequency,
startDate=startDate,
endDate=endDate,
)
print(certifi.where())
make_oecd_request()
It will still fail, but now it will tell you where the certifi certificate was installed. Open that file and paste the certificate you previously copied at the top. Make sure you include all of it.
You'll find the certificate error is resolved. However, the request is now returning a 400 which means there is an issue with the parameters provided.

How to authenticate HashiCorp Vault without certificate?

Below is my code:
import hvac
client = hvac.Client(
url='https://vault-abc.net',token='s.d0AGS4FE3o6UxUpVTQ0h0RRd',verify='False'
)
print(client.is_authenticated())
ERROR in output:
in cert_verify
raise IOError("Could not find a suitable TLS CA certificate bundle, " OSError: Could not find a suitable TLS CA certificate
bundle, invalid path: False
I got only token and URL to login on console from client no certificates shared! In other java applications code without using any certificate authentication working but in python code under hvac module or CURL or vault CLI expecting certificates to be passed. Any way I can handle this and fix above error?
Do we have any certificate check skip option?
Agenda is authenticate and do fetch vault secrets using python program, without any certificates need to fetch just with Token & vault URL.
You can disable certificate checks, but for something like Vault that's generally a bad idea (disabling security checks on a security service).
In any case, your problem is simple: You are passing 'False' (a string) where you should be passing False (a boolean) as the verify argument.
Passing a string causes the library to look for a certificate at that path; since there is no certificate at the path 'False', you get the error that you are seeing.

Getting ERROR while uploading certificate of .pfx format

I have a certificate called "MyCert.pfx" with some passphrase say "buggy" and 2 different working server S1 and S2. With S1 uploading and usage of this cert is absolutely fine but While uploading this certificate to S2, I am getting below error:
['asn1 encoding routines', 'ASN1_CHECK_TLEN', 'wrong tag']['asn1 encoding routines', 'asn1_item_embed_d2i', 'nested asn1 error]
Traceback:
n File \"/opt/aruba/central/apps/configuration/ENV/local/lib/python2.7/site-packages/OpenSSL/crypto.py\", line 3046, in load_pkcs12
n _raise_current_error()
n File \"/opt/aruba/central/apps/configuration/ENV/local/lib/python2.7/site-packages/OpenSSL/_util.py\", line 54, in exception_from_error_queue
n raise exception_type(errors)
nError: [(\'asn1 encoding routines\', \'asn1_check_tlen\', \'wrong tag\'), (\'asn1 encoding routines\', \'asn1_item_embed_d2i\', \'nested asn1 error\')]'
Any idea why the same certificate is working in one place but not on the other ? When I converted it to .PEM it is working fine at both the places.
In my case it was due to new OpenSSL version vs. old node.js installed on the server.
I found the solution here:
Run the following command to fix the key:
openssl rsa -in key.txt -out key.txt
Where key.txt is the private key file.

"SSL: certificate_verify_failed" error when scraping https://www.thenewboston.com/

So I started learning Python recently using "The New Boston's" videos on youtube, everything was going great until I got to his tutorial of making a simple web crawler. While I understood it with no problem, when I run the code I get errors all seemingly based around "SSL: CERTIFICATE_VERIFY_FAILED." I've been searching for an answer since last night trying to figure out how to fix it, it seems no one else in the comments on the video or on his website are having the same problem as me and even using someone elses code from his website I get the same results. I'll post the code from the one I got from the website as it's giving me the same error and the one I coded is a mess right now.
import requests
from bs4 import BeautifulSoup
def trade_spider(max_pages):
page = 1
while page <= max_pages:
url = "https://www.thenewboston.com/forum/category.php?id=15&orderby=recent&page=" + str(page) #this is page of popular posts
source_code = requests.get(url)
# just get the code, no headers or anything
plain_text = source_code.text
# BeautifulSoup objects can be sorted through easy
for link in soup.findAll('a', {'class': 'index_singleListingTitles'}): #all links, which contains "" class='index_singleListingTitles' "" in it.
href = "https://www.thenewboston.com/" + link.get('href')
title = link.string # just the text, not the HTML
print(href)
print(title)
# get_single_item_data(href)
page += 1
trade_spider(1)
The full error is: ssl.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:645)
I apologize if this is a dumb question, I'm still new to programming but I seriously can't figure this out, I was thinking about just skipping this tutorial but it's bothering me not being able to fix this, thanks!
The problem is not in your code but in the web site you are trying to access. When looking at the analysis by SSLLabs you will note:
This server's certificate chain is incomplete. Grade capped to B.
This means that the server configuration is wrong and that not only python but several others will have problems with this site. Some desktop browsers work around this configuration problem by trying to load the missing certificates from the internet or fill in with cached certificates. But other browsers or applications will fail too, similar to python.
To work around the broken server configuration you might explicitly extract the missing certificates and add them to you trust store. Or you might give the certificate as trust inside the verify argument. From the documentation:
You can pass verify the path to a CA_BUNDLE file or directory with
certificates of trusted CAs:
>>> requests.get('https://github.com', verify='/path/to/certfile')
This list of trusted CAs can also be specified through the
REQUESTS_CA_BUNDLE environment variable.
You can tell requests not to verify the SSL certificate:
>>> url = "https://www.thenewboston.com/forum/category.php?id=15&orderby=recent&page=1"
>>> response = requests.get(url, verify=False)
>>> response.status_code
200
See more in the requests doc
You are probably missing the stock certificates in your system. E.g. if running on Ubuntu, check that ca-certificates package is installed.
if you want to use the Python dmg installer, you also have to read Python 3's ReadMe and run the bash command to get new certificates.
Try running
/Applications/Python\ 3.6/Install\ Certificates.command
It's worth shedding a bit more "hands-on" light about what happens here, adding upon #Steffen Ullrich's answer here and elsewhere:
urllib and “SSL: CERTIFICATE_VERIFY_FAILED” Error
Python Urllib2 SSL error (a very detailed answer)
Notes:
I'll use another website than the OP, because the OP's website currently has no issues.
I used Ubunto to run the following commands (curl and openssl). I tried running curl on my Windows 10, but got different, unhelpful output.
The error experienced by the OP can be "reproduced" by using the following curl command:
curl -vvI https://www.vimmi.net
Which outputs (note the last line):
* TCP_NODELAY set
* Connected to www.vimmi.net (82.80.192.7) port 443 (#0)
* ALPN, offering h2
* ALPN, offering http/1.1
* successfully set certificate verify locations:
* CAfile: /etc/ssl/certs/ca-certificates.crt
CApath: /etc/ssl/certs
* TLSv1.3 (OUT), TLS handshake, Client hello (1):
* TLSv1.3 (IN), TLS handshake, Server hello (2):
* TLSv1.2 (IN), TLS handshake, Certificate (11):
* TLSv1.2 (OUT), TLS alert, Server hello (2):
* SSL certificate problem: unable to get local issuer certificate
* stopped the pause stream!
* Closing connection 0
curl: (60) SSL certificate problem: unable to get local issuer certificate
Now let's run it with the --insecure flag, which will display the problematic certificate:
curl --insecure -vvI https://www.vimmi.net
Outputs (note the last two lines):
* Rebuilt URL to: https://www.vimmi.net/
* Trying 82.80.192.7...
* TCP_NODELAY set
* Connected to www.vimmi.net (82.80.192.7) port 443 (#0)
* ALPN, offering h2
* ALPN, offering http/1.1
* successfully set certificate verify locations:
* CAfile: /etc/ssl/certs/ca-certificates.crt
CApath: /etc/ssl/certs
* [...]
* Server certificate:
* subject: OU=Domain Control Validated; CN=vimmi.net
* start date: Aug 5 15:43:45 2019 GMT
* expire date: Oct 4 16:16:12 2020 GMT
* issuer: C=US; ST=Arizona; L=Scottsdale; O=GoDaddy.com, Inc.; OU=http://certs.godaddy.com/repository/; CN=Go Daddy Secure Certificate Authority - G2
* SSL certificate verify result: unable to get local issuer certificate (20), continuing anyway.
The same result can be seen using openssl, which is worth mentioning because it's used internally by python:
echo | openssl s_client -connect vimmi.net:443
Outputs:
CONNECTED(00000005)
depth=0 OU = Domain Control Validated, CN = vimmi.net
verify error:num=20:unable to get local issuer certificate
verify return:1
depth=0 OU = Domain Control Validated, CN = vimmi.net
verify error:num=21:unable to verify the first certificate
verify return:1
---
Certificate chain
0 s:OU = Domain Control Validated, CN = vimmi.net
i:C = US, ST = Arizona, L = Scottsdale, O = "GoDaddy.com, Inc.", OU = http://certs.godaddy.com/repository/, CN = Go Daddy Secure Certificate Authority - G2
---
Server certificate
-----BEGIN CERTIFICATE-----
[...]
-----END CERTIFICATE-----
[...]
---
DONE
So why both curl and openssl can't verify the certificate Go Daddy issued for that website?
Well, to "verify a certificate" (to use openssl's error message terminology) means to verify that the certificate contains a trusted source signature (put differently: the certificate was signed by a trusted source), thus verifying vimmi.net identity ("identity" here strictly means that "the public key contained in the certificate belongs to the person, organization, server or other entity noted in the certificate").
A source is "trusted" if we can establish its "chain of trust", with the following properties:
The Issuer of each certificate (except the last one) matches the Subject of the next certificate in the list
Each certificate (except the last one) is signed by the secret key corresponding to the next certificate in the chain (i.e. the signature
of one certificate can be verified using the public key contained in
the following certificate)
The last certificate in the list is a trust anchor: a certificate that you trust because it was delivered to you by some trustworthy
procedure
In our case, the issuer is "Go Daddy Secure Certificate Authority - G2". That is, the entity named "Go Daddy Secure Certificate Authority - G2" signed the certificate, so it's supposed to be a trusted source.
To establish this entity's trustworthiness, we have 2 options:
Assume that "Go Daddy Secure Certificate Authority - G2" is a "trust anchor" (see listing 3 above). Well, it turns out that curl and openssl try to act upon this assumption: they searched that entity's certificate on their default paths (called CA paths), which are:
for curl, it's /etc/ssl/certs.
for openssl, it's /use/lib/ssl (run openssl version -a to see that).
But that certificate wasn't found, leaving us with a second option:
Follow steps 1 and 2 listed above; in order to do that, we need to get the certificate issued for that entity.
This can be achieved by downloading it from its source, or using the browser.
for example, go to vimmi.net using Chrome, click the padlock > "Certificate" > "Certification Path" tab, select the entity > "View Certificate", then in the opened window go to "Details" tab > "Copy to File" > Base-64 encoded > save the file)
Great! Now that we have that certificate (which can be in whatever file format: cer, pem, etc.; you can even save it as a txt file), let's tell curl to use it:
curl --cacert test.cer https://vimmi.net
Going back to Python
Once we have:
"Go Daddy Secure Certificate Authority - G2" certificate
"Go Daddy Root Certificate Authority - G2" certificate (wasn't mentioned above, but can be achieved in a similar way).
We need to copy their contents into a single file, let's call it combined.cer, and let's put it in the current directory. Then, simply:
import requests
res = requests.get("https://vimmi.net", verify="./combined.cer")
print (res.status_code) # 200
BTW, "Go Daddy Root Certificate Authority - G2" is listed as a trusted authority by browsers and various tools; that's why we didn't have to specify it for curl.
Further reading:
how are ssl certificates verified, especially #ychaouche image.
The First Few Milliseconds of an HTTPS Connection
Wikipedia: Public key certificate, Certificate authority
Nice video: Basics of Certificate Chain Validation.
Helpful SE answers that focus on certificate signature terminology: 1, 2, 3.
Certificates in relation to Man-In-The-Middle attack: 1, 2.
The most dangerous code in the world: validating SSL certificates in non-browser software
I'm posting this as an answer because I've gotten past your issue thus far, but there's still issues in your code (which when fixed, I can update).
So long story short: you could be using an old version of requests or the ssl certificate should be invalid. There's more information in this SO question: Python requests "certificate verify failed"
I've updated the code into my own bsoup.py file:
#!/usr/bin/env python3
import requests
from bs4 import BeautifulSoup
def trade_spider(max_pages):
page = 1
while page <= max_pages:
url = "https://www.thenewboston.com/forum/category.php?id=15&orderby=recent&page=" + str(page) #this is page of popular posts
source_code = requests.get(url, timeout=5, verify=False)
# just get the code, no headers or anything
plain_text = source_code.text
# BeautifulSoup objects can be sorted through easy
for link in BeautifulSoup.findAll('a', {'class': 'index_singleListingTitles'}): #all links, which contains "" class='index_singleListingTitles' "" in it.
href = "https://www.thenewboston.com/" + link.get('href')
title = link.string # just the text, not the HTML
print(href)
print(title)
# get_single_item_data(href)
page += 1
if __name__ == "__main__":
trade_spider(1)
When I run the script, it gives me this error:
https://www.thenewboston.com/forum/category.php?id=15&orderby=recent&page=1
Traceback (most recent call last):
File "./bsoup.py", line 26, in <module>
trade_spider(1)
File "./bsoup.py", line 16, in trade_spider
for link in BeautifulSoup.findAll('a', {'class': 'index_singleListingTitles'}): #all links, which contains "" class='index_singleListingTitles' "" in it.
File "/usr/local/lib/python3.4/dist-packages/bs4/element.py", line 1256, in find_all
generator = self.descendants
AttributeError: 'str' object has no attribute 'descendants'
There's an issue somewhere with your findAll method. I've used both python3 and python2, wherein python2 reports this:
TypeError: unbound method find_all() must be called with BeautifulSoup instance as first argument (got str instance instead)
So it looks like you'll need to fix up that method before you can continue
I spent several hours trying to fix some Python and update certs on a VM. In my case I was working against a server that someone else had set up. It turned out that the wrong cert had been uploaded to the server. I found this command on another SO answer.
root#ubuntu:~/cloud-tools# openssl s_client -connect abc.def.com:443
CONNECTED(00000005)
depth=0 OU = Domain Control Validated, CN = abc.def.com
verify error:num=20:unable to get local issuer certificate
verify return:1
depth=0 OU = Domain Control Validated, CN = abc.def.com
verify error:num=21:unable to verify the first certificate
verify return:1
---
Certificate chain
0 s:OU = Domain Control Validated, CN = abc.def.com
i:C = US, ST = Arizona, L = Scottsdale, O = "GoDaddy.com, Inc.", OU = http://certs.godaddy.com/repository/, CN = Go Daddy Secure Certificate Authority - G2

Requests: what is the difference between cert and verify?

What is the difference between cert and verify?
From Documentation:
verify – (optional) if True, the SSL cert will be verified. A CA_BUNDLE path can also be provided.
cert – (optional) if String, path to ssl client cert file (.pem). If Tuple, (‘cert’, ‘key’) pair.
Does this mean I can do the following:
CA_BUNDLE='path/to/.pem'
requests.get(url=google.com, verify= CA_BUNDLE)
or
Cert='path/to/.pem'
requests.get(url=google.com, cert=Cert)
They both look like they do the same thing. except verify can disable ssl verification.
I am trying to compile my code to an exe using PYinstaller. I am using certifi module that I see already has a cacert.pem file but I guess I still have to bundle it with my code.
In my code do I modify ...verify or cert?...with a path to cacert.pem or just 'cacert.pem'?
I think it is clearly stated in the documentation: SSL Cert Verification
The option cert is to send you own certificate, e.g. authenticate yourself against the server using a client certificate. It needs a certificate file and if the key is not in the same file as the certificate also the key file.
The option verify is used to enable (default) or disable verification of the servers certificate. It can take True or False or a name of a file which contains the trusted CAs. If not given I think (not documented?) it will take the default CA path/file from OpenSSL, which works usually on UNIX (except maybe OS X) and not on windows.
if the *.pem file has this section
-----BEGIN PRIVATE KEY-----
....
-----END PRIVATE KEY-----
then use cert
and if not, then use verify

Categories