so I was trying to load test a Kerberos authenticated endpoint using the below locustfile (details removed):
from locust import HttpUser, TaskSet, task
from requests_kerberos import HTTPKerberosAuth
class UserBehaviour(TaskSet):
#task
def method1(self):
self.client.post("/method1", auth=HTTPKerberosAuth(force_preemptive=True), json={})
class FilterAndPrioritiseUser(HttpUser):
tasks = [UserBehaviour]
Then I continually get an error saying SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1076)'), even though I can manually hit the endpoint fine.
However, if I add another task:
#task(1)
def method2(self):
self.client.get("/endpoint2", verify=False)
Then the results look like this:
Type Name # Requests # Fails
GET /endpoint2 6 6
POST /endpoint1 19 4
Where the errors are:
# fails Method Name Type
6 GET /method2 HTTPError('401 Client Error: Unauthorized for url')
4 POST /method1 SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1076)')
This makes no sense, as why does this other endpoint being hit and failing for the being unauthorised cause the original endpoint to stop failing after a few retries?
Any help would be very appreciated as I'm very confused!
Turns out I needed to add certs to the default cert file that's used by Python, which I found this using this other post. After doing that, the requests all passed authentication as expected!
Related
Here is my function. It was working yesterday, but not any more.
from cif import cif
def make_oecd_request():
country = ['AUS','AUT']
dsname = 'B1_GE'
measure = ['GPSA']
frequency = 'Q'
startDate = '1947-Q1'
endDate = '2021-Q3'
data, subjects, measures = cif.createDataFrameFromOECD(countries=countries,
dsname=dsname,measure=measure,
frequency=frequency,startDate=startDate,endDate=endDate)
Here is the error:
requests.exceptions.SSLError:
HTTPSConnectionPool(host='stats.oecd.org', port=443):
Max retries exceeded with url:
/SDMX-JSON/data/B1_GE/AUS..GPSA.Q/all?startTime=1947-Q1&endTime=2021-Q3
&dimensionAtObservation
=AllDimensions (Caused by
SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED]
certificate verify failed:
unable to get local issuer certificate (_ssl.c:997)')))
Has anyone faced this issue before?
It has nothing to do with API limits in this case. Usually when you hit a limit you will get a more explicit message. This is purely an SSL certificate issue. It could have expired or you could have an issue with the intermediary certificate.
First thing you could do is try to hit the endpoint with a browser and look at the certificate (it's typically a padlock icon in the address bar depending on your browser).
If the certificate looks fine, try pip install --upgrade certifi
Still doesn't work? You need to get out the big guns. Note that it would be much better if you do this in a virtualized environment.
First step, go to https://www.digicert.com/help/ and search for stats.oecd.org. Note that it will tell you that the server is misconfigured and is not providing the intermediary certificate. Note the name of the certificate in question: DigiCert TLS RSA SHA256 2020 CA1
Now go to https://www.digicert.com/kb/digicert-root-certificates.htm and search for DigiCert TLS RSA SHA256 2020 CA1. When you find it, download the pem file. Open the file in your favorite editor and copy everything.
Now modify your code like so:
import certifi
from cif import cif
def make_oecd_request():
countries = ["AUS", "AUT"]
dsname = "B1_GE"
measure = ["GPSA"]
frequency = "Q"
startDate = "1947-Q1"
endDate = "2021-Q3"
data, subjects, measures = cif.createDataFrameFromOECD(
countries=countries,
dsname=dsname,
measure=measure,
frequency=frequency,
startDate=startDate,
endDate=endDate,
)
print(certifi.where())
make_oecd_request()
It will still fail, but now it will tell you where the certifi certificate was installed. Open that file and paste the certificate you previously copied at the top. Make sure you include all of it.
You'll find the certificate error is resolved. However, the request is now returning a 400 which means there is an issue with the parameters provided.
I need to often manually check if some links are live. To avoid this I would like to write a small script that just returns the https status of all the links so I can immediately see any 404s. Problem is I have no idea what i'm doing I just know it should be possible :D
I tried to do this with a tester page using python's requests and I get the following error
import requests
requests.get('https://exp04.zih.tu-dresden.de/')
HTTPSConnectionPool(host='exp04.zih.tu-dresden.de', port=443): Max retries exceeded with url: / (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1125)')))
I understand the individual words of the error message but now nothign about server/client stuff. How do I return the status code for this page?
import requests
requests.get('https://exp04.zih.tu-dresden.de/', verify=False)
Just ignore the SSL certificate verifying.
I have a little code, in python 2.7.17, where I'm trying to reach the Yahoo! Finance API to get information about a stock, but when I execute it, I get an error. I don't know how to fix it.
This is the code:
import urllib
urlStock = 'http://finance.yahoo.com/d/quotes.csv?s=aapl&f=nagh'
response = urllib.urlopen(urlStock).read()
print response
And this is the error:
Exception has occurred: IOError
[Errno socket error] [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:727)
File "/Users/ivanparra/Dropbox/Aprendizaje Python/InternetTests.py", line 4, in <module>
response = urllib.urlopen(urlStock).read()
That site has been discontinued for since 2018, unfortunately (more discussion here). However, there is an alternative, as pointed out on the linked Github issue thread. The URL is:
https://query1.finance.yahoo.com/v7/finance/quote?lang=en-US®ion=US&corsDomain=finance.yahoo.com&symbols=AAPL&fields=regularMarketPrice
Personally, I tend to prefer using the requests library (which you can easily install with pip) whenever possible because of its simple syntax. If you have SSL issues, see my comment in the example code.
Here's how I'd query it:
import requests
import pdb
res = requests.get("https://query1.finance.yahoo.com/v7/finance/quote?lang=en-US®ion=US&corsDomain=finance.yahoo.com&symbols=AAPL&fields=regularMarketPrice")
# If you need to work around SSL issues, set the verify kw arg to False. For example:
# requests.get("URL_HERE", verify=False)
stock_data = res.json()
price = stock_data['quoteResponse']['result'][0]['regularMarketPrice']
print(price)
self.host="KibanaProxy"
self.Port="443"
self.user="test"
self.password="test"
I need to suppress certificate validation. It works with curl when using option -k on command line. But while connecting using elasticsearch.Elasticsearch from Elasticsearch python module, it throws error.
_es2 = Elasticsearch([self.host], port=self.port, scheme="https", http_auth=(self.user, self.password), use_ssl=True, verify_certs=False)
_es2.info()
Error:
raise SSLError('N/A', str(e), e)
elasticsearch.exceptions.SSLError: ConnectionError([SSL: CERTIFICATE_VERIFY_FAILED]
certificate verify failed (_ssl.c:590)) caused by: SSLError([SSL:
CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:590))```
Found it.
While reading this post https://github.com/elastic/elasticsearch-py/issues/275, i got to know about connection_class. Looked for some standard or predefined method related to it so found https://elasticsearch-py.readthedocs.io/en/master/transports.html
Solution:
from elasticsearch import RequestsHttpConnection
.....
_es2 = Elasticsearch([self.host], port=self.port, connection_class=RequestsHttpConnection, http_auth=(self.user, self.password), use_ssl=True, verify_certs=False)
print(es.ping())
$ ./sn.py
True
In addition to Sonali's answer, as of September 2022, the Python Elasticsearch client does not allow function parameter use_ssl=True
Adding only verify_certs=False parameter fixed my case:
Elasticsearch(hosts=[address], basic_auth=[user, password], verify_certs=False)
And es.ping() does not let you know what is the error, just returns False. In order to see the error details, you can use this single line of code
print(es.info())
TLS error caused by: TlsError(TLS error caused by: SSLError([SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:997)))
I am following these instructions; but when I run the given sample
from shade import *
simple_logging(debug=True)
conn = openstack_cloud(cloud='myopenstack')
images = conn.list_images()
for image in images:
print(image)
I get:
keystoneauth1.exceptions.connection.SSLError: SSL exception connecting to MY-URL/auth/tokens: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:645)
I then remembered that the folks providing this cloud to us told us to use the "--insecure" parameter when using the python openstack client. I did some more searching, and changed one line
conn = openstack_cloud(cloud='myopenstack', verify='False')
Now I get:
keystoneauth1.exceptions.connection.SSLError: SSL exception connecting to MY-AUTH_URL/auth/tokens: [Errno 2] No such file or directory
But now I am kinda lost - any ideas?
I did not find a solution to make the call with verify work out; but the people providing this openstack instance provided a special certificate to me; and with that; I do:
os.environ['REQUESTS_CA_BUNDLE'] = './special.pem'
And everything works!
( so the answer is to workaround the certificate validation by fixing the certificate ;-)