We have an instance of minio running with a certificate that is signed by our corporate CA. Accessing it with S3 Browser works perfect. Now I try to write a python script to upload files. I try to use the windows cert store to get my CA certs
myssl = ssl.create_default_context()
myhttpclient = urllib3.PoolManager(
cert_reqs='CERT_REQUIRED',
ca_certs=myssl.get_ca_certs()
)
s3dev = Minio("s3dev.mycorp.com:9000,
access_key="myAccessKey",
secret_key="mySecretKey"
secure=True,
http_client=myhttpclient
)
I get an error "TypeError: unhashable type: list"
Getting the CA Certs from Windows cert store with ssl.get_ca_certs() returns a list with all the certs in it which seems logic to me, what am I missing here to get something this simple to work ?
Related
I am trying to connect to SharePoint Online using Python using Certificate Authentication using https://github.com/vgrem/Office365-REST-Python-Client
If I connect to the root of my SharePoint, it works. But if I try to connect to any specific site, it gets an error. But both work through PowerShell. So I don't think this is a setup issue.
Example - this is PowerShell using the cert in a .pfx file, connecting to the root (https://mytenant.sharepoint.com)
Connect-PnPOnline -Url https://mytenant.sharepoint.com -Tenant mytenant.onmicrosoft.com -ClientId 5fa2148c-d484-444a-bcf1-db632a0fed71 -CertificatePath 'PowershellPnp.pfx' -CertificatePassword $(ConvertTo-Securestring -string "MyCertPassword" -AsPlainText -Force)
Now I change it to connect to a specific site: https://mytenant.sharepoint.com/sites/MySite
Connect-PnPOnline -Url https://mytenant.sharepoint.com/sites/MySite -Tenant mytenant.onmicrosoft.com -ClientId 5fa2148c-d484-444a-bcf1-db632a0fed71 -CertificatePath 'PowershellPnp.pfx' -CertificatePassword $(ConvertTo-Securestring -string "MyCertPassword" -AsPlainText -Force)
Still works, no errors.
Now I try to do the same thing through Python. I use openssl to convert the .pfx to a .pem file, which is what the Python library wants.
First I try the root https://mytenant.sharepoint.com
site_url = "https://mytenant.sharepoint.com"
cert_settings = {
'client_id': '5fa2148c-d484-444a-bcf1-db632a0fed71',
'thumbprint': "D1656C4AAC5CFBB971477230A5FBACCD356829D3",
'cert_path': 'PowershellPnP.pem'
}
ctx = ClientContext(site_url).with_client_certificate('mytenant.onmicrosoft.com',**cert_settings)
This connects without error.
However, if I change the site to https://mytenant.sharepoint.com/MySite:
site_url = "https://mytenant.sharepoint.com/sites/MySite"
cert_settings = {
'client_id': '5fa2148c-d484-444a-bcf1-db632a0fed71',
'thumbprint': "D1656C4AAC5CFBB971477230A5FBACCD356829D3",
'cert_path': 'PowershellPnP.pem'
}
ctx = ClientContext(site_url).with_client_certificate('mytenant.onmicrosoft.com',**cert_settings)
I get this error:
ValueError: {'error': 'invalid_resource', 'error_description':
'AADSTS500011: The resource principal named
https://mytenant.sharepoint.com/sites/MySite was not found in the
tenant named mytenant. This can happen if the application has not been
installed by the administrator of the tenant or consented to by any
user in the tenant. You might have sent your authentication request to
the wrong tenant
I might consider what that error says, but I can connect to that site using the certificate method through PowerShell. So there should be no problem or other requirements to connect to it through Python, no?
It's not the wrong tenant, and Azure shows everything is consented. And it all works in PowersShell.
Turns out this was a bug in the library.
A workaround until it is fixed is to add a value for the scope parameter:
'scopes': ['https://{tenant}.sharepoint.com/.default']
Below is my code:
import hvac
client = hvac.Client(
url='https://vault-abc.net',token='s.d0AGS4FE3o6UxUpVTQ0h0RRd',verify='False'
)
print(client.is_authenticated())
ERROR in output:
in cert_verify
raise IOError("Could not find a suitable TLS CA certificate bundle, " OSError: Could not find a suitable TLS CA certificate
bundle, invalid path: False
I got only token and URL to login on console from client no certificates shared! In other java applications code without using any certificate authentication working but in python code under hvac module or CURL or vault CLI expecting certificates to be passed. Any way I can handle this and fix above error?
Do we have any certificate check skip option?
Agenda is authenticate and do fetch vault secrets using python program, without any certificates need to fetch just with Token & vault URL.
You can disable certificate checks, but for something like Vault that's generally a bad idea (disabling security checks on a security service).
In any case, your problem is simple: You are passing 'False' (a string) where you should be passing False (a boolean) as the verify argument.
Passing a string causes the library to look for a certificate at that path; since there is no certificate at the path 'False', you get the error that you are seeing.
I'm trying to connect to one of my internal services at: https://myservice.my-alternative-domain.com through Python Requests. I'm using Python 3.6
I'm using a custom CA bundle to verify the request, and I'm getting the next error:
SSLError: hostname 'myservice.my-domain.com' doesn't match either of 'my-domain.com', 'my-alternative-domain.com'
The SSL certificate that the internal service uses has as CN: my-domain.com, and as SAN (Subject Alternative Names): 'my-domain.com', 'my-alternative-domain.com'
So, I'm trying to access the service through one of the alternative names (this has to be like this and it's not under my control)
I think the error is correct, and that the certificate should have also as SAN:
'*.my-alternative-domain.com'
in order for the request to work.
The only thing that puzzles me is that I can access the service through the browser.
Can somebody confirm the behavior of Python Requests is correct?
This is how I call the service:
response = requests.get('https://myservice.my-alternative-domain.com', params=params, headers=headers, verify=ca_bundle)
Thanks
pass verify as false might work
x=requests.get(-----,verify=false)
I am running a REST API (Search API) with Tweepy in Python. I worked the program at home and it's totally fine. But now I am working on this in different networks and I got the error message.
SSLError: ("bad handshake: Error([('SSL routines', 'ssl3_get_server_certificate', 'certificate verify failed')],)",)
My code is like this.
auth = tweepy.AppAuthHandler(consumer_key, consumer_secret)
api = tweepy.API(auth,wait_on_rate_limit=True, wait_on_rate_limit_notify=True)
I found this post
Python Requests throwing up SSLError
and set the following code (verify = false) may be a quick solution. Does anyone know how to do it or other ways in tweepy? Thank you.
In streaming.py, adding verify = False in line# 105 did the trick for me as shown below. Though it is not advisable to use this approach as it makes the connection unsafe. Haven't been able to come up with a workaround for this yet.
stream = Stream(auth, listener, verify = False)
I ran into the same problem and unfortunately the only thing that worked was setting verify=False in auth.py in Tweepy (for me Tweepy is located in /anaconda3/lib/python3.6/site-packages/tweepy on my Mac):
resp = requests.post(self._get_oauth_url('token'),
auth=(self.consumer_key,
self.consumer_secret),
data={'grant_type': 'client_credentials'},
verify=False)
Edit:
Behind a corporate firewall, there is a certificate issue. In chrome go to settings-->advanced-->certificates and download your corporate CA certificate. Then, in Tweepy binder.py, right under session = requests.session() add
session.verify = 'path_to_corporate_certificate.cer'
First, verify if you can access twitter just using a proxy configuration. If so, you can modify this line on your code to include a proxy URL:
self.api = tweepy.API(self.auth)
Adding verify=False will ignore the validation that has to be made and all the data will be transferred in plain text without any encryption.
pip install certifi
The above installation fixes the bad handshake and ssl error.
For anybody that might stumble on this like I did, I had a similar problem because my company was using a proxy, and the SSL check failed while trying to verify the proxy's certificate.
The solution was to export the proxy's root certificate as a .pem file. Then you can add this certificate to certifi's trust store by doing:
import certifi
cafile = certifi.where()
with open(r<path to pem file>, 'rb') as infile:
customca = infile.read()
with open(cafile, 'ab') as outfile:
outfile.write(customca)
You'll have to replace <path to pem file> with the path to the exported file. This should allow requests (and tweepy) to successfully validate the certificates.
Using python-requests, how can I pin a self-signed .pem certificate for a specific server directly, without using CA root bundles?
Is this currently possible? If yes, can you please provide an example?
I read https://2.python-requests.org/en/v2.8.1/user/advanced/#ssl-cert-verification but am not sure if this applies to what I'm trying to do:
You can also specify a local cert to use as client side certificate, as a single file (containing the private key and the certificate) or as a tuple of both file’s path: requests.get('https://kennethreitz.com', cert=('/path/server.crt', '/path/key')) Response [200]
Because the certificate file is self-signed, this works just as you would do it normally with requests. Below is a step-by-step procedure:
Obtain the self-signed certificate, ideally in some secure, out-of-band manner. For example, I run a webserver that offers HTTPS access via a self-signed certificate, so I downloaded the certificate using scp:
scp <username>#<server>:/path/to/certfile.crt .
Because I use nginx this is already in PEM format, but if it's not you'll need to convert it. That's outside the scope of this answer.
Use the certificate file from inside requests:
r = requests.get('https://yoursite.com/', verify='certfile.crt')
That's all you need to do.
If you can't obtain the certificate in an out-of-band manner you trust, you can obtain the certificate using your browser. All browsers will let you export the certificate via their UIs. This is less-secure: if someone is going to MITM you then they may well have already started, and can offer you their MITM root CA instead of your self-signed cert.
You can also verify certificates against their fingerprints.
For this you need a custom transport adapter for requests.
An example for a simple one can be found here:
https://github.com/untitaker/vdirsyncer/blob/9d3a9611b2db2e92f933df30dd98c341a50c6211/vdirsyncer/utils/init.py#L198
import requests
from requests.packages.urllib3.poolmanager import PoolManager
class _FingerprintAdapter(requests.adapters.HTTPAdapter):
def __init__(self, fingerprint=None, **kwargs):
self.fingerprint = str(fingerprint)
super(_FingerprintAdapter, self).__init__(**kwargs)
def init_poolmanager(self, connections, maxsize, block=False):
self.poolmanager = PoolManager(num_pools=connections,
maxsize=maxsize,
block=block,
assert_fingerprint=self.fingerprint)