I'm trying to parse the data from this url:
https://www.chemeo.com/search?q=show%3Ahfus+tf%3A275%3B283
But I think this is failing because the website uses SSL TLS 1.3. How can I enable my Python script, below, to connect using SSL in urllib.request?
I've tried using an SSL context but this doesn't seem to work.
This is the Python 3.6 code I have:
import urllib.request
import ssl
from bs4 import BeautifulSoup
scontext = ssl.SSLContext(ssl.PROTOCOL_SSLv23)
chemeo_search_url = "https://www.chemeo.com/search?q=show%3Ahfus+tf%3A275%3B283"
print(chemeo_search_url)
with urllib.request.urlopen(chemeo_search_url, context=scontext) as f:
print(f.read(200))
Try:
ssl.PROTOCOL_TLS
From the docs on "PROTOCOL_SSLv23":
Deprecated since version 2.7.13: Use PROTOCOL_TLS instead.
note:
Be sure to have the CA certificate bundles installed, like on a minimal build of alpine linux, busybox, the certs have to be installed. Also sometimes if python wasn't compiled with SSL support, it might be necessary to to do so. Also depending on which version of OpenSSL has been compiled will determine which features for SSL will be usable.
Also note chemeo site doesn't use TLSv1.3 ... it is still experimental and not all that secure at the time of this writing, they currently support tls 1.0, 1.1, 1.2 using "letsencrypt" as their cert provider.
Related
I'm writting a Flask app that connects to external soap service that uses TLS v1.2.
I'm using Python 2.7 and requests library in version 2.18.1.
I've contacted server owner and he told me that I need to include multiple client certificates in TLS connection. It's a chain of 3 certificates which I have in separate .pem files. (root + indermediate + my client certificate).
Server won't let me in if I would have just the last one.
I've tested this with SoapUI and Wireshark and it's true. I receive a response only when I provide the whole chain of 3 certificates.
I get an error from the server when passing just my client certificate.
From requests documentation you can read that as client certificate you can pass just one cert using:
session = requests.session()
session.cert = ('/path/client_cert.pem', '/path/private_key.pem')
response = session.post(SERVICE_URL, data=XML_CONTENT, headers=HEADERS)
I get an error even if my "client_cert.pem" file is a bundle of 3 certificates (just like you do it in session.verify with CA certs). I can see on Wireshark that only the first one is used in TLS connection.
Is there any way to include multiple certificates TLS connection in Python's requests library?
Maybe I should use different library or override some of it's code?
I've got it!
I had some legacy library versions installed.
It seems that this issue was fixed by requests library developers in version 1.23. I also had to update urllib3.
My current requirements.txt is:
requests==2.22.0
urllib3==1.25.2 # compatible with requests 2.22
For following spec everything works perfecly. I've checked TLS connection on Wireshark. All certificates from "client_cert.pem" chain are passed.
If you'll have problems like this in the future remember to check if your requests and urllib3 library versions are compatible.
Thank you guys!
I have some python code to make a https post request (api call) and under python3, and need to convert it to python 2.6.6
I was searching online for some solutions but still unclear of what is the equivalent SSL functions in 2.6.6 from what I currently have.
import ssl
import http.client
context = ssl.create_default_context(ssl.Purpose.SERVER_AUTH, cafile='mycer.cer')
context.verify_mode = ssl.CERT_REQUIRED
context.load_cert_chain(certfile='mycrt.crt', keyfile='mykey.key',
password='mypass')
conn = http.client.HTTPSConnection("example.com", context=context)
So this above snippet allows me to connect to the webpage with SSL cert authentication and then I can proceed to make my get/post requests.
I know http.client is known as httplib in python 2.x, however the SSL functions i use need at least 2.7.9 python and 2.6.6 doesn't have the create_default_context function.
Any ideas how to create the equivalent in python2.6.6?
Thanks guys for your help
I am using requests library of python to download a file of size approx. 40mb. But with my code i am getting file of size 14mb only. It is not showing any error(few warnings though before download file).
here is my code:
import requests
file_url = "https://file_url.tar"
user='username'
passw='password'
r = requests.get(file_url,auth=(user,passw),verify=False, stream = True)
with open("c.tar","wb") as code:
for chunk in r.iter_content(chunk_size=1024):
if chunk:
code.write(chunk)
I tried using without 'stream=True' too. but that also not working.
When i am puting this URL in browser i am getting complete file of 40 mb.
I tried this script on some other machine and it is working fine there(and i am getting those warnings here too).
These are the warnings i am getting:
SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#snimissingwarning.
SNIMissingWarning
InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning.
InsecurePlatformWarning
InsecureRequestWarning: Unverified HTTPS request is being made. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.org/en/latest/security.html
InsecureRequestWarning)
but i don't think there is a problem because of this because if i am running this script on some other system i am getting these warnings but script is working fine.
I am using urllib instead of requests
import urllib
url = "http://62.138.7.10/downloads/Bh2g2m.Bh2g.06.DR.M0viesC0unter.mp4?st=6MVZyTUL7X22v7ILOtB2XA&e=1502823147"
file_name = 'trial_video.mp4'
response = urllib.urlopen(url)
with open(file_name,'wb') as f:
f.write(response.read())
Hope this will help you
I have experienced similar problems with Requests. Requests is great for doing fancy JSON api POST requests etc, but for ordinary file downloads, pycurl is a much better tool. The complicated dependency on libcurl means you shouldn't try installing pycurl with pip; instead you need to download a copy from your distro, or use one of the prebuilt win32 modules from their site.
For what it's worth, when I was using requests for file downloads, I also set up logging, and I got some "broken pipe" errors. Maybe Requests disconnects early for performance reasons or something? I didn't have the patience to figure it out when I knew there was an alternative solution that works reliably.
I am attempting to connect to MongoDB hosted on an AWS instance with a key file. I am able to ssh into the instance and connect to the database with no issues. When I try to connect to the database from a remote location with pymongo I receive this error:
ServerSelectionTimeoutError: SSL handshake failed: EOF occurred in violation of protocol
Port 27017 is open and the source is set to 0.0.0.0/0.
from pymongo import MongoClient
client = MongoClient('mongodb://ec2-123-45-678-910.compute-1.amazonaws.com',
27017,
ssl=True,
ssl_keyfile='/path_to/mykey.pem')
db = client.test
coll = db.foo
coll.insert_many(records)
ServerSelectionTimeoutError: SSL handshake failed: EOF occurred in violation of protocol (_ssl.c:645)
This question is nearly identical to mine, however the error is different and the solution posted there does not apply to my issue.
The address and key here have been changed, I have been going in circles on this for hours with no luck, any help would be appreciated.
This issue can cause because of following issue:
version of pymongo (suggest to use 3.3.0, which worked for me)
It can be a DNS issue, etc, in fact you could check for a DNS issue using:
telnet xx.xx.xx.xx port
can be a firewall issue
Can be an issue with ssl key. Try the following to test:
import os
import pymongo
import ssl
URL="url:port/db?ssl=true"
client = pymongo.MongoClient(URL, ssl_cert_reqs=ssl.CERT_NONE)
db = client.get_default_database()
print db
print db.collection_names()
I had the same problem (SSL handshake) with Pymongo module to connect to DocumentDB Azure (Data Base).
The error :
ServerSelectionTimeoutError: SSL handshake failed: EOF occurred in violation of protocol (_ssl.c:590)
I was using pymongo==3.4.0
To resolve this :
Change the version of pymongo by installing the 3.3.0 version
pip install pymongo==3.3.0
Try:
import pymongo
pymongo.__version__
For me, the problem was that my Python setup only supported TLS 1.0 – not TLS 1.1 or above.
You can check it like this:
Python 3
> from urllib.request import urlopen
> urlopen('https://www.howsmyssl.com/a/check').read()
Python 2
> from urllib2 import urlopen
> urlopen('https://www.howsmyssl.com/a/check').read()
Check the output for the key tls_version. If it says TLS 1.0 and not TLS 1.1 or TLS 1.2 that could be the problem.
If you're using a virtualenv, be sure to run the command inside.
Solution: Install Python with a newer version of OpenSSL
In order support TLS 1.1 or above, you may need to install a newer version of OpenSSL, and install Python again afterwards. This should give you a Python that supports TLS 1.1.
The process depends on your operating system – here's a guide for OS X.
virtualenv users
For me, the Python outside of my virtualenv had TLS 1.2 support, so just I removed my old virtualenv, and created a new one with the same packages and then it worked. Easy peasy!
See also:
The warning about TLS 1.0 in the Python 3 section in the PyMongo documenation. Although it's under the Python 3 section it also applies to Python 2
I had the same issue and talked for 30 minutes with the Mongo Atlas support which deployed over AWS. I run the next terminal command:
/Applications/Python\ 3.6/Install\ Certificates.command
I had the same issue. Please check if you are connected via VPN. when I disconnected it resolved my problem.
I am using the requests library for my python client which talks to app servers. I do not want to put the verify=false in the production version as it is blindly trusting. I know that requests APIs support certificate verification. http://docs.python-requests.org/en/latest/user/advanced/#ssl-cert-verification. But I am not able to find the required dependencies for the same. Is installing the openssl separately required on windows?
You do not need to install OpenSSL on Windows to get certificate verification with requests.
Yes, OpenSSL is required, but OpenSSL is statically linked with the Python Windows binaries, so as long as the ssl module is present in your Python install everything will work fine.