I am trying to access this site with Python Httplib2:
https://www.talkmore.no/talkmore3/servlet/Login
But I get this error:
httplib2.SSLHandshakeError: [Errno 1] _ssl.c:510: error:140770FC:SSL routines:SSL23_GET_SERVER_HELLO:unknown protocol
This is the python code I use:
login = "user"
pwd = "pass"
headers = {'Content-type': 'application/x-www-form-urlencoded'}
data = {'username':login, 'password':pwd}
h = httplib2.Http(".cache", disable_ssl_certificate_validation=True)
resp, content = h.request("https://www.talkmore.no/talkmore3/servlet/Login", "POST", urlencode(data))
I have tried with other libraries, but the same error occurs..
The server itself is fine and supports TLS1.0...TLS1.2 (but no SSL 3.0). It also supports commonly used ciphers and using your python code gives no errors for me. This means that you either have some old and buggy version of python/OpenSSL installed (details for versions are missing in the question) or that there is some middlebox in between which stops the connection (i.e. firewall or similar).
Please try to access the same https-site with a normal browser from the same machine to see if you get the same problems. If yes then there is some middlebox blocking the data. If the browser succeeds please make a packet capture (with tcpdump or similar) to look at the differences between the data sent by the browser and your test program and thus narrow down what the underlying problem might be.
Related
I'm trying to have a simple function collect certificates from servers. Using Python 3.10.8 and my code looks something this:
import ssl
def certgrab(dom):
address = (dom, 443)
try:
f = ssl.get_server_certificate(address)
except Exception as clanger:
return {'clanger': clanger}
print(f)
This is fine when I try it against 'google.com' or 'microsoft.com'. But most websites return the following error:{'clanger': ConnectionRefusedError(10061, 'No connection could be made because the target machine actively refused it', None, 10061, None)}.
I was wondering if it was a rejection because the sites don't like the user-agent (requests works fine with everything I test against, but obviously cannot grab the cert (unless it secretly can - which would be great!). But I cannot find a way of specifying one in the SSL library.
I'm at a bit of a loss as it works against 'google.com' and 'microsoft.com' (but then I suppose they may have set their sites to be generous / forgiving regarding what types of connections they support).
ConnectionRefusedError(10061, 'No connection could be made because the target machine actively refused it'
This has nothing to do with certificates, not even with TLS. This is a connection error at the TCP level, i.e. even before any TLS and certificates are in effect.
But most websites return the following error ...
If this is really "most websites" then you might have serious problems in your infrastructure which limit access to large parts of the internet. Or, you might need to use a proxy - but ssl.get_server_certificate does not support a proxy.
I have the following code:
import requests
requests.get('URL WITH PARAMS HERE', auth=('MY USERNAME', 'MY PASSWORD'))
It is used to hit an API, but it returns the following error:
"socket.error: [Errno 104] Connection reset by peer"
I am able to retrieve results using my browser. I am also able to cURL it and get results. The same problem happens when using urllib2, but for some reason pycurl seems to retrieve results.
Is there any solution to make it work or any idea as to the problem?
Your code is correct. The error might mean that the server on the other end is unhappy about what you're sending. You have to make sure you send it an appropriate request. To do that, you can:
Read the documentation about the host
Contact its owner
Check what your browser is sending when you successfully access your data
For the third option, use the integrated development tools on firefox, chrome, safari or your favorite browser (e.g for Firefox, read this)
I'm using MITM Proxy to capture requests/responses using Python Inline Scripts. In case of certificate verification error MITM Proxy writes log line to standard output like
127.0.0.1:34390: SSL verification failed for upstream server at depth 0 with error: 18
Is it possible to capture this certificate validation errors in Inline Script function and how? I'm only possible to capture http requests and responses in this scripts.
If it's impossible with Inline Script, maybe I can do it with libmproxy?
I ran into the same issue. Thankfully mitmproxy's code is very clean and well oraganized. You'll be able to retrieve your error like this:
def request(context, flow):
print flow.server_conn.ssl_verification_error
I am trying to connect to a website with requests that requires using a client certificate.
import requests
r = requests.get(url, cert='path to cert')
print(r.status_code)
This works for one site that uses the same client cert. That server is using TLS_RSA_WITH_AES_128_CBC_SHA, TLS 1.0. However my target site uses TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA, TLS 1.1. So basically the difference is TLS 1 works and TLS 1.1 doesn't. Everything works fine in browser so it must have something to do with Python's SSL.
I am using requests version 2.7.0 and I have requests[security] installed as well. pip freeze:
cffi==0.9.2
cryptography==0.8.1
ndg-httpsclient==0.3.3
pyasn1==0.1.7
pycparser==2.10
pyOpenSSL==0.15.1
requests==2.7.0
six==1.9.0
The specific error I am getting is requests.exceptions.SSLError: [SSL: TLSV1_ALERT_INTERNAL_ERROR] tlsv1 alert internal error (_ssl.c:600). This is on Windows 7 with Python 3.4.3. Unfortunately this is on an internal machine so I am stuck with Windows and our internal mirror of PyPi does not have the latest versions of everything. It seems to me like this has something to do with ssl failing and not necessarily requests.
Google does not give back promising results. There is this StackOverflow post that describes the same problem, but the solution provided (using a custom adapter) does not work for me.
Hopefully someone else has run into this before and can give me some tips on how to fix it. Please and thanks.
EDIT: I did a wireshark capture of the interaction. The SSL alert sent back is "Level: Fatal (2) Description: Internal Error (80)". After the TCP connection start, my machine sends a client hello.
Content Type: Handshake (22)
Version: TLS 1.0 (0x0301)
Length: 512
Then the handshake protocol segment of that packet is
Handshake Type: Client Hello (1)
Length: 508
Version: TLS 1.2 (0x0301)
followed by a list of the supported cipher suites, etc. I looked in the list of cipher suites sent by my client and TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA is listed. The server ACKs this message then sends the Alert packet.
I got rid of an identical SSLError by removing the first entry ECDH+AESGCM from requests.packages.urllib3.util.ssl_.DEFAULT_CIPHERS, with which the server seemed to have problems. The line
requests.packages.urllib3.util.ssl_.DEFAULT_CIPHERS = 'DH+AESGCM:ECDH+AES256:DH+AES256:ECDH+AES128:DH+AES:ECDH+HIGH:DH+HIGH:ECDH+3DES:DH+3DES:RSA+AESGCM:RSA+AES:RSA+HIGH:RSA+3DES:!aNULL:!eNULL:!MD5'
solved the problem for me.
For me, request.request('GET'... instead of request.get(... works.
And I got rid of the above SSLError by removing almost all the first entry:
requests.packages.urllib3.util.ssl_.DEFAULT_CIPHERS = 'RSA+AESGCM:RSA+AES:RSA+HIGH:RSA+3DES:!aNULL:!eNULL:!MD5'
Our client wants a client script that will be installed on their customers' computers to be as trivial to install as possible. This means no extra-install packages, in this case PyCurl.
We need to be able to connect to a website using SSL and expecting a client certificate. Currently this is done calling Curl with os.system() but to get the http return code doing this it looks like we'll have to use the '-v' option to Curl and comb through this output. Not difficult, just a bit icky.
Is there some other way to do this using the standard library that comes with Python 2.6?
I read everything I could find on this and I couldn't see a non-Curl way of doing it.
Thanks in advance for any guidance on this subject whatsoever!
this will do the trick. Note that Verisign don't require a client certificate, it's just a randomly taken HTTPS site.
import httplib
conn = httplib.HTTPSConnection('verisign.com', key_file='./my-key.pem', cert_file='./my-cert.pem')
conn.connect()
conn.request('GET', '/')
conn.set_debuglevel(20)
response = conn.getresponse()
print('HTTP status', response.status)
EDIT: Just for the posterity, Bruno's comment below is a valid one and here's an article how to roll it using the stdlib's socket ssl and socket modules in case it's needed.
EDIT2: Seems I cannot post links - just do a web search for 'Validating SSL server certificate with Python 2.x another day'