Python liburl2 timeout; can ping server fine, and wget works fine; - python

I'm trying to use Python's liburl2 to access the Dreamhost API documented here: http://wiki.dreamhost.com/API
Here is my code:
request = urllib2.Request('https://api.dreamhost.com/?key=<key>')
response = urllib2.urlopen(request)
page = response.read()
print(page)
This invariably fails with the error:
urllib2.URLError: <urlopen error [Errno 104] Connection reset by peer>
I'm absolutely stumped, because I can ping api.dreamhost.com just fine, and wget https://api.dreamhost.com/?key= works fine, too.
Any ideas?

I know it's an old question, but I faced the same problem, and found the solution through two other questions.
This, that shows me the problem is with the handshake using SSLv3:
OpenSSL issues in Debian Wheezy
And this, that gives some possible solutions:
Python HTTPS requests (urllib2) to some sites fail on Ubuntu 12.04 without proxy

Related

Python3 + vagrant ubuntu 16.04 + ssl request = [Errno 104] Connection reset by peer

I'm using on my Mac Vagrant with "bento/ubuntu-16.04" box. I'm trying to use Google Adwords Api via python library but got error [Errno 104] Connection reset by peer
I make sample script to check possibility to send requests:
import urllib.request
url ="https://adwords.google.com/api/adwords/mcm/v201609/ManagedCustomerService?wsdl"
f = urllib.request.urlopen(url)
print(f.read())
If I try this request via python3 - I've got [Errno 104] Connection reset by peer.
But if I send request via curl curl https://adwords.google.com/api/adwords/mcm/v201609/ManagedCustomerService?wsdl - I've got some response( even if it is 500 code) with body.
If I try this sample python script from my host Mac machine - I also receive some text response.
I tried this script from VDS server with ubuntu 16.04 - also worked.
So I assume, problem is possible between Vagrant/Mac.
Maybe you can help me?
Thanks.
I found solution. It looks like bug in Virtualbox 5.1.8 version. You can read about it here
So, you can fix it by downgrade Virtualbox to < 5.1.6

SSL error using python(2.7) requests

I am unable to issue a request to piratebay using requests with python2.7. I did the same with python3.4 and it worked ok. The line which I'm trying to execute:
r = requests.get("http://thepiratebay.se/browse/201", verify=False)
I did the verify=False to try and escape all the SSL jargon to no avail. It's a small personal project anyway..
I also tried to change the version of SSL using this link, however it still is giving me
requests.exceptions.SSLError: [Errno 1] _ssl.c:510: error:14077438:SSL routines:SSL23_GET_SERVER_HELLO:tlsv1 alert internal error.`
Thanks
The site thepiratebay.se requires Server Name Indication (SNI) and will throw an alert if the client does not support it. While python3 supported SNI for a while already with python2.7 SNI was only added with version 2.7.9. My guess is that you are using an older version of python 2.7 and that's why run into this error.

<urlopen error [Errno 1] _ssl.c:510: error:14077417:SSL

Does anyone know why I am getting this error?
SSLError: [Errno 1] _ssl.c:510: error:14077438:SSL routines:SSL23_GET_SERVER_HELLO:tlsv1
I get the erro when using requests or urllib2, I'm running the code on Kodi. The code runs fine when I run it on Visual Studio on my PC.
I am trying to scrape a website that is blocked by my ISP, so I'm using a proxy version of the site.
import requests
url = 'https://kickass.unblocked.pe/'
r = requests.get(url)
The site is hosted by Cloudflare Free SSL and requires support for Server Name Indication (SNI). SNI is support with Python 2.7 only since version 2.7.9. I guess that you are using an older version.
verify=False (which is usually a bad idea anyway) will not help here because without SNI the handshake will fail because the server does not know which certificate is requested and thus will not sent any certificate but instead an alert.

Python requests - get request to secured resource - SSL3_GET_RECORD:decryption failed or bad record mac

I am using python-requests to perform get requests to some resources.
In staging and production environments things works out fine, but in the test environment with a slightly different setup, I receive the message below when trying to perform the request:
requests.exceptions.SSLError: [Errno 1] _ssl.c:510: error:1408F119:SSL routines:SSL3_GET_RECORD:decryption failed or bad record mac
I have tried using an adapter as specified here: SSL Error on Python GET Request
I have tried all the protocols in the list. Still no luck.
I have tried mounting both the complete url and the domain url. No difference.
Any ideas on what to try next?

Requests library not properly directing HTTP requests through proxies

I know how to use requests very well, yet for some reason I am not succeeding in getting the proxies working. I am making the following request:
r = requests.get('http://whatismyip.com', proxies={'http': 'http://148.236.5.92:8080'})
I get the following:
requests.exceptions.ConnectionError: [Errno 10060] A connection attempt failed b
ecause the connected party did not properly respond after a period of time, or e
stablished connection failed because connected host has failed to respond
Yet, I know the proxy works, because using node:
request.get({uri: 'http://www.whatismyip.com', proxy: 'http://148.236.5.92:8080'},
function (err, response, body) {var $ = cheerio.load(body); console.log($('#greenip').text());});
I get the following (correct) response:
148.236.5.92
Furthermore, when I try the requests request at all differently (say, without writing http:// in front of the proxy), it just allows the request to go through normally without going through a proxy or returning an error.
What am I doing wrong in Python?
It's a known issue: https://github.com/kennethreitz/requests/issues/1074
I'm not sure exactly why it's taking so long to fix though. To answer your question though, you're doing nothing wrong.
As sigmavirus24 says, this is a known issue, which has been fixed, but hasn't yet been packaged up into a new version and pushed to PyPI.
So, if you need this in a hurry, you can upgrade from the git repo's master.
If you're using pip, this is simple. Instead of this:
pip install -U requests
Do this:
pip install -U git+https://github.com/kennethreitz/requests
If you're not using pip, you'll probably have to explicitly git clone the repo, then easy_install . or python setup.py or whatever from your local copy.

Categories