Python requests / urllib2 socket.error: [Errno 104] Connection reset by peer - python

I have the following code:
import requests
requests.get('URL WITH PARAMS HERE', auth=('MY USERNAME', 'MY PASSWORD'))
It is used to hit an API, but it returns the following error:
"socket.error: [Errno 104] Connection reset by peer"
I am able to retrieve results using my browser. I am also able to cURL it and get results. The same problem happens when using urllib2, but for some reason pycurl seems to retrieve results.
Is there any solution to make it work or any idea as to the problem?

Your code is correct. The error might mean that the server on the other end is unhappy about what you're sending. You have to make sure you send it an appropriate request. To do that, you can:
Read the documentation about the host
Contact its owner
Check what your browser is sending when you successfully access your data
For the third option, use the integrated development tools on firefox, chrome, safari or your favorite browser (e.g for Firefox, read this)

Related

Selenium Python, reset tcp connection after each request

So i'm using python and selenium to repeat requests to a specific website. I am using a rotating proxy that is supposed to give me a new ip after each requests. The issue is that when I do a request for example to whatsmyip.org in the chrome window I don't always get a fresh ip.
If my requests are done every 2seconds I keep the same ip, but if it's every 10-15s then my ip changes.
If you have any idea how I could fix this it would be nice, for example a chrome option ? or a capability maybe ? I really don't know.

<urlopen error [Errno -3] Temporary failure in name resolution>

The problem here is that whenever I try to do urlopen request with urllib this problem pops up. I have no idea what could be causing this, maybe an outside source has ip banned me from sending urllib requests? Maybe, since when I try to execute the same line of code on my laptop it runs perfectly.
Keep in mind that my laptop is not in the same network as my server where the code is erroring out.
I have tried to do this on all of my other servers that I have and urllib works perfectly. The thing is that this error just popped out one day when I ran the code on the server. No code was changed, it just decided to show that error.
import urllib.request
urllib.request.urlopen("http://example.com/").read()
I expect the urlopen request to return readable html data but instead, it returns me this error:
urllib.error.URLError: urlopen error [Errno -3] Temporary
failure in name resolution
Problem solved. The problem here was that the /etc/resolv.conf file was non-existent and the solution to that was to create it and add, "nameserver 8.8.8.8" to the file. After that and rebooting the server, everything came back to normal.

How to send HTTP request in SSH?

I'm trying to make a simple HTTP request in Python in an SSH terminal:
from requests import get
r = get("https://www.google.com")
However, this command just stalls to infinity. This does not happen when not in SSH.
Is there any way to send the request such that it goes through?
Thanks ahead of time.
EDIT: Running the logging in Joran's link yields only the following line:
INFO:requests.packages.urllib3.connectionpool:Starting new HTTPS connection (1): www.google.com
First, check that you have ability to reach URL using some system-wide tool, like curl: curl -I "https://www.google.com". In case, you will have not timeout error, and got success response, my answer is not for you :)
You code can run forever, just because there is not timeout defined for socket connections. And if for some reason your system is not able to read from socket (at low level), you have to wait for long time.
http://docs.python-requests.org/en/latest/user/quickstart/#timeouts
Try this (assuming you are using python3):
from urllib.request import urlopen
r = urlopen('https://www.google.com').read()

SSL Handshake Error: [Errno 1]

I am trying to access this site with Python Httplib2:
https://www.talkmore.no/talkmore3/servlet/Login
But I get this error:
httplib2.SSLHandshakeError: [Errno 1] _ssl.c:510: error:140770FC:SSL routines:SSL23_GET_SERVER_HELLO:unknown protocol
This is the python code I use:
login = "user"
pwd = "pass"
headers = {'Content-type': 'application/x-www-form-urlencoded'}
data = {'username':login, 'password':pwd}
h = httplib2.Http(".cache", disable_ssl_certificate_validation=True)
resp, content = h.request("https://www.talkmore.no/talkmore3/servlet/Login", "POST", urlencode(data))
I have tried with other libraries, but the same error occurs..
The server itself is fine and supports TLS1.0...TLS1.2 (but no SSL 3.0). It also supports commonly used ciphers and using your python code gives no errors for me. This means that you either have some old and buggy version of python/OpenSSL installed (details for versions are missing in the question) or that there is some middlebox in between which stops the connection (i.e. firewall or similar).
Please try to access the same https-site with a normal browser from the same machine to see if you get the same problems. If yes then there is some middlebox blocking the data. If the browser succeeds please make a packet capture (with tcpdump or similar) to look at the differences between the data sent by the browser and your test program and thus narrow down what the underlying problem might be.

urllib.urlopen to open page on same port just hangs

I am trying to use urllib.urlopen to open a web page running on the same host and port as the page I am loading it from and it is just hanging.
For example I have a page at: "http://mydevserver.com:8001/readpage.html" and I have the following code in it:
data = urllib.urlopen("http://mydevserver.com:8001/testpage.html")
When I try and load the page it just hangs. However if I move the testpage.html script to a different port on the same host it works fine. e.g.
data = urllib.urlopen("http://mydevserver.com:8002/testpage.html")
Does anyone know why this might be and how I can solve the problem?
A firewall perhaps? Try opening the page from the command line with wget/curl (assuming you're on Linux) or on the browser, with both ports on settings. Furthermore, you could try a packet sniffer to find out what's going on and where the connection gets stuck. Also, if testpage.html is dynamically generated, see if it is hit, check webserver logs if the request shows up there.
Maybe something is already running on port 8001. Does the page open properly with a browser?
You seem to be implying that you are accessing a web page that is scripted in Python. That implies that the Python script is handling the incoming connections, which could mean that since it's already handling the urllib call, it is not available to handle the connection that results from it as well.
Show the code (or tell us what software) you're using to serve these Python scripts.

Categories