What causes python socket timeout - python

I have a simple Python code running on Linux (Raspbian) and connecting to a server using urlopen(basically this is using Python socket) :
req = urllib.request.Request('myServer', data = params, headers = head)
try:
response = urllib.request.urlopen(req, timeout = 20)
except:
From socket timeout doc
timeout=None will act in blocking mode this is not what I want as It will hang forever if I have not internet connection
timeout=0 will act in non-blocking mode, but using it then I get error 115 (Operation now in progress)
timeout=20 will act in timeout mode, blocking for 20s and escaping if not able to create the connection
My questions:
Why is the non-blocking mode always failing ? (It may be a misconception but I thought it should work sometimes and not fail always)
What sometimes causes the 20s timeout to occur ? (80% of the case the urlopen will execute in 1-2s, 20% timeout)

Related

httplib.HTTPConnection timeout: connect vs other blocking calls

I'm trying to use python's HTTPConnection to make some long running remote procedure calls (~30 seconds)
httplib.HTTPConnection(..., timeout=45)
solves this. However, it means that failed connection attempts will cause a painfully long wait. I can independently control the read and connect timeouts for a socket -- can I do this when using HTTPConnection?
I understand you are not waiting to send a request, but if you deal with the connection failure first, you wont have to wait, i.e. don't include the timeout parameter with the first request, something like the following:
httplib.HTTPSConnection ('url' )
headers ={'Connection' : 'Keep-Alive'}
conn.request("request blank", headers) #dummy request to open a conn and keep it alive
dummy_response = conn.getresponse()
if dummmy_response == 200 or dummy_response == 201: #if ok, do work with conn
conn.request("request real", timeout = 25)
else:
reconnect. #restart the dummy request
Just a suggestion, you can always bump the question or have it answered again.

Python requests module connection timeout

I'm looking at http://docs.python-requests.org/en/latest/ and "Connection Timeouts" is listed as a feature. However, when I read further, it states
timeout is not a time limit on the entire response download; rather, an exception is raised if the server has not issued a response for timeout seconds (more precisely, if no bytes have been received on the underlying socket for timeout seconds).
That doesn't sound like a description of a connection timeout. What I'm seeing is a connection is successful, it uploads a big file and then waits for a response. However, the response takes a while and then timesout.
How can I set a connection timeout, but still wait for slow responses once a connection has been successful? Thanks a lot.
The requests (for humans) library has connection timeouts, see
- https://requests.kennethreitz.org/en/master/user/advanced/#timeouts
r = requests.get('https://github.com', timeout=(3.05, 27))
# e.g. explicitly
conn_timeout = 6
read_timeout = 60
timeouts = (conn_timeout, read_timeout)
r = requests.get('https://github.com', timeout=timeouts)
The docs are not exactly explicit about which value is which in the tuple, but it might be safe to assume that it's (connect, read) timeouts.
The timeout is used for both the socket connect stage and the response reading stage. The only exception is streamed requests; if you set stream=True, the timeout cannot be applied to the reading portion. The timeout is indeed used just for waiting for the socket to connect or data to be received.
If you need an overall timeout, then use another technique, like using interrupts or eventlets: Timeout for python requests.get entire response

How to set the redis timeout waiting for the response with pipeline in redis-py?

In the code below, is the pipeline timeout 2 seconds?
client = redis.StrictRedis(host=host, port=port, db=0, socket_timeout=2)
pipe = client.pipeline(transaction=False)
for name in namelist:
key = "%s-%s-%s-%s" % (key_sub1, key_sub2, name, key_sub3)
pipe.smembers(key)
pipe.execute()
In the redis, there are a lot of members in the set "key". It always return the error as below with the code last:
error Error while reading from socket: ('timed out',)
If I modify the socket_timeout value to 10, it returns ok.
Doesn't the param "socket_timeout" mean connection timeout? But it looks like response timeout.
The redis-py version is 2.6.7.
I asked andymccurdy , the author of redis-py, on github and the answer is as below:
If you're using redis-py<=2.9.1, socket_timeout is both the timeout
for socket connection and the timeout for reading/writing to the
socket. I pushed a change recently (465e74d) that introduces a new
option, socket_connect_timeout. This allows you to specify different
timeout values for socket.connect() differently from
socket.send/socket.recv(). This change will be included in 2.10 which
is set to be released later this week.
The redis-py version is 2.6.7, so it's both the timeout for socket connection and the timeout for reading/writing to the socket.
It is not connection timeout, it is operation timeout. Internally the socket_timeout argument on StrictRedis() will be passed to the socket's settimeout method.
See here for details: https://docs.python.org/2/library/socket.html#socket.socket.settimeout

Close lingering connection

I'm using python-requests for a client tool. It makes repeated requests to servers at an interval. However if the server disconnects, the client fails with a socket error on its next request. It appears the client is keeping the connection open from its side, rather than reconnecting. These connections could be hours apart, so it is unlikely the server wouldn't disconnect it.
Is there a way to override keep alive and force it to close? Is there something similar to:
with requests.get(url) as r:
doStuff(r)
# R is cleaned up, the socket is closed.
that would force the connection to clean up after I'm done?
As written that doesn't work, because requests.Response doesn't have an __ exit__ call.
How about this?
I haven't tested it, based only on the API doc:
s = requests.Session()
r = s.get(url)
doStuff(r)
s.close()
Or, to make sure that the close is always called, even if there's an exception, here's how to emulate the with-statement using a try/finally:
s = requests.Session()
try:
r = s.get(url)
doStuff(r)
finally:
s.close()

Python: How to shutdown a threaded HTTP server with persistent connections (how to kill readline() from another thread)?

I'm using python2.6 with HTTPServer and the ThreadingMixIn, which will handle each request in a separate thread. I'm also using HTTP1.1 persistent connections ('Connection: keep-alive'), so neither the server or client will close a connection after a request.
Here's roughly what the request handler looks like
request, client_address = sock.accept()
rfile = request.makefile('rb', rbufsize)
wfile = request.makefile('wb', wbufsize)
global server_stopping
while not server_stopping:
request_line = rfile.readline() # 'GET / HTTP/1.1'
# etc - parse the full request, write to wfile with server response, etc
wfile.close()
rfile.close()
request.close()
The problem is that if I stop the server, there will still be a few threads waiting on rfile.readline().
I would put a select([rfile, closefile], [], []) above the readline() and write to closefile when I want to shutdown the server, but I don't think it would work on windows because select only works with sockets.
My other idea is to keep track of all the running requests and rfile.close() but I get Broken pipe errors.
Ideas?
You're almost there—the correct approach is to call rfile.close() and to catch the broken pipe errors and exit your loop when that happens.
If you set daemon_threads to true in your HTTPServer subclass, the activity of the threads will not prevent the server from exiting.
class ThreadedHTTPServer(ThreadingMixIn, HTTPServer):
daemon_threads = True
You could work around the Windows problem by making closefile a socket, too -- after all, since it's presumably something that's opened by your main thread, it's up to you to decide whether to open it as a socket or a file;-).

Categories