How to catch the orignial exception - python

I'm using the requests module with max_retries option. I would like to catch the exceptions only related to timeouts and slow replies:
import requests
from requests.exceptions import ConnectTimeout, Timeout
URL = 'http://exmaple.com/sleep' # sleeps for 5 seconds before reply
with requests.Session() as s:
try:
a = requests.adapters.HTTPAdapter(max_retries=2)
s.mount('http://', a)
r = s.get(URL, timeout=1)
except (ConnectTimeout, Timeout) as err:
print('# {} - timeout'.format(URL))
But it looks like the underlying urllib3 library throws ReadTimeoutError and requests doesn't catch it and throws ConnectionError instead:
requests.exceptions.ConnectionError: HTTPConnectionPool(host='example.com', port=80): Max retries exceeded with url: /sleep (Caused by ReadTimeoutError("HTTPConnectionPool(host='example.com', port=80): Read timed out. (read timeout=1)"))
I don't want to add ConnectionError to the list because there are other exceptions that inherit from it so it would also catch those.
Is there a way to catch the original exception or perhaps all exceptions in the chain using traceback module.

Ideally, you should catch those other exceptions above ConnectionError and raise them if you want your program to throw an error.
class OtherException(requests.exceptions.ConnectionError):
pass
try:
raise OtherException('This is other exception.')
except OtherException as oe:
raise oe
except requests.exceptions.ConnectionError:
print('The error you want to catch')

You can use a similar contruct:
import traceback
import logging
try:
whatever()
except Exception as e:
logging.error(traceback.format_exc())
# Your actions here
This will almost catch everything except, for example, KeyboardInterrupt and SystemExit.
Catching those would make the script quite hard to quit.

Related

Create a Conection TimeOut using urllib2.urlOpen()

I want to create a connection timeout exception using urlopen.
try:
urllib2.urlopen("http://example.com", timeout = 5)
except urllib2.URLError, e:
raise MyException("There was an error: %r" % e)
This is the code
I want to create a timeout that this code would bring an exception.
Thank You in advance.
You need to catch socket.timeout exception, check example below.
import urllib2
import socket
class MyException(Exception):
pass
try:
urllib2.urlopen("http://example.com", timeout = 1)
except socket.timeout, e:
# For Python 2.7
raise MyException("There was an error: %r" % e)
I strongly recommend using Requests library for making requests, it will make your life easier.

Python requests detailed ConnectionError handling

I just wrote this:
try:
r = requests.get('http://example.com')
except requests.exceptions.ConnectionError as e:
print(e)
And I got this output:
('Connection aborted.', RemoteDisconnected('Remote end closed connection without response',))
Does anyone know how could I get different types of connection errors? Like 'connection aborted', 'connection refused' and 'connection reset' from this exception and handle them?
If your goal is to get the response message and then handle them. You can try this code.
import requests
response = requests.get("http://www.example.com")
print(response.status_code)
print(response.reason)

Bypassing the IncompleteRead exception

I am writing a Twitter stream listener in Python3 using Tweepy. I get this error after streaming for a while:
urllib3.exceptions.ProtocolError: ('Connection broken: IncompleteRead(0 bytes read)', IncompleteRead(0 bytes read))
How can I just bypass this, reconnect and keep going?
I have done:
from requests.packages.urllib3.exceptions import ReadTimeoutError, IncompleteRead
And:
while True:
try:
twitter_stream.filter(track=keywordlist, follow=userlist)
except IncompleteRead:
continue
But still getting the error.
The exception you're getting is a urllib3.exceptions.ProtocolError exception.
Try:
from urllib3.exceptions import ProtocolError
while True:
try:
twitter_stream.filter(track=keywordlist, follow=userlist)
except ProtocolError:
continue

Checking for Timeout Error in python

So I have a pretty generic logging statement after a request:
try:
r = requests.get(testUrl, timeout=10.0)
except Exception, err:
logger.error({"message": err.message})
This works great for everything I've thrown at it except TimeoutError. When the request times out the err I get back is a tuple that it tries and fails to serialize.
My question is how do I catch just this one type of error? For starters TimeoutError is not something I have access to. I have tried adding from exceptions import * but with no luck. I've also tried importing OSError because the docs say TimeoutError is a subclass, but I was unable to access TimeoutError after importing OSError.
TimeoutError docs
I plan to either list my exceptions in order:
except TimeoutError, err:
#handle this specific error
except Exception, err:
#handle all other errors
or just check for type:
except Exception, err:
if isinstance(err, TimeoutError):
#handle specific error
#handle all other errors
Python 2.7.3 & Django 1.5
You can handle requests.Timeout exception:
try:
r = requests.get(testUrl, timeout=10.0)
except requests.Timeout as err:
logger.error({"message": err.message})
except requests.RequestException as err:
# handle other errors
Example:
>>> import requests
>>> url = "http://httpbin.org/delay/2"
>>> try:
... r = requests.get(url, timeout=1)
... except requests.Timeout as err:
... print(err.message)
...
HTTPConnectionPool(host='httpbin.org', port=80): Read timed out. (read timeout=1)

Checking a Python FTP connection

I have a FTP connection from which I am downloading many files and processing them in between. I'd like to be able to check that my FTP connection hasn't timed out in between. So the code looks something like:
conn = FTP(host='blah')
conn.connect()
for item in list_of_items:
myfile = open('filename', 'w')
conn.retrbinary('stuff", myfile)
### do some parsing ###
How can I check my FTP connection in case it timed out during the ### do some parsing ### line?
Send a NOOP command. This does nothing but check that the connection is still going and if you do it periodically it can keep the connection alive.
For example:
conn.voidcmd("NOOP")
If there is a problem with the connection then the FTP object will throw an exception. You can see from the documentation that exceptions are thrown if there is an error:
socket.error and IOError: These are raised by the socket connection and are most likely the ones you are interested in.
exception ftplib.error_reply: Exception raised when an unexpected reply is received from the server.
exception ftplib.error_temp: Exception raised when an error code signifying a temporary error (response codes in the range 400–499) is received.
exception ftplib.error_perm: Exception raised when an error code signifying a permanent error (response codes in the range 500–599) is received.
exception ftplib.error_proto: Exception raised when a reply is received from the server that does not fit the response specifications of the File Transfer Protocol, i.e. begin with a digit in the range 1–5.
Therefore you can use a try-catch block to detect the error and handle it accordingly.
For example this sample of code will catch an IOError, tell you about it and then retry the operation:
retry = True
while (retry):
try:
conn = FTP('blah')
conn.connect()
for item in list_of_items:
myfile = open('filename', 'w')
conn.retrbinary('stuff', myfile)
### do some parsing ###
retry = False
except IOError as e:
print "I/O error({0}): {1}".format(e.errno, e.strerror)
print "Retrying..."
retry = True

Categories