Can someone explain me the idea of generator and try except in this code:
from contextlib import contextmanager
#contextmanager
def file_open(path):
try:
f_obj = open(path, 'w')
yield f_obj
except OSError:
print("We had an error!")
finally:
print('Closing file')
f_obj.close()
if __name__ == '__main__':
with file_open('test.txt') as fobj:
fobj.write('Testing context managers')
As I know, finally is always executed regardless of correctness of the expression in try. So in my opinion this code should work like this: if we haven't exceptions, we open file, go to generator and the we go to finally block and return from the function. But I can't understand how generator works in this code. We used it only once and that's why we can't write all the text in the file. But I think my thoughts are incorrect. WHy?
So, one, your implementation is incorrect. You'll try to close the open file object even if it failed to open, which is a problem. What you need to do in this case is:
#contextmanager
def file_open(path):
try:
f_obj = open(path, 'w')
try:
yield f_obj
finally:
print('Closing file')
f_obj.close()
except OSError:
print("We had an error!")
or more simply:
#contextmanager
def file_open(path):
try:
with open(path, 'w') as f_obj:
yield f_obj
print('Closing file')
except OSError:
print("We had an error!")
To "how do generators in general work?" I'll refer you to the existing question on that topic. This specific case is complicated because using the #contextlib.contextmanager decorator repurposes generators for a largely unrelated purpose, using the fact that they innately pause in two cases:
On creation (until the first value is requested)
On each yield (when each subsequent value is requested)
to implement context management.
contextmanager just abuses this to make a class like this (actual source code is rather more complicated to cover edge cases):
class contextmanager:
def __init__(self, gen):
self.gen = gen # Receives generator in initial state
def __enter__(self):
return next(self.gen) # Advances to first yield, returning the value it yields
def __exit__(self, *args):
if args[0] is not None:
self.gen.throw(*args) # Plus some complicated handling to ensure it did the right thing
else:
try:
next(self.gen) # Check if it yielded more than once
except StopIteration:
pass # Expected to only yield once
else:
raise RuntimeError(...) # Oops, it yielded more than once, that's not supposed to happen
allowing the coroutine elements of generators to back a simpler way to write simple context managers.
Related
I am working with a class in python that is part of a bigger program. The class is calling different methods.
If there is an error in one of the method I would like code to keep running after, but after the program is finished, I want to be able to see which methods had potential errors in them.
Below is roughly how I am structuring it at the moment, and this solution doesn't scale very well with more methods. Is there a better way to provide feedback (after the code has been fully run) as to which of the method had a potential error?
class Class():
def __init__(self):
try:
self.method_1()
except:
self.error_method1 = "Yes"
break
try:
self.method_2()
except:
self.error_method2 = "Yes"
break
try:
self.method_3()
except:
self.error_method3 = "Yes"
break
Although you could use sys.exc_info() to retrieve information about an Exception when one occurs as I mentioned in a comment, doing so may not be required since Python's standard try/expect mechanism seems adequate.
Below is a runnable example showing how to do so in order to provide "feedback" later about the execution of several methods of a class. This approach uses a decorator function, so should scale well since the same decorator can be applied to as many of the class' methods as desired.
from contextlib import contextmanager
from functools import wraps
import sys
from textwrap import indent
def provide_feedback(method):
""" Decorator to trap exceptions and add messages to feedback. """
#wraps(method)
def wrapped_method(self, *args, **kwargs):
try:
return method(self, *args, **kwargs)
except Exception as exc:
self._feedback.append(
'{!r} exception occurred in {}()'.format(exc, method.__qualname__))
return wrapped_method
class Class():
def __init__(self):
with self.feedback():
self.method_1()
self.method_2()
self.method_3()
#contextmanager
def feedback(self):
self._feedback = []
try:
yield
finally:
# Example of what could be done with any exception messages.
# They could instead be appended to some higher-level container.
if self._feedback:
print('Feedback:')
print(indent('\n'.join(self._feedback), ' '))
#provide_feedback
def method_1(self):
raise RuntimeError('bogus')
#provide_feedback
def method_2(self):
pass
#provide_feedback
def method_3(self):
raise StopIteration('Not enough foobar to go around')
inst = Class()
Output:
Feedback:
RuntimeError('bogus') exception occurred in Class.method_1()
StopIteration('Not enough foobar to go around') exception occurred in Class.method_3()
I have a script executing several independent functions in turn. I would like to collect the errors/exceptions happening along the way, in order to send an email with a summary of the errors.
What is the best way to raise these errors/exceptions and collect them, while allowing the script to complete and go through all the steps? They are independent, so it does not matter if one crashes. The remaining ones can still run.
def step_1():
# Code that can raise errors/exceptions
def step_2():
# Code that can raise errors/exceptions
def step_3():
# Code that can raise errors/exceptions
def main():
step_1()
step_2()
step_3()
send_email_with_collected_errors()
if '__name__' == '__main__':
main()
Should I wrap each step in a try..except block in the main() function? Should I use a decorator on each step function, in addition to an error collector?
You could wrap each function in try/except, usually better for small simple scripts.
def step_1():
# Code that can raise errors/exceptions
def step_2():
# Code that can raise errors/exceptions
def step_3():
# Code that can raise errors/exceptions
def main():
try:
step_1_result = step_1()
log.info('Result of step_1 was {}'.format(result))
except Exception as e:
log.error('Exception raised. {}'.format(e))
step_1_result = e
continue
try:
step_2_result = step_2()
log.info('Result of step_2 was {}'.format(result))
except Exception as e:
log.error('Exception raised. {}'.format(e))
step_2_result = e
continue
try:
step_3_result = step_3()
log.info('Result of step_3 was {}'.format(result))
except Exception as e:
log.error('Exception raised. {}'.format(e))
step_3_result = e
continue
send_email_with_collected_errors(
step_1_result,
step_2_result,
step_3_result
)
if '__name__' == '__main__':
main()
For something more elaborate you could use a decorator that'd construct a list of errors/exceptions caught. For example
class ErrorIgnore(object):
def __init__(self, errors, errorreturn=None, errorcall=None):
self.errors = errors
self.errorreturn = errorreturn
self.errorcall = errorcall
def __call__(self, function):
def returnfunction(*args, **kwargs):
try:
return function(*args, **kwargs)
except Exception as E:
if type(E) not in self.errors:
raise E
if self.errorcall is not None:
self.errorcall(E, *args, **kwargs)
return self.errorreturn
return returnfunction
Then you could use it like this:
exceptions = []
def errorcall(E, *args):
print 'Exception raised {}'.format(E)
exceptions.append(E)
#ErrorIgnore(errors=[ZeroDivisionError, ValueError], errorreturn=None, errorcall=errorcall)
def step_1():
# Code that can raise errors/exceptions
...
def main():
step_1()
step_2()
step_3()
send_email_with_collected_errors(exceptions)
if '__name__' == '__main__':
main()
use simple try except statements and do logging for the exceptions that would be standard way to collect all your errors.
There are two options:
Use decorator in which you catch all exceptions and save it somewhere.
Add try/except everywhere.
Using decorator might be much better and cleaner, and code will be easier to maintain.
How to store errors? Your decision. You can add them to some list, create logging class receiving exceptions and get them after everything is doneā¦ Depends on your project and size of code.
Simple logging class:
class LoggingClass(object):
def __init__(self):
self.exceptions = []
def add_exception(self, exception):
self.exceptions.append(exception)
def get_all(self):
return self.exceptions
Create instance of class in your script, catch exceptions in decorator and add them to class (however global variable might be also ok).
I have a script that processes some data and if a database/file is present, writes some data into it. I specify the database or file as configargparse(argparse) argument. I need to clean (close file, db) in some organized way in case exceptions occur.
Here is my init:
import sqlite3
import confargparse
import sys
parser.ArgParser(...)
parser.add('--database', dest='database',
help='path to database with grabbers', metavar='FILE',
type=lambda x: arghelper.is_valid_file(parser, x))
parser.add('-f', '--file', type=configargparse.FileType(mode='r'))
args = parser.parse_args()
I did it using if and try:
if args.database:
conn = sqlite3.connect(args.database)
c = conn.cursor()
# same init for file
try:
while True: # do something, it might be moved to some main() function
result = foo()
if args.database:
c.execute('Write to database {}'.format(result))
# same
# for file
finally:
if args.database:
conn.close()
# same
# for file
except KeyboardInterrupt:
print 'keyboard interrupt'
Can it be done with the with statement? Something like (here comes ()?():() from C):
with ((args.database)?
(conn = sqlite3.connect(args.database)):
(None)) as db, same for file:
and then refer to the db inside the with clause and check if they exist?
To answer your question first. It can be done, using contextlib. But I'm not sure how much you would gain from this.
from contextlib import contextmanager
#contextmanager
def uncertain_conn(args):
yield sqlite3.connect(args.database) if args.database else None
# Then you use it like this
with uncertain_conn(args) as conn:
# conn will be the value yielded by uncertain_conn(args)
if conn is not None:
try:
# ...
But as I said, while turning a generator function into a context manager is cool and personally I really like the contextmanager decorator, and it does give you the functionality you say you want, I don't know if it really helps you that much here. If I were you I'd probably just be happy with if:
if args.database:
conn = sqlite3.connect(args.database)
try:
# ...
There are a couple of things you can simplify with with, though. Check out closing, also from contextlib (really simple, I'll just quote the documentation):
contextlib.closing(thing)
Return a context manager that closes thing
upon completion of the block. This is basically equivalent to:
from contextlib import contextmanager
#contextmanager def closing(thing):
try:
yield thing
finally:
thing.close()
So the above code can become:
if args.database:
conn = sqlite3.connect(args.database)
with closing(conn):
# do something; conn.close() will be called no matter what
But this won't print a nice message for KeyboardInterrupt. If you really need that, then I guess you still to have to write out the try-except-finally yourself. Doing anything more fanciful is probably not worth it. (And note that except must precede finally, otherwise you get a syntax error.)
And you can even do this with suppress (but requires a bit of caution; see below)
from contextlib import suppress
with suppress(TypeError):
conn = sqlite3.connect(args.database or None)
with closing(conn):
# do business
with suppress(error): do_thing is equivalent to
try:
do_thing
except error:
pass
So if args.database evaluates to False, the second line is effectively connect(None), which raises a TypeError, which will be caught by the context manager and the code below will be skipped. But the risk is that it will suppress all TypeErrors in its scope, and you may not want that.
You can create your own context manager in such cases. Create one which handles both connections. A context manager is a class which has methods __enter__() and __exit__(). One is called before entering the with clause, one is called when it is left (how ever).
Here's an example for how to do this in your case:
def f(cond1, cond2):
class MultiConnectionContextManager(object):
def __init__(self, cond1, cond2):
self.cond1 = cond1
self.cond2 = cond2
def __enter__(self):
print "entering ..."
if self.cond1:
# self.connection1 = open(...)
print "opening connection1"
if self.cond2:
# self.connection1 = open(...)
print "opening connection2"
return self
def __exit__(self, exc_type, exc_value, traceback):
print "exiting ..."
if self.cond1:
# self.connection1.close()
print "closing connection1"
if self.cond2:
# self.connection2.close()
print "closing connection2"
with MultiConnectionContextManager(cond1, cond2) as handle:
if cond1:
# handle.connection1.read()
print "using handle.connection1"
if cond2:
# handle.connection2.read()
print "using handle.connection2"
for cond1 in (False, True):
for cond2 in (False, True):
print "=====", cond1, cond2
f(cond1, cond2)
You can call this directly to see the outcome. Replace the prints with your real statements for opening, using, and closing the connections.
I have a requirement to execute multiple Python statements and few of them might fail during execution, even after failing I want the rest of them to be executed.
Currently, I am doing:
try:
wx.StaticBox.Destroy()
wx.CheckBox.Disable()
wx.RadioButton.Enable()
except:
pass
If any one of the statements fails, except will get executed and program exits. But what I need is even though it is failed it should run all three statements.
How can I do this in Python?
Use a for loop over the methods you wish to call, eg:
for f in (wx.StaticBox.Destroy, wx.CheckBox.Disable, wx.RadioButton.Enable):
try:
f()
except Exception:
pass
Note that we're using except Exception here - that's generally much more likely what you want than a bare except.
If an exception occurs during a try block, the rest of the block is skipped. You should use three separate try clauses for your three separate statements.
Added in response to comment:
Since you apparently want to handle many statements, you could use a wrapper method to check for exceptions:
def mytry(functionname):
try:
functionname()
except Exception:
pass
Then call the method with the name of your function as input:
mytry(wx.StaticBox.Destroy)
I would recommend creating a context manager class that suppress any exception and the exceptions to be logged.
Please look at the code below. Would encourage any improvement to it.
import sys
class catch_exception:
def __init__(self, raising=True):
self.raising = raising
def __enter__(self):
pass
def __exit__(self, type, value, traceback):
if issubclass(type, Exception):
self.raising = False
print ("Type: ", type, " Log me to error log file")
return not self.raising
def staticBox_destroy():
print("staticBox_destroy")
raise TypeError("Passing through")
def checkbox_disable():
print("checkbox_disable")
raise ValueError("Passing through")
def radioButton_enable():
print("radioButton_enable")
raise ValueError("Passing through")
if __name__ == "__main__":
with catch_exception() as cm:
staticBox_destroy()
with catch_exception() as cm:
checkbox_disable()
with catch_exception() as cm:
radioButton_enable()
Trying to get the try/except statement working but having problems. This code will take a txt file and copy the file that is in location row 0 to location of row 1. It works however if i change one of the paths to invalid one it generates an error ftplib.error_perm however the except command is not picking up and everything stops. What am i doing wrong? Python 2.4
import csv
import operator
import sys
import os
import shutil
import logging
import ftplib
import tldftp
def docopy(filename):
ftp = tldftp.dev()
inf = csv.reader(open(filename,'r'))
sortedlist = sorted(inf, key=operator.itemgetter(2), reverse=True)
for row in sortedlist:
src = row[0]
dst = row[1]
tldftp.textXfer(ftp, "RETR " + src, dst)
def hmm(haha):
result = docopy(haha);
try:
it = iter(result)
except ftplib.error_perm:
print "Error Getting File"
if __name__ == "__main__":
c = sys.argv[1]
if (c == ''):
raise Exception, "missing first parameter - row"
hmm(c)
The except clause will only catch exceptions that are raised inside of their corresponding try block. Try putting the docopy function call inside of the try block as well:
def hmm(haha):
try:
result = docopy(haha)
it = iter(result)
except ftplib.error_perm:
print "Error Getting File"
The point in the code which raises the error must be inside the try block. In this case, it's likely that the error is raised inside the docopy function, but that isn't enclosed in a try block.
Note that docopy returns None. As such, you will raise an exception when you try to make an iter out of None -- but it won't be a ftplib.error_perm exception, it'll be a TypeError
If you are not sure of what exception will occur, the use the code below, because if especifies for example: except StandardError: and is not that error the exception will not be process.
try:
# some code
except Exception: # Or only except:
print "Error" # Python 3: print("Error")
I know the OP is ancient, but for folks desperate for answers on this question. I had a similar issue, depending on your IDE, if you have a breakpoint on any of the lines with specific exceptions etc, this can conflict and stop try/except executing.
I noticed global exception may not works, e.g. , Ctrl+C when epub.py module perform urllib3 connection trigger KeyboardInterrupt but not able to catch in main thread, the workaround is put my clean up code inside finally, e.g.:
try:
main()
except Exception as e:
clean_up_stuff() #this one never called if keyboard interrupt in module urllib3 thread
finally: #but this work
clean_up_stuff()
This example is generic for Python3.3+, when decorating a generator function, a decorated generator returns successfully, thus not entering the decorators except, the magic happens with yield from f thus wrapping the yieldable within the decorator:
from types import GeneratorType
def generic_exception_catcher(some_kwarg: int = 3):
def catch_errors(func):
def func_wrapper(*args, **kwargs):
try:
f = func(*args, **kwargs)
if type(f) == GeneratorType:
yield from f
else:
return f
except Exception as e:
raise e
return func_wrapper
return catch_errors
Usage:
#generic_exception_catcher(some_kwarg=4)
def test_gen():
for x in range(0, 10):
raise Exception('uhoh')
yield x
for y in test_gen():
print('should catch in the decorator')