Skipping execution of -with- block - python

I am defining a context manager class and I would like to be able to skip the block of code without raising an exception if certain conditions are met during instantiation. For example,
class My_Context(object):
def __init__(self,mode=0):
"""
if mode = 0, proceed as normal
if mode = 1, do not execute block
"""
self.mode=mode
def __enter__(self):
if self.mode==1:
print 'Exiting...'
CODE TO EXIT PREMATURELY
def __exit__(self, type, value, traceback):
print 'Exiting...'
with My_Context(mode=1):
print 'Executing block of codes...'

According to PEP-343, a with statement translates from:
with EXPR as VAR:
BLOCK
to:
mgr = (EXPR)
exit = type(mgr).__exit__ # Not calling it yet
value = type(mgr).__enter__(mgr)
exc = True
try:
try:
VAR = value # Only if "as VAR" is present
BLOCK
except:
# The exceptional case is handled here
exc = False
if not exit(mgr, *sys.exc_info()):
raise
# The exception is swallowed if exit() returns true
finally:
# The normal and non-local-goto cases are handled here
if exc:
exit(mgr, None, None, None)
As you can see, there is nothing obvious you can do from the call to the __enter__() method of the context manager that can skip the body ("BLOCK") of the with statement.
People have done Python-implementation-specific things, such as manipulating the call stack inside of the __enter__(), in projects such as withhacks. I recall Alex Martelli posting a very interesting with-hack on stackoverflow a year or two back (don't recall enough of the post off-hand to search and find it).
But the simple answer to your question / problem is that you cannot do what you're asking, skipping the body of the with statement, without resorting to so-called "deep magic" (which is not necessarily portable between python implementations). With deep magic, you might be able to do it, but I recommend only doing such things as an exercise in seeing how it might be done, never in "production code".

If you want an ad-hoc solution that uses the ideas from withhacks (specifically from AnonymousBlocksInPython), this will work:
import sys
import inspect
class My_Context(object):
def __init__(self,mode=0):
"""
if mode = 0, proceed as normal
if mode = 1, do not execute block
"""
self.mode=mode
def __enter__(self):
if self.mode==1:
print 'Met block-skipping criterion ...'
# Do some magic
sys.settrace(lambda *args, **keys: None)
frame = inspect.currentframe(1)
frame.f_trace = self.trace
def trace(self, frame, event, arg):
raise
def __exit__(self, type, value, traceback):
print 'Exiting context ...'
return True
Compare the following:
with My_Context(mode=1):
print 'Executing block of code ...'
with
with My_Context(mode=0):
print 'Executing block of code ... '

A python 3 update to the hack mentioned by other answers from
withhacks (specifically from AnonymousBlocksInPython):
class SkipWithBlock(Exception):
pass
class SkipContextManager:
def __init__(self, skip):
self.skip = skip
def __enter__(self):
if self.skip:
sys.settrace(lambda *args, **keys: None)
frame = sys._getframe(1)
frame.f_trace = self.trace
def trace(self, frame, event, arg):
raise SkipWithBlock()
def __exit__(self, type, value, traceback):
if type is None:
return # No exception
if issubclass(type, SkipWithBlock):
return True # Suppress special SkipWithBlock exception
with SkipContextManager(skip=True):
print('In the with block') # Won't be called
print('Out of the with block')
As mentioned before by joe, this is a hack that should be avoided:
The method trace() is called when a new local scope is entered, i.e. right when the code in your with block begins. When an exception is raised here it gets caught by exit(). That's how this hack works. I should add that this is very much a hack and should not be relied upon. The magical sys.settrace() is not actually a part of the language definition, it just happens to be in CPython. Also, debuggers rely on sys.settrace() to do their job, so using it yourself interferes with that. There are many reasons why you shouldn't use this code. Just FYI.

Based on #Peter's answer, here's a version that uses no string manipulations but should work the same way otherwise:
from contextlib import contextmanager
#contextmanager
def skippable_context(skip):
skip_error = ValueError("Skipping Context Exception")
prev_entered = getattr(skippable_context, "entered", False)
skippable_context.entered = False
def command():
skippable_context.entered = True
if skip:
raise skip_error
try:
yield command
except ValueError as err:
if err != skip_error:
raise
finally:
assert skippable_context.entered, "Need to call returned command at least once."
skippable_context.entered = prev_entered
print("=== Running with skip disabled ===")
with skippable_context(skip=False) as command:
command()
print("Entering this block")
print("... Done")
print("=== Running with skip enabled ===")
with skippable_context(skip=True) as command:
command()
raise NotImplementedError("... But this will never be printed")
print("... Done")

What you're trying to do isn't possible, unfortunately. If __enter__ raises an exception, that exception is raised at the with statement (__exit__ isn't called). If it doesn't raise an exception, then the return value is fed to the block and the block executes.
Closest thing I could think of is a flag checked explicitly by the block:
class Break(Exception):
pass
class MyContext(object):
def __init__(self,mode=0):
"""
if mode = 0, proceed as normal
if mode = 1, do not execute block
"""
self.mode=mode
def __enter__(self):
if self.mode==1:
print 'Exiting...'
return self.mode
def __exit__(self, type, value, traceback):
if type is None:
print 'Normal exit...'
return # no exception
if issubclass(type, Break):
return True # suppress exception
print 'Exception exit...'
with MyContext(mode=1) as skip:
if skip: raise Break()
print 'Executing block of codes...'
This also lets you raise Break() in the middle of a with block to simulate a normal break statement.

Context managers are not the right construct for this. You're asking for the body to be executed n times, in this case zero or one. If you look at the general case, n where n >= 0, you end up with a for loop:
def do_squares(n):
for i in range(n):
yield i ** 2
for x in do_squares(3):
print('square: ', x)
for x in do_squares(0):
print('this does not print')
In your case, which is more special purpose, and doesn't require binding to the loop variable:
def should_execute(mode=0):
if mode == 0:
yield
for _ in should_execute(0):
print('this prints')
for _ in should_execute(1):
print('this does not')

Another slightly hacky option makes use of exec. This is handy because it can be modified to do arbitrary things (e.g. memoization of context-blocks):
from contextlib import contextmanager
#contextmanager
def skippable_context_exec(skip):
SKIP_STRING = 'Skipping Context Exception'
old_value = skippable_context_exec.is_execed if hasattr(skippable_context_exec, 'is_execed') else False
skippable_context_exec.is_execed=False
command = "skippable_context_exec.is_execed=True; "+("raise ValueError('{}')".format(SKIP_STRING) if skip else '')
try:
yield command
except ValueError as err:
if SKIP_STRING not in str(err):
raise
finally:
assert skippable_context_exec.is_execed, "You never called exec in your context block."
skippable_context_exec.is_execed = old_value
print('=== Running with skip disabled ===')
with skippable_context_exec(skip=False) as command:
exec(command)
print('Entering this block')
print('... Done')
print('=== Running with skip enabled ===')
with skippable_context_exec(skip=True) as command:
exec(command)
print('... But this will never be printed')
print('... Done')
Would be nice to have something that gets rid of the exec without weird side effects, so if you can think of a way I'm all ears. The current lead answer to this question appears to do that but has some issues.

Related

Returning value when exiting python context manager

Maybe this is a stupid (and indeed not very practical) question but I'm asking it because I can't wrap my head around it.
While researching if a return statement inside a call to a context manager would prevent __exit__ from being called (no it doesn't), I found that it seems common to make an analogy between __exit__ and finally in a try/finally block (for example here: https://stackoverflow.com/a/9885287/3471881) because:
def test():
try:
return True
finally:
print("Good bye")
Would execute the same as:
class MyContextManager:
def __enter__(self):
return self
def __exit__(self, *args):
print('Good bye')
def test():
with MyContextManager():
return True
This really helped me understand how cm:s work but after playing around a bit I realised that this analogy wont work if we are returning something rather than printing.
def test():
try:
return True
finally:
return False
test()
--> False
While __exit__ seemingly wont return at all:
class MyContextManager:
def __enter__(self):
return self
def __exit__(self, *args):
return False
def test():
with MyContextManager():
return True
test()
--> True
This lead me to think that perhaps you can't actually return anything inside __exit__, but you can:
class MyContextManager:
def __enter__(self):
return self
def __exit__(self, *args):
return self.last_goodbye()
def last_goodbye(self):
print('Good bye')
def test():
with MyContextManager():
return True
test()
--> Good bye
--> True
Note that it doesn't matter if we don't return anything inside the test() function.
This leads me to my question:
Is it impossible to return a value from inside __exit__ and if so, why?
Yes. It is impossible to alter the return value of the context from inside __exit__.
If the context is exited with a return statement, you cannot alter the return value with your context_manager.__exit__. This is different from a try ... finally ... clause, because the code in finally still belongs to the parent function, while context_manager.__exit__ runs in its own scope
.
In fact, __exit__ can return a boolean value (True or False) and it will be understood by Python. It tells Python whether the exception that exits the context (if any) should be suppressed (not propagate to outside the context).
See this example of the meaning of the return value of __exit__:
>>> class MyContextManager:
... def __init__(self, suppress):
... self.suppress = suppress
...
... def __enter__(self):
... return self
...
... def __exit__(self, exc_type, exc_obj, exc_tb):
... return self.suppress
...
>>> with MyContextManager(True): # suppress exception
... raise ValueError
...
>>> with MyContextManager(False): # let exception pass through
... raise ValueError
...
Traceback (most recent call last):
File "<stdin>", line 2, in <module>
ValueError
>>>
In the above example, both ValueErrors will cause the control to jump out of the context. In the first block, the __exit__ method of the context manager returns True, so Python suppresses this exception and it's not reflexed in the REPL. In the second block, the context manager returns False, so Python let the outer code handle the exception, which gets printed out by the REPL.
The workaround is to store the result in an attribute instead of returning it, and access it later. That is if you intend to use that value in more than a print.
For example, take this simple context manager:
class time_this_scope():
"""Context manager to measure how much time was spent in the target scope."""
def __init__(self, allow_print=False):
self.t0 = None
self.dt = None
self.allow_print = allow_print
def __enter__(self):
self.t0 = time.perf_counter()
def __exit__(self, type=None, value=None, traceback=None):
self.dt = (time.perf_counter() - self.t0) # Store the desired value.
if self.allow_print is True:
print(f"Scope took {self.dt*1000: 0.1f} milliseconds.")
It could be used this way:
with time_this_scope(allow_print=True):
time.sleep(0.100)
>>> Scope took 100 milliseconds.
or like so:
timer = time_this_scope()
with timer:
time.sleep(0.100)
dt = timer.dt
Not like what is shown below since the timer object is not accessible anymore as the scope ends. We need to modify the class as described here and add return self value to __enter__. Before the modification, you would get an error:
with time_this_scope() as timer:
time.sleep(0.100)
dt = timer.dt
>>> AttributeError: 'NoneType' object has no attribute 'dt'
Finally, here is a simple use example:
"""Calculate the average time spent sleeping."""
import numpy as np
import time
N = 100
dt_mean = 0
for n in range(N)
timer = time_this_scope()
with timer:
time.sleep(0.001 + np.random.rand()/1000) # 1-2 ms per loop.
dt = timer.dt
dt_mean += dt/N
print(f"Loop {n+1}/{N} took {dt}s.")
print(f"All loops took {dt_mean}s on average.)

Short form to return method result if condition passed

I'm wondering if there's any pythonic or short-form method to achieve the following:
error_response = self.check_conditions(request)
# If we have an error response, return it, otherwise continue as normal.
if error_response:
return error_response
Something like:
(return self.check_conditions(request)) or pass
Alternatively, is it possible for a function to return the calling method, such as:
self.check_conditions(request)
def check_conditions(self, request):
error_response = do_stuff()
if error_response:
return_parent error_response
I get the feeling the second concept is breaking a ton of programming laws to prevent chaos and the apocalypse, just a thought though :)
No, there is no short form for a conditional return.
But, to get to the second part of your question:
There are exceptions in Python. You can write something like this:
class MyErrorResponse(Exception): pass
class MyClass:
...
def check_conditions(self, request):
error_response = do_stuff()
if error_response:
raise MyErrorResponse(error_response)
def do_the_main_stuff():
try:
self.check_conditions()
...
except MyErrorResponse as e:
return e.args[0]
That depends a lot on what check_conditions does under the hood. It's likely that you can move error handling down a level of abstraction and handle things directly:
Compare:
error = False
def foo(request):
global error
try:
result = do_something_with(request)
except SomeWellDefinedError:
error = True
def check_conditions(request):
foo(request)
return error
def main():
error_response = check_conditions(some_request)
if error_response:
# freak out!
With
def foo(request):
try:
result = do_something_with(request)
except SomeWellDefinedError:
# you can try to handle the error here, or...
raise # uh oh!
def main():
try:
foo(some_request)
except SomeWellDefinedError:
# handle the error here, instead!

Threading with exception... Why it keeps ignoring the pass keyword?

I am stuck with a Python issue related to threading.
import threading
import time
import random
import sys
import echo
class presence(threading.Thread):
def __init__(self, cb):
threading.Thread.__init__(self)
self.callback = cb
def run(self):
minValue = 0
maxValue = 3
try:
while True:
time.sleep(1)
if random.randint(minValue, maxValue) == 1:
self.callback(1)
elif random.randint(minValue, maxValue) == 2:
raise Exception('An error')
else:
self.callback(0)
except:
print 'Exception caught!'
pass
def showAlert():
echo.echo('Someone is behind the door!')
def count(x):
if x == 1:
showAlert()
sys.stdout.flush()
That is how I call it:
t2 = presence.presence(presence.count)
t2.start()
I eventually get an "Exception caught!", but the thread stops not returning alerts anymore.
What did I do wrong here?
The try/except block should be inside the loop. For example:
while True:
...
elif random.randint(minValue, maxValue) == 2:
try:
raise Exception('An error')
except Exception:
print 'Exception caught!'
Otherwise, the loop will be exited when the exception is raised and Python jumps to the except: block in order to handle it.
You'll notice too that I selectively placed the try/except block in my example to only cover the code that might actually raise the exception. This is a best practice and I recommend it for your code. Having a try/except block enclose large portions of code decreases readability and also wastes space (lots of lines are unnecessarily indented).

Catch exception and continue try block in Python

Can I return to executing the try block after exception occurs?
For example:
try:
do_smth1()
except:
pass
try:
do_smth2()
except:
pass
vs.
try:
do_smth1()
do_smth2()
except:
??? # magic word to proceed to do_smth2() if there was exception in do_smth1
No, you cannot do that. That's just the way Python has its syntax. Once you exit a try-block because of an exception, there is no way back in.
What about a for-loop though?
funcs = do_smth1, do_smth2
for func in funcs:
try:
func()
except Exception:
pass # or you could use 'continue'
Note however that it is considered a bad practice to have a bare except. You should catch for a specific exception instead. I captured for Exception because that's as good as I can do without knowing what exceptions the methods might throw.
While the other answers and the accepted one are correct and should be followed in real code, just for completeness and humor, you can try the fuckitpy ( https://github.com/ajalt/fuckitpy ) module.
Your code can be changed to the following:
#fuckitpy
def myfunc():
do_smth1()
do_smth2()
Then calling myfunc() would call do_smth2() even if there is an exception in do_smth1())
Note: Please do not try it in any real code, it is blasphemy
You can achieve what you want, but with a different syntax. You can use a "finally" block after the try/except. Doing this way, python will execute the block of code regardless the exception was thrown, or not.
Like this:
try:
do_smth1()
except:
pass
finally:
do_smth2()
But, if you want to execute do_smth2() only if the exception was not thrown, use a "else" block:
try:
do_smth1()
except:
pass
else:
do_smth2()
You can mix them too, in a try/except/else/finally clause.
Have fun!
'continue' is allowed within an 'except' or 'finally' only if the try block is in a loop. 'continue' will cause the next iteration of the loop to start.
So you can try put your two or more functions in a list and use loop to call your function.
Like this:
funcs = [f,g]
for func in funcs:
try: func()
except: continue
For full information you can go to this link
You could iterate through your methods...
for m in [do_smth1, do_smth2]:
try:
m()
except:
pass
one way you could handle this is with a generator. Instead of calling the function, yield it; then whatever is consuming the generator can send the result of calling it back into the generator, or a sentinel if the generator failed: The trampoline that accomplishes the above might look like so:
def consume_exceptions(gen):
action = next(gen)
while True:
try:
result = action()
except Exception:
# if the action fails, send a sentinel
result = None
try:
action = gen.send(result)
except StopIteration:
# if the generator is all used up, result is the return value.
return result
a generator that would be compatible with this would look like this:
def do_smth1():
1 / 0
def do_smth2():
print "YAY"
def do_many_things():
a = yield do_smth1
b = yield do_smth2
yield "Done"
>>> consume_exceptions(do_many_things())
YAY
Note that do_many_things() does not call do_smth*, it just yields them, and consume_exceptions calls them on its behalf
I don't think you want to do this. The correct way to use a try statement in general is as precisely as possible. I think it would be better to do:
try:
do_smth1()
except Stmnh1Exception:
# handle Stmnh1Exception
try:
do_smth2()
except Stmnh2Exception:
# handle Stmnh2Exception
Depending on where and how often you need to do this, you could also write a function that does it for you:
def live_dangerously(fn, *args, **kw):
try:
return fn(*args, **kw)
except Exception:
pass
live_dangerously(do_smth1)
live_dangerously(do_smth2)
But as other answers have noted, having a null except is generally a sign something else is wrong with your code.
This can be done with exec() in a custom function, a list of strings, and a for loop.
The function with exec():
def try_it(string):
try:
exec(string)
print(f'Done: {string}')
except:
print(f'Error. Could not do: {string}')
More on exec():
exec(object)
This function supports dynamic execution of Python code. object must be either a string or a code object.
Example list of strings and for loop:
do_smth_list = ['do_smth1()', 'do_smth2()', 'do_smth3()']
for smth in do_smth_list:
try_it(smth)
This definitely isn't the cleanest way of doing it, but you can put it in a while loop with a variable set to true, and when it runs the function successfully it sets the variable to false, whereas if it fails it keeps the variable set to true.
x = True
while x == True:
try:
do_smth1()
do_smth2()
x = False
except Exception:
x = True
This way what happens is that the while loop will keep on looping the try except section again and again until it works, in which x is set to false and the loop stops
Also, you can implement a break in the while loop instead of basing it on a variable, for example:
while True:
try:
do_smth1()
do_smth2()
break
except Excpetion:
pass
P.S It is good ettiquete to put a specific exception for the except section, instead of leaving it for any exception. It makes the code cleaner and is more sensible when managing errors especially in bigger projects
special_func to avoid try-except repetition:
def special_func(test_case_dict):
final_dict = {}
exception_dict = {}
def try_except_avoider(test_case_dict):
try:
for k,v in test_case_dict.items():
final_dict[k]=eval(v) #If no exception evaluate the function and add it to final_dict
except Exception as e:
exception_dict[k]=e #extract exception
test_case_dict.pop(k)
try_except_avoider(test_case_dict) #recursive function to handle remaining functions
finally: #cleanup
final_dict.update(exception_dict)
return final_dict #combine exception dict and final dict
return try_except_avoider(test_case_dict)
Run code:
def add(a,b):
return (a+b)
def sub(a,b):
return (a-b)
def mul(a,b):
return (a*b)
case = {"AddFunc":"add(8,8)","SubFunc":"sub(p,5)","MulFunc":"mul(9,6)"}
solution = special_func(case)
Output looks like:
{'AddFunc': 16, 'MulFunc': 54, 'SubFunc': NameError("name 'p' is not defined")}
To convert to variables:
locals().update(solution)
Variables would look like:
AddFunc = 16, MulFunc = 54, SubFunc = NameError("name 'p' is not defined")
If do_smth1() worked, then do_smth2() will not be tried.
try:
x=do_smth1()
except:
try:
x=do_smth2()
except:
x="Both Failed"
print (x)

How to use a python context manager inside a generator

In python, should with-statements be used inside a generator? To be clear, I am not asking about using a decorator to create a context manager from a generator function. I am asking whether there is an inherent issue using a with-statement as a context manager inside a generator as it will catch StopIteration and GeneratorExit exceptions in at least some cases. Two examples follow.
A good example of the issue is raised by Beazley's example (page 106). I have modified it to use a with statement so that the files are explicitly closed after the yield in the opener method. I have also added two ways that an exception can be thrown while iterating the results.
import os
import fnmatch
def find_files(topdir, pattern):
for path, dirname, filelist in os.walk(topdir):
for name in filelist:
if fnmatch.fnmatch(name, pattern):
yield os.path.join(path,name)
def opener(filenames):
f = None
for name in filenames:
print "F before open: '%s'" % f
#f = open(name,'r')
with open(name,'r') as f:
print "Fname: %s, F#: %d" % (name, f.fileno())
yield f
print "F after yield: '%s'" % f
def cat(filelist):
for i,f in enumerate(filelist):
if i ==20:
# Cause and exception
f.write('foobar')
for line in f:
yield line
def grep(pattern,lines):
for line in lines:
if pattern in line:
yield line
pylogs = find_files("/var/log","*.log*")
files = opener(pylogs)
lines = cat(files)
pylines = grep("python", lines)
i = 0
for line in pylines:
i +=1
if i == 10:
raise RuntimeError("You're hosed!")
print 'Counted %d lines\n' % i
In this example, the context manager successfully closes the files in the opener function. When an exception is raised, I see the trace back from the exception, but the generator stops silently. If the with-statement catches the exception why doesn't the generator continue?
When I define my own context managers for use inside a generator. I get runtime errors saying that I have ignored a GeneratorExit. For example:
class CManager(object):
def __enter__(self):
print " __enter__"
return self
def __exit__(self, exctype, value, tb):
print " __exit__; excptype: '%s'; value: '%s'" % (exctype, value)
return True
def foo(n):
for i in xrange(n):
with CManager() as cman:
cman.val = i
yield cman
# Case1
for item in foo(10):
print 'Pass - val: %d' % item.val
# Case2
for item in foo(10):
print 'Fail - val: %d' % item.val
item.not_an_attribute
This little demo works fine in case1 with no exceptions raised, but fails in case2 where an attribute error is raised. Here I see a RuntimeException raised because the with statement has caught and ignored a GeneratorExit exception.
Can someone help clarify the rules for this tricky use case? I suspect it is something I am doing, or not doing in my __exit__ method. I tried adding code to re-raise GeneratorExit, but that did not help.
from the Data model entry for object.__exit__
If an exception is supplied, and the method wishes to suppress the exception (i.e., prevent it from being propagated), it should return a true value. Otherwise, the exception will be processed normally upon exit from this method.
In your __exit__ function, you're returning True which will suppress all exceptions. If you change it to return False, the exceptions will continue to be raised as normal (with the only difference being that you guarantee that your __exit__ function gets called and you can make sure to clean up after yourself)
For example, changing the code to:
def __exit__(self, exctype, value, tb):
print " __exit__; excptype: '%s'; value: '%s'" % (exctype, value)
if exctype is GeneratorExit:
return False
return True
allows you to do the right thing and not suppress the GeneratorExit. Now you only see the attribute error. Maybe the rule of thumb should be the same as with any Exception handling -- only intercept Exceptions if you know how to handle them. Having an __exit__ return True is on par (maybe slightly worse!) than having a bare except:
try:
something()
except: #Uh-Oh
pass
Note that when the AttributeError is raised (and not caught), I believe that causes the reference count on your generator object to drop to 0 which then triggers a GeneratorExit exception within the generator so that it can clean itself up. Using my __exit__, play around with the following two cases and hopefully you'll see what I mean:
try:
for item in foo(10):
print 'Fail - val: %d' % item.val
item.not_an_attribute
except AttributeError:
pass
print "Here" #No reference to the generator left.
#Should see __exit__ before "Here"
and
g = foo(10)
try:
for item in g:
print 'Fail - val: %d' % item.val
item.not_an_attribute
except AttributeError:
pass
print "Here"
b = g #keep a reference to prevent the reference counter from cleaning this up.
#Now we see __exit__ *after* "Here"
class CManager(object):
def __enter__(self):
print " __enter__"
return self
def __exit__(self, exctype, value, tb):
print " __exit__; excptype: '%s'; value: '%s'" % (exctype, value)
if exctype is None:
return
# only re-raise if it's *not* the exception that was
# passed to throw(), because __exit__() must not raise
# an exception unless __exit__() itself failed. But throw()
# has to raise the exception to signal propagation, so this
# fixes the impedance mismatch between the throw() protocol
# and the __exit__() protocol.
#
if sys.exc_info()[1] is not (value or exctype()):
raise

Categories