Python decorator - Trying to understand a simple example - python

I am trying to understand python decorators.
I devised that simple example where I want the decorator function to be a custom log that just print error if for instance I try to sum_ and int and a str
def log(fun):
try:
return fun(*args)
except:
print('error!')
#log
def sum_(a,b):
return a+b
This returns "error" already simply when I define the function. I suspect there are multiple wrong things in what I did... I tried to look into the other questions about that topic, but I find them all to intricate to understand how such a simple example should be drafted ,esp how to pass the arguments from the original function.
All help and pointers appreciated

That's because you're not forwarding the args from the function to your decorator, and the catch-all exception catches the NameError for args; one of the reasons to always specify the exception class.
Here's a modified version of your code with the try-catch removed and the function arguments correctly forwarded:
def log(fun):
def wrapper(*args):
print('in decorator!')
return fun(*args)
return wrapper
#log
def sum_(a,b):
return a+b
print sum_(1,2)

The reason you're getting an error is simply because args is undefined in your decorator. This isn't anything special about decorators, just a regular NameError. For this reason you probably want to restrict your exception clause to just TypeErrors so that you're not silencing other errors. A full implementation would be
import functools
def log(fun):
#functools.wraps(fun)
def inner(*args):
try:
return fun(*args)
except TypeError:
print('error!')
return inner
#log
def sum_(a, b):
return a + b
It's also a good idea to decorate your inner functions with the functools.wrap decorator, which transfers the name and docstring from your original function to your decorated one.

The log decorator, in this case, does not return a function, but a value. This may point on an assumption that the decorator function replaces the original function, where in fact, it is called to create a replacement function.
A fix which may represent the intention:
def log(fun):
def my_func(*args):
try:
return fun(*args)
except:
print('error!')
return my_func
In this case, my_func is the actual function which is called for sum_(1, 2), and internally, it calls the original function (the original sum_) which the decorator received as an argument.
A trivial example that illustrates the order of the actions:
def my_decorator(fun):
print 'This will be printed first, during module load'
def my_wrapper(*args):
print 'This will be printed during call, before the original func'
return fun(*args)
return my_wrapper()
#my_decorator
def func():
print('This will be printed in the original func')

Related

Please assist in developing a Python decorator

DESCRIPTION
Implement a Python decorator that should take whatever the decorated function returns, and write it to a file in a new line. For the sake of this problem, let us assume that the decorated functions always return a string. The decorator should be named log_message and should write to the file /tmp/decorator_logs.txt.
Implement the following design
#log_message
def a_function_that_returns_a_string():
return "A string"
#log_message
def a_function_that_returns_a_string_with_newline(s):
return "{}\n".format(s)
#log_message
def a_function_that_returns_another_string(string=""):
return "Another string"
Here is the decorator:
def log_message(func):
def wrap(*args,**kwargs):
res = func(*args,**kwargs)
with open('/tmp/decorator_logs.txt','wt') as f:
f.write(res)
return res
return wrap
But from your description of what you did in your comments, it seems that you don't fully understand the decorator concept.
You can think of decorator as a special function that takes another function as input and return a decorated function that do something slightly different.
In the code above, log_message take what ever function it decorates, and define a new function called wrap. This wrap function take whatever inputs, pass those input arguments to func, write the returned result of func to the tmp/decorator_logs.txt file, then return the same result.
Another thing you need to understand is that
#log_message
def decorated_function():
....
is the same as:
decorated_function = log_message(decorated_function)
Hope this help with your understanding of decorator.

Python decorators with arguments

I'm having trouble understanding the concept of decorators, so basically if I understood correctly, decorators are used to extend the behavior of a function, without modifying the functions code . The basic example:
I have the decorator function, which take as parameter another function and then changes the functionality of the function given as argument:
def decorator(f):
def wrapper(*args):
return "Hello " + str(f(*args))
return wrapper
And here I have the function which I want to decorate :
#decorator
def text (txt):
'''function that returns the txt argument'''
return txt
So if I understand correctly , what actually happens "behind" is:
d=decorator(text)
d('some argument')
My question is , what happens in this case when we have three nested function in the decorator:
def my_function(argument):
def decorator(f):
def wrapper(*args):
return "Hello " +str(argument)+ str(f(*args))
return wrapper
return decorator
#my_function("Name ")
def text(txt):
return txt
Because the functionality is awesome, I can pass argument in the decorator, I do not understand what actually happens behind this call :
#my_function("Name ")
Thank you,
It is just another level of indirection, basically the code is equivalent to:
decorator = my_function("Name ")
decorated = decorator(text)
text = decorated
Without arguments, you already have the decorator, so
decorated = my_function(text)
text = decorated
my_function is used to create a closure here. The argument is local to my_function. But due to closure, when you return the decorator, the decorator function has a reference to it all the time. So when you apply decorator to text, the decorator adds extra functionality, as to be expected. It also can embed argument in its extra functionality since it has access to the environment in which it was defined.

#decorators in Python: why the inner defined function?

I'm just starting with Python and I have just been exposed to decorators. I wrote the following code, mimicking what I am seeing, and it works:
def decorator_function(passed_function):
def inner_decorator():
print('this happens before')
passed_function()
print('this happens after')
return inner_decorator
#decorator_function
def what_we_call():
print('The actual function we called.')
what_we_call()
But then I wrote this, which throws errors:
def decorator_function(passed_function):
print('this happens before')
passed_function()
print('this happens after')
#decorator_function
def what_we_call():
print('The actual function we called.')
what_we_call()
So, why do we need to have that inner nested function inside the decorator function? what purpose does it serve? Wouldn't it be simpler to just use the syntax of the second? What am I not getting?
The funny thing is that BOTH have the same (correct) output, but the second on has error text as well, saying "TypeError: 'NoneType' object is not callable"
Please use language and examples suitable for someone just starting with Python, his first programming language - and also new to OOP as well! :) Thanks.
The reason is that when you wrap what_we_call in decorator_function by doing:
#decorator_function
def what_we_call():
...
What you're doing is:
what_we_call = decorator_function(what_we_call)
In you first example it works because you don't run the inner_function actually, you only initialise it, and then you return the new inner_function back (that you will call later when call the decorated what_we_call):
def decorator_function(passed_function):
def inner_decorator():
print('this happens before')
passed_function()
print('this happens after')
return inner_decorator
Contrarily, in your second example you're going to run 2 print statements and the passed_function (what_we_call in our case) in the between:
def decorator_function(passed_function):
print('this happens before')
passed_function()
print('this happens after')
In other words, you don't return a function in the example of before:
what_we_call = decorator_function(what_we_call)
You run the code (and you see the output), but then decorator_function returns 'None' to what_we_call (overwriting the original function), and when you call 'None' as if it was a function Python complains.
Python decorators are basically just syntactic sugar. This:
#decorator
def fn(arg1, arg2):
return arg1 + arg2
Becomes this:
def fn(arg1, arg2):
return arg1 + arg2
fn = decorator(fn)
That is, a decorator basically accepts a function as an argument, and returns "something"; that "something" is bound to the name of the decorated function.
In nearly all cases, that "something" should be another function, because it is expected that fn will be a function (and will probably be called as though it is).

Calling functions as if in own caller?

I'm trying to replicate an error-checking pattern that I often use when programming in C, in Python. I have a function check as follows:
def check(exceptions, msg, handler):
def wrapped(func, *args, **kwargs):
try:
return func(*args, **kwargs)
except exceptions as err:
log_err(msg)
# Do something with handler
return wrapped
By calling check with appropriate arguments and then calling the result with a function and its arguments, it's possible to reduce a try-except statement to as little as two lines of code without really sacrificing clarity (in my opinion, anyway).
For example
def caller():
try:
files = listdir(directory)
except OSError as err:
log_err(msg)
return []
# Do something with files
becomes
def caller():
c = check(OSError, msg, handler)
files = c(listdir, directory) # caller may return [] here
# Do something with files
The issue is that in order for this transformation to be transparent to the rest of the program it's necessary for handler to execute exactly as if it were written in the scope of the caller of wrapped. (handler need not be a function object. I'm after an effect, not a method.)
In C I would just use macros and expand everything inline (since that's where I would be writing the code anyway), but Python doesn't have macros. Is it possible to achieve this effect in some other way?
There is no way to write Python code to create an object c so that, when you call c, the function that called it returns. You can only return from a function by literally typing a return statement directly in that function's body (or falling off the end).
You could easily make it so that your "checked" function simply returns the default value. The caller can then use it as normal, but it can't make the caller itself return. You could also write a decorator for caller that catches your specified errors and returns [] instead, but this would catch all OSErrors raised anywhere in caller, not just ones raised by calling a particular function (e.g., listdir). For instance:
def check(exceptions, msg, handler):
def deco(func):
def wrapped(*args, **kwargs):
try:
return func(*args, **kwargs)
except exceptions as err:
print "Logged error:", msg
return handler(err)
return wrapped
return deco
def handler(err):
return []
#check(ZeroDivisionError, "divide by zero", handler)
def func(x):
1/0
>>> func(1)
Logged error: divide by zero
[]
The situation you describe seems somewhat unusual. In most cases where it would be worth it to factor out the handling into a "handler" function, that handler function either couldn't know what value it wants the caller to return, or it could know what to return just based on the error, without needing to know what particular line raised the error.
For instance, in your example, you apparently have a function caller that might raise an OSError at many different points. If you only have one place where you need to catch OSError and return [], just write one try/except and it's no big deal. If you want to catch any OSError in the function and return [], decorate it as shown above. What you describe would seem to only be useful in cases where you want to catch more-than-one-but-not-all possible OSErrors raised in caller, and yet in all those cases you want to return the same particular value, which seems rather unusual.
Forgive me if I don't quite understand, but can't you just put the handler inside the calling function?
def caller():
def handler():
print("Handler was called")
# ...
A better way...
Or, if you want to simplify the way you call it, you can use with statements to achieve the desired effect. This will be a lot easier, I think, and it's a lot less of a "hack":
class LogOSError(object):
def __init__(self, loc):
self.loc = loc
def __enter__(self):
return None
def __exit__(self, exc_type, exc_value, traceback):
if isinstance(exc_value, OSError):
print("{}: OSError: {}".format(self.loc, exc_value))
return True
You can use it like this:
with LogOSError('example_func'):
os.unlink('/does/not/exist')
And the output is:
>>> with LogOSError('some_location'):
... os.unlink('/does/not/exist')
...
some_location: OSError: [Errno 2] No such file or directory: '/does/not/exist'

catch wrong-arguments exception, in the general case

I want to catch an exception, but only if it comes from the very next level of logic.
The intent is to handle errors caused by the act of calling the function with the wrong number of arguments, without masking errors generated by the function implementation.
How can I implement the wrong_arguments function below?
Example:
try:
return myfunc(*args)
except TypeError, error:
#possibly wrong number of arguments
#we know how to proceed if the error occurred when calling myfunc(),
#but we shouldn't interfere with errors in the implementation of myfunc
if wrong_arguments(error, myfunc):
return fixit()
else:
raise
Addendum:
There are several solutions that work nicely in the simple case, but none of the current answers will work in the real-world case of decorated functions.
Consider that these are possible values of myfunc above:
def decorator(func):
"The most trivial (and common) decorator"
def wrapper(*args, **kwargs):
return func(*args, **kwargs)
def myfunc1(a, b, c='ok'):
return (a, b, c)
myfunc2 = decorator(myfunc1)
myfunc3 = decorator(myfunc2)
Even the conservative look-before-you-leap method (inspecting the function argument spec) fails here, since most decorators will have an argspec of *args, **kwargs regardless of the decorated function. Exception inspection also seems unreliable, since myfunc.__name__ will be simply "wrapper" for most decorators, regardless of the core function's name.
Is there any good solution if the function may or may not have decorators?
You can do:
try:
myfunc()
except IndexError:
trace = sys.exc_info()[2]
if trace.tb_next.tb_next is None:
pass
else:
raise
Although it is kinda ugly and would seem to violate encapsulation.
Stylistically, wanting to catch having passed too many arguments seem strange. I suspect that a more general rethink of what you are doing may resolve the problem. But without more details I can't be sure.
EDIT
Possible approach: check if function you are calling has the arguments *args,**kwargs. If it does, assume its a decorator and adjust the code above to check if the exception was one further layer in. If not, check as above.
Still, I think you need to rethink your solution.
I am not a fan of doing magic this way. I suspect you have an underlying design problem rather.
--original answer and code which was too unspecific to the problem removed--
Edit after understanding specific problem:
from inspect import getargspec
def can_call_effectively(f, args):
(fargs, varargs, _kw, df) = getattr(myfunc, 'effective_argspec', \
getargspec(myfunc))
fargslen = len(fargs)
argslen = len(args)
minargslen = fargslen - len(df)
return (varargs and argslen >= minargslen) or minargslen <= argslen <= fargslen
if can_call_effectively(myfunc, args)
myfunc(*args)
else:
fixit()
All your decorators, or at least those you want to be transparent in regard to
calling via the above code, need to set 'effective_argspec' on the returned callable.
Very explicit, no magic. To achieve this, you could decorate your decorators with the appropriate code...
Edit: more code, the decorator for transparent decorators.
def transparent_decorator(decorator):
def wrapper(f):
wrapped = decorator(f)
wrapped.__doc__ = f.__doc__
wrapped.effective_argspec = getattr(f, 'effective_argspec', getargspec(f))
return wrapped
return wrapper
Use this on your decorator:
#transparent_decorator
def decorator(func):
"The most trivial (and common) decorator"
def wrapper(*args, **kwargs):
return func(*args, **kwargs)
return wrapper # line missing in example above
Now if you create myfunc1 - myfunc3 as above, they work exactly as expected.
Ugh unfortunately not really. Your best bet is to introspect the error object that is returned and see if myfunc and the number of arguments is mentioned.
So you'd do something like:
except TypeError, err:
if err.has_some_property or 'myfunc' in str(err):
fixit()
raise
you can do it by doing something like
>>> def f(x,y,z):
print (f(0))
>>> try:
f(0)
except TypeError as e:
print (e.__traceback__.tb_next is None)
True
>>> try:
f(0,1,2)
except TypeError as e:
print (e.__traceback__.tb_next is None)
False
but a better way should be to count the number of args of function and comparing with the number of args expected
len(inspect.getargspec(f).args) != len (args)
You can retrieve the traceback and look at its length. Try:
import traceback as tb
import sys
def a():
1/0
def b():
a()
def c():
b()
try:
a()
except:
print len(tb.extract_tb(sys.exc_traceback))
try:
b()
except:
print len(tb.extract_tb(sys.exc_traceback))
try:
c()
except:
print len(tb.extract_tb(sys.exc_traceback))
This prints
2
3
4
Well-written wrappers will preserve the function name, signature, etc, of the functions they wrap; however, if you have to support wrappers that don't, or if you have situations where you want to catch an error in a wrapper (not just the final wrapped function), then there is no general solution that will work.
I know this is an old post, but I stumbled with this question and later with a better answer. This answer depends on a new feature in python 3, Signature objects
With that feature you can write:
sig = inspect.signature(myfunc)
try:
sig.bind(*args)
except TypeError:
return fixit()
else:
f(*args)
Seems to me what you're trying to do is exactly the problem that exceptions are supposed to solve, ie where an exception will be caught somewhere in the call stack, so that there's no need to propagate errors upwards.
Instead, it sounds like you are trying to do error handling the C (non-exception handling) way, where the return value of a function indicates either no error (typically 0) or an error (non 0 value). So, I'd try just writing your function to return a value, and have the caller check for the return value.

Categories