I'm trying to write a decorator that takes a few arguments, and can decorate arbitrary functions. After reading a few code examples, and stepping through the debugger I've figured out how to write it. But I don't fully understand why it works.
def bar(arg1):
def inner_bar(f):
def inner_inner_bar(*args, **kwargs):
new_args = (x + arg1 for x in args)
return f(*new_args, **kwargs)
return inner_inner_bar
return inner_bar
#bar(4)
def foo(x, y):
print("Sum is {0}".format(x+y))
if __name__ == "__main__":
foo(1, 2)
Sum is 11
What I don't fully grasp is how/why the function f exists in the scope of inner_bar but not bar. And similarly that args and kwargs exist in the scope of inner_inner_bar but not inner_bar.
What is Python doing when I use #bar that makes the different variables available in the different methods of my decorator?
What is Python doing when I use #bar that makes the different variables available in the different methods of my decorator?
Note that you're not just using #bar, you're using #bar(4). It works like this:
bar(4) returns a function (inner_bar)
Using #bar(4) to decorate foo calls inner_bar(foo).
inner_bar(foo) returns a function (inner_inner_bar)
This function (inner_inner_bar) is assigned back to the name foo
When you call foo, you are calling inner_inner_bar, so whatever arguments you pass are passed as the *args and **kwargs
What Python is "doing" is calling the functions involved. All of the variables you're asking about (f, args and kwargs) are just function arguments, which, as usual, become available when their function is called. f becomes available when inner_bar is called, namely when you apply the decorator. *args and **kwargs become available when inner_inner_bar is called, namely when you call the decorated function. The only thing that is available when you write bar(4) is arg1, because the other functions haven't been called yet.
Related
Suppose we want to create a callback function which adds two numbers arg1 and arg2. This could be done by returning a function reference like so:
def add(arg1, arg2):
return arg1 + arg2
def partial(f, arg1, arg2):
def wrapper():
return f(arg1, arg2)
return wrapper
callback = partial(add, 2, 4)
print(callback())
Output: 6
How does wrapper remember arg1 and arg2? What do we really return with return wrapper? It seems like not only is part of the function code returned but also its surroundings (in this case variables defined before wrapper). Otherwise there would be a NameError.
Answering your main question, this happens because of something called Closure in python.
When you nest a function inside an enclosing function and return the nested function, the variables passed to the enclosing function are still in the scope of the nested function.
You would often see use of closures for creating decorators in python.
I have a project in which I need to do some things, call one of a few functions, and then do some other things:
def doCommand(function, parameter_for_function):
# do a thing
function(parameter_for_function)
# do a thing
My problem is that I don't know if the function I will be passing will require a parameter of its own!
How can I allow my function to call both functions that have no parameters, and functions that have one?
The preferred method of handling this is probably along these lines:
def doCommand(function, *args, **kwargs):
# do a thing
function(*args, **kwargs)
# do a thing
*args and **kwargs allow arbitrary arguments to be passed along to a method. As the name implies, these allow you to call doCommand with an arbitrary number of arguments and keyword arguments as well that get passed onto function.
I suggest explicitly saying that the function you take is one that's called with no parameters:
from typing import Callable
def doCommand(function: Callable[[], None]) -> None:
# do a thing
function()
# do a thing
If the function needs parameters, it's then explicitly the responsibility of the doCommand caller to provide them, e.g. within a lambda expression:
def foo(param) -> None:
print("foo: ", param)
doCommand(lambda: foo("bar"))
or a named wrapper function:
def foobar() -> None:
foo("bar")
doCommand(foobar)
I am new to the more advanced features of Python like decorators.
I am unable to understand how the Python interpreter actually understands where to put the original function object in a decorator.
Lets look at an example: Examples taken from here.
Simple decorator with no arguments:
def call_counter(func):
def helper(*args, **kwargs):
helper.calls += 1
return func(*args, **kwargs)
helper.calls = 0
return helper
#call_counter
def succ(x):
return x + 1
This makes perfect sense if we can assume that the first/only argument to the decorator call_counter(func) is the function object that needs to wrapped ie. in this case succ() function.
But things become inconsistent when you are talking about "decorators with parameters". Look at the example below:
Decorator with one argument:
def greeting(expr): # Shouldn't expr be the function here ? Or at least isn't there suppose to be another parameter.
def greeting_decorator(func): # How does Python know to pass the function down here ?
def function_wrapper(x):
print(expr + ", " + func.__name__ + " returns:")
func(x)
return function_wrapper
return greeting_decorator
#greeting("Hello")
def foo(x):
print(42)
foo("Hi")
Now we know Python has no concept of data-types, so function parameters give no information about what type of object they will contain.
Am I correct ?
Having said that lets look at the line from the above example:
def greeting(expr):
If for decorators the first argument is the function to be wrapped then by that logic expr should point to foo() right ? Otherwise there should be at least two parameters in greeting(), like:
def greeting(func, expr):
But instead Python can "magically" understand that the inner function needs to be passed the function reference:
def greeting(expr):
def greeting_decorator(func): # How is it correctly put one level down ?
The code has no datatypes or type information specified, so how is it that for decorators without arguments the function is passed as the first argument and for decorators with arguments the function is passed to the inner function ?
How can the interpreter detect that ?
What is going on here ?
This seems like "magic" to me.
What happens if I have 5 or 6 levels of nested functions ?
I am pretty sure I am missing something pretty basic here.
Thanks.
Python evaluates the expression after the # and uses the result as the decorator function.
Python calls the __call__ method of the object that is the decorator with the function as argument.
using
#call_counter
def succ(x):
return x + 1
callcounter is the object looked for __call__ to give the argument func
If you use
#greeting("Hello")
def foo(x):
print(42)
greeting("Hello") is evaluated and its result is an object that Python uses the __call__ method with the func argument.
I was wondering why the following works:
def wrapper():
def wrap(p=10):
def f():
print(p)
f()
return wrap
f2 = wrapper()
f2()
But this doesn't:
def f():
print(p)
def enhance(f):
def wrap(p=10):
f()
return wrap
f2 = enhance(f)
f2() # NameError: name 'p' is not defined
Is there a way I can modify the second scenario so that variable p is defined? I was playing around with function decorators but couldn't figure it out how to expose the variables to the function I'm passing into the decorators.
I think I understand what you are really asking. You're taking about decorators, not variable scope. You say you can't figure out how to "expose the variables to the function I'm passing to the decorators." In your case 2, the function you are passing to enhance doesn't have any variables (arguments). Suppose we give it an argument, like this:
def f(p):
print(p)
def enhance(f):
def wrap(p=10):
f(p) # pass the argument to f
return wrap
f2 = enhance(f)
f2()
Now you have a function, named enhance, which can be used as a decorator. The function to be decorated takes one argument. The decorator will replace this function with a new function, which can be called with one or zero arguments. If called with no arguments it will get the value "10" as a default.
Decorators replace one function with another function. In general it isn't the decorator's job to supply the arguments, except in the case of default arguments as you are trying to do. The arguments come from the code that calls the function.
because in example 2 you’re referencing p that is not defined in one function and used as a parameter in the other function each of which is defined in their own scope.
in example 1 a function defined within the scope of another ie a nested function, has access to the outer functions scope (and therefore its variables)
Some decorators should only be used in the outermost layer.
A decorator that augments the original function and add a configure parameter is one example.
from functools import wraps
def special_case(f):
#wraps(f)
def _(a, b, config_x=False):
if config_x:
print "Special case here"
return
return f(a, b)
How can I avoid decorators like this getting decorated by another decorator?
EDIT
It is really disgusting to let everyone trying to apply a new decorator worry about the application order.
So, is it possible to avoid this kind of situation? Is it possible to add a config option without introducing a new parameter?
There isn't any way to stop it from being decorated. You just have to document that it needs to apply last and tell people not to use it inside another decorator.
Edit responding to your edit: In Python 3 you can give your function a keyword-only argument. This drastically reduces the impact that the change will have on existing uses of the function. Unfortunately this only works in Python 3.
Ultimately, applying a decorator to a function just means passing the decorated function as an argument to another function. There's no way for a function (or any object) to even know that it's being passed as an argument, let alone what it's being passed to. The reason you can't know about later decorators is the same reason that in an ordinary function call like f(g(x)), the function g can't know that it will later be called by f.
This is one reason writing decorators is tricky. Code that relies on heavy use of decorators that pass explicit arguments to their wrapped functions (as yours passes a and b) is inherently going to be fragile. Fortunately, a lot of the time you can write a decorator that uses *args and **kwargs so it can pass all the arguments it doesn't use along to the decorated function.
If someone takes the code you provide, and writes another decorator that explicitly accepts only a and b as arguments, and then calls the decorated function as f(a, b, True), it's their own fault if it fails. They should have known that other decorators they used might have changed the function signature.
Normally, when one write a decorator to be used generically, one does not estrict the number or name of arguments for the function it is wrapping.
Most decorators out there accept a list o positional arguments, and amapping of keyword arguments as parameters for their wrapper, and pass those, as received, to the decorated function:
def deco(func):
def wrapper(*args, **kwargs):
... decorator stuff here ...
return func(*args, **kwargs)
Threfore, if the decorator is to receive a parameter that it should "consume" - like the config_x you mention, all you have to do is to document it, have it as a keyword parameter, and pick it from kwargs. To avoid name clashes on parameters, one can, for example, prefix this parameter name with the decorator's own name or other distinct name:
def deco(func):
def wrapper(*args, **kwargs):
if "deco_config_x" in kwargs):
config_x = kwargs.pop(deco_config_x)
... decorator stuff here ...
return func(*args, **kwargs)
This way, the decorator may be put anywhere on a "decorator stack" - it will pick the parameter(s) addressed to it, and those bellow it won't get any stranger parameter. Theonly requirement is that your functions and decorators as a whole juts let keyword parametrs they don't know about to pass through.