I was working through a decorator design pattern tutorial
(credit to Jungwoo Ryoo)
I'm curious as to why I can swap the lines: return decorator
and print(hello_world()) with return decorator() and print(hello_world)
from functools import wraps
def make_blink(function):
"""Defines the decorator"""
#wraps(function)
# Define the inner function
def decorator():
# Grab the return value of the function being decorated
ret = function()
# Add new functionality to the function being decorated
return "<blink>"+ ret + "<b/link>"
return decorator #return decorator()#<THIS LINE HERE SWAPPED
# Apply the decorator here!
#make_blink
def hello_world():
"""Original function! """
return "Hello, World!"
# Check the result of decorating
print(hello_world()) #print(hello_world) #<THIS LINE HERE SWAPPED
Wouldn't the interpreter be doing something different each time? I'm just looking for some insight to have a better understanding of what's going on
Decorators are just functions really, and functions are just objects.
The lines
#make_blink
def hello_world():
# ...
are essentially the same as
def hello_world():
# ...
hello_world = make_blink(hello_world)
except the function object is never assigned to hello_world first (it's on the stack for to be passed to the decorator).
So whatever you return from make_blink() is assigned back to hello_world. That can be a function object, but it can also be something entirely different.
So when you use return decorator, you tell Python to set hello_world to the nested function object. When you use return decorator(), you tell Python to use the result of the decorator() function. Here, that's a string value. It's as if you did this:
def hello_world():
"""Original function! """
return "Hello, World!"
hello_world = "<blink>" + hello_world() + "</blink>"
And that is fine for this specific example, because body of the hello_world() function only ever returns the same string each time.
But what if you changed the original hello_world() function body to return something different each time you called it? What if you had:
import random
#make_blink
def random_greeting():
return 'Hello ' + random.choice('DonAr', 'Martijn Pieters', 'Guido van Rossum') + '!'
Now it makes a big difference what you return from the make_blink() call! For the top-level of a module, decorators are executed only once, when importing. If you used return decorator() you'd run random.choice() just once, and you have fixed the value of random_greeting to a single, static string result.
Generally speaking, decorators are expected to return a callable object again. That can be the original function (where the decorator just updates some kind of registration), a wrapper function (which does extra things before or after calling the original), or even something different entirely. But that's not set in stone anywhere, and the interpreter doesn't care either way.
Decorators are just reusable things to use in your program, a tool. If you have a specific use for a decorator that returns the result of the original function, then you are free to do so.
Related
For example:
def get_val(n):
return n
def check_args(func):
# gets the arguments of a function at runtime
get_val(1)
get_val(2)
Note: This is probably bad practice, but I want to understand more about how python works.
Python decorators allow you to do this with minimal effort:
import functools
def check_args(func):
#functools.wraps(func) # Copies documentation and other stuff from wrapped func to wrapper, making it look as much like the wrapped func as possible
def wrapper(*args, **kwargs):
# do stuff to inspect arguments
return func(*args, **kwargs)
return wrapper
#check_args
def get_val(n):
return n
Using #check_args is equivalent to fully defining get_val, then doing:
get_val = check_args(get_val)
which means get_val gets replaced with the wrapper function, that now gets called first, can perform checks, then delegate to the wrapped function (in this case, get_args). Obviously, for just one function, it's kind of pointless (you could just put the checking in get_val), but if you want to check other functions, prefixing their definition with #check_args is a single line of code that doesn't get intermingled with the rest of the code, and keeps boilerplate down.
I can't figure out how to do this, and frankly, I don't know if it's possible.
I want to write a decorator that changes the way a function is called. It's easiest to see with example code:
def my_print(*args, **kwargs):
print(args[0].upper())
#reroute_decorator('print', my_print)
def my_func():
print('normally this print function is just a print function...')
print('but since my_func is decorated with a special reroute_decorator...')
print('it is replaced with a different function, and its args sent there.')
my_func()
# NORMALLY THIS PRINT FUNCTION IS JUST A PRINT FUNCTION...
# BUT SINCE MY_FUNC IS DECORATED WITH A SPECIAL REROUTE_DECORATOR...
# IT IS REPLACED WITH A DIFFERENT FUNCTION, AND ITS ARGS SENT THERE.
Is a decorator with this kind of functionality even possible in python?
Now, I don't really need this if it's too complex, I just can't figure out how to do it in a simple way.
Is this kind of a problem trivial? Or is it really complex?
You can create a new function with an updated globals dictionary so that to that function it appears that the global was bound to the desired value.
Note that this is weaker than actual dynamic scope as any functions called by the function will see the original bindings and not the modified one.
See namespaced_function referenced in How does Python's types.FunctionType create dynamic Functions?
To elaborate on #Dan D.'s answer, you would create a new function object to replace the original, something like this:
from types import FunctionType
def reroute_decorator(**kwargs):
def actual_decorator(func):
globals = func.__globals__.copy()
globals.update(kwargs)
new_func = FunctionType(
func.__code__, globals, name=func.__name__,
argdefs=func.__defaults__, closure=func.__closure__)
new_func.__dict__.update(func.__dict__)
return new_func
return actual_decorator
The only catch here is that the updated function object is the only one that will see whatever kwargs you passed in, since they will be spoofed into globals. Additionally, any modifications you make to the module after calling the decorator function will not be visible to the decorated function, but that should not be an issue. You can go a layer deeper and create a proxy dictionary that would allow you to interact normally with the original, except for keys you explicitly defined, like print, but that's a bit out of scope here.
I've updated your print implementation to be a bit more general, and made the input to the decorator function more pythonic (less MATLABy):
def my_print(*args, **kwargs):
print(*(str(x).upper() for x in args), **kwargs)
#reroute_decorator(print=my_print)
def my_func():
print('normally this print function is just a print function...')
print('but since my_func is decorated with a special reroute_decorator...')
print('it is replaced with a different function, and its args sent there.')
Which results in:
>>> my_func()
NORMALLY THIS PRINT FUNCTION IS JUST A PRINT FUNCTION...
BUT SINCE MY_FUNC IS DECORATED WITH A SPECIAL REROUTE_DECORATOR...
IT IS REPLACED WITH A DIFFERENT FUNCTION, AND ITS ARGS SENT THERE.
I'm having trouble understanding the concept of decorators, so basically if I understood correctly, decorators are used to extend the behavior of a function, without modifying the functions code . The basic example:
I have the decorator function, which take as parameter another function and then changes the functionality of the function given as argument:
def decorator(f):
def wrapper(*args):
return "Hello " + str(f(*args))
return wrapper
And here I have the function which I want to decorate :
#decorator
def text (txt):
'''function that returns the txt argument'''
return txt
So if I understand correctly , what actually happens "behind" is:
d=decorator(text)
d('some argument')
My question is , what happens in this case when we have three nested function in the decorator:
def my_function(argument):
def decorator(f):
def wrapper(*args):
return "Hello " +str(argument)+ str(f(*args))
return wrapper
return decorator
#my_function("Name ")
def text(txt):
return txt
Because the functionality is awesome, I can pass argument in the decorator, I do not understand what actually happens behind this call :
#my_function("Name ")
Thank you,
It is just another level of indirection, basically the code is equivalent to:
decorator = my_function("Name ")
decorated = decorator(text)
text = decorated
Without arguments, you already have the decorator, so
decorated = my_function(text)
text = decorated
my_function is used to create a closure here. The argument is local to my_function. But due to closure, when you return the decorator, the decorator function has a reference to it all the time. So when you apply decorator to text, the decorator adds extra functionality, as to be expected. It also can embed argument in its extra functionality since it has access to the environment in which it was defined.
I have below decorator demonstration code. If I execute it without explicitly calling greet function, it is executing print statement inside decorator function and outputs Inside decorator.
I am unable to understand this behavior of decorator. How the time_decorator is called even if I didn't call greet function?
I am using Python 3.
def time_decorator(original_func):
print('Inside decorator')
def wrapper(*args, **kwargs):
start = time.clock()
result = original_func(*args, **kwargs)
end = time.clock()
print('{0} is executed in {1}'.format(original_func.__name__, end-start))
return result
return wrapper
#time_decorator
def greet(name):
return 'Hello {0}'.format(name)
Decorators are called at start time (when the python interpreter reads the code as the program starts), not at runtime (when the decorated function is actually called).
At runtime, it is the wrapped function wrapper which is called and which itself calls the decorated function and returns its result.
So this is totally normal that the print line gets executed.
If, i.e, you decorate 10 functions, you will see 10 times the print output. No need to even call the decorated functions for this to happen.
Move the print inside wrapper and this won't happen anymore.
Decorators as well as metaclasses are part of what is called meta-programming (modify / create code, from existing code). This is a really fascinating aspect of programming which takes time to understand but offers amazing possibilities.
time_decorator is executed during function decoration. wrapper is not (this is function invoked when decorated greet() is called).
# is just syntactic sugar. Following code snippets are equivalent.
Decorator syntax:
#time_decorator
def greet(name):
return 'Hello {0}'.format(name)
Explicit decoration process - decorator is a function that returns new function based on another one.
def greet(name):
return 'Hello {0}'.format(name)
greet = time_decorator(greet)
In this blog article they use the construct:
#measured
def some_func():
#...
# Presumably outputs something like "some_func() is finished in 121.333 s" somewhere
This #measured directive doesn't seem to work with raw python. What is it?
UPDATE: I see from Triptych that #something is valid, but is where can I find #measured, is it in a library somewhere, or is the author of this blog using something from his own private code base?
#measured decorates the some_func() function, using a function or class named measured. The # is the decorator syntax, measured is the decorator function name.
Decorators can be a bit hard to understand, but they are basically used to either wrap code around a function, or inject code into one.
For example the measured function (used as a decorator) is probably implemented like this...
import time
def measured(orig_function):
# When you decorate a function, the decorator func is called
# with the original function as the first argument.
# You return a new, modified function. This returned function
# is what the to-be-decorated function becomes.
print "INFO: This from the decorator function"
print "INFO: I am about to decorate %s" % (orig_function)
# This is what some_func will become:
def newfunc(*args, **kwargs):
print "INFO: This is the decorated function being called"
start = time.time()
# Execute the old function, passing arguments
orig_func_return = orig_function(*args, **kwargs)
end = time.time()
print "Function took %s seconds to execute" % (end - start)
return orig_func_return # return the output of the original function
# Return the modified function, which..
return newfunc
#measured
def some_func(arg1):
print "This is my original function! Argument was %s" % arg1
# We call the now decorated function..
some_func(123)
#.. and we should get (minus the INFO messages):
This is my original function! Argument was 123
# Function took 7.86781311035e-06 to execute
The decorator syntax is just a shorter and neater way of doing the following:
def some_func():
print "This is my original function!"
some_func = measured(some_func)
There are some decorators included with Python, for example staticmethod - but measured is not one of them:
>>> type(measured)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
NameError: name 'measured' is not defined
Check the projects import statements to see where the function or class is coming from. If it uses from blah import * you'll need to check all of those files (which is why import * is discouraged), or you could just do something like grep -R def measured *
Yes it's real. It's a function decorator.
Function decorators in Python are functions that take a function as it's single argument, and return a new function in it's place.
#classmethod and #staticmethod are two built in function decorators.
Read more ยป
measured is the name of a function that must be defined before that code will work.
In general any function used as a decorator must accept a function and return a function. The function will be replaced with the result of passing it to the decorator - measured() in this case.