Python wrappers - Confusion about variable scope - python

I thought I would be clever and write a wrapper that called the session variables (many are present) and add that to the (django) views requiring session variables. However I seem to not be understanding the scope of the variables, or am writing this incorrectly.
The wrapper I have is:
def s_vars(func_to_decorate):
#wraps(func_to_decorate)
def wrapper(request, *args, **kwargs):
#add all the session variables to kwargs and accessible for the decorated function.
user_obj = request.user
datepicker = request.session['datepicker']
date_format = request.session['date_format']
.........
country = request.session['country']
metric = request.session['metric']
qrydtm = request.session.get("qrydtm",date.today())
result = func_to_decorate(request, *args, **kwargs)
#any post view checks to be done go here
#return to the function to be decorated.
return result
return wrapper
Then for the view I have something like:
#s_vars
def main(request, template_name='placeholder.html'):
return render_to_response(template_name, RequestContext(request,{
'user':user_obj
}))
But this leads to the error that user_obj is not accessible inside the method "main". My understanding was that this is an inner function and therefore the variables in the list under the "wrapper" method would be accessible to this inner function "main". What am I missing here?

The syntax
#spam
def ham():
pass
is precisely equivalent to the syntax
def ham():
pass
ham = spam(ham)
Does that clarify why what you are doing doesn't work?
If you want to pass stuffto a function from a decorator, the usual idiom is to send extra arguments to the function. This can be a little icky, because it means that the argspec that looks right is actually not.

Nested functions only take scoped variables from the scope where they are defined, and the binding takes place at compile time.
You cannot add scoped variables later on, and certainly not with a simple wrapper.

At the point where the inner function is called, it is just called by the outer scope function, not defined in it.
That distinction (also made by interpreter vs runtime) is definitely important in scoping. Take a look at the dis (disassembly) of the s_vars wrapper (or a reduced simple example of the same behaviour). The code is not reinterpreted for different values (it is just a value here) of func_to_decorate.
If you want to make a list of variables available to the inner function, perhaps an object passed in would make more sense. The wrapper could ensure that the external API is without it.

Related

how can a python decorator change calls in decorated function?

I can't figure out how to do this, and frankly, I don't know if it's possible.
I want to write a decorator that changes the way a function is called. It's easiest to see with example code:
def my_print(*args, **kwargs):
print(args[0].upper())
#reroute_decorator('print', my_print)
def my_func():
print('normally this print function is just a print function...')
print('but since my_func is decorated with a special reroute_decorator...')
print('it is replaced with a different function, and its args sent there.')
my_func()
# NORMALLY THIS PRINT FUNCTION IS JUST A PRINT FUNCTION...
# BUT SINCE MY_FUNC IS DECORATED WITH A SPECIAL REROUTE_DECORATOR...
# IT IS REPLACED WITH A DIFFERENT FUNCTION, AND ITS ARGS SENT THERE.
Is a decorator with this kind of functionality even possible in python?
Now, I don't really need this if it's too complex, I just can't figure out how to do it in a simple way.
Is this kind of a problem trivial? Or is it really complex?
You can create a new function with an updated globals dictionary so that to that function it appears that the global was bound to the desired value.
Note that this is weaker than actual dynamic scope as any functions called by the function will see the original bindings and not the modified one.
See namespaced_function referenced in How does Python's types.FunctionType create dynamic Functions?
To elaborate on #Dan D.'s answer, you would create a new function object to replace the original, something like this:
from types import FunctionType
def reroute_decorator(**kwargs):
def actual_decorator(func):
globals = func.__globals__.copy()
globals.update(kwargs)
new_func = FunctionType(
func.__code__, globals, name=func.__name__,
argdefs=func.__defaults__, closure=func.__closure__)
new_func.__dict__.update(func.__dict__)
return new_func
return actual_decorator
The only catch here is that the updated function object is the only one that will see whatever kwargs you passed in, since they will be spoofed into globals. Additionally, any modifications you make to the module after calling the decorator function will not be visible to the decorated function, but that should not be an issue. You can go a layer deeper and create a proxy dictionary that would allow you to interact normally with the original, except for keys you explicitly defined, like print, but that's a bit out of scope here.
I've updated your print implementation to be a bit more general, and made the input to the decorator function more pythonic (less MATLABy):
def my_print(*args, **kwargs):
print(*(str(x).upper() for x in args), **kwargs)
#reroute_decorator(print=my_print)
def my_func():
print('normally this print function is just a print function...')
print('but since my_func is decorated with a special reroute_decorator...')
print('it is replaced with a different function, and its args sent there.')
Which results in:
>>> my_func()
NORMALLY THIS PRINT FUNCTION IS JUST A PRINT FUNCTION...
BUT SINCE MY_FUNC IS DECORATED WITH A SPECIAL REROUTE_DECORATOR...
IT IS REPLACED WITH A DIFFERENT FUNCTION, AND ITS ARGS SENT THERE.

Decorator with parameters

Can you explain me how the following decorator works:
def set_ev_cls(ev_cls, dispatchers=None):
def _set_ev_cls_dec(handler):
if 'callers' not in dir(handler):
handler.callers = {}
for e in _listify(ev_cls):
handler.callers[e] = _Caller(_listify(dispatchers), e.__module__)
return handler
return _set_ev_cls_dec
#set_ev_cls(ofp_event.EventOFPSwitchFeatures, CONFIG_DISPATCHER)
def _switch_features_handler(self, ev):
datapath = ev.msg.datapath
....
Please, don't go into details on what's going on inside the function. I'm interested in how the decorator with parameters wrap methods here. By the way, it's a code snippet from Ryu (event registration mechanism).
Thank you in advance
First, a decorator is just a function that gets called with a function. In particular, the following are (almost) the same thing:
#spam
def eggs(arg): pass
def eggs(arg): pass
eggs = spam(eggs)
So, what happens when the decorator takes parameters? Same thing:
#spam(arg2)
def eggs(arg): pass
def eggs(arg): pass
eggs = spam(arg2)(eggs)
Now, notice that the function _set_ev_cls_dec, which is ultimately returned and used in place of _switch_features_handler, is a local function, defined inside the decorator. That means it can be a closure over variables from the outer function—including the parameters of the outer function. So, it can use the handler argument at call time, plus the ev_cls and dispatchers arguments that it got at decoration time.
So:
set_ev_cls_dev creates a local function and returns a closure around its ev_cls and dispatchers arguments, and returns that function.
That closure gets called with _switch_features_handler as its parameter, and it modifies and returns that parameter by adding a callers attribute, which is a dict of _Caller objects built from that closed-over dispatchers parameter and keyed off that closed-over ev_cls parameter.
Explain how it works without detailing what's going on inside? That kind of sounds like "explain without explaining," but here's a rough walkthrough:
Think of set_ev_cls as a factory for decorators. It's there to catch the arguments at the time the decorator is invoked:
#set_ev_cls(ofp_event.EventOFPSwitchFeatures, CONFIG_DISPATCHER)
And return a function, _set_ev_cls_dec that has its variables bound to:
ev_cls = ofp_event.EventOFPSwitchFeatures
dispatchers = CONFIG_DISPATCHER
Or put another way, you now have a 'customized' or 'parametrized' dispatcher that's logically equivalent to:
def custom_decorator(handler):
if 'callers' not in dir(handler):
handler.callers = {}
for e in _listify(ofp_event.EventOFPSwitchFeatures):
handler.callers[e] = _Caller(_listify(CONFIG_DISPATCHER), e.__module__)
return handler
(If you captured the values of ofp_event.EventOFPSwitchFeatures and CONFIG_DISPATCHER at the moment the #set_ev_cls(...) was called).
The custom_decorator of step 1 is applied to _switch_features_handleras a more traditional unparameterized decorator.

daisy-chaining Python/Django custom decorators

Is it good style to daisy-chain Python/Django custom decorators? And pass different arguments than received?
Many of my Django view functions start off with the exact same code:
#login_required
def myView(request, myObjectID):
try:
myObj = MyObject.objects.get(pk=myObjectID)
except:
return myErrorPage(request)
try:
requester = Profile.objects.get(user=request.user)
except:
return myErrorPage(request)
# Do Something interesting with requester and myObj here
FYI, this is what the corresponding entry in urls.py file looks like:
url(r'^object/(?P<myObjectID>\d+)/?$', views.myView, ),
Repeating the same code in many different view functions is not DRY at all. I would like to improve it by creating a decorator that would do this repetitive work for me and make the new view functions much cleaner and look like this:
#login_required
#my_decorator
def myView(request, requester, myObj):
# Do Something interesting with requester and myObj here
So here are my questions:
Is this a valid thing to do? Is it good style? Notice that I will be changing the signature of the myView() function. That feels a bit strange and risky to me. But I'm not sure why
If I make multiple such decorators that do some common function but each call the wrapped function with different arguments than the decorator received, is it OK if I daisy-chain them together?
If it is OK to #1 and #2 above, what is the best way to indicate to the users of this myView what the set of arguments are that they should pass in (because just looking at the parameters in the function definition is no longer really valid)
That's a very interesting question ! Another one has already been answered in depth on the basic usage of decorators. But it does not provide much insight on modifying arguments
Stackable decorators
You can find on that other question an example of stacked decorators with the following piece of explanation hidden in a very, very long and detailed answer :
Yes, that’s all, it’s that simple. #decorator is just a shortcut to:
another_stand_alone_function = my_shiny_new_decorator(another_stand_alone_function)
And that's the magic. As python documentation states : a decorator is a function returning another function.
That means you can do :
from functools import wraps
def decorator1(f):
#wraps(f)
def wrapper(*args, **kwargs):
do_something()
f(*args, **kwargs)
return wrapper
def decorator2(f):
#wraps(f)
def wrapper(*args, **kwargs):
do_something_else()
f(*args, **kwargs)
return wrapper
#decorator1
#decorator2
def myfunc(n):
print "."*n
#is equivalent to
def myfunc(n):
print "."*n
myfunc = decorator1(decorator2(myfunc))
Decorators are not Decorators
Python decorators might be puzzling for developpers who learned OOP with a language where GoF has already used half of the dictionary to name the patterns who fix the failures of the language is the de-facto design pattern shop.
GoF's decorators are subclasses of the component (interface) they're decorating, therefore sharing this interface with any other subclass of that component.
Python decorators are functions returning functions (or classes).
Functions all the way down
A python decorator is a function returning a function, any function.
Most decorators out there are designed to extend the decorated function without getting in the way of it's expected behavior. They are shaped after GoF's definition of the Decorator pattern, which describes a way to extend an object while keeping it's interface.
But GoF's Decorator is a pattern, whereas python's decorator is a feature.
Python decorators are functions, these functions are expected to return functions (when provided a function).
Adapters
Let's take another GoF pattern : Adapter
An adapter helps two incompatible interfaces to work together.
This is the real world definition for an adapter.
[An Object] adapter contains an instance of the class it wraps.
In this situation, the adapter makes calls to the instance of the wrapped
object.
Take for example an object — say a dispatcher, who would call a function which takes some defined parameters, and take a function who would do the job but provided another set of parameters. Parameters for the second function can be derived from those of the first.
A function (which is a first-class object in python) who would take the parameters of the first and derive them to call the second and return a value derived from its result would be an adapter.
A function returning an adapter for the function it is passed would be an adapter factory.
Python decorators are functions returning functions. Including adapters.
def my_adapter(f):
def wrapper(*args, **kwargs):
newargs, newkwargs = adapt(args, kwargs)
return f(*newargs, **newkwargs)
#my_adapter # This is the contract provider
def myfunc(*args, **kwargs):
return something()
Oooooh, I see what you did there… is it good style ?
I'd say, hell yeah, yet another built-in pattern ! But you'd have to forget about GoF Decorators and simply remember that python decorators are functions which return functions. Therefore, the interface you're dealing with is the one of the wrapper function, not the decorated one.
Once you decorate a function, the decorator defines the contract, either telling it's keeping the interface of the decorated function or abstracting it away. You don't call that decorated function anymore, it's even tricky to try it, you call the wrapper.
First of all, this block of code:
try:
myObj = MyObject.objects.get(pk=myObjectID)
except:
return myErrorPage(request)
can be replaced with:
from django.shortcuts import get_object_or_404
myObj = get_object_or_404(MyObject, pk=myObjectID)
The same applies with the second block of code you have.
That in and of itself makes this a lot more elegant.
If you'd like to go further and implement your own decorator, your best bet is to subclass #login_required. If you're passing different arguments or don't want to do that, then you can indeed make your own decorator and it wouldn't be wrong.
1) Yes, chaining decorators is valid as other answers have already pointed out. Good style is subjective, but personally I think it would make your code much harder to read for others. Someone familiar with Django but not your application would need to keep extra context in their head while working with your code. I think it's very important to stick to framework conventions to make your code as maintainable as possible.
2) The answer is yes, it is technically okay to pass in different arguments to the wrapped function, but consider a simple code example of how this would work:
def decorator1(func):
def wrapper1(a1):
a2 = "hello from decorator 1"
func(a1, a2)
return wrapper1
def decorator2(func):
def wrapper2(a1, a2):
a3 = "hello from decorator 2"
func(a1, a2, a3)
return wrapper2
#decorator1
#decorator2
def my_func(a1, a2, a3):
print a1, a2, a3
my_func("who's there?")
# Prints:
# who's there?
# hello from decorator 1
# hello from decorator2
In my opinion, any person reading this would need to be a mental gymnast to keep context of the method signatures at each level of the decorator stack.
3) I would use a class-based view and override the dispatch() method to set instance variables like this:
class MyView(View):
#method_decorator(login_required)
def dispatch(self, *args, **kwargs):
self.myObj = ...
self.requester = ...
return super(MyView, self).dispatch(*args, **kwargs)
The dispatch method is what calls your get()/post() methods. From the django docs:
The as_view entry point creates an instance of your class and calls its dispatch() method. dispatch looks at the request to determine whether it is a GET, POST, etc, and relays the request to a matching method if one is defined
Then you could access these instance variables in your get() and/or post() view methods. The advantage of this approach is that you could extract this out to a base class and use it in any number of View subclasses. It is also a lot more traceable in an IDE because this is standard inheritance.
An example of how a get() request would look like:
class MyView(View):
def get(self, request, id):
print 'requester is {}'.format(self.requester)

understanding functools.wraps

I am using Flask-RESTful and trying to have my REST endpoints using the technique showed here
The main code is
def authenticate(func):
#wraps(func)
def wrapper(*args, **kwargs):
if not getattr(func, 'authenticated', True):
return func(*args, **kwargs)
acct = basic_authentication() # custom account lookup function
if acct:
return func(*args, **kwargs)
restful.abort(401)
return wrapper
class Resource(restful.Resource):
method_decorators = [authenticate] # applies to all inherited resources
I do the same way and it seems to work, but I am not sure what happens with #wraps?
It seems magic to me at the moment and I did not understand the following
a.) It seems function which is wrapped with #wraps is passed to the wrapper, then what is the wrapper returning?
Possible answer: Everything that was passed to the function initially?
If yes, how can I pass more information like the acct object with everything so that my function receives the account object and I don't have to do a database fetch for it?
UPDATE
Based on the example, my rest endpoint looks like
class UserResource(RestResource):
def get(self, uuid):
return {'method': 'get user -> ' + uuid}
and I call it like
curl -X GET http://127.0.0.1:5000/users/validUUID
Now when my every request is authenticated, I see if a valid acct object exists and if it exists, I delegate the control to endpoint
Question:
Since I am actually making one database call to find out acct object, is it possible to pass in that to the endpoint when a valid acct is located?
This way two things happen
a.) I know the call is authenticated
b.) I reuse the acct object which I can use for my further work, rather than making the DB call again and get the acct object from validUUID again
How can I achieve this ?
authenticate is a decorator -- it takes a function and returns a modified version of that function (which is usually implemented by wrapping the function and wrapping it).
Now, the problem with wrappers is that they often don't act exactly like the original function in some ways -- they may be missing docstrings, have the wrong __name__ (wrapper instead of what it should be called), and other blemishes. This might be important if some other code is using that extra information. functools.wraps is a simple function that adds this information from the original function (here, func) to the wrapper function, so it behaves more like the original function. (Technically, it is itself a decorator, which is the confusing part, but you don't have to worry about that detail. Just know that it is a nice tool that copies attributes from a wrapped function to a wrapper function).
Thus, when you write
new_function = authenticate(old_function)
or more commonly
#authenticate
def function(...)
new_function will look more like old_function.

"self" inside plain function?

I've got a bunch of functions (outside of any class) where I've set attributes on them, like funcname.fields = 'xxx'. I was hoping I could then access these variables from inside the function with self.fields, but of course it tells me:
global name 'self' is not defined
So... what can I do? Is there some magic variable I can access? Like __this__.fields?
A few people have asked "why?". You will probably disagree with my reasoning, but I have a set of functions that all must share the same signature (accept only one argument). For the most part, this one argument is enough to do the required computation. However, in a few limited cases, some additional information is needed. Rather than forcing every function to accept a long list of mostly unused variables, I've decided to just set them on the function so that they can easily be ignored.
Although, it occurs to me now that you could just use **kwargs as the last argument if you don't care about the additional args. Oh well...
Edit: Actually, some of the functions I didn't write, and would rather not modify to accept the extra args. By "passing in" the additional args as attributes, my code can work both with my custom functions that take advantage of the extra args, and with third party code that don't require the extra args.
Thanks for the speedy answers :)
self isn't a keyword in python, its just a normal variable name. When creating instance methods, you can name the first parameter whatever you want, self is just a convention.
You should almost always prefer passing arguments to functions over setting properties for input, but if you must, you can do so using the actual functions name to access variables within it:
def a:
if a.foo:
#blah
a.foo = false
a()
see python function attributes - uses and abuses for when this comes in handy. :D
def foo():
print(foo.fields)
foo.fields=[1,2,3]
foo()
# [1, 2, 3]
There is nothing wrong with adding attributes to functions. Many memoizers use this to cache results in the function itself.
For example, notice the use of func.cache:
from decorator import decorator
#decorator
def memoize(func, *args, **kw):
# Author: Michele Simoniato
# Source: http://pypi.python.org/pypi/decorator
if not hasattr(func, 'cache'):
func.cache = {}
if kw: # frozenset is used to ensure hashability
key = args, frozenset(kw.iteritems())
else:
key = args
cache = func.cache # attribute added by memoize
if key in cache:
return cache[key]
else:
cache[key] = result = func(*args, **kw)
return result
You can't do that "function accessing its own attributes" correctly for all situations - see for details here how can python function access its own attributes? - but here is a quick demonstration:
>>> def f(): return f.x
...
>>> f.x = 7
>>> f()
7
>>> g = f
>>> g()
7
>>> del f
>>> g()
Traceback (most recent call last):
File "<interactive input>", line 1, in <module>
File "<interactive input>", line 1, in f
NameError: global name 'f' is not defined
Basically most methods directly or indirectly rely on accessing the function object through lookup by name in globals; and if original function name is deleted, this stops working. There are other kludgey ways of accomplishing this, like defining class, or factory - but thanks to your explanation it is clear you don't really need that.
Just do the mentioned keyword catch-all argument, like so:
def fn1(oneArg):
// do the due
def fn2(oneArg, **kw):
if 'option1' in kw:
print 'called with option1=', kw['option1']
//do the rest
fn2(42)
fn2(42, option1='something')
Not sure what you mean in your comment of handling TypeError - that won't arise when using **kw. This approach works very well for some python system functions - check min(), max(), sort(). Recently sorted(dct,key=dct.get,reverse=True) came very handy to me in CodeGolf challenge :)
Example:
>>> def x(): pass
>>> x
<function x at 0x100451050>
>>> x.hello = "World"
>>> x.hello
"World"
You can set attributes on functions, as these are just plain objects, but I actually never saw something like this in real code.
Plus. self is not a keyword, just another variable name, which happens to be the particular instance of the class. self is passed implicitly, but received explicitly.
if you want globally set parameters for a callable 'thing' you could always create a class and implement the __call__ method?
There is no special way, within a function's body, to refer to the function object whose code is executing. Simplest is just to use funcname.field (with funcname being the function's name within the namespace it's in, which you indicate is the case -- it would be harder otherwise).
This isn't something you should do. I can't think of any way to do what you're asking except some walking around on the call stack and some weird introspection -- which isn't something that should happen in production code.
That said, I think this actually does what you asked:
import inspect
_code_to_func = dict()
def enable_function_self(f):
_code_to_func[f.func_code] = f
return f
def get_function_self():
f = inspect.currentframe()
code_obj = f.f_back.f_code
return _code_to_func[code_obj]
#enable_function_self
def foo():
me = get_function_self()
print me
foo()
While I agree with the the rest that this is probably not good design, the question did intrigue me. Here's my first solution, which I may update once I get decorators working. As it stands, it relies pretty heavily on being able to read the stack, which may not be possible in all implementations (something about sys._getframe() not necessarily being present...)
import sys, inspect
def cute():
this = sys.modules[__name__].__dict__.get(inspect.stack()[0][3])
print "My face is..." + this.face
cute.face = "very cute"
cute()
What do you think? :3
You could use the following (hideously ugly) code:
class Generic_Object(object):
pass
def foo(a1, a2, self=Generic_Object()):
self.args=(a1,a2)
print "len(self.args):", len(self.args)
return None
... as you can see it would allow you to use "self" as you described. You can't use an "object()" directly because you can't "monkey patch(*)" values into an object() instance. However, normal subclasses of object (such as the Generic_Object() I've shown here) can be "monkey patched"
If you wanted to always call your function with a reference to some object as the first argument that would be possible. You could put the defaulted argument first, followed by a *args and optional **kwargs parameters (through which any other arguments or dictionaries of options could be passed during calls to this function).
This is, as I said hideously ugly. Please don't ever publish any code like this or share it with anyone in the Python community. I'm only showing it here as a sort of strange educational exercise.
An instance method is like a function in Python. However, it exists within the namespace of a class (thus it must be accessed via an instance ... myobject.foo() for example) and it is called with a reference to "self" (analagous to the "this" pointer in C++) as the first argument. Also there's a method resolution process which causes the interpreter to search the namespace of the instance, then it's class, and then each of the parent classes and so on ... up through the inheritance tree.
An unbound function is called with whatever arguments you pass to it. There can't bee any sort of automatically pre-pended object/instance reference to the argument list. Thus, writing a function with an initial argument named "self" is meaningless. (It's legal because Python doesn't place any special meaning on the name "self." But meaningless because callers to your function would have to manually supply some sort of object reference to the argument list and it's not at all clear what that should be. Just some bizarre "Generic_Object" which then floats around in the global variable space?).
I hope that clarifies things a bit. It sounds like you're suffering from some very fundamental misconceptions about how Python and other object-oriented systems work.
("Monkey patching" is a term used to describe the direct manipulation of an objects attributes -- or "instance variables" by code that is not part of the class hierarchy of which the object is an instance).
As another alternative, you can make the functions into bound class methods like so:
class _FooImpl(object):
a = "Hello "
#classmethod
def foo(cls, param):
return cls.a + param
foo = _FooImpl.foo
# later...
print foo("World") # yes, Hello World
# and if you have to change an attribute:
foo.im_self.a = "Goodbye "
If you want functions to share attribute namespaecs, you just make them part of the same class. If not, give each its own class.
What exactly are you hoping "self" would point to, if the function is defined outside of any class? If your function needs some global information to execute properly, you need to send this information to the function in the form of an argument.
If you want your function to be context aware, you need to declare it within the scope of an object.

Categories