Passing an object to a decorator - python

I'm trying to create a Python file containing all of the decorators which I need to use in the rest of the program. These decorators are stored inside a class, which I called Decorators. Then I tried to add a decorator to check if the argument of a decorated function match with the argument types passed to the decorator itself (I took this kind of decorator from the example 4 at the site https://www.python.org/dev/peps/pep-0318/#examples, but I changed it a bit to better fit my style of coding). The syntax is like this:
class Decorators(object):
""" Decorators class: contain all the decorators """
#classmethod
def argument_consistency(cls, *function_arguments_type):
""" check the consistency of argument and their types of the decorated function """
def check_arguments(function):
""" check if the number of passed arguments is different from the number of accepted arguments """
# check if the number of passed arguments is different from the number of accepted arguments
if not len(function_arguments_type) == function.__code__.co_argcount:
raise Exception("the number of passed argument is different from the number of the accepted arguments")
def inner_function(*args, **kwds):
""" check if the type of the passed arguments match with the requested ones """
# iterate through the list of couples (argument, argument's type) and check for their match
for (arguments, argument_types) in zip(args, function_arguments_type):
# remember that: {arguments} is the n-th argument passed to the function, while
# the {argument_types} is the n-th argument types. {args} is the entire list of arguments
# passed to the function; {function_arguments_type} is the entire list of types. So zip
# returns an iterator of tuples of element of {args} and {function_arguments_type} paired
# together, for example zip((1, 2, 3, 4), (a, b, c, d)) = ((1, a), (2, b), (3, c), (4, d))
# check if the n-th argument type match with the one requested
if not type(arguments) == argument_types:
raise Exception(f"The argument {arguments} doesn't match the type, "
f"which must be {argument_types}")
# returning the passed function using the passed arguments
return function(*args, **kwds)
# changing the name of the inner_function to the {function}'s name
inner_function.__name__ = function.__name__
# return the inner function
return inner_function
# return the check_argument function
return check_arguments
To test the previous decorator I created the simple class A whith a function a:
class A():
def __init__(self):
pass
#Decorators.argument_consistency(str, str)
def a(self, str1, str2):
print(f"{str1} AND {str2}")
a = A()
a.a("ciao", "ciao2")
Obviously, when I decorated the function a I got an error (rised by the argument_consistency decorator itself). This because the length of the list argument types is different from the length of the list of passed arguments. The error appeared because I didn't put the self parameter. Understood this error, I tried to pass self to the decorator, but I got an error: NameError: name 'self' is not defined (this happens even if I pass type(self)); then I tried to pass the class A itself, but I still got the same error. So I tried to fix this by adding to the decorator one line between the for loop and the if not type(arguments) == argument_types:
if not (args.index(arguments) == 0 and argument_types is None):
# check if the n-th argument type match with the one requested
if not type(arguments) == argument_types:
# the rest of the code
pass
These line checks whether the first argument passed to the function decorator is None, then that means that the first parameter of the function is self, so the function doesn't proceed to check if None is equal to the type of the self parameter (which obviously is not). This way is very cumbersome and the opposite of elegant. Therefore I wonder if there is a way to avoid this fix and directly pass the self type to the decorator.

You can make a stub class for self argument for object/class methods
class selftype:
pass
and pass it to decorator
#Decorators.argument_consistency(selftype, str, str)
def a(self, str1, str2):
print(f"{str1} AND {str2}")
Then check in inner_function if the first type in decorator is your stub type:
def inner_function(*args, **kwds):
for (argument, argument_type, i) in zip(args, function_arguments_type, range(0, len(args))):
if argument_type == selftype and i == 0:
pass
# check if the n-th argument type match with the one requested
elif not type(argument) == argument_type:
raise Exception(f"The argument {argument} doesn't match the type, "
f"which must be {argument_type}")
# returning the passed function using the passed arguments
return function(*args, **kwds)
Not very elegant, but this works for both object/class methods and functions
class A():
def __init__(self):
pass
#Decorators.argument_consistency(selftype, str, str)
def a(self, str1, str2):
print(f"{str1} AND {str2}")
a = A()
a.a("ciao", "ciao2")
#Decorators.argument_consistency(str)
def b(str1):
print(f"{str1}")
b("a")
Also, if you want to use your decorator in pair with #classmethod or #staticmethod, make sure to apply your decorator first, otherwise it won't be able to access function.__code__ attribute.
I quite liked the solution proposed by #go2nirvana in comments, but it didn't work for me, unfortunately. inspect.ismethod(function) returns False inside decorator functions calls, idk why.

Related

Default argument for a multiple argument function only when no argument supplied

Let's consider a function that accepts multiple arguments as under:
def my_function(*args):
my_list = []
for _ in args:
my_list.append(_)
another_function(my_list)
Now, the issue that I face in my scenario is that: I need my_list to contain at least one value, any argument, but at least one.
A user can do my_function(arg1), my_fuynction(arg1,arg2), my_function(arg1,arg2,arg3) and so on. But if a user does my_function(), I need to provide a default argument (say arg1) to the function.
If I put an default argument, then the argument defaultarg will be compulsory and it has to be supplied with some value:
def my_function(defaultarg, *args):
#function
If I put it as optional argument, then I will have to provide a default value to the argument:
def my_function(optionalarg = defaultvalue, *args):
#function
Both these ways do not work for me as I can't make it compulsory and I can't give a default value for it.
How do I create this function so that if no arguments are passed, the function assumes one argument to have been passed?
As of now, I am handling this by putting a if...else in the function as under:
def my_function(*args):
my_list = []
if len(args) == 0:
my_list.append('defaultarg')
else:
for _ in args:
my_list.append(_)
another_function(my_list)
Is there any better way to do this?
I can't give a default value for it
I don't understand this part of your question. You are providing the default value 'defaultarg' in your own solution?! My answer assumes that you can get the default value from somewhere.
Anyway, if you want to keep the f(*args) signature, you can check whether the args tuple is empty and supply a default value.
def my_function(*args):
if not args:
args = (default_value,)
# more code
or
def my_function(*args):
if not args:
return my_function(default_value)
# more code
You won't get around explicitly checking whether args is empty, but maybe you'll like one of these proposals better than your version.
edit:
You could also write a parameterized decorator.
def with_default_value(default_value):
def with_default_value_decorator(f):
def f_new(*args):
if not args:
return f(default_value)
return f(*args)
return f_new
return with_default_value_decorator
Example:
#with_default_value('hi')
def my_print(*args):
print(' '.join(args))
Demo:
>>> my_print('hello', 'world')
hello world
>>> my_print()
hi

How instance attributes are passed to the decorator inner function?

I recently studied how decorators work in python, and found an example which integrates decorators with nested functions.
The code is here :
def integer_check(method):
def inner(ref):
if not isinstance(ref._val1, int) or not isinstance(ref._val2, int):
raise TypeError('val1 and val2 must be integers')
else:
return method(ref)
return inner
class NumericalOps(object):
def __init__(self, val1, val2):
self._val1 = val1
self._val2 = val2
#integer_check
def multiply_together(self):
return self._val1 * self._val2
def power(self, exponent):
return self.multiply_together() ** exponent
y = NumericalOps(1, 2)
print(y.multiply_together())
print(y.power(3))
My question is how the inner function argument("ref") accesses the instance attributes (ref._val1 and ref._val2)?
It seems like ref equals the instance but i have no idea how it happenes.
Let's first recall how a decorator works:
Decorating the method multiply_together with the decorator #integer_check is equivalent to adding the line: multiply_together = integer_check(multiply_together), and by the definition of multiply_together, this is equivalent to multiply_together = inner.
Now, when you call the method multiply_together, since this is an instance method, Python implicitly adds the class instance used to invoke the method as its first (an only, in this case) argument. But multiply_togethet is, actually,inner, so, in fact, inner is invoked with the class instance as an argument. This instance is mapped to the parameter ref, and through this parameter the function gets access to the required instance attributes.
well one explanation I found some time ago about the self argument was that this:
y.multiply_together()
is roughly the same as
NumericalOps.multiply_together(y)
So now that you use that decorator it returns the function inner which requires the ref argument so I see that roughly happen like this (on a lower level):
NumericalOps.inner(y)
Because inner "substitutes" multiply_together while also adding the extra functionality
inner replaces the original function as the value of the class attribute.
#integer_check
def multiply_together(self):
return self._val1 * self._val2
# def multiply_together(self):
# ...
#
# multiply_together = integer_check(multiply_together)
first defines a function and binds it to the name multiply_together. That function is then passed as the argument to integer_check, and then the return value of integer_check is bound to the name multiply_together. The original function is now only refernced by the name ref that is local to inner/multiply_together.
The definition of inner implies that integer_check can only be applied to functions whose first argument will have attributes named _val1 and _val2.

Python closure and function attributes

I tried to reimplement something like partial (which later will have more behavior). Now in the following example lazycall1 seems to work just as fine as lazycall2, so I don't understand why the documentation of partial suggests using the longer second version. Any suggestions? Can it get me in trouble?
def lazycall1(func, *args, **kwargs):
def f():
func(*args, **kwargs)
return f
def lazycall2(func, *args, **kwargs):
def f():
func(*args, **kwargs)
f.func=func # why do I need that?
f.args=args
f.kwargs=kwargs
return f
def A(x):
print("A", x)
def B(x):
print("B", x)
a1=lazycall1(A, 1)
b1=lazycall1(B, 2)
a1()
b1()
a2=lazycall2(A, 3)
b2=lazycall2(B, 4)
a2()
b2()
EDIT: Actually the answers given so far aren't quite right. Even with double arguments it would work. Is there another reason?
def lazycall(func, *args):
def f(*args2):
return func(*(args+args2))
return f
def sum_up(a, b):
return a+b
plusone=lazycall(sum_up, 1)
plustwo=lazycall(sum_up, 2)
print(plusone(6)) #7
print(plustwo(9)) #11
The only extra thing the second form has, are some extra properties. This might be helpful if you start passing around the functions returned by lazycall2, so that the receiving function may make decisions based on these values.
functools.partial can accept additional arguments - or overridden arguments - in the inner, returned function. Your inner f() functions don't, so there's no need for what you're doing in lazycall2. However, if you wanted to do something like this:
def sum(a, b):
return a+b
plusone = lazycall3(sum, 1)
plusone(6) # 7
You'd need to do what is shown in those docs.
Look closer at the argument names in the inner function newfunc in the Python documentation page you link to, they are different than those passed to the inner function, args vs. fargs, keywords vs. fkeywords. Their implementation of partial saves the arguments that the outer function was given and adds them to the arguments given to the inner function.
Since you reuse the exact same argument names in your inner function, the original arguments to the outer function won't be accessible in there.
As for setting func, args, and kwargs attributes on the outer function, a function is an object in Python, and you can set attributes on it. These attributes allow you to get access to the original function and arguments after you have passed them into your lazycall functions. So a1.func will be A and a1.args will be [1].
If you don't need to keep track of the original function and arguments, you should be fine
with your lazycall1.

python check if function accepts **kwargs

is there a way to check if a function accepts **kwargs before calling it e.g.
def FuncA(**kwargs):
print 'ok'
def FuncB(id = None):
print 'ok'
def FuncC():
print 'ok'
args = {'id': '1'}
FuncA(**args)
FuncB(**args)
FuncC(**args)
When I run this FuncA and FuncB would be okay but FuncC errors with got an unexpected keyword argument 'id' as it doesn't accept any arguments
try:
f(**kwargs)
except TypeError:
#do stuff
It's easier to ask forgiveness than permission.
def foo(a, b, **kwargs):
pass
import inspect
args, varargs, varkw, defaults = inspect.getargspec(foo)
assert(varkw=='kwargs')
This only works for Python functions. Functions defined in C extensions (and built-ins) may be tricky and sometimes interpret their arguments in quite creative ways. There's no way to reliably detect which arguments such functions expect. Refer to function's docstring and other human-readable documentation.
func is the function in question.
with python2, it's:
inspect.getargspec(func).keywords is not None
python3 is a bit tricker, following https://www.python.org/dev/peps/pep-0362/ the kind of parameter must be VAR_KEYWORD
Parameter.VAR_KEYWORD - a dict of keyword arguments that aren't bound to any other parameter. This corresponds to a "**kwargs" parameter in a Python function definition.
any(param for param in inspect.signature(func).parameters.values() if param.kind == param.VAR_KEYWORD)
For python > 3 you should to use inspect.getfullargspec.
import inspect
def foo(**bar):
pass
arg_spec = inspect.getfullargspec(foo)
assert arg_spec.varkw and arg_spec.varkw == 'bar'
Seeing that there are a multitude of different answers in this thread, I thought I would give my two cents, using inspect.signature().
Suppose you have this method:
def foo(**kwargs):
You can test if **kwargs are in this method's signature:
import inspect
sig = inspect.signature(foo)
params = sig.parameters.values()
has_kwargs = any([True for p in params if p.kind == p.VAR_KEYWORD])
More
Getting the parameters in which a method takes is also possible:
import inspect
sig = inspect.signature(foo)
params = sig.parameters.values()
for param in params:
print(param.kind)
You can also store them in a variable like so:
kinds = [param.kind for param in params]
# [<_ParameterKind.VAR_KEYWORD: 4>]
Other than just keyword arguments, there are 5 parameter kinds in total, which are as follows:
POSITIONAL_ONLY # parameters must be positional
POSITIONAL_OR_KEYWORD # parameters can be positional or keyworded (default)
VAR_POSITIONAL # *args
KEYWORD_ONLY # parameters must be keyworded
VAR_KEYWORD # **kwargs
Descriptions in the official documentation can be found here.
Examples
POSITIONAL_ONLY
def foo(a, /):
# the '/' enforces that all preceding parameters must be positional
foo(1) # valid
foo(a=1) #invalid
POSITIONAL_OR_KEYWORD
def foo(a):
# 'a' can be passed via position or keyword
# this is the default and most common parameter kind
VAR_POSITIONAL
def foo(*args):
KEYWORD_ONLY
def foo(*, a):
# the '*' enforces that all following parameters must by keyworded
foo(a=1) # valid
foo(1) # invalid
VAR_KEYWORD
def foo(**kwargs):
It appears that you want to check whether the function receives an 'id' keyword argument. You can't really do that by inspection because the function might not be a normal function, or you might have a situation like that:
def f(*args, **kwargs):
return h(*args, **kwargs)
g = lambda *a, **kw: h(*a, **kw)
def h(arg1=0, arg2=2):
pass
f(id=3) still fails
Catching TypeError as suggested is the best way to do that, but you can't really figure out what caused the TypeError. For example, this would still raise a TypeError:
def f(id=None):
return "%d" % id
f(**{'id': '5'})
And that might be an error that you want to debug. And if you're doing the check to avoid some side effects of the function, they might still be present if you catch it. For example:
class A(object):
def __init__(self): self._items = set([1,2,3])
def f(self, id): return self._items.pop() + id
a = A()
a.f(**{'id': '5'})
My suggestion is to try to identify the functions by another mechanism. For example, pass objects with methods instead of functions, and call only the objects that have a specific method. Or add a flag to the object or the function itself.
According to https://docs.python.org/2/reference/datamodel.html
you should be able to test for use of **kwargs using co_flags:
>>> def blah(a, b, kwargs):
... pass
>>> def blah2(a, b, **kwargs):
... pass
>>> (blah.func_code.co_flags & 0x08) != 0
False
>>> (blah2.func_code.co_flags & 0x08) != 0
True
Though, as noted in the reference this may change in the future, so I would definitely advise to be extra careful. Definitely add some unit tests to check this feature is still in place.

Named keywords in decorators?

I've been playing around in depth with attempting to write my own version of a memoizing decorator before I go looking at other people's code. It's more of an exercise in fun, honestly. However, in the course of playing around I've found I can't do something I want with decorators.
def addValue( func, val ):
def add( x ):
return func( x ) + val
return add
#addValue( val=4 )
def computeSomething( x ):
#function gets defined
If I want to do that I have to do this:
def addTwo( func ):
return addValue( func, 2 )
#addTwo
def computeSomething( x ):
#function gets defined
Why can't I use keyword arguments with decorators in this manner? What am I doing wrong and can you show me how I should be doing it?
You need to define a function that returns a decorator:
def addValue(val):
def decorator(func):
def add(x):
return func(x) + val
return add
return decorator
When you write #addTwo, the value of addTwo is directly used as a decorator. However, when you write #addValue(4), first addValue(4) is evaluated by calling the addValue function. Then the result is used as a decorator.
You want to partially apply the function addValue - give the val argument, but not func. There are generally two ways to do this:
The first one is called currying and used in interjay's answer: instead of a function with two arguments, f(a,b) -> res, you write a function of the first arg that returns another function that takes the 2nd arg g(a) -> (h(b) -> res)
The other way is a functools.partial object. It uses inspection on the function to figure out what arguments a function needs to run (func and val in your case ). You can add extra arguments when creating a partial and once you call the partial, it uses all the extra arguments given.
from functools import partial
#partial(addValue, val=2 ) # you can call this addTwo
def computeSomething( x ):
return x
Partials are usually a much simpler solution for this partial application problem, especially with more than one argument.
Decorators with any kinds of arguments -- named/keyword ones, unnamed/positional ones, or some of each -- essentially, ones you call on the #name line rather than just mention there -- need a double level of nesting (while the decorators you just mention have a single level of nesting). That goes even for argument-less ones if you want to call them in the # line -- here's the simplest, do-nothing, double-nested decorator:
def double():
def middling():
def inner(f):
return f
return inner
return middling
You'd use this as
#double()
def whatever ...
note the parentheses (empty in this case since there are no arguments needed nor wanted): they mean you're calling double, which returns middling, which decorates whatever.
Once you've seen the difference between "calling" and "just mentioning", adding (e.g. optional) named args is not hard:
def doublet(foo=23):
def middling():
def inner(f):
return f
return inner
return middling
usable either as:
#doublet()
def whatever ...
or as:
#doublet(foo=45)
def whatever ...
or equivalently as:
#doublet(45)
def whatever ...

Categories