Decorators with parameters - python

I have a collection of functions with (mostly) shared parameters but different processes. I'd like to use a decorator to add the description for each parameter to a function's headline-level docstring.
I've tried to mimic the structure found in this answer by incorporating a nested function within appender but failed. I've also tried functools.partial but something is slightly off.
My attempt:
def appender(func, *args):
"""Appends additional parameter descriptions to func's __doc__."""
def _doc(func):
params = ''.join([defaultdocs[arg] for arg in args])
func.__doc__ += '\n' + params
return func
return _doc
defaultdocs = {
'a' :
"""
a : int, default 0
the first parameter
""",
'b' :
"""
b : int, default 1
the second parameter
"""
}
#appender('a')
def f(a):
"""Title-level docstring."""
return a
#appender('a', 'b')
def g(a, b):
"""Title-level docstring."""
return a + b
This fails, and it fails I believe because the first arg passed to appender is interpreted as func. So when I view the resulting docstring for g I get:
print(g.__doc__)
Title-level docstring.
b : int, default 1
the second parameter
because, again, 'a' is interpreted to be 'func' when I want it to be the first element of *args. How can I correct this?
Desired result:
print(g.__doc__)
Title-level docstring.
a : int, default 0
the first parameter
b : int, default 1
the second parameter

This happens because the variable names you pass actually get captured into a func argument.
In order to do callable decorators in Python you need to code the function twice, having external function to accept decorator arguments and internal function to accept original function. Callable decorators are just higher-order functions that return other decorators. For example:
def appender(*args): # This is called when a decorator is called,
# e. g. #appender('a', 'b')
"""Appends additional parameter descriptions to func's __doc__."""
def _doc(func): # This is called when the function is about
# to be decorated
params = ''.join([defaultdocs[arg] for arg in args])
func.__doc__ += '\n' + params
return func
return _doc
The external (appender) function acts as a factory for new decorator while _doc function is an actual decorator. Always pass it this way:
Pass decorator args to the external function
Pass original function to the internal function
Once the Python sees this:
#appender('a', 'b')
def foo(): pass
...it will do something like this under the hood:
foo = appender('a', 'b')(foo)
...which expands to this:
decorator = appender('a', 'b')
foo = decorator(foo)
Because of how scopes in Python work, each newly returned _doc function instance will have its own local args value from the external function.

An alternate solution that uses inspect.signature to collect the passed function params.
import inspect
import textwrap
def appender(defaultdocs):
def _doc(func):
params = inspect.signature(func).parameters
params = [param.name for param in params.values()]
params = ''.join([textwrap.dedent(defaultdocs[param])
for param in params])
func.__doc__ += '\n\nParameters\n' + 10 * '=' + params
return func
return _doc
Example:
# default docstrings for parameters that are re-used often
# class implementation not a good alternative in my specific case
defaultdocs = {
'a' :
"""
a : int, default 0
the first parameter""",
'b' :
"""
b : int, default 1
the second parameter"""
}
#appender
def f(a):
"""Title-level docstring."""
return a
#appender
def g(a, b):
"""Title-level docstring."""
return a + b
This appends the descriptions for a and b to g.__doc__ without needing to specify them in the decorator:
help(g)
Help on function g in module __main__:
g(a, b)
Title-level docstring.
Parameters
==========
a : int, default 0
the first parameter
b : int, default 1
the second parameter

Related

What is the point of passing keywords equal to `None` as parameters in a function?

I am trying to translate a Python Voronoi Generator to Java but I am not very familiar with keyword arguments and I am confused by the keyword argument that is written as a parameter of a function.
I've read this question, but it only covers calling arguments, not defining them.
It seems like the function is just setting a and b to None. I don't understand why that is necessary. Why not just type None instead of using parameters like that?
What is the point of using keyword parameters equal to None in function definitions?
The function:
class Arc:
p = None
pprev = None
pnext = None
e = None
s0 = None
s1 = None
def __init__(self, p, a=None, b=None):
self.p = p
self.pprev = a
self.pnext = b
self.e = None
self.s0 = None
self.s1 = None
Keyword arguments can have a default value. In this case, the default value of a and b are None. This means that when these arguments aren't passed into the function, a and b are set to None.
Example:
def func(a=None):
print(a)
func()
>>>None
But if you do something like this:
def func(a):
print(a)
func()
>>>TypeError: func() missing 1 required positional argument: 'a'
def __init__(self, p, a=None, b=None):
That line is defining the __init__ method, which you can think of as a constructor for the class. It's not calling the method.
Setting a and b equal to None in the method definition makes None their default value. This makes them optional when you call the method. (So you can call the method with 1, 2, or 3 parameters.) If you don't pass in values for a and b, None will be used. It's common in Python to use optional parameters in place of overloading methods.
This is a concept called default value.
Using this syntax, the caller can choose whether to specify a value for every such argument or use the default one.
Suppose you have a method that has a default behavior, but you want to reserve the possibility to change it willingly. This give you this power.
Consider the following example:
def custom_method(msg, prefix=None):
tag = prefix if prefix else ''
print(tag + msg)
Now, as a user of this API, you can choose to use a prefix, or avoid it:
custom_method('Dude', 'Hi')
# Prints 'Hi Dude'
Or:
custom_method('Dude')
# Prints 'Dude'
This is the Python syntax for default arguments, or optional argument.
Arguments which are declared with a default value, will receive that default value if they are not passed in the function call, and will receive what is passed in the other case.
Here Arc(p) is the same as Arc(p, None, None), but you could use:
a1 = Arc(p)
a2 = Arc(p, a1)
and then a2.pprev will be a1.
But there are some caveats with mutable default arguments... be careful with them or better avoid them until you are more proficient at Python.

Function parameter dtype declaring doesnt work?

Why doesnt this give back '12'?
The '+' sign should concatenate two strings, not add them.
def foo(a:str, b:str):
print(a+b)
foo(1,2)
3
That's not what annotations are for. Annotations are metadata, not an instruction to Python to convert data.
From the Function definitions reference documentation:
Parameters may have annotations of the form “: expression” following the parameter name. Any parameter may have an annotation even those of the form *identifier or **identifier. Functions may have “return” annotation of the form “-> expression” after the parameter list. These annotations can be any valid Python expression and are evaluated when the function definition is executed. Annotations may be evaluated in a different order than they appear in the source code. The presence of annotations does not change the semantics of a function.
(Bold emphisis mine).
For example, the Python type hinting framework uses annotations to attach type information to functions for static analysis, validating that code actually passes in the types that are expected to be passed in.
Just convert your values explicitly; in the call:
foo(str(1), str(2))
or in the function itself:
def foo(a, b):
print(str(a) + str(b))
or in a decorator:
import functools
import inspect
def typeconversion(f):
"""Converts arguments with a callable attached in the parameter annotation"""
sig = inspect.signature(f)
#functools.wraps(f)
def wrapper(*args, **kwargs):
# convert any argument (including defaults), for which there is a
# callable annotation
bound = sig.bind(*args, **kwargs)
bound.apply_defaults()
args = bound.arguments
for param in sig.parameters.values():
if param.annotation is not param.empty and callable(param.annotation):
args[param.name] = param.annotation(args[param.name])
# call the function with the converted arguments
result = f(*bound.args, **bound.kwargs)
# convert the return value
if sig.return_annotation is not sig.empty and callable(sig.return_annotation):
result = sig.return_annotation(result)
return result
return wrapper
Demo:
>>> #typeconversion
... def foo(a: str, b: str) -> int:
... return a + b
...
>>> foo(42, 101)
42101

When passing function as a parameter in python, how to specify only one parameter of the passed function

def func_a(func):
return func(1)
def func_b(a, b=2):
return a+b
Assuming there are 2 Python functions above. The func_a(func_b) will return 1 + 2 = 3. Is there any possible way that I can assign the value of parameter b when calling func_a, like func_a(func_b, b=3), so that I can change the default parameter 'b' of the called func_b, without adding an additional parameter to func_a.

Applying functools.wraps to nested wrappers

I have a base decorator that takes arguments but that also is built upon by other decorators. I can't seem to figure where to put the functools.wraps in order to preserve the full signature of the decorated function.
import inspect
from functools import wraps
# Base decorator
def _process_arguments(func, *indices):
""" Apply the pre-processing function to each selected parameter """
#wraps(func)
def wrap(f):
#wraps(f)
def wrapped_f(*args):
params = inspect.getargspec(f)[0]
args_out = list()
for ind, arg in enumerate(args):
if ind in indices:
args_out.append(func(arg))
else:
args_out.append(arg)
return f(*args_out)
return wrapped_f
return wrap
# Function that will be used to process each parameter
def double(x):
return x * 2
# Decorator called by end user
def double_selected(*args):
return _process_arguments(double, *args)
# End-user's function
#double_selected(2, 0)
def say_hello(a1, a2, a3):
""" doc string for say_hello """
print('{} {} {}'.format(a1, a2, a3))
say_hello('say', 'hello', 'arguments')
The result of this code should be and is:
saysay hello argumentsarguments
However, running help on say_hello gives me:
say_hello(*args, **kwargs)
doc string for say_hello
Everything is preserved except the parameter names.
It seems like I just need to add another #wraps() somewhere, but where?
I experimented with this:
>>> from functools import wraps
>>> def x(): print(1)
...
>>> #wraps(x)
... def xyz(a,b,c): return x
>>> xyz.__name__
'x'
>>> help(xyz)
Help on function x in module __main__:
x(a, b, c)
AFAIK, this has nothing to do with wraps itself, but an issue related to help. Indeed, because help inspects your objects to provide the information, including __doc__ and other attributes, this is why you get this behavior, although your wrapped function has different argument list. Though, wraps doesn't update that automatically (the argument list) what it really updates is this tuple and the __dict__ which is technically the objects namespace:
WRAPPER_ASSIGNMENTS = ('__module__', '__name__', '__qualname__', '__doc__',
'__annotations__')
WRAPPER_UPDATES = ('__dict__',)
If you aren't sure about how wraps work, probably it'll help if your read the the source code from the standard library: functools.py.
It seems like I just need to add another #wraps() somewhere, but where?
No, you don't need to add another wraps in your code, help as I stated above works that way by inspecting your objects. The function's arguments are associated with code objects (__code__) because your function's arguments are stored/represented in that object, wraps has no way to update the argument of the wrapper to be like the wrapped function (continuing with the above example):
>>> xyz.__code__.co_varnames
>>> xyz.__code__.co_varnames = x.__code__.co_varnames
AttributeError: readonly attribute
If help displayed that function xyz has this argument list () instead of (a, b, c) then this is clearly wrong! And the same applies for wraps, to change the argument list of the wrapper to the wrapped, would be cumbersome! So this should not be a concern at all.
>>> #wraps(x, ("__code__",))
... def xyz(a,b,c): pass
...
>>> help(xyz)
Help on function xyz in module __main__:
xyz()
But xyz() returns x():
>>> xyz()
1
For other references take a look at this question or the Python Documentation
What does functools.wraps do?
direprobs was correct in that no amount of functools wraps would get me there. bravosierra99 pointed me to somewhat related examples. However, I couldn't find a single example of signature preservation on nested decorators in which the outer decorator takes arguments.
The comments on Bruce Eckel's post on decorators with arguments gave me the biggest hints in achieving my desired result.
The key was in removing the middle function from within my _process_arguments function and placing its parameter in the next, nested function. It kind of makes sense to me now...but it works:
import inspect
from decorator import decorator
# Base decorator
def _process_arguments(func, *indices):
""" Apply the pre-processing function to each selected parameter """
#decorator
def wrapped_f(f, *args):
params = inspect.getargspec(f)[0]
args_out = list()
for ind, arg in enumerate(args):
if ind in indices:
args_out.append(func(arg))
else:
args_out.append(arg)
return f(*args_out)
return wrapped_f
# Function that will be used to process each parameter
def double(x):
return x * 2
# Decorator called by end user
def double_selected(*args):
return _process_arguments(double, *args)
# End-user's function
#double_selected(2, 0)
def say_hello(a1, a2,a3):
""" doc string for say_hello """
print('{} {} {}'.format(a1, a2, a3))
say_hello('say', 'hello', 'arguments')
print(help(say_hello))
And the result:
saysay hello argumentsarguments
Help on function say_hello in module __main__:
say_hello(a1, a2, a3)
doc string for say_hello

python check if function accepts **kwargs

is there a way to check if a function accepts **kwargs before calling it e.g.
def FuncA(**kwargs):
print 'ok'
def FuncB(id = None):
print 'ok'
def FuncC():
print 'ok'
args = {'id': '1'}
FuncA(**args)
FuncB(**args)
FuncC(**args)
When I run this FuncA and FuncB would be okay but FuncC errors with got an unexpected keyword argument 'id' as it doesn't accept any arguments
try:
f(**kwargs)
except TypeError:
#do stuff
It's easier to ask forgiveness than permission.
def foo(a, b, **kwargs):
pass
import inspect
args, varargs, varkw, defaults = inspect.getargspec(foo)
assert(varkw=='kwargs')
This only works for Python functions. Functions defined in C extensions (and built-ins) may be tricky and sometimes interpret their arguments in quite creative ways. There's no way to reliably detect which arguments such functions expect. Refer to function's docstring and other human-readable documentation.
func is the function in question.
with python2, it's:
inspect.getargspec(func).keywords is not None
python3 is a bit tricker, following https://www.python.org/dev/peps/pep-0362/ the kind of parameter must be VAR_KEYWORD
Parameter.VAR_KEYWORD - a dict of keyword arguments that aren't bound to any other parameter. This corresponds to a "**kwargs" parameter in a Python function definition.
any(param for param in inspect.signature(func).parameters.values() if param.kind == param.VAR_KEYWORD)
For python > 3 you should to use inspect.getfullargspec.
import inspect
def foo(**bar):
pass
arg_spec = inspect.getfullargspec(foo)
assert arg_spec.varkw and arg_spec.varkw == 'bar'
Seeing that there are a multitude of different answers in this thread, I thought I would give my two cents, using inspect.signature().
Suppose you have this method:
def foo(**kwargs):
You can test if **kwargs are in this method's signature:
import inspect
sig = inspect.signature(foo)
params = sig.parameters.values()
has_kwargs = any([True for p in params if p.kind == p.VAR_KEYWORD])
More
Getting the parameters in which a method takes is also possible:
import inspect
sig = inspect.signature(foo)
params = sig.parameters.values()
for param in params:
print(param.kind)
You can also store them in a variable like so:
kinds = [param.kind for param in params]
# [<_ParameterKind.VAR_KEYWORD: 4>]
Other than just keyword arguments, there are 5 parameter kinds in total, which are as follows:
POSITIONAL_ONLY # parameters must be positional
POSITIONAL_OR_KEYWORD # parameters can be positional or keyworded (default)
VAR_POSITIONAL # *args
KEYWORD_ONLY # parameters must be keyworded
VAR_KEYWORD # **kwargs
Descriptions in the official documentation can be found here.
Examples
POSITIONAL_ONLY
def foo(a, /):
# the '/' enforces that all preceding parameters must be positional
foo(1) # valid
foo(a=1) #invalid
POSITIONAL_OR_KEYWORD
def foo(a):
# 'a' can be passed via position or keyword
# this is the default and most common parameter kind
VAR_POSITIONAL
def foo(*args):
KEYWORD_ONLY
def foo(*, a):
# the '*' enforces that all following parameters must by keyworded
foo(a=1) # valid
foo(1) # invalid
VAR_KEYWORD
def foo(**kwargs):
It appears that you want to check whether the function receives an 'id' keyword argument. You can't really do that by inspection because the function might not be a normal function, or you might have a situation like that:
def f(*args, **kwargs):
return h(*args, **kwargs)
g = lambda *a, **kw: h(*a, **kw)
def h(arg1=0, arg2=2):
pass
f(id=3) still fails
Catching TypeError as suggested is the best way to do that, but you can't really figure out what caused the TypeError. For example, this would still raise a TypeError:
def f(id=None):
return "%d" % id
f(**{'id': '5'})
And that might be an error that you want to debug. And if you're doing the check to avoid some side effects of the function, they might still be present if you catch it. For example:
class A(object):
def __init__(self): self._items = set([1,2,3])
def f(self, id): return self._items.pop() + id
a = A()
a.f(**{'id': '5'})
My suggestion is to try to identify the functions by another mechanism. For example, pass objects with methods instead of functions, and call only the objects that have a specific method. Or add a flag to the object or the function itself.
According to https://docs.python.org/2/reference/datamodel.html
you should be able to test for use of **kwargs using co_flags:
>>> def blah(a, b, kwargs):
... pass
>>> def blah2(a, b, **kwargs):
... pass
>>> (blah.func_code.co_flags & 0x08) != 0
False
>>> (blah2.func_code.co_flags & 0x08) != 0
True
Though, as noted in the reference this may change in the future, so I would definitely advise to be extra careful. Definitely add some unit tests to check this feature is still in place.

Categories