Related
I am trying to examine the types of functions arguments before the call (in this example it is foo). I am using python decorators to achieve this. I don't see how I can get arguments in the same order as they are visible to the function foo. In the following example, I get two different orderings but have essentially the same function call.
def wrapper(func):
def f(*args, **kwargs):
print([type(x) for x in args] + [type(v) for v in kwargs.values()])
return func(*args, **kwargs)
return f
#wrapper
def foo(a, b, c, d):
print(f"{a} {b} {c} {d}")
foo(10, 12.5, 14, 5.2) # all good: int, float, int, float
foo(10, 12.5, d=5.2, c=14) # not what I want: int, float, float, int
Is it possible to get arguments in a consistent order? If not, then is it at least possible to get them all keyed by argument name? Something that looks like this:
def wrapper(func):
def f(**kwargs):
# kwargs = {'a': 10, 'b': 12.5, 'c': 14, 'd': 5.2}
print([type(v) for v in kwargs.values()])
return func(*args, **kwargs)
return f
foo(10, 12.5, 14, 5.2) # obviously doesn't work
The type-checking is a bit weak, the annotations works as long you annotate your code but a more robust way can be achieved by using inspect from the standard library:
it provides full access to frame, ... and everything you may need. In this case with inspect.signature can be used to fetch the signature of the original function to get a the original order of the parameters. Then just regroup the parameters and pass the final group back to the original function. More details in the comments.
from inspect import signature
def wrapper(func):
def f(*args, **kwargs):
# signature object
sign = signature(func)
# use order of the signature of the function as reference
order = order = dict.fromkeys(sign.parameters)
# update first key-values
order.update(**kwargs)
# update by filling with positionals
free_pars = (k for k, v in order.items() if v is None)
order.update(zip(free_pars, args))
return func(**order)
return f
#wrapper
def foo(a, b, c, d):
print(f"{a} {b} {c} {d}")
foo(10, 12.5, 14, 5.2)
#10 12.5 14 5.2
foo(10, 12.5, d=5.2, c=14)
#10 12.5 14 5.2
The code is annotations compatible:
#wrapper
def foo(a: int, b: float, c: int, d: float) -> None:
print(f"{a} {b} {c} {d}")
The annotation's way, no imports required:
It is a copy past of the above code but using __annotations__ attribute to get the signature. Remember that annotations may or may not have an annotation for the output
def wrapper(func):
def f(*args, **kwargs):
if not func.__annotations__:
raise Exception('No clue... inspect or annotate properly')
params = func.__annotations__
# set return flag
return_has_annotation = False
if 'return' in params:
return_has_annotation = True
# remove possible return value
return_ = params.pop('return', None)
order = dict.fromkeys(params)
order.update(**kwargs)
free_pars = (k for k, v in order.items() if v is None)
order.update(zip(free_pars, args))
# update with return annotation
if return_has_annotation:
func.__annotations__ = params | {'return': return_}
return func(**order)
return f
#wrapper
def foo(a: int, b: float, c: int, d: float) -> None:
print(f"{a} {b} {c} {d}")
The first thing to be careful of is that key word arguments are implemented because order does not matter for them and are intended to map a value to a specific argument by name at call-time. So enforcing any specific order on kwargs does not make much sense (or at least would be confusing to anyone trying to use your decorater). So you will probably want to check for which kwargs are specified and remove the corresponding argument types.
Next if you want to be able to check the argument types you will need to provide a way to tell your decorator what types you are expected by passing it an argument (you can see more about this here). The only way to do this is to pass a dictionary mapping each argument to the expected type:
#wrapper({'a': int, 'b': int, c: float, d: int})
def f(a, b, c=6.0, d=5):
pass
def wrapper(types):
def inner(func):
def wrapped_func(*args, **kwargs):
# be careful here, this only works if kwargs is ordered,
# for python < 3.6 this portion will not work
expected_types = [v for k, v in types.items() if k not in kwargs]
actual_types = [type(arg) for arg in args]
# substitute these in case you are dead set on checking for key word arguments as well
# expected_types = types
# actual_types = [type(arg) for arg in args)] + [type(v) for v in kwargs.items]
if expected_types != actual_types:
raise TypeError(f"bad argument types:\n\tE: {expected_types}\n\tA: {actual_types}")
func(*args, **kwargs)
return wrapped_func
return inner
#wrapper({'a': int, 'b': float, 'c': int})
def f(a, b, c):
print('good')
f(10, 2.0, 10)
f(10, 2.0, c=10)
f(10, c=10, b=2.0)
f(10, 2.0, 10.0) # will raise exception
Now after all of this, I want to point out that this is functionality is probably largely unwanted and unnecessary in python code. Python was designed to be dynamically typed so anything resembling strong types in python is going against the grain and won't be expected by most.
Next, since python 3.5 we have had access to the built-in typing package. This lets you specify the type that you expect to be receiving in a function call:
def f(a: int, b: float, c: int) -> int:
return a + int(b) + c
Now this won't actually do any type assertions for you, but it will make it plainly obvious what types you are expecting, and most (if not all) IDEs will give you visual warnings that you are passing the wrong type to a function.
I am trying to add some customized logic outside of an existing function. Here are the example:
# existing function that I cannot change
def sum(a, b, c, d):
return a+b+c+d
# the function I want to build
def sumMultiply(a, b, c, d, multiplier):
return multiplier * sum(a, b, c, d)
This is a stupid example, but essentially I want to build a new function that takes all the parameter of the existing function and add a few new arguments.
The above solution is problematic when the existing function changes its definition. For example:
# in some updates the original function dropped one parameter
def sum(a, b, c):
return a+b+c
# the new function will give an error since there is no parameter "d"
def sumMultiply(a, b, c, d, multiplier):
return multiplier * sum(a, b, c, d) # error
How can I specify the new function so that I do not need to worry about changing the new function definition when the existing function definition changes?
One way would be to use arbitrary positional or keyword arguments:
def sumMultiply(multiplier, *numbers):
return multiplier * sum(*numbers)
def sumMultiply(multiplier, *args, **kwargs):
return multiplier * sum(*args, **kwargs)
However, if you see yourself passing around the same set of data around, consider making a parameter object. In your case, it can simply be a list:
def sum(numbers):
...
def sumMultiply(multiplier, numbers):
return multiplier * sum(numbers)
There are some additional downsides to using arbitrary arguments:
the arguments are implicit: you might need to dig through several layers to see what you actually need to provide
they don't play well with type annotations and other static analysers (e.g. PyCharm's refactorings)
I would create a decorator function
def create_fun_multiplier(fun, multiplier=1):
def multiplier_fun(*args):
return multiplier * fun(*args)
return multiplier_fun
def my_sum(a, b, c):
return a + b + c
sumMultiply = create_fun_multiplier(my_sum, multiplier=2)
print(sumMultiply(3, 4, 7))
I would look at using keyword args for this problem.
eg.
def sum(a, b, c):
return a + b + c
def sumMultiply(*args, multiplier=1):
return multiplier * sum(*args)
I would like to change the input arguments of a function. This will also lead to changes within the function body.
What's a pythonic way to mark an input argument "deprecated" and maintain backward compatibility at the same time? Here's a toy example:
from typing import List
# original function
def sum_numbers(numbers: List[int]):
return sum(numbers)
# function with changed input arguments and function body
def sum_numbers(a: int, b: int) -> int:
return a + b
The user should be able to call sum_numbers either with numbers: List[int] argument or by using a: int, b: int. However, I want to submit a DeprecationWarning when the user uses the original call method.
An option is to overload the function using multipledispatch module:
from multipledispatch import dispatch
#dispatch(int, int)
def sum_numbers(a, b):
print("Deprecated")
return a + b
#dispatch(list)
def sum_numbers(numbers):
return sum(numbers)
An alternative to multipledispatch is to take in *args or have an optional arg and dispatch internally:
# original function
def sum_numbers(a, b=None):
if isinstance(a, list):
warnings.warn("...", DeprecationWarning, stacklevel=2)
return sum(numbers)
return a + b
then for typing purposes you can use typing.overload:
#typing.overload
def sum_numbers(numbers: list[int]) -> int:
"""deprecated"""
#typing.overload
def sum_numbers(a: int, b: int) -> int:
...
(note that as documented the overloads should come first and the actual implementation last)
I have a function -
def add(a, b):
return a+b
Now I want it to add 100 to the result using a decorator such that -
#decorator(100)
def add(a, b):
return (a+b)
class decorator():
def __init__(self, c):
self.c = c
def __call__(self, f):
def _wrapped_f(a, b):
return self.c + f(a, b)
return _wrapped_f
The problem is when I want to keep the extra argument variable and not fixed to 100.
Something like -
#decorator(c)
def add(a, b):
return (a+b)
Can I somehow assign the value of this variable during the function call. (when add function is called)
The reason I want to do this using a decorator is because I don't want to modify my function add.
I cannot afford an extra parameter in function something like -
def(a, b, c=0):
return (a+b+c)
Hence, the need of the decorator.
Sure. All a decorator does is replace the original function with a wrapped version; that wrapper can take as many parameters as you like.
I'm not sure why you are using a class - this would be clearer as a standard decorator.
def decorator(func):
def wrapped(a, b, c):
return c + func(a, b)
return wrapped
#decorator
def add(a, b):
return (a+b)
Not 100% sure what you want to do with this decorator, so this might not be exactly what you want, but even then, others might find it useful.
You can add another parameter to the function, just by adding more parameters to the _wrapped function, or even by passing *args and **kwargs to that function. But how do you handle those parameters? Surely, just adding those parameters to the result only makes sense for the add function.
In the general case, you could use the decorated function itself to process the additional parameters. Then, just have the decorated function reduce all the parameters using the original function:
from functools import reduce # Python 3
def vararg(f):
""" change two-arg-function into vararg-function"""
def _f(*args):
return reduce(f, args)
return _f
#vararg
def add(a, b):
return a + b
print(add(1, 2)) # -> 3
print(add(1, 2, 3)) # -> 6
print(add(1, 2, 3, 4)) # -> 10
print(add(1, 2, 3, 4, 5)) # -> 15
I have a function with one optional argument, like this:
def funA(x, a, b=1):
return a+b*x
I want to write a new function that calls funA and also has an optional argument, but if no argument is passed, I want to keep the default in funA.
I was thinking something like this:
def funB(x, a, b=None):
if b:
return funA(x, a, b)
else:
return funA(x, a)
Is there a more pythonic way of doing this?
I would replace if b with if b is not None, so that if you pass b=0 (or any other "falsy" value) as argument to funB it will be passed to funA.
Apart from that it seems pretty pythonic to me: clear and explicit. (albeit maybe a bit useless, depending on what you're trying to do!)
A little more cryptic way that relies on calling funB with the correct keyword arguments (e.g. funB(3, 2, b=4):
def funB(x, a, **kwargs):
return funA(x, a, **kwargs)
def funA(x, a, b=1):
return a+b*x
def funB(x, a, b=1):
return funA(x, a, b)
Make the default value of b=1 in funB() and then pass it always to funA()
The way you did it is fine. Another way is for funB to have the same defaults as funA, so you can pass the same parameters right through. E.g., if you do def funB(x, a, b=1), then you can always call return funA(x, a, b) just like that.
For simple cases, the above will work fine. For more complex cases, you may want to use *args and **kwargs (explained here and here). Specifically, you can pass in all your keyword arguments as a dictionary (conventionally called kwargs). In this case, each function would set its own independent defaults, and you would just pass the whole dictionary through:
def funA(x, a, **kwargs):
b = kwargs.get("b", 1)
return a+b*x
def funB(x, a, **kwargs):
return funA(x, a, **kwargs)
If kwargs is empty when passed to funB (b is not specified), it will be set to the default in funA by the statement b = kwargs.get("b", 1). If b is specified, it will be passed through as-is. Note that in funB, you can access b with its own, independent default value and still get the behavior you are looking for.
While this may seem like overkill for your example, extracting a couple of arguments at the beginning of a function is not a big deal if the function is complex enough. It also gives you a lot more flexibility (such as avoiding many of the common gotchas).
Using inspect.getargspec, you can get the default values (fourth item of the returned tuple = defaults):
import inspect
def funA(x, a, b=1):
return a + b * x
# inspect.getargspec(funA) =>
# ArgSpec(args=['x', 'a', 'b'], varargs=None, keywords=None, defaults=(1,))
def funcB(x, a, b=inspect.getargspec(funA)[3][0]):
return funA(x, a, b)
OR (in Python 2.7+)
def funcB(x, a, b=inspect.getargspec(funA).defaults[0]):
return funA(x, a, b)
In Python 3.5+, it's recommend to use inspect.signature instead:
def funcB(x, a, b=inspect.signature(funA).parameters['b'].default):
return funA(x, a, b)
Using FunctionType from types, you can just take a function and create a new one specifying the defaults at runtime. You can put all this in a decorator so that at the point of where you write your code it will keep things tidy, whilst still giving the reader a clue about what you are trying to accomplish. It also allows the exact same call signature for funB as funA -- all arguments can be positional, or all arguments can be keywords, or any valid mix thereof, and any arguments with default values are optional. Should play nice with positional arguments (*args) and keyword arguments (**kwargs) too.
import inspect
from types import FunctionType
def copy_defaults(source_function):
def decorator(destination_function):
"""Creates a wrapper for the destination function with the exact same
signature as source_function (including defaults)."""
# check signature matches
src_sig = inspect.signature(source_function)
dst_sig = inspect.signature(destination_function)
if list(src_sig.parameters) != list(dst_sig.parameters):
raise ValueError("src func and dst func do not having matching " \
"parameter names / order")
return FunctionType(
destination_function.__code__,
destination_function.__globals__,
destination_function.__name__,
source_function.__defaults__, # use defaults from src
destination_function.__closure__
)
return decorator
def funA(x, a, b=1):
return a+b*x
#copy_defaults(funA)
def funB(x, a, b):
"""this is fun B"""
return funA(x, a, b)
assert funA(1, 2) == funB(1, 2)
assert funB.__name__ == "funB"
assert funB.__doc__ == "this is fun B"
You can also use:
def funA(x, a, b=1):
return a+b*x
def funB(x, a, b=None):
return funA(*filter(lambda o: o is not None, [x, a, b]))
Version which will not fail if x or a are None:
def funB(x, a, b=None):
return funA(*([x, a]+filter(lambda o: o is not None, [b])))