How can I check if a function can always be called with the same arguments as another function? For example, b can be called with all arguments provided to a.
def a(a, b, c=None):
pass
def b(a, *args, d=4,**kwargs):
pass
The reason I want this is that I have a base function:
def f(a, b):
print('f', a, b)
and a list of callbacks:
def g(b, a):
print('g', a, b)
def h(*args, **kwargs):
print('h', args, kwargs)
funcs = [g, h]
and a wrapper function that accepts anything:
def call(*args, **kwargs):
f(*args, **kwargs)
for func in funcs:
func(*args, **kwargs)
Now I want to check if all callbacks will accept the arguments provided to call(), assuming they're valid for f().
For performance reasons, I don't want to check the arguments every time call() is called, but rather check each callback before adding it to the list of callbacks.
For example, those calls are ok:
call(1, 2)
call(a=1, b=3)
But this one should fail because g has arguments in wrong order:
call(1, b=3)
This took a bit of fun research, but i think i've covered the corner cases. A number of them arise to keep things compatible with python 2 while new syntax being added.
Most problematic part is the fact that some named (keyword) parameters can be passed in as positional argument or be required based on order passed in.
For more see comments.
Below code will ensure that function b can be called using any possible combination of valid arguments to function a. (does not imply inverse).
Uncomment/add try except block to get true/valse result and not an AssertionError.
import inspect
def check_arg_spec(a,b):
"""
attrs of FullArgSpec object:
sp.args = pos or legacy keyword arguments, w/ keyword at back
sp.varargs = *args
sp.varkw = **kwargs
sp.defaults = default values for legacy keyword arguments #
sp.args
sp.kwdonly = keyword arguments follow *args or *, must be passed in by name
sp.kwdonlydefaults = {name: default_val, .... }
sp.annotatons -> currently not in use, except as standard flag for outside applications
Consume order:
(1) Positional arguments
(2) legacy keyword argument = default (can be filled by both keyword or positional parameter)
[
(3) *args
[
(4) keyword only arguments [=default]
]
]
(5) **kwds
"""
a_sp = inspect.getfullargspec(a)
b_sp = inspect.getfullargspec(b)
kwdfb = b_sp.kwonlydefaults or {}
kwdfa = a_sp.kwonlydefaults or {}
kwddefb = b_sp.defaults or []
kwddefa = a_sp.defaults or []
# try:
akwd = a_sp.kwonlyargs
if len(kwddefa):
akwd += a_sp.args[-len(kwddefa):]
bkwd = b_sp.kwonlyargs
if len(kwddefb):
bkwd += b_sp.args[-len(kwddefb):]
# all required arguments in b must have name match in a spec.
for bkey in (set(b_sp.args) ^ set(bkwd)) & set(b_sp.args) :
assert bkey in a_sp.args
# all required positional in b can be met by a
assert (len(a_sp.args)-len(kwddefb)) >= (len(b_sp.args)-len(kwddefb))
# if a has *args spec, so must b
assert not ( a_sp.varargs and b_sp.varargs is None )
# if a does not take *args, max number of pos args passed to a is len(a_sp.args). b must accept at least this many positional args unless it can consume *args
if b_sp.varargs is None:
# if neither a nor b accepts *args, check that total number of pos plus py2 style keyword arguments for sg of b is more than a can send its way.
assert len(a_sp.args) <= len(b_sp.args)
# Keyword only arguments of b -> they are required, must be present in a.
akws = set(a_sp.kwonlyargs) | set(a_sp.args[-len(kwddefa):])
for nmreq in (set(b_sp.kwonlyargs)^set(kwdfb)) & set(b_sp.kwonlyargs):
assert nmreq in akws
# if a and b both accept an arbitrary number of positional arguments or if b can but a cannot, no more checks neccessary here
# if a accepts optional arbitrary, **kwds, then so must b
assert not (a_sp.varkw and b_sp.varkw is None)
if b_sp.varkw is None:
# neither a nor b can consume arbitrary keyword arguments
# then b must be able to consume all keywords that a can be called w/th.
for akw in akwd:
assert akw in bkwd
# if b accepts **kwds, but not a, then there is no need to check further
# if both accept **kwds, then also no need to check further
# return True
#
# except AssertionError:
#
# return False
Not sure what you are really looking for and I'm pretty sure your issue could be solved in a better way, but anyway:
from inspect import getargspec
def foo(a, b, c=None):
pass
def bar(a, d=4, *args, **kwargs):
pass
def same_args(func1, func2):
return list(set(getargspec(func1)[0]).intersection(set(getargspec(func2)[0])))
print same_args(foo, bar)
# => ['a']
same_args just check arguments from func1 and func2 and returns a new list with only same arguments in both func1 and func2.
Related
consider the following:
from functools import partial
def add(a, b, c):
return 100 * a + 10 * b + c
add_part = partial(add, c = 2, b = 1)
add_part(3)
312
Works fine. However:
def foo(x, y, z):
return x+y+z
bar = partial(foo, y=3)
bar(1, 2)
barfs:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: foo() got multiple values for argument 'y'
Obviously I am missing something obvious, but what?
The definition of partial() from the official functools documentation is:
def partial(func, /, *args, **keywords):
def newfunc(*fargs, **fkeywords):
newkeywords = {**keywords, **fkeywords}
return func(*args, *fargs, **newkeywords)
newfunc.func = func
newfunc.args = args
newfunc.keywords = keywords
return newfunc
That means that in your case partial() returns the foo() function with the signature modified as follow:
>>> from inspect import signature
>>> signature(bar)
<Signature (x, *, y=3, z)>
To solve your error, you could provide keyword arguments to the bar() function:
def foo(x, y, z):
return x+y+z
bar = partial(foo, y=3)
bar(x=1, z=2)
From the python 3.9 documentation:
Return a new partial object which when called will behave like func called with the positional arguments args and keyword arguments keywords. If more arguments are supplied to the call, they are appended to args. If additional keyword arguments are supplied, they extend and override keywords.
def foo(x, y, z):
return x+y+z
bar = partial(foo, y=3)
print(bar.args) # ()
print(bar.keywords) # {'y': 3}
When bar(1, 2) is called, bar.args becomes (1, 2) and bar.keywords is still {'y': 3}
Taking note of this, the next thing is to refer to "Function Calling Behavior" as specified in PEP 3102
When a function is called, the input arguments are assigned to formal parameters as follows:
For each formal parameter, there is a slot which will be used to contain the value of the argument assigned to that parameter.
Slots which have had values assigned to them are marked as 'filled'. Slots which have no value assigned to them yet are considered 'empty'.
Initially, all slots are marked as empty.
Positional arguments are assigned first, followed by keyword arguments.
For each positional argument:
Attempt to bind the argument to the first unfilled parameter slot. If the slot is not a vararg slot, then mark the slot as 'filled'.
If the next unfilled slot is a vararg slot, and it does not have a name, then it is an error.
Otherwise, if the next unfilled slot is a vararg slot then all remaining non-keyword arguments are placed into the vararg slot.
For each keyword argument:
If there is a parameter with the same name as the keyword, then the argument value is assigned to that parameter slot. However, if the parameter slot is already filled, then that is an error.
Otherwise, if there is a 'keyword dictionary' argument, the argument is added to the dictionary using the keyword name as the dictionary key, unless there is already an entry with that key, in which case it is an error.
Otherwise, if there is no keyword dictionary, and no matching named parameter, then it is an error.
Finally:
If the vararg slot is not yet filled, assign an empty tuple as its value.
For each remaining empty slot: if there is a default value for that slot, then fill the slot with the default value. If there is no default value, then it is an error.
I'm not quite sure though if the above is up to date or how to apply it in this situation because I'm not sure if bar has positional parameters or varargs.
What is a most pythonic way to write a function to either pass in arguments or a tuple/list of arguments?
For example, a function add could either take in an argument of add(1, 2) or add((1, 2)) and both output 3.
What I have so far: (it works, but does not look nice)
def add(*args):
if len(args) == 1:
return (args[0][0] + args[0][1])
if len(args) == 2:
return args[0] + args[1]
else:
print "error: add takes in one or two arguments"
What I don't like about it is:
I have to print the error about passing in one or two arguments
The args[0][0] looks very unreadable
This way, it is hard to tell what the arguments passed in represent (they don't have names)
I dont know if this is the most "pythonic" way but it will do what you want:
def add(a, b=None):
return a+b if b is not None else sum(a)
If your function takes a specific number of arguments, then the most pythonic way to do this is to not do it. Rather if the user has a tuple with the arguments, you make them unpack them when they call the function. E.g.
def add(a, b):
return a + b
Then the caller can do
add(1,2)
or
t = (1,2)
add(*t)
The only time you want to accept either a sequence of params or individual params is when you can have any arbitrary (non-zero) number of arguments (e.g. the max and min builtin functions) in which case you'd just use *args
If you can only take a finite number of arguments, it makes more sense to ask for those specifically. If you can accept an arbitrary number of arguments, then the *args paradigm works well if you loop through it. Mixing and matching those two aren't very elegant.
def add(*args):
total = 0
for i in args:
total += i
return total
>>> add(1, 2, 3)
6
(I know we could just use sum() there, but I'm trying to make it look a bit more general)
In the spirit of python duck typing, if you see 1 argument, assume its something that expands to 2 arguments. If its then 2, assume its two things that add together. If it violates your rules, raise an exception like python would do on a function call.
def add(*args):
if len(args) == 1:
args = args[0]
if len(args) != 2:
raise TypeError("add takes 2 arguments or a tuple of 2 arguments")
return args[0] + args[1]
A decorator would be best suited for this job.
from functools import wraps
def tupled_arguments(f):
#wraps(f) # keeps name, docstring etc. of f
def accepts_tuple(tup, *args):
if not args: # only one argument given
return f(*tup)
return f(tup, *args)
return accepts_tuple
#tupled_arguments
def add(a, b):
return a + b
I am using a single decorator for two separate functions: one with specification of a decorator argument; and another one without it.
When the optional argument is not passed, the return type is a function (specifically, the inner_function in the decorator). However, when the optional argument is passed it works as expected.
Can you explain what is going on here and why it acts differently in these cases?
def cache_disk(cache_folder="./cache"):
def wrapper(f):
def inner_function(*args, **kwargs):
result = f(*args, **kwargs)
return result
return inner_function
return wrapper
#cache_disk
def func1(data):
return [d for d in data]
#cache_disk(cache_folder='./cache/')
def func2(data):
return [d for d in data]
data = [1,2,3]
print(func1(data))
print(func2(data))
Result:
<function inner_function at 0x7f1f283d5c08>
[1, 2, 3]
Note that:
#decorator # no arguments
def func(...):
...
is equivalent to:
def func(...):
...
func = decorator(func) # one 'level' of calls
and that:
#decorator(...): # arguments
def func(...):
...
is equivalent to:
def func(...):
...
func = decorator(...)(func) # two 'levels' of calls
In the first case, there is a single argument to the decorator, the func itself. In the second case, the arguments to the decorator are the ... from the # line, and it's the function returned by the decorator that is called with func as an argument.
In your example,
#cache_disk
def func1(data):
...
the decorator cache_disk gets a single, callable argument (func, which becomes args[0]) and returns the wrapper. Then when you call:
print(func1(data))
wrapper gets a single argument (data, which becomes f) and returns inner_function.
Therefore, you have three choices:
Decorate func1 with #cache_disk() (note parentheses), passing no arguments to cache_disk itself and func to wrapper;
Alter cache_disk to behave differently depending on whether it's passed a single, callable argument or something else; or
As #o11c pointed out in the comments, use e.g. cache_disk.wrapper = cache_disk() to provide a convenient alias for the parameter-less version, then decorate with #cache_disk.wrapper.
if you want default values, you need to call the function, which returns a decorator:
#cache_disk()
def func1(data):
return [d for d in data]
This example is taken from Beazley, Python Essential Reference 4e,
pg:101.
How is he doing:
func(*args, **kwargs)
where 'func' is the square-function which takes 1 argument. Earlier in
the chapter he sqawks about how the position and number of arguments
must match in a call/definition or a TypeError would be raised.
Also,
#trace
def square(x):
...
square = trace(square)
trace returns 'callf' so this is equivalent to writing: square = callf
which is fine because since square refers to a new-function-object, you can
call it with *args, **kwargs. But, then in callf he does func(*args...
Given that we just made 'square' refer to some other object, how is
the original square accessible inside? What mechanism is coming into
play?
#trace
def square(x):
return x*x
enable_tracing = True
if enable_tracing:
debug_log = open("debug.log","w")
def trace(func):
if enable_tracing:
def callf(*args,**kwargs):
debug_log.write("Calling %s: %s, %s\n" %
(func.__name__, args, kwargs))
r = func(*args,**kwargs) #????????
debug_log.write("%s returned %s\n" % (func.__name, r))
return r
return callf
else:
return func
The *-prefix means, "Use this sequence of values as the positional parameters to the function." The **-prefix means, "Use this dictionary as the named parameters to the function." If the sequence is empty, then no positional parameters are passed. If the dictionary is empty, then no named parameters are passed.
When you define a function with those prefixes, then the unaccounted for positional parameters go into the *-prefixed argument, and the unaccounted for named parameters go into the **-prefixed argument. So if you define a function like this:
def wrapper(*args, **kwargs):
then the function can be invoked with any arguments whatsoever. If that function then calls another function with those arguments, then it will be called however the wrapper was called.
Note that you can call a function with (*args, **kwargs) even if wasn't defined that way:
>>> def square(x):
... return x*x
...
>>> args = (10,)
>>> kwargs = {}
>>> square(*args, **kwargs)
100
Because kwargs is empty, there are no named parameters passed to the function. It gets only the one positional arguments in args.
For a decorator I am writing I would like to manipulate a specific named parameter of a function. Consider the following decorator:
def square_param(param):
def func_decorator(func):
def func_caller(*args,**kwargs):
kwargs[param] = kwargs[param] * kwargs[param]
return func(*args,**kwargs)
return func_caller
return func_decorator
Applied on the next function:
#square_param('dividend')
def quotient(divisor=1,dividend=0):
return dividend/divisor
This will work if dividend is called as a keyword argument e.g.:
>>> quotient(dividend=2)
4
However, when given as a positional argument this will fail.
>>> quotient(3,4)
TypeError: quotient() got multiple values for keyword argument 'dividend'
With Python 3 I could solve this by forcing the parameter to be always given as a keyword:
#square_param('dividend')
def quotient(divisor=1,*,dividend=0):
return dividend/divisor
but I would like to support Python 2 and also I would like to put as little restrictions on the function.
Is there a way that I can fix this behaviour in my decorator?
Firstly, your square_param decorator doesn't work because it doesn't return the functions. It needs to be:
def square_param(param):
def func_decorator(func):
def func_caller(*args,**kwargs):
kwargs[param] = kwargs[param] * kwargs[param]
return func(*args,**kwargs)
return func_caller
return func_decorator
Now I took #Dirk's advice and looked into the inspect module. You can do it by checking first if the parameter is one of the function's positional arguments, and second if that positional argument has been specified, and then modifying that argument position. You also need to make sure you only modify kwargs if the parameter was supplied as a keyword argument.
import inspect
def square_param(param):
def func_decorator(func):
def func_caller(*args,**kwargs):
funparams = inspect.getargspec(func).args
if param in funparams:
i = funparams.index(param)
if len(args) > i:
args = list(args) # Make mutable
args[i] = args[i] * args[i]
if param in kwargs:
kwargs[param] = kwargs[param] * kwargs[param]
return func(*args,**kwargs)
return func_caller
return func_decorator
even without using Inspect we can get function params
>>> func = lambda x, y, args: (x, y, {})
>>> func.func_code.co_argcount
3
>>> func.func_code.co_varnames
('x', 'y', 'args')
This may only be tangentially related, but I found it useful to solve a similar problem. I wanted to meld *args and **kwargs into a single dictionary so that my following code could process without regard to how the args came in, and I didn't want to mutate the existing kwargs variable, otherwise I just would have use kwargs.update().
all_args = {**kwargs, **{k: v for k, v in zip(list(inspect.signature(func).parameters), args)}}
# optionally delete `self`
del (all_args['self'])
Update: While this works, this answer has a better technique. In part:
bound_args = inspect.signature(f).bind(*args, **kwargs)
bound_args.apply_defaults()