Call another function and optionally keep default arguments - python

I have a function with one optional argument, like this:
def funA(x, a, b=1):
return a+b*x
I want to write a new function that calls funA and also has an optional argument, but if no argument is passed, I want to keep the default in funA.
I was thinking something like this:
def funB(x, a, b=None):
if b:
return funA(x, a, b)
else:
return funA(x, a)
Is there a more pythonic way of doing this?

I would replace if b with if b is not None, so that if you pass b=0 (or any other "falsy" value) as argument to funB it will be passed to funA.
Apart from that it seems pretty pythonic to me: clear and explicit. (albeit maybe a bit useless, depending on what you're trying to do!)
A little more cryptic way that relies on calling funB with the correct keyword arguments (e.g. funB(3, 2, b=4):
def funB(x, a, **kwargs):
return funA(x, a, **kwargs)

def funA(x, a, b=1):
return a+b*x
def funB(x, a, b=1):
return funA(x, a, b)
Make the default value of b=1 in funB() and then pass it always to funA()

The way you did it is fine. Another way is for funB to have the same defaults as funA, so you can pass the same parameters right through. E.g., if you do def funB(x, a, b=1), then you can always call return funA(x, a, b) just like that.
For simple cases, the above will work fine. For more complex cases, you may want to use *args and **kwargs (explained here and here). Specifically, you can pass in all your keyword arguments as a dictionary (conventionally called kwargs). In this case, each function would set its own independent defaults, and you would just pass the whole dictionary through:
def funA(x, a, **kwargs):
b = kwargs.get("b", 1)
return a+b*x
def funB(x, a, **kwargs):
return funA(x, a, **kwargs)
If kwargs is empty when passed to funB (b is not specified), it will be set to the default in funA by the statement b = kwargs.get("b", 1). If b is specified, it will be passed through as-is. Note that in funB, you can access b with its own, independent default value and still get the behavior you are looking for.
While this may seem like overkill for your example, extracting a couple of arguments at the beginning of a function is not a big deal if the function is complex enough. It also gives you a lot more flexibility (such as avoiding many of the common gotchas).

Using inspect.getargspec, you can get the default values (fourth item of the returned tuple = defaults):
import inspect
def funA(x, a, b=1):
return a + b * x
# inspect.getargspec(funA) =>
# ArgSpec(args=['x', 'a', 'b'], varargs=None, keywords=None, defaults=(1,))
def funcB(x, a, b=inspect.getargspec(funA)[3][0]):
return funA(x, a, b)
OR (in Python 2.7+)
def funcB(x, a, b=inspect.getargspec(funA).defaults[0]):
return funA(x, a, b)
In Python 3.5+, it's recommend to use inspect.signature instead:
def funcB(x, a, b=inspect.signature(funA).parameters['b'].default):
return funA(x, a, b)

Using FunctionType from types, you can just take a function and create a new one specifying the defaults at runtime. You can put all this in a decorator so that at the point of where you write your code it will keep things tidy, whilst still giving the reader a clue about what you are trying to accomplish. It also allows the exact same call signature for funB as funA -- all arguments can be positional, or all arguments can be keywords, or any valid mix thereof, and any arguments with default values are optional. Should play nice with positional arguments (*args) and keyword arguments (**kwargs) too.
import inspect
from types import FunctionType
def copy_defaults(source_function):
def decorator(destination_function):
"""Creates a wrapper for the destination function with the exact same
signature as source_function (including defaults)."""
# check signature matches
src_sig = inspect.signature(source_function)
dst_sig = inspect.signature(destination_function)
if list(src_sig.parameters) != list(dst_sig.parameters):
raise ValueError("src func and dst func do not having matching " \
"parameter names / order")
return FunctionType(
destination_function.__code__,
destination_function.__globals__,
destination_function.__name__,
source_function.__defaults__, # use defaults from src
destination_function.__closure__
)
return decorator
def funA(x, a, b=1):
return a+b*x
#copy_defaults(funA)
def funB(x, a, b):
"""this is fun B"""
return funA(x, a, b)
assert funA(1, 2) == funB(1, 2)
assert funB.__name__ == "funB"
assert funB.__doc__ == "this is fun B"

You can also use:
def funA(x, a, b=1):
return a+b*x
def funB(x, a, b=None):
return funA(*filter(lambda o: o is not None, [x, a, b]))
Version which will not fail if x or a are None:
def funB(x, a, b=None):
return funA(*([x, a]+filter(lambda o: o is not None, [b])))

Related

Multiple Dispatch not required value

I had a method like this on python:
def method(a, b, c: int=0):
return a+b+c
When I called method(5,2) it returns me 7.
However when I want to use multiple dispatching:
from multipledispatch import dispatch
#dispatch(int, int, int)
def method(a, b, c=0):
return a+b+c
method(5,2) understandably gives an error. Is there any way to make one of the values in dispatch not required like a ref statement on c#?
This will work (you need to specify the names of args with default values when using #dispatch).
#dispatch(int, int, c=int)
def method(a, b, c=0):
return a+b+c
method(2,7)
# Out[58]: 9

Add a function using another function's parameter declaration

I am trying to add some customized logic outside of an existing function. Here are the example:
# existing function that I cannot change
def sum(a, b, c, d):
return a+b+c+d
# the function I want to build
def sumMultiply(a, b, c, d, multiplier):
return multiplier * sum(a, b, c, d)
This is a stupid example, but essentially I want to build a new function that takes all the parameter of the existing function and add a few new arguments.
The above solution is problematic when the existing function changes its definition. For example:
# in some updates the original function dropped one parameter
def sum(a, b, c):
return a+b+c
# the new function will give an error since there is no parameter "d"
def sumMultiply(a, b, c, d, multiplier):
return multiplier * sum(a, b, c, d) # error
How can I specify the new function so that I do not need to worry about changing the new function definition when the existing function definition changes?
One way would be to use arbitrary positional or keyword arguments:
def sumMultiply(multiplier, *numbers):
return multiplier * sum(*numbers)
def sumMultiply(multiplier, *args, **kwargs):
return multiplier * sum(*args, **kwargs)
However, if you see yourself passing around the same set of data around, consider making a parameter object. In your case, it can simply be a list:
def sum(numbers):
...
def sumMultiply(multiplier, numbers):
return multiplier * sum(numbers)
There are some additional downsides to using arbitrary arguments:
the arguments are implicit: you might need to dig through several layers to see what you actually need to provide
they don't play well with type annotations and other static analysers (e.g. PyCharm's refactorings)
I would create a decorator function
def create_fun_multiplier(fun, multiplier=1):
def multiplier_fun(*args):
return multiplier * fun(*args)
return multiplier_fun
def my_sum(a, b, c):
return a + b + c
sumMultiply = create_fun_multiplier(my_sum, multiplier=2)
print(sumMultiply(3, 4, 7))
I would look at using keyword args for this problem.
eg.
def sum(a, b, c):
return a + b + c
def sumMultiply(*args, multiplier=1):
return multiplier * sum(*args)

Default values for iterable unpacking

I've often been frustrated by the lack of flexibility in Python's iterable unpacking.
Take the following example:
a, b = range(2)
Works fine. a contains 0 and b contains 1, just as expected. Now let's try this:
a, b = range(1)
Now, we get a ValueError:
ValueError: not enough values to unpack (expected 2, got 1)
Not ideal, when the desired result was 0 in a, and None in b.
There are a number of hacks to get around this. The most elegant I've seen is this:
a, *b = function_with_variable_number_of_return_values()
b = b[0] if b else None
Not pretty, and could be confusing to Python newcomers.
So what's the most Pythonic way to do this? Store the return value in a variable and use an if block? The *varname hack? Something else?
As mentioned in the comments, the best way to do this is to simply have your function return a constant number of values and if your use case is actually more complicated (like argument parsing), use a library for it.
However, your question explicitly asked for a Pythonic way of handling functions that return a variable number of arguments and I believe it can be cleanly accomplished with decorators. They're not super common and most people tend to use them more than create them so here's a down-to-earth tutorial on creating decorators to learn more about them.
Below is a decorated function that does what you're looking for. The function returns an iterator with a variable number of arguments and it is padded up to a certain length to better accommodate iterator unpacking.
def variable_return(max_values, default=None):
# This decorator is somewhat more complicated because the decorator
# itself needs to take arguments.
def decorator(f):
def wrapper(*args, **kwargs):
actual_values = f(*args, **kwargs)
try:
# This will fail if `actual_values` is a single value.
# Such as a single integer or just `None`.
actual_values = list(actual_values)
except:
actual_values = [actual_values]
extra = [default] * (max_values - len(actual_values))
actual_values.extend(extra)
return actual_values
return wrapper
return decorator
#variable_return(max_values=3)
# This would be a function that actually does something.
# It should not return more values than `max_values`.
def ret_n(n):
return list(range(n))
a, b, c = ret_n(1)
print(a, b, c)
a, b, c = ret_n(2)
print(a, b, c)
a, b, c = ret_n(3)
print(a, b, c)
Which outputs what you're looking for:
0 None None
0 1 None
0 1 2
The decorator basically takes the decorated function and returns its output along with enough extra values to fill in max_values. The caller can then assume that the function always returns exactly max_values number of arguments and can use fancy unpacking like normal.
Here's an alternative version of the decorator solution by #supersam654, using iterators rather than lists for efficiency:
def variable_return(max_values, default=None):
def decorator(f):
def wrapper(*args, **kwargs):
actual_values = f(*args, **kwargs)
try:
for count, value in enumerate(actual_values, 1):
yield value
except TypeError:
count = 1
yield actual_values
yield from [default] * (max_values - count)
return wrapper
return decorator
It's used in the same way:
#variable_return(3)
def ret_n(n):
return tuple(range(n))
a, b, c = ret_n(2)
This could also be used with non-user-defined functions like so:
a, b, c = variable_return(3)(range)(2)
Shortest known to me version (thanks to #KellyBundy in comments below):
a, b, c, d, e, *_ = *my_list_or_iterable, *[None]*5
Obviously it's possible to use other default value than None if necessary.
Also there is one nice feature in Python 3.10 which comes handy here when we know upfront possible numbers of arguments - like when unpacking sys.argv
Previous method:
import sys.argv
_, x, y, z, *_ = *sys.argv, *[None]*3
New method:
import sys
match sys.argv[1:]: #slice needed to drop first value of sys.argv
case [x]:
print(f'x={x}')
case [x,y]:
print(f'x={x}, y={y}')
case [x,y,z]:
print(f'x={x}, y={y}, z={z}')
case _:
print('No arguments')

How to force one keyword in python function

I am implementing a function with three keywords. The default value for each keyword is None, but I need to force the user to pass at least one keyword. The reason why I want to use keywords is that the keyword names a, b and c are descriptive, and will help the user to figure out what does he need to pass to method. How do I achieve my task?
def method(a=None, b=None, c=None):
if a!=None:
func_a(a)
elif b!=None:
func_b(b)
elif c!=None:
func_c(c)
else:
raise MyError('Don\'t be silly, user - please!')
In the above example, assume of course that a, b and c have different attributes. The obvious solution would be:
def method(x):
if is_instance(x, A):
func_a(x)
elif is_instance(x, B):
func_b(x)
[...]
But the problem is that as I said I want to use the keyword names a, b and c to help the user understand what he does need to pass to method!
Is there a more pythonic way to achieve the result?
You could use all() to raise an error early:
def foo(a=None, b=None, c=None):
if all(x is None for x in (a, b, c)):
raise ValueError('You need to set at least *one* of a, b, or c')
if a is not None:
func_a(a)
# etc.
You can use decorator, simulating the Contract programming paradigm.
def check_params(func):
def wrapper(*args, **kwargs):
a = kwargs.get('a', None)
b = kwargs.get('b', None)
c = kwargs.get('c', None)
if (a == b == c == None):
raise Exception("Set one of a, b or c is mandatory.")
else:
return func(*args, **kwargs)
return wrapper
#check_params
def foo(a=None, b=None, c=None):
print("Ok")
foo(a=4) # This will print ok.
foo() # This will rise an exception.
Note that a call such as method(b="something", a="otherthing") would return func_a(a), and not func_b(b), which the user might expect. In fact, it would be better to make sure one and only one keyword is not None (see e.g. here), for which it would probably make more sense for the user to simply directly call the respective methods (though you might want to call them method_from_a etc. then).
Try this expression to evaluate that at least one argument is passed
if not (a or b or c):
raise MyError

Setting the default value of a function input to equal another input in Python

Consider the following function, which does not work in Python, but I will use to explain what I need to do.
def exampleFunction(a, b, c = a):
...function body...
That is I want to assign to variable c the same value that variable a would take, unless an alternative value is specified. The above code does not work in python. Is there a way to do this?
def example(a, b, c=None):
if c is None:
c = a
...
The default value for the keyword argument can't be a variable (if it is, it's converted to a fixed value when the function is defined.) Commonly used to pass arguments to a main function:
def main(argv=None):
if argv is None:
argv = sys.argv
If None could be a valid value, the solution is to either use *args/**kwargs magic as in carl's answer, or use a sentinel object. Libraries that do this include attrs and Marshmallow, and in my opinion it's much cleaner and likely faster.
missing = object()
def example(a, b, c=missing):
if c is missing:
c = a
...
The only way for c is missing to be true is for c to be exactly that dummy object you created there.
This general pattern is probably the best and most readable:
def exampleFunction(a, b, c = None):
if c is None:
c = a
...
You have to be careful that None is not a valid state for c.
If you want to support 'None' values, you can do something like this:
def example(a, b, *args, **kwargs):
if 'c' in kwargs:
c = kwargs['c']
elif len(args) > 0:
c = args[0]
else:
c = a
One approach is something like:
def foo(a, b, c=None):
c = a if c is None else c
# do something

Categories