Suppose I have this function:
def f(x,y):
return x+y
If I use inspect.getargspec(f).args I get ['x','y'] as a result. Great.
Now suppose I want to create another function g(a,b) at runtime, where I don't know the argument names a and b until runtime:
def g(a,b):
return f(a,b)
Is there a way to do this? Lambdas are almost right, except I can only assign argument names at compile time.
g = lambda *p: f(*p)
Somehow I want to create the function dynamically at run time based on a list L (for example L=['a','b']), so that inspect.getargspec(g).args == L).
Here's a somewhat hacky way to do it which first creates a new function from an existing one with the modification and then replaces the original's code with it. It's lengthly mostly because the types.CodeType() call has so many arguments. The Python 3 version is somewhat different because a number of the function.func_code attributes were renamed and the calling sequence of types.CodeType() was changed slightly.
I got the idea from this answer by #aaronasterling (who says he got the idea from Michael Foord's Voidspace blog entry #583 titled Selfless Python). It could easily be made into a decorator, but I don't see that as being helpful based on what you've told us of the intended usage.
import sys
import types
def change_func_args(function, new_args):
""" Create a new function with its arguments renamed to new_args. """
if sys.version_info[0] < 3: # Python 2?
code_obj = function.func_code
assert(0 <= len(new_args) <= code_obj.co_argcount)
# The arguments are just the first co_argcount co_varnames.
# Rreplace them with the new argument names in new_args.
new_varnames = tuple(new_args[:code_obj.co_argcount] +
list(code_obj.co_varnames[code_obj.co_argcount:]))
new_code_obj = types.CodeType(code_obj.co_argcount,
code_obj.co_nlocals,
code_obj.co_stacksize,
code_obj.co_flags,
code_obj.co_code,
code_obj.co_consts,
code_obj.co_names,
new_varnames,
code_obj.co_filename,
code_obj.co_name,
code_obj.co_firstlineno,
code_obj.co_lnotab,
code_obj.co_freevars,
code_obj.co_cellvars)
modified = types.FunctionType(new_code_obj, function.func_globals)
else: # Python 3
code_obj = function.__code__
assert(0 <= len(new_args) <= code_obj.co_argcount)
# The arguments are just the first co_argcount co_varnames.
# Replace them with the new argument names in new_args.
new_varnames = tuple(new_args[:code_obj.co_argcount] +
list(code_obj.co_varnames[code_obj.co_argcount:]))
new_code_obj = types.CodeType(code_obj.co_argcount,
code_obj.co_posonlyargcount,
code_obj.co_kwonlyargcount,
code_obj.co_nlocals,
code_obj.co_stacksize,
code_obj.co_flags,
code_obj.co_code,
code_obj.co_consts,
code_obj.co_names,
new_varnames,
code_obj.co_filename,
code_obj.co_name,
code_obj.co_firstlineno,
code_obj.co_lnotab)
modified = types.FunctionType(new_code_obj, function.__globals__)
function.__code__ = modified.__code__ # replace code portion of original
if __name__ == '__main__':
import inspect
def f(x, y):
return x+y
def g(a, b):
return f(a, b)
print('Before:')
print('inspect.getargspec(g).args: {}'.format(inspect.getargspec(g).args))
print('g(1, 2): {}'.format(g(1, 2)))
change_func_args(g, ['p', 'q'])
print('')
print('After:')
print('inspect.getargspec(g).args: {}'.format(inspect.getargspec(g).args))
print('g(1, 2): {}'.format(g(1, 2)))
I have a feeling you want something like this:
import inspect
import math
def multiply(x, y):
return x * y
def add(a, b):
return a + b
def cube(x):
return x**3
def pythagorean_theorum(a, b, c):
return math.sqrt(a**2 + b**2 + c**2)
def rpc_command(fname, *args, **kwargs):
# Get function by name
f = globals().get(fname)
# Make sure function exists
if not f:
raise NotImplementedError("function not found: %s" % fname)
# Make a dict of argname: argvalue
arg_names = inspect.getargspec(f).args
f_kwargs = dict(zip(arg_names, args))
# Add kwargs to the function's kwargs
f_kwargs.update(kwargs)
return f(**f_kwargs)
Usage:
>>> # Positional args
... rpc_command('add', 1, 2)
3
>>>
>>> # Keyword args
... rpc_command('multiply', x=20, y=6)
120
>>> # Keyword args passed as kwargs
... rpc_command('add', **{"a": 1, "b": 2})
3
>>>
>>> # Mixed args
... rpc_command('multiply', 5, y=6)
30
>>>
>>> # Different arg lengths
... rpc_command('cube', 3)
27
>>>
>>> # Pass in a last as positional args
... rpc_command('pythagorean_theorum', *[1, 2, 3])
3.7416573867739413
>>>
>>> # Try a non-existent function
... rpc_command('doesntexist', 5, 6)
Traceback (most recent call last):
File "<stdin>", line 2, in <module>
File "<stdin>", line 6, in rpc_command
NotImplementedError: function not found: doesntexist
How about using keyword arguments?
>>> g = lambda **kwargs: kwargs
>>> g(x=1, y=2)
{'y': 2, 'x': 1}
>>> g(a='a', b='b')
{'a': 'a', 'b': 'b'}
Something like:
g = lambda **kwargs: f(kwargs.get('a', 0), kwargs['b'])
or let's say you want to use just the values:
>>> g = lambda **kwargs: f(*kwargs.values())
>>> def f(*args): print sum(args)
...
>>> g(a=1, b=2, c=3)
6
In any case, using the **kwargs syntax results in kwargs being a dictionary of all the arguments passed by name.
You may use *args and **kwargs
let say you generate a dynamic function at runtime
def func():
def dyn_func(*args, **kwargs):
print args, kwargs
return dyn_func
it is then possible to use args into your generated function
f = func()
f(test=1)
would give:
() {'test': 1}
then it is possible to manage args as you wish
Related
Something of the following sort. Imagine this case:
def some_function(a, b):
return a + b
some_magical_workaround({"a": 1, "b": 2, "c": 3}) # returns 3
I can't modify some_function to add a **kwargs parameter. How could I create a wrapper function some_magical_workaround which calls some_function as shown?
Also, some_magical_workaround may be used with other functions, and I don't know beforehand what args are defined in the functions being used.
So, you cannot do this in general if the function isn't written in Python (e.g. many built-ins, functions from third-party libraries written as extensions in C) but you can use the inpsect module to introspect the signature. Here is a quick-and-dirty proof-of-concept, I haven't really considered edge-cases, but this should get you going:
import inspect
def bind_exact_args(func, kwargs):
sig = inspect.signature(func) # will fail with functions not written in Python, e.g. many built-ins
common_keys = sig.parameters.keys() & kwargs.keys()
return func(**{k:kwargs[k] for k in common_keys})
def some_function(a, b):
return a + b
So, a demonstration:
>>> import inspect
>>>
>>> def bind_exact_args(func, kwargs):
... sig = inspect.signature(func) # will fail with functions not written in Python, e.g. many built-ins
... return func(**{k:kwargs[k] for k in sig.parameters.keys() & kwargs.keys()})
...
>>> def some_function(a, b):
... return a + b
...
>>> bind_exact_args(some_function, {"a": 1, "b": 2, "c": 3})
3
But note how it can fail with built-ins:
>>> bind_exact_args(max, {"a": 1, "b": 2, "c": 3})
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 2, in bind_exact_args
File "/usr/local/Cellar/python#3.9/3.9.13_1/Frameworks/Python.framework/Versions/3.9/lib/python3.9/inspect.py", line 3113, in signature
return Signature.from_callable(obj, follow_wrapped=follow_wrapped)
File "/usr/local/Cellar/python#3.9/3.9.13_1/Frameworks/Python.framework/Versions/3.9/lib/python3.9/inspect.py", line 2862, in from_callable
return _signature_from_callable(obj, sigcls=cls,
File "/usr/local/Cellar/python#3.9/3.9.13_1/Frameworks/Python.framework/Versions/3.9/lib/python3.9/inspect.py", line 2329, in _signature_from_callable
return _signature_from_builtin(sigcls, obj,
File "/usr/local/Cellar/python#3.9/3.9.13_1/Frameworks/Python.framework/Versions/3.9/lib/python3.9/inspect.py", line 2147, in _signature_from_builtin
raise ValueError("no signature found for builtin {!r}".format(func))
ValueError: no signature found for builtin <built-in function max>
As noted in #JamieDoornbos answer, another example that will not work is a function with positional-only paramters:
E.g.:
def some_function(a, b, /, c):
return a + b + c
Although, you can introspect this:
>>> def some_function(a, b, /, c):
... return a + b + c
...
>>> sig = inspect.signature(some_function)
>>> sig.parameters['a'].kind
<_ParameterKind.POSITIONAL_ONLY: 0>
>>> sig.parameters['b'].kind
<_ParameterKind.POSITIONAL_ONLY: 0>
>>> sig.parameters['c'].kind
<_ParameterKind.POSITIONAL_OR_KEYWORD: 1>
If you need to handle this case, it is certainly possible to, but I leave that as an exercise to the reader :)
You can use the inspect module to find out the argument names for the original function and filter your dictionary:
import inspect
def some_function(a, b):
return a + b
def some_magical_workaround(d):
args = inspect.getfullargspec(some_function)[0]
return some_function(**{k: v for k, v in d.items() if k in args})
print(some_magical_workaround({"a": 1, "b": 2, "c": 3}))
This will print:
3
It can even be made more general by making the function itself an argument:
def some_magical_workaround(func, d):
args = inspect.getfullargspec(func)[0]
...
print(some_magical_workaround(some_function, {"a": 1, "b": 2, "c": 3}))
If you want to pass the entire dict to a wrapper function, you can do so, read the keys internally, and pass them along too
def wrapper_for_some_function(source_dict):
# two possible choices to get the keys from the dict
a = source_dict["a"] # indexing: exactly a or KeyError
b = source_dict.get("b", 7) # .get method: 7 if there's no b
# now pass the values from the dict to the wrapped function
return some_function(a, b)
original answer components
If instead, you can unpack the dict, when you define your function, you can add a **kwargs argument to eat up unknown args
def some_function(a, b, **kwargs):
return a + b
>>> def some_function(a, b, **kwargs):
... return a + b
...
>>> d = {"a":1,"b":2,"c":3}
>>> some_function(**d) # dictionary unpacking
3
For basic use cases, you can do something like this:
import inspect
def some_magical_workaround(fn, params):
return fn(**{
name: value for name, value in params.items()
if name in set(inspect.getfullargspec(fn)[0])
})
some_magical_workaround(some_function, {"a":1,"b":2,"c":3})
However, be warned that calling functions this way circumvents the explicit coupling of parameters, which is designed to expose errors earlier in the development process.
And there are some constraints on the values of fn that will work as expected. some_function is fine, but here's an alternative that won't work because the parameters are not allowed to have names:
>>> def some_function2(a, b, /):
... return a+b
...
>>> some_magical_workaround(some_function2, {"a":1,"b":2,"c":3})
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 2, in apply_by_name
TypeError: some_function() got some positional-only arguments passed as keyword arguments: 'a, b'
Let's imagine I have a dict :
d = {'a': 3, 'b':4}
I want to create a function f that does the exact same thing than this function :
def f(x, a=d['a'], b=d['b']):
print(x, a, b)
(Not necessarily print, but do some stuff with the variable and calling directly from their name).
But I would like to create this function directly from the dict, that is to say, I would like to have something that look likes
def f(x, **d=d):
print(x, a, b)
and that behaves like the previously defined function. The idea is that I have a large dictionary that contains defaults values for arguments of my function, and I would like not to have to do
def f(a= d['a'], b = d['b'] ...)
I don't know if it's possible at all in python. Any insight is appreciated !
Edit : The idea is to be able to call f(5, a=3).
Edit2 : The question is not about passing arguments stored in a dict to a function but to define a function whose arguments names and defaults values are stored in a dict.
You cannot achieve this at function definition because Python determines the scope of a function statically. Although, it is possible to write a decorator to add in default keyword arguments.
from functools import wraps
def kwargs_decorator(dict_kwargs):
def wrapper(f):
#wraps(f)
def inner_wrapper(*args, **kwargs):
new_kwargs = {**dict_kwargs, **kwargs}
return f(*args, **new_kwargs)
return inner_wrapper
return wrapper
Usage
#kwargs_decorator({'bar': 1})
def foo(**kwargs):
print(kwargs['bar'])
foo() # prints 1
Or alternatively if you know the variable names but not their default values...
#kwargs_decorator({'bar': 1})
def foo(bar):
print(bar)
foo() # prints 1
Caveat
The above can be used, by example, to dynamically generate multiple functions with different default arguments. Although, if the parameters you want to pass are the same for every function, it would be simpler and more idiomatic to simply pass in a dict of parameters.
Python is designed such that the local variables of any function can be determined unambiguously by looking at the source code of the function. So your proposed syntax
def f(x, **d=d):
print(x, a, b)
is a nonstarter because there's nothing that indicates whether a and b are local to f or not; it depends on the runtime value of the dictionary, whose value could change across runs.
If you can resign yourself to explicitly listing the names of all of your parameters, you can automatically set their default values at runtime; this has already been well covered in other answers. Listing the parameter names is probably good documentation anyway.
If you really want to synthesize the whole parameter list at run time from the contents of d, you would have to build a string representation of the function definition and pass it to exec. This is how collections.namedtuple works, for example.
Variables in module and class scopes are looked up dynamically, so this is technically valid:
def f(x, **kwargs):
class C:
vars().update(kwargs) # don't do this, please
print(x, a, b)
But please don't do it except in an IOPCC entry.
try this:
# Store the default values in a dictionary
>>> defaults = {
... 'a': 1,
... 'b': 2,
... }
>>> def f(x, **kwa):
# Each time the function is called, merge the default values and the provided arguments
# For python >= 3.5:
args = {**defaults, **kwa}
# For python < 3.5:
# Each time the function is called, copy the default values
args = defaults.copy()
# Merge the provided arguments into the copied default values
args.update(kwa)
... print(args)
...
>>> f(1, f=2)
{'a': 1, 'b': 2, 'f': 2}
>>> f(1, f=2, b=8)
{'a': 1, 'b': 8, 'f': 2}
>>> f(5, a=3)
{'a': 3, 'b': 2}
Thanks Olvin Roght for pointing out how to nicely merge dictionaries in python >= 3.5
How about the **kwargs trick?
def function(arg0, **kwargs):
print("arg is", arg0, "a is", kwargs["a"], "b is", kwargs["b"])
d = {"a":1, "b":2}
function(0., **d)
outcome:
arg is 0.0 a is 1 b is 2
This question is very interesting, and it seemed different people have their different own guess about what the question really want.
I have my own too. Here is my code, which can express myself:
# python3 only
from collections import defaultdict
# only set once when function definition is executed
def kwdefault_decorator(default_dict):
def wrapper(f):
f.__kwdefaults__ = {}
f_code = f.__code__
po_arg_count = f_code.co_argcount
keys = f_code.co_varnames[po_arg_count : po_arg_count + f_code.co_kwonlyargcount]
for k in keys:
f.__kwdefaults__[k] = default_dict[k]
return f
return wrapper
default_dict = defaultdict(lambda: "default_value")
default_dict["a"] = "a"
default_dict["m"] = "m"
#kwdefault_decorator(default_dict)
def foo(x, *, a, b):
foo_local = "foo"
print(x, a, b, foo_local)
#kwdefault_decorator(default_dict)
def bar(x, *, m, n):
bar_local = "bar"
print(x, m, n, bar_local)
foo(1)
bar(1)
# only kw_arg permitted
foo(1, a=100, b=100)
bar(1, m=100, n=100)
output:
1 a default_value
1 m default_value
1 100 100
1 100 100
Posting this as an answer because it would be too long for a comment.
Be careful with this answer. If you try
#kwargs_decorator(a='a', b='b')
def f(x, a, b):
print(f'x = {x}')
print(f'a = {a}')
print(f'b = {b}')
f(1, 2)
it will issue an error:
TypeError: f() got multiple values for argument 'a'
because you are defining a as a positional argument (equal to 2).
I implemented a workaround, even though I'm not sure if this is the best solution:
def default_kwargs(**default):
from functools import wraps
def decorator(f):
#wraps(f)
def wrapper(*args, **kwargs):
from inspect import getfullargspec
f_args = getfullargspec(f)[0]
used_args = f_args[:len(args)]
final_kwargs = {
key: value
for key, value in {**default, **kwargs}.items()
if key not in used_args
}
return f(*args, **final_kwargs)
return wrapper
return decorator
In this solution, f_args is a list containing the names of all named positional arguments of f. Then used_args is the list of all parameters that have effectively been passed as positional arguments. Therefore final_kwargs is defined almost exactly like before, except that it checks if the argument (in the case above, a) was already passed as a positional argument.
For instance, this solution works beautifully with functions such as the following.
#default_kwargs(a='a', b='b', d='d')
def f(x, a, b, *args, c='c', d='not d', **kwargs):
print(f'x = {x}')
print(f'a = {a}')
print(f'b = {b}')
for idx, arg in enumerate(args):
print(f'arg{idx} = {arg}')
print(f'c = {c}')
for key, value in kwargs.items():
print(f'{key} = {value}')
f(1)
f(1, 2)
f(1, b=3)
f(1, 2, 3, 4)
f(1, 2, 3, 4, 5, c=6, g=7)
Note also that the default values passed in default_kwargs have higher precedence than the ones defined in f. For example, the default value for d in this case is actually 'd' (defined in default_kwargs), and not 'not d' (defined in f).
You can unpack values of dict:
from collections import OrderedDict
def f(x, a, b):
print(x, a, b)
d = OrderedDict({'a': 3, 'b':4})
f(10, *d.values())
UPD.
Yes, it's possible to implement this mad idea of modifying local scope by creating decorator which will return class with overriden __call__() and store your defaults in class scope, BUT IT'S MASSIVE OVERKILL.
Your problem is that you're trying to hide problems of your architecture behind those tricks. If you store your default values in dict, then access to them by key. If you want to use keywords - define class.
P.S. I still don't understand why this question collect so much upvotes.
Sure...
hope this helps
def funcc(x, **kwargs):
locals().update(kwargs)
print(x, a, b, c, d)
kwargs = {'a' : 1, 'b' : 2, 'c':1, 'd': 1}
x = 1
funcc(x, **kwargs)
Let's imagine I have a dict :
d = {'a': 3, 'b':4}
I want to create a function f that does the exact same thing than this function :
def f(x, a=d['a'], b=d['b']):
print(x, a, b)
(Not necessarily print, but do some stuff with the variable and calling directly from their name).
But I would like to create this function directly from the dict, that is to say, I would like to have something that look likes
def f(x, **d=d):
print(x, a, b)
and that behaves like the previously defined function. The idea is that I have a large dictionary that contains defaults values for arguments of my function, and I would like not to have to do
def f(a= d['a'], b = d['b'] ...)
I don't know if it's possible at all in python. Any insight is appreciated !
Edit : The idea is to be able to call f(5, a=3).
Edit2 : The question is not about passing arguments stored in a dict to a function but to define a function whose arguments names and defaults values are stored in a dict.
You cannot achieve this at function definition because Python determines the scope of a function statically. Although, it is possible to write a decorator to add in default keyword arguments.
from functools import wraps
def kwargs_decorator(dict_kwargs):
def wrapper(f):
#wraps(f)
def inner_wrapper(*args, **kwargs):
new_kwargs = {**dict_kwargs, **kwargs}
return f(*args, **new_kwargs)
return inner_wrapper
return wrapper
Usage
#kwargs_decorator({'bar': 1})
def foo(**kwargs):
print(kwargs['bar'])
foo() # prints 1
Or alternatively if you know the variable names but not their default values...
#kwargs_decorator({'bar': 1})
def foo(bar):
print(bar)
foo() # prints 1
Caveat
The above can be used, by example, to dynamically generate multiple functions with different default arguments. Although, if the parameters you want to pass are the same for every function, it would be simpler and more idiomatic to simply pass in a dict of parameters.
Python is designed such that the local variables of any function can be determined unambiguously by looking at the source code of the function. So your proposed syntax
def f(x, **d=d):
print(x, a, b)
is a nonstarter because there's nothing that indicates whether a and b are local to f or not; it depends on the runtime value of the dictionary, whose value could change across runs.
If you can resign yourself to explicitly listing the names of all of your parameters, you can automatically set their default values at runtime; this has already been well covered in other answers. Listing the parameter names is probably good documentation anyway.
If you really want to synthesize the whole parameter list at run time from the contents of d, you would have to build a string representation of the function definition and pass it to exec. This is how collections.namedtuple works, for example.
Variables in module and class scopes are looked up dynamically, so this is technically valid:
def f(x, **kwargs):
class C:
vars().update(kwargs) # don't do this, please
print(x, a, b)
But please don't do it except in an IOPCC entry.
try this:
# Store the default values in a dictionary
>>> defaults = {
... 'a': 1,
... 'b': 2,
... }
>>> def f(x, **kwa):
# Each time the function is called, merge the default values and the provided arguments
# For python >= 3.5:
args = {**defaults, **kwa}
# For python < 3.5:
# Each time the function is called, copy the default values
args = defaults.copy()
# Merge the provided arguments into the copied default values
args.update(kwa)
... print(args)
...
>>> f(1, f=2)
{'a': 1, 'b': 2, 'f': 2}
>>> f(1, f=2, b=8)
{'a': 1, 'b': 8, 'f': 2}
>>> f(5, a=3)
{'a': 3, 'b': 2}
Thanks Olvin Roght for pointing out how to nicely merge dictionaries in python >= 3.5
How about the **kwargs trick?
def function(arg0, **kwargs):
print("arg is", arg0, "a is", kwargs["a"], "b is", kwargs["b"])
d = {"a":1, "b":2}
function(0., **d)
outcome:
arg is 0.0 a is 1 b is 2
This question is very interesting, and it seemed different people have their different own guess about what the question really want.
I have my own too. Here is my code, which can express myself:
# python3 only
from collections import defaultdict
# only set once when function definition is executed
def kwdefault_decorator(default_dict):
def wrapper(f):
f.__kwdefaults__ = {}
f_code = f.__code__
po_arg_count = f_code.co_argcount
keys = f_code.co_varnames[po_arg_count : po_arg_count + f_code.co_kwonlyargcount]
for k in keys:
f.__kwdefaults__[k] = default_dict[k]
return f
return wrapper
default_dict = defaultdict(lambda: "default_value")
default_dict["a"] = "a"
default_dict["m"] = "m"
#kwdefault_decorator(default_dict)
def foo(x, *, a, b):
foo_local = "foo"
print(x, a, b, foo_local)
#kwdefault_decorator(default_dict)
def bar(x, *, m, n):
bar_local = "bar"
print(x, m, n, bar_local)
foo(1)
bar(1)
# only kw_arg permitted
foo(1, a=100, b=100)
bar(1, m=100, n=100)
output:
1 a default_value
1 m default_value
1 100 100
1 100 100
Posting this as an answer because it would be too long for a comment.
Be careful with this answer. If you try
#kwargs_decorator(a='a', b='b')
def f(x, a, b):
print(f'x = {x}')
print(f'a = {a}')
print(f'b = {b}')
f(1, 2)
it will issue an error:
TypeError: f() got multiple values for argument 'a'
because you are defining a as a positional argument (equal to 2).
I implemented a workaround, even though I'm not sure if this is the best solution:
def default_kwargs(**default):
from functools import wraps
def decorator(f):
#wraps(f)
def wrapper(*args, **kwargs):
from inspect import getfullargspec
f_args = getfullargspec(f)[0]
used_args = f_args[:len(args)]
final_kwargs = {
key: value
for key, value in {**default, **kwargs}.items()
if key not in used_args
}
return f(*args, **final_kwargs)
return wrapper
return decorator
In this solution, f_args is a list containing the names of all named positional arguments of f. Then used_args is the list of all parameters that have effectively been passed as positional arguments. Therefore final_kwargs is defined almost exactly like before, except that it checks if the argument (in the case above, a) was already passed as a positional argument.
For instance, this solution works beautifully with functions such as the following.
#default_kwargs(a='a', b='b', d='d')
def f(x, a, b, *args, c='c', d='not d', **kwargs):
print(f'x = {x}')
print(f'a = {a}')
print(f'b = {b}')
for idx, arg in enumerate(args):
print(f'arg{idx} = {arg}')
print(f'c = {c}')
for key, value in kwargs.items():
print(f'{key} = {value}')
f(1)
f(1, 2)
f(1, b=3)
f(1, 2, 3, 4)
f(1, 2, 3, 4, 5, c=6, g=7)
Note also that the default values passed in default_kwargs have higher precedence than the ones defined in f. For example, the default value for d in this case is actually 'd' (defined in default_kwargs), and not 'not d' (defined in f).
You can unpack values of dict:
from collections import OrderedDict
def f(x, a, b):
print(x, a, b)
d = OrderedDict({'a': 3, 'b':4})
f(10, *d.values())
UPD.
Yes, it's possible to implement this mad idea of modifying local scope by creating decorator which will return class with overriden __call__() and store your defaults in class scope, BUT IT'S MASSIVE OVERKILL.
Your problem is that you're trying to hide problems of your architecture behind those tricks. If you store your default values in dict, then access to them by key. If you want to use keywords - define class.
P.S. I still don't understand why this question collect so much upvotes.
Sure...
hope this helps
def funcc(x, **kwargs):
locals().update(kwargs)
print(x, a, b, c, d)
kwargs = {'a' : 1, 'b' : 2, 'c':1, 'd': 1}
x = 1
funcc(x, **kwargs)
While applying some external module method to a class I need to be able to pass different pairs of arg = 'value' to the function, like:
Ad.nodes.get(id_ = '11974312')
How to pass dicts or tuples to the function, so that it recognises 'id_' (string) as id_ (argument) in
('id_', '11974312') (tuple) or {'id_':'11974312'} (dictionary) ?
Basically, I just need to get id_ out of 'id_'
For your reference, I am trying to use neomodel module for neo4j graph db.
To pass multiple arguments to a function, you use the * operator and ** operator as shown below.
def myfoo(*arg, **karg):
print arg, karg
The * operator pack all ordered argument in it, and the ** operator packs all unmatched key argument in it.
For example:
def myfoo(a, *arg):
return a, arg
myfoo(11,22,33)
>> (11, [22, 33])
myfoo(11)
>> (11, [])
For key argument it works the same way,
def myfoo(a=11, b=22, **kargs):
return a, b, kargs
myfoo(22, c=100, b=233)
>> (22, 233, {'c': 100})
(ref.)
You can unfold the positional argument of a function with a single asterisk * and unfold dictionaries as key/value pairs with two asterisks **. For example
def get(a, b, c=0):
print(a, b, c)
args = (1, 2)
kwargs = {'c': 3}
get(*args, **kwargs)
The Python Reference has details on this.
A more specific example for OP
If you have a function get(id_=None) with the keyword argument id_ you can use **some_dict to unfold the key/value pairs into keyword arguments. For example
In [1]: def get(id_=None):
...: print(id_)
...: # do something with id_ ...
...:
In [2]: get(**{'id_': 1})
1
In [3]: get(**{'id_': 2})
2
If instead you have a function get(id_) with the positional argument id_ you can use *some_iterable to unfold the values into positional arguments. You can also use **some_dict to unfold the key/value pairs as long as the keys exactly match the positional arguments. For example
In [4]: def get(id_):
...: print(id_)
...: # do something with id_ ...
...:
In [5]: get(*(1,))
1
In [6]: get(*(2,))
2
In [7]: get(**{'id_': 3})
3
In [8]: # this will fail because `id` is not the argument, `id_` is
In [9]: get(**{'id': 4})
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-9-922e10531f8a> in <module>
----> 1 get(**{'id': 4})
TypeError: get() got an unexpected keyword argument 'id'
I don't want to use *args or **kwargs since I can't change function declaration.
For example:
def foo( a, b, c ) """Lets say values passed to a, b and c are 1,2 and 3 respectively"""
...
...
""" I would like to generate an object preferably a dictionary such as {'a':1, 'b':2, 'c':3} """
...
...
Can anyone suggest a way to do this?
Thanks in advance.
If you can't change the function "declaration" (why not?) but you can change the contents of the function, then just create the dictionary as you want it:
def foo(a, b, c):
mydict = {'a': a, 'b': b, 'c': c}
If that doesn't work, I think you need a better explanation of what you want and what the constraints are in your case.
This is also going to give similar results in the above case (where you don't show any local variables other than the arguments), but be warned that you should not try to modify locals():
def foo(a, b, c):
mydict = locals()
#Rohit, we do not understand what you mean when you say "the function declaration". If you mean you don't want to change the API of the function (the documented way the function is called), perhaps because you have existing code already calling an existing function, then you can still use the **kwargs notation, and the callers will never know:
def foo(a, b, c):
return a + b + c
def foo(**kwargs):
total = 0
for x in ("a", "b", "c"):
assert x in kwargs
total += kwargs[x]
return total
def bar():
foo(3, 5, 7)
bar() cannot tell which version of foo() it is calling, and does not care.
Perhaps you are looking for a "wrapper" you can wrap around existing function objects, without changing the actual source code of the function object?
def make_wrapper(fn, *arg_names):
def wrapped_fn(*args):
mydict = dict(tup for tup in zip(arg_names, args))
print("TEST: mydict: %s" % str(mydict))
return fn(*args)
return wrapped_fn
def foo(a, b, c):
return a + b + c
foo = make_wrapper(foo, "a", "b", "c")
foo(3, 5, 7)
The new wrapped function gathers the arguments into mydict and prints mydict before calling the function.
By diligent searching of StackOverflow, I found out how to do this. You use the inspect module.
import inspect
def make_wrapper(fn):
arg_names = inspect.getargspec(fn)[0]
def wrapped_fn(*args, **kwargs):
# mydict now gets all expected positional arguments:
mydict = dict(tup for tup in zip(arg_names, args))
# special name "__args" gets list of all positional arguments
mydict["__args"] = args
# mydict now updated with all keyword arguments
mydict.update(kwargs)
# mydict now has full information on all arguments of any sort
print("TEST: mydict: %s" % str(mydict))
return fn(*args, **kwargs)
return wrapped_fn
def foo(a, b, c, *args, **kwargs):
# a, b, and c must be set; extra, unexpected args will go in args list
return a + b + c
foo = make_wrapper(foo)
foo(3, 5, 7, 1, 2)
# prints: TEST: mydict: {'a': 3, 'c': 7, 'b': 5, '__args': (3, 5, 7, 1, 2)}
# returns: 15
There you go, a perfect solution to the problem you stated. It is a wrapper, you don't need to pass in the arguments, and it should work for any function. If you need it to work with class objects or something you can read the docs for inspect and see how to do it.
Note, of course order is not preserved in dictionaries, so you may not see the exact order I saw when I tested this. But the same values should be in the dict.
def foo(a, b, c):
args = {"a": a, "b": b, "c": c}