python function dictionary with variable number of parameters - python

I am trying to create a dictionary which maps strings to functions. The problem is, the functions can have different parameter lengths. Is there a way to handle this.
For ex:
myFuncDict
{
'A' : a ----> def a(param1, param2)
'B : b ----> def b(param1)
'C' : c ----> def c(param1, param2, param3)
}
I want to call functions like:
def test(k):
myFuncDict[k](params)
How can i achieve this?
kwargs or args is one way to go, but not sure how to handle the above using those without sending extra parameters.

Python actually makes this quite simple. You can simply unpack your container of arguments into the function call using the unpacking operator *. Here is an example:
def a(x):
print(x)
def b(x, y):
print x, y
dic = {'a': a, 'b': b}
def call_func(func, params):
dic[func](*params) # *params is the magic.
call_func('a', (1))
call_func('b', (1, 2))

Related

Pass dictionary directly as arguments of function [duplicate]

Let's imagine I have a dict :
d = {'a': 3, 'b':4}
I want to create a function f that does the exact same thing than this function :
def f(x, a=d['a'], b=d['b']):
print(x, a, b)
(Not necessarily print, but do some stuff with the variable and calling directly from their name).
But I would like to create this function directly from the dict, that is to say, I would like to have something that look likes
def f(x, **d=d):
print(x, a, b)
and that behaves like the previously defined function. The idea is that I have a large dictionary that contains defaults values for arguments of my function, and I would like not to have to do
def f(a= d['a'], b = d['b'] ...)
I don't know if it's possible at all in python. Any insight is appreciated !
Edit : The idea is to be able to call f(5, a=3).
Edit2 : The question is not about passing arguments stored in a dict to a function but to define a function whose arguments names and defaults values are stored in a dict.
You cannot achieve this at function definition because Python determines the scope of a function statically. Although, it is possible to write a decorator to add in default keyword arguments.
from functools import wraps
def kwargs_decorator(dict_kwargs):
def wrapper(f):
#wraps(f)
def inner_wrapper(*args, **kwargs):
new_kwargs = {**dict_kwargs, **kwargs}
return f(*args, **new_kwargs)
return inner_wrapper
return wrapper
Usage
#kwargs_decorator({'bar': 1})
def foo(**kwargs):
print(kwargs['bar'])
foo() # prints 1
Or alternatively if you know the variable names but not their default values...
#kwargs_decorator({'bar': 1})
def foo(bar):
print(bar)
foo() # prints 1
Caveat
The above can be used, by example, to dynamically generate multiple functions with different default arguments. Although, if the parameters you want to pass are the same for every function, it would be simpler and more idiomatic to simply pass in a dict of parameters.
Python is designed such that the local variables of any function can be determined unambiguously by looking at the source code of the function. So your proposed syntax
def f(x, **d=d):
print(x, a, b)
is a nonstarter because there's nothing that indicates whether a and b are local to f or not; it depends on the runtime value of the dictionary, whose value could change across runs.
If you can resign yourself to explicitly listing the names of all of your parameters, you can automatically set their default values at runtime; this has already been well covered in other answers. Listing the parameter names is probably good documentation anyway.
If you really want to synthesize the whole parameter list at run time from the contents of d, you would have to build a string representation of the function definition and pass it to exec. This is how collections.namedtuple works, for example.
Variables in module and class scopes are looked up dynamically, so this is technically valid:
def f(x, **kwargs):
class C:
vars().update(kwargs) # don't do this, please
print(x, a, b)
But please don't do it except in an IOPCC entry.
try this:
# Store the default values in a dictionary
>>> defaults = {
... 'a': 1,
... 'b': 2,
... }
>>> def f(x, **kwa):
# Each time the function is called, merge the default values and the provided arguments
# For python >= 3.5:
args = {**defaults, **kwa}
# For python < 3.5:
# Each time the function is called, copy the default values
args = defaults.copy()
# Merge the provided arguments into the copied default values
args.update(kwa)
... print(args)
...
>>> f(1, f=2)
{'a': 1, 'b': 2, 'f': 2}
>>> f(1, f=2, b=8)
{'a': 1, 'b': 8, 'f': 2}
>>> f(5, a=3)
{'a': 3, 'b': 2}
Thanks Olvin Roght for pointing out how to nicely merge dictionaries in python >= 3.5
How about the **kwargs trick?
def function(arg0, **kwargs):
print("arg is", arg0, "a is", kwargs["a"], "b is", kwargs["b"])
d = {"a":1, "b":2}
function(0., **d)
outcome:
arg is 0.0 a is 1 b is 2
This question is very interesting, and it seemed different people have their different own guess about what the question really want.
I have my own too. Here is my code, which can express myself:
# python3 only
from collections import defaultdict
# only set once when function definition is executed
def kwdefault_decorator(default_dict):
def wrapper(f):
f.__kwdefaults__ = {}
f_code = f.__code__
po_arg_count = f_code.co_argcount
keys = f_code.co_varnames[po_arg_count : po_arg_count + f_code.co_kwonlyargcount]
for k in keys:
f.__kwdefaults__[k] = default_dict[k]
return f
return wrapper
default_dict = defaultdict(lambda: "default_value")
default_dict["a"] = "a"
default_dict["m"] = "m"
#kwdefault_decorator(default_dict)
def foo(x, *, a, b):
foo_local = "foo"
print(x, a, b, foo_local)
#kwdefault_decorator(default_dict)
def bar(x, *, m, n):
bar_local = "bar"
print(x, m, n, bar_local)
foo(1)
bar(1)
# only kw_arg permitted
foo(1, a=100, b=100)
bar(1, m=100, n=100)
output:
1 a default_value
1 m default_value
1 100 100
1 100 100
Posting this as an answer because it would be too long for a comment.
Be careful with this answer. If you try
#kwargs_decorator(a='a', b='b')
def f(x, a, b):
print(f'x = {x}')
print(f'a = {a}')
print(f'b = {b}')
f(1, 2)
it will issue an error:
TypeError: f() got multiple values for argument 'a'
because you are defining a as a positional argument (equal to 2).
I implemented a workaround, even though I'm not sure if this is the best solution:
def default_kwargs(**default):
from functools import wraps
def decorator(f):
#wraps(f)
def wrapper(*args, **kwargs):
from inspect import getfullargspec
f_args = getfullargspec(f)[0]
used_args = f_args[:len(args)]
final_kwargs = {
key: value
for key, value in {**default, **kwargs}.items()
if key not in used_args
}
return f(*args, **final_kwargs)
return wrapper
return decorator
In this solution, f_args is a list containing the names of all named positional arguments of f. Then used_args is the list of all parameters that have effectively been passed as positional arguments. Therefore final_kwargs is defined almost exactly like before, except that it checks if the argument (in the case above, a) was already passed as a positional argument.
For instance, this solution works beautifully with functions such as the following.
#default_kwargs(a='a', b='b', d='d')
def f(x, a, b, *args, c='c', d='not d', **kwargs):
print(f'x = {x}')
print(f'a = {a}')
print(f'b = {b}')
for idx, arg in enumerate(args):
print(f'arg{idx} = {arg}')
print(f'c = {c}')
for key, value in kwargs.items():
print(f'{key} = {value}')
f(1)
f(1, 2)
f(1, b=3)
f(1, 2, 3, 4)
f(1, 2, 3, 4, 5, c=6, g=7)
Note also that the default values passed in default_kwargs have higher precedence than the ones defined in f. For example, the default value for d in this case is actually 'd' (defined in default_kwargs), and not 'not d' (defined in f).
You can unpack values of dict:
from collections import OrderedDict
def f(x, a, b):
print(x, a, b)
d = OrderedDict({'a': 3, 'b':4})
f(10, *d.values())
UPD.
Yes, it's possible to implement this mad idea of modifying local scope by creating decorator which will return class with overriden __call__() and store your defaults in class scope, BUT IT'S MASSIVE OVERKILL.
Your problem is that you're trying to hide problems of your architecture behind those tricks. If you store your default values in dict, then access to them by key. If you want to use keywords - define class.
P.S. I still don't understand why this question collect so much upvotes.
Sure...
hope this helps
def funcc(x, **kwargs):
locals().update(kwargs)
print(x, a, b, c, d)
kwargs = {'a' : 1, 'b' : 2, 'c':1, 'd': 1}
x = 1
funcc(x, **kwargs)

Give function defaults arguments from a dictionary in Python

Let's imagine I have a dict :
d = {'a': 3, 'b':4}
I want to create a function f that does the exact same thing than this function :
def f(x, a=d['a'], b=d['b']):
print(x, a, b)
(Not necessarily print, but do some stuff with the variable and calling directly from their name).
But I would like to create this function directly from the dict, that is to say, I would like to have something that look likes
def f(x, **d=d):
print(x, a, b)
and that behaves like the previously defined function. The idea is that I have a large dictionary that contains defaults values for arguments of my function, and I would like not to have to do
def f(a= d['a'], b = d['b'] ...)
I don't know if it's possible at all in python. Any insight is appreciated !
Edit : The idea is to be able to call f(5, a=3).
Edit2 : The question is not about passing arguments stored in a dict to a function but to define a function whose arguments names and defaults values are stored in a dict.
You cannot achieve this at function definition because Python determines the scope of a function statically. Although, it is possible to write a decorator to add in default keyword arguments.
from functools import wraps
def kwargs_decorator(dict_kwargs):
def wrapper(f):
#wraps(f)
def inner_wrapper(*args, **kwargs):
new_kwargs = {**dict_kwargs, **kwargs}
return f(*args, **new_kwargs)
return inner_wrapper
return wrapper
Usage
#kwargs_decorator({'bar': 1})
def foo(**kwargs):
print(kwargs['bar'])
foo() # prints 1
Or alternatively if you know the variable names but not their default values...
#kwargs_decorator({'bar': 1})
def foo(bar):
print(bar)
foo() # prints 1
Caveat
The above can be used, by example, to dynamically generate multiple functions with different default arguments. Although, if the parameters you want to pass are the same for every function, it would be simpler and more idiomatic to simply pass in a dict of parameters.
Python is designed such that the local variables of any function can be determined unambiguously by looking at the source code of the function. So your proposed syntax
def f(x, **d=d):
print(x, a, b)
is a nonstarter because there's nothing that indicates whether a and b are local to f or not; it depends on the runtime value of the dictionary, whose value could change across runs.
If you can resign yourself to explicitly listing the names of all of your parameters, you can automatically set their default values at runtime; this has already been well covered in other answers. Listing the parameter names is probably good documentation anyway.
If you really want to synthesize the whole parameter list at run time from the contents of d, you would have to build a string representation of the function definition and pass it to exec. This is how collections.namedtuple works, for example.
Variables in module and class scopes are looked up dynamically, so this is technically valid:
def f(x, **kwargs):
class C:
vars().update(kwargs) # don't do this, please
print(x, a, b)
But please don't do it except in an IOPCC entry.
try this:
# Store the default values in a dictionary
>>> defaults = {
... 'a': 1,
... 'b': 2,
... }
>>> def f(x, **kwa):
# Each time the function is called, merge the default values and the provided arguments
# For python >= 3.5:
args = {**defaults, **kwa}
# For python < 3.5:
# Each time the function is called, copy the default values
args = defaults.copy()
# Merge the provided arguments into the copied default values
args.update(kwa)
... print(args)
...
>>> f(1, f=2)
{'a': 1, 'b': 2, 'f': 2}
>>> f(1, f=2, b=8)
{'a': 1, 'b': 8, 'f': 2}
>>> f(5, a=3)
{'a': 3, 'b': 2}
Thanks Olvin Roght for pointing out how to nicely merge dictionaries in python >= 3.5
How about the **kwargs trick?
def function(arg0, **kwargs):
print("arg is", arg0, "a is", kwargs["a"], "b is", kwargs["b"])
d = {"a":1, "b":2}
function(0., **d)
outcome:
arg is 0.0 a is 1 b is 2
This question is very interesting, and it seemed different people have their different own guess about what the question really want.
I have my own too. Here is my code, which can express myself:
# python3 only
from collections import defaultdict
# only set once when function definition is executed
def kwdefault_decorator(default_dict):
def wrapper(f):
f.__kwdefaults__ = {}
f_code = f.__code__
po_arg_count = f_code.co_argcount
keys = f_code.co_varnames[po_arg_count : po_arg_count + f_code.co_kwonlyargcount]
for k in keys:
f.__kwdefaults__[k] = default_dict[k]
return f
return wrapper
default_dict = defaultdict(lambda: "default_value")
default_dict["a"] = "a"
default_dict["m"] = "m"
#kwdefault_decorator(default_dict)
def foo(x, *, a, b):
foo_local = "foo"
print(x, a, b, foo_local)
#kwdefault_decorator(default_dict)
def bar(x, *, m, n):
bar_local = "bar"
print(x, m, n, bar_local)
foo(1)
bar(1)
# only kw_arg permitted
foo(1, a=100, b=100)
bar(1, m=100, n=100)
output:
1 a default_value
1 m default_value
1 100 100
1 100 100
Posting this as an answer because it would be too long for a comment.
Be careful with this answer. If you try
#kwargs_decorator(a='a', b='b')
def f(x, a, b):
print(f'x = {x}')
print(f'a = {a}')
print(f'b = {b}')
f(1, 2)
it will issue an error:
TypeError: f() got multiple values for argument 'a'
because you are defining a as a positional argument (equal to 2).
I implemented a workaround, even though I'm not sure if this is the best solution:
def default_kwargs(**default):
from functools import wraps
def decorator(f):
#wraps(f)
def wrapper(*args, **kwargs):
from inspect import getfullargspec
f_args = getfullargspec(f)[0]
used_args = f_args[:len(args)]
final_kwargs = {
key: value
for key, value in {**default, **kwargs}.items()
if key not in used_args
}
return f(*args, **final_kwargs)
return wrapper
return decorator
In this solution, f_args is a list containing the names of all named positional arguments of f. Then used_args is the list of all parameters that have effectively been passed as positional arguments. Therefore final_kwargs is defined almost exactly like before, except that it checks if the argument (in the case above, a) was already passed as a positional argument.
For instance, this solution works beautifully with functions such as the following.
#default_kwargs(a='a', b='b', d='d')
def f(x, a, b, *args, c='c', d='not d', **kwargs):
print(f'x = {x}')
print(f'a = {a}')
print(f'b = {b}')
for idx, arg in enumerate(args):
print(f'arg{idx} = {arg}')
print(f'c = {c}')
for key, value in kwargs.items():
print(f'{key} = {value}')
f(1)
f(1, 2)
f(1, b=3)
f(1, 2, 3, 4)
f(1, 2, 3, 4, 5, c=6, g=7)
Note also that the default values passed in default_kwargs have higher precedence than the ones defined in f. For example, the default value for d in this case is actually 'd' (defined in default_kwargs), and not 'not d' (defined in f).
You can unpack values of dict:
from collections import OrderedDict
def f(x, a, b):
print(x, a, b)
d = OrderedDict({'a': 3, 'b':4})
f(10, *d.values())
UPD.
Yes, it's possible to implement this mad idea of modifying local scope by creating decorator which will return class with overriden __call__() and store your defaults in class scope, BUT IT'S MASSIVE OVERKILL.
Your problem is that you're trying to hide problems of your architecture behind those tricks. If you store your default values in dict, then access to them by key. If you want to use keywords - define class.
P.S. I still don't understand why this question collect so much upvotes.
Sure...
hope this helps
def funcc(x, **kwargs):
locals().update(kwargs)
print(x, a, b, c, d)
kwargs = {'a' : 1, 'b' : 2, 'c':1, 'd': 1}
x = 1
funcc(x, **kwargs)

how to get the order of named arguments after a call the function

I have a function with named arguments
def sample(a=None, b=None, c=None)
pass
how to get the order of these arguments after a call the function?
sample(b=1, a=1, c=1)
out: ['b', 'a', 'c']
sample(c=1, a=1)
out: ['c', 'a']
sample(a=1, b=1)
out: ['a', 'b']
sample(b=1, a=1)
out: ['b', 'a']
sample(a=1, b=1, c=1)
out: ['a', 'b', 'c']
maybe this can be done using a decorator or some other way?
updated:
I want to make the wrapper for elasticsearch filters like sqlalchemy filter way but through named arguments
class ESQuery(object):
def __init__(self, url, index):
pass
def filter(self, _bool=None, _range=None, _not=None, _and=None, _or=None, _type=None, exists=None, query=None):
return self.query_body
after calling the function, I need to get the correct order of query, like this http://www.elastic.co/guide/en/elasticsearch/reference/1.5/query-dsl-and-filter.html
I just can't build the query so that to keep a strict order
es = Someclass(url, index)
es.filter()
I want to do it using named arguments that people find it convenient to use as the tooltip
update 2:
I want to find another way, one line is not enough, as there can be long calls
from api.search_api import ESQuery
es = ESQuery(index='lot', place='etsy_spider', match_all=True)
print es.\
filter(query='{ "tweet": "full text search" }').\
filter(_range='{ "created": { "gte": "now - 1d / d" }}').\
filter(should='{ "term": { "featured": true }},')
maybe some ideas on how to simplify long queries in elasticsearch?
I cannot think of a reason why it would be useful. That being said, you can use the inspect module (https://docs.python.org/2/library/inspect.html):
import inspect
def f(a=2, b=3):
call_string = inspect.stack()[1][4] ## returns a list with the string that represents how the function was called.
print call_string
args, kwargs = get_function_args(call_string[0]) ## to return what you want
...
f(b=3, a=1) # prints [u'f(b=3, a=1)\n']
Then, you would parse the call_string with regular expressions.
Note that this method only works for single-line calls.
And here is a simple regex parser that will return a list of arguments and keyword arguments by order of appearance. Note that this is very basic and will not work with strings that contain ",".
import re
VALID_VAR = "([_A-Za-z][_a-zA-Z0-9]*)"
LEFT_PAR = '\('
RIGHT_PAR = '\)'
def get_function_args(line):
args = []
keywords = []
res = re.search(VALID_VAR+LEFT_PAR+'(.*?)'+RIGHT_PAR+'$', line)
if res:
allargs = res.group(2)
allargs = allargs.split(',') ## does not work if you have strings with ","
for arg in allargs:
## Arguments
res2 = re.search('^{0}$'.format(VALID_VAR), arg.strip())
if res2:
args.append(res2.group(1))
## Optional arguments
res2 = re.search('^{0} *= *(.*)$'.format(VALID_VAR), arg.strip())
if res2:
keywords.append(res2.group(1))
return args, keywords
Yes, it can be done - read on.
how to get the order of these arguments after a call the function?
Named arguments are passed as a dictionary, whose .items() by definition are in arbitrary order:
# this...
def foo(a=None, b=None, c=None):
...
# .... is the equivalent of:
def foo(**kwargs):
a = kwargs.get('a')
b = kwargs.get('b')
c = kwargs.get('c')
...
how to get the order of these arguments after a call the function?
Regardless of the above, you can achieve it by using an OrderedDict:
from collections import OrderedDict
def bar(sorted_kwargs):
for k in sorted_kwargs.keys():
print 'index of %s => %s' % (k, sorted_kwargs.keys().index(k))
bar(OrderedDict((('c', 1), ('a', 2), ('b', 3))))
# results
index of c => 0
index of a => 1
index of b => 2
maybe this can be done using a decorator or some other way?
The order of the arguments is something for the caller to decide -- a decorator cannot change that.
Btw. there is a draft PEP about this

How to create a function at runtime with specified argument names?

Suppose I have this function:
def f(x,y):
return x+y
If I use inspect.getargspec(f).args I get ['x','y'] as a result. Great.
Now suppose I want to create another function g(a,b) at runtime, where I don't know the argument names a and b until runtime:
def g(a,b):
return f(a,b)
Is there a way to do this? Lambdas are almost right, except I can only assign argument names at compile time.
g = lambda *p: f(*p)
Somehow I want to create the function dynamically at run time based on a list L (for example L=['a','b']), so that inspect.getargspec(g).args == L).
Here's a somewhat hacky way to do it which first creates a new function from an existing one with the modification and then replaces the original's code with it. It's lengthly mostly because the types.CodeType() call has so many arguments. The Python 3 version is somewhat different because a number of the function.func_code attributes were renamed and the calling sequence of types.CodeType() was changed slightly.
I got the idea from this answer by #aaronasterling (who says he got the idea from Michael Foord's Voidspace blog entry #583 titled Selfless Python). It could easily be made into a decorator, but I don't see that as being helpful based on what you've told us of the intended usage.
import sys
import types
def change_func_args(function, new_args):
""" Create a new function with its arguments renamed to new_args. """
if sys.version_info[0] < 3: # Python 2?
code_obj = function.func_code
assert(0 <= len(new_args) <= code_obj.co_argcount)
# The arguments are just the first co_argcount co_varnames.
# Rreplace them with the new argument names in new_args.
new_varnames = tuple(new_args[:code_obj.co_argcount] +
list(code_obj.co_varnames[code_obj.co_argcount:]))
new_code_obj = types.CodeType(code_obj.co_argcount,
code_obj.co_nlocals,
code_obj.co_stacksize,
code_obj.co_flags,
code_obj.co_code,
code_obj.co_consts,
code_obj.co_names,
new_varnames,
code_obj.co_filename,
code_obj.co_name,
code_obj.co_firstlineno,
code_obj.co_lnotab,
code_obj.co_freevars,
code_obj.co_cellvars)
modified = types.FunctionType(new_code_obj, function.func_globals)
else: # Python 3
code_obj = function.__code__
assert(0 <= len(new_args) <= code_obj.co_argcount)
# The arguments are just the first co_argcount co_varnames.
# Replace them with the new argument names in new_args.
new_varnames = tuple(new_args[:code_obj.co_argcount] +
list(code_obj.co_varnames[code_obj.co_argcount:]))
new_code_obj = types.CodeType(code_obj.co_argcount,
code_obj.co_posonlyargcount,
code_obj.co_kwonlyargcount,
code_obj.co_nlocals,
code_obj.co_stacksize,
code_obj.co_flags,
code_obj.co_code,
code_obj.co_consts,
code_obj.co_names,
new_varnames,
code_obj.co_filename,
code_obj.co_name,
code_obj.co_firstlineno,
code_obj.co_lnotab)
modified = types.FunctionType(new_code_obj, function.__globals__)
function.__code__ = modified.__code__ # replace code portion of original
if __name__ == '__main__':
import inspect
def f(x, y):
return x+y
def g(a, b):
return f(a, b)
print('Before:')
print('inspect.getargspec(g).args: {}'.format(inspect.getargspec(g).args))
print('g(1, 2): {}'.format(g(1, 2)))
change_func_args(g, ['p', 'q'])
print('')
print('After:')
print('inspect.getargspec(g).args: {}'.format(inspect.getargspec(g).args))
print('g(1, 2): {}'.format(g(1, 2)))
I have a feeling you want something like this:
import inspect
import math
def multiply(x, y):
return x * y
def add(a, b):
return a + b
def cube(x):
return x**3
def pythagorean_theorum(a, b, c):
return math.sqrt(a**2 + b**2 + c**2)
def rpc_command(fname, *args, **kwargs):
# Get function by name
f = globals().get(fname)
# Make sure function exists
if not f:
raise NotImplementedError("function not found: %s" % fname)
# Make a dict of argname: argvalue
arg_names = inspect.getargspec(f).args
f_kwargs = dict(zip(arg_names, args))
# Add kwargs to the function's kwargs
f_kwargs.update(kwargs)
return f(**f_kwargs)
Usage:
>>> # Positional args
... rpc_command('add', 1, 2)
3
>>>
>>> # Keyword args
... rpc_command('multiply', x=20, y=6)
120
>>> # Keyword args passed as kwargs
... rpc_command('add', **{"a": 1, "b": 2})
3
>>>
>>> # Mixed args
... rpc_command('multiply', 5, y=6)
30
>>>
>>> # Different arg lengths
... rpc_command('cube', 3)
27
>>>
>>> # Pass in a last as positional args
... rpc_command('pythagorean_theorum', *[1, 2, 3])
3.7416573867739413
>>>
>>> # Try a non-existent function
... rpc_command('doesntexist', 5, 6)
Traceback (most recent call last):
File "<stdin>", line 2, in <module>
File "<stdin>", line 6, in rpc_command
NotImplementedError: function not found: doesntexist
How about using keyword arguments?
>>> g = lambda **kwargs: kwargs
>>> g(x=1, y=2)
{'y': 2, 'x': 1}
>>> g(a='a', b='b')
{'a': 'a', 'b': 'b'}
Something like:
g = lambda **kwargs: f(kwargs.get('a', 0), kwargs['b'])
or let's say you want to use just the values:
>>> g = lambda **kwargs: f(*kwargs.values())
>>> def f(*args): print sum(args)
...
>>> g(a=1, b=2, c=3)
6
In any case, using the **kwargs syntax results in kwargs being a dictionary of all the arguments passed by name.
You may use *args and **kwargs
let say you generate a dynamic function at runtime
def func():
def dyn_func(*args, **kwargs):
print args, kwargs
return dyn_func
it is then possible to use args into your generated function
f = func()
f(test=1)
would give:
() {'test': 1}
then it is possible to manage args as you wish

In python is there any way I can obtain the arguments passed to a function as object?

I don't want to use *args or **kwargs since I can't change function declaration.
For example:
def foo( a, b, c ) """Lets say values passed to a, b and c are 1,2 and 3 respectively"""
...
...
""" I would like to generate an object preferably a dictionary such as {'a':1, 'b':2, 'c':3} """
...
...
Can anyone suggest a way to do this?
Thanks in advance.
If you can't change the function "declaration" (why not?) but you can change the contents of the function, then just create the dictionary as you want it:
def foo(a, b, c):
mydict = {'a': a, 'b': b, 'c': c}
If that doesn't work, I think you need a better explanation of what you want and what the constraints are in your case.
This is also going to give similar results in the above case (where you don't show any local variables other than the arguments), but be warned that you should not try to modify locals():
def foo(a, b, c):
mydict = locals()
#Rohit, we do not understand what you mean when you say "the function declaration". If you mean you don't want to change the API of the function (the documented way the function is called), perhaps because you have existing code already calling an existing function, then you can still use the **kwargs notation, and the callers will never know:
def foo(a, b, c):
return a + b + c
def foo(**kwargs):
total = 0
for x in ("a", "b", "c"):
assert x in kwargs
total += kwargs[x]
return total
def bar():
foo(3, 5, 7)
bar() cannot tell which version of foo() it is calling, and does not care.
Perhaps you are looking for a "wrapper" you can wrap around existing function objects, without changing the actual source code of the function object?
def make_wrapper(fn, *arg_names):
def wrapped_fn(*args):
mydict = dict(tup for tup in zip(arg_names, args))
print("TEST: mydict: %s" % str(mydict))
return fn(*args)
return wrapped_fn
def foo(a, b, c):
return a + b + c
foo = make_wrapper(foo, "a", "b", "c")
foo(3, 5, 7)
The new wrapped function gathers the arguments into mydict and prints mydict before calling the function.
By diligent searching of StackOverflow, I found out how to do this. You use the inspect module.
import inspect
def make_wrapper(fn):
arg_names = inspect.getargspec(fn)[0]
def wrapped_fn(*args, **kwargs):
# mydict now gets all expected positional arguments:
mydict = dict(tup for tup in zip(arg_names, args))
# special name "__args" gets list of all positional arguments
mydict["__args"] = args
# mydict now updated with all keyword arguments
mydict.update(kwargs)
# mydict now has full information on all arguments of any sort
print("TEST: mydict: %s" % str(mydict))
return fn(*args, **kwargs)
return wrapped_fn
def foo(a, b, c, *args, **kwargs):
# a, b, and c must be set; extra, unexpected args will go in args list
return a + b + c
foo = make_wrapper(foo)
foo(3, 5, 7, 1, 2)
# prints: TEST: mydict: {'a': 3, 'c': 7, 'b': 5, '__args': (3, 5, 7, 1, 2)}
# returns: 15
There you go, a perfect solution to the problem you stated. It is a wrapper, you don't need to pass in the arguments, and it should work for any function. If you need it to work with class objects or something you can read the docs for inspect and see how to do it.
Note, of course order is not preserved in dictionaries, so you may not see the exact order I saw when I tested this. But the same values should be in the dict.
def foo(a, b, c):
args = {"a": a, "b": b, "c": c}

Categories