Related
I have two functions that accept different parametes:
def foo(name, age):
pass
def bar(color, shape):
pass
Now, I have a master function that I want to be able to call with the function I want to execute and it's parameters. Since it's a master function that might call either foo or bar, it's called with only two params (the function to be executed and the parameters for this function.
function is a string telling what function to execute
params will be a dictionary of parameters (like **kwargs)
I can do this to make it work:
def master(function, params):
if function == 'foo':
foo(params['name'], params['age'])
elif function == 'bar':
foo(params['color'], params['shape'])
And then I call master like:
master('foo',{'name': 'John', 'age': 99})
However if master has a lot of subfuntions to call, there's too much conditions and picking the right parameters for each function.
So I basically have two questions:
1) Instead of calling master with the name of the function and then checking this name in a condition, can I directly pass the function to be executed? If so, how do I execute the function then?
Something like calling master like this:
master(foo(), {'name': 'John', 'age': 99})
2) functions foo and bar don't have **kwargs, however it would be very convinient if I can call them passing just a dictionary and then they assign to each variable their corresponding value from the dict.
So basically, could I do:
params = {'name':'John', 'age':99, 'color':red, 'shape':circle}
foo(params) # I would like to to this even if foo doesn't have **kwargs
bar(params) # same for bar
So at the end my ideal call of master would be:
params = {'name':'John', 'age':99, 'color':red, 'shape':circle}
master(foo(), params) # to execute foo
master(bar(), params) # to execute bar
You can pass functions as arguments:
def master(func, arguments: dict):
if func is foo:
args = arguments["name"], arguments["age"]
elif func is bar:
args = arguments["color"], arguments["shape"]
return func(*args)
This can be done even simpler if you don't know the names of the functions' arguments:
def master(func, arguments: list):
return func(*arguments)
A much more generic version is the following:
def master(function, *positional_arguments, **keyword_arguments):
function(*positional_arguments, **keyword_arguments)
master(foo, 'John', 56)
master(foo, **{'name': 'John', 'age': 56})
master(foo, name='John', age=56)
master(foo, *['John', 56])
Function in python are objects, so yes, you can pass them as a parameter (but do not use parenthesis).
def master(function, **kwargs):
function(params, kwargs)
Then you call the master:
master(foo, name='John', age=99...)
You can execute a function through master function not by passing a function as a parameter inside master, but by passing the definition of the function. Please follow the example below:
def foo(name, age):
pass
def bar(color, shape):
pass
Now consider a master function:
Func param will contain the defination of the function by adding parentheses after function name, it will make it to execute.
def masterFunc(func, params):
return func(params) # Func param will contain the defination of the function by adding parentheses after function name, it will make it to execute.
You will use this master function to execute the passed function defination as below:
masterFunc(foo,{a:1}) #Sample execution
function is a first-class object in Python and therefore can be assigned to an identifier, passed as an argument or returned by a function.
The unified way where you don't want to bother of keyword args for a particular function - with inspect.getfullargspec feature:
import inspect
def foo(name, age):
print(name, age)
def bar(color, shape):
print(color, shape)
def master(func, params):
arg_names = inspect.getfullargspec(func).args
func(**{k:v for k,v in params.items() if k in arg_names})
params = {'name':'John', 'age':99, 'color':'red', 'shape':'circle'}
master(foo, params)
master(bar, params)
Sample output:
John 99
red circle
In Python, functions are objects, like everything else. You can pass functions as parameters and execute them very simply.
In your case,
def say_hello(name):
print "hi ", name
def say_bye(name, time):
print name, ", have a good ", time
say_bye("john", "night")
john , have a good night
def master(fun, **kwargs):
fun(**kwargs)
master(say_bye, name='john', time='night')
john , have a good night
params1 = {'name': 'jay', 'time': 'morning'}
params2 = {'name': 'cole'}
master(say_hello, **params2)
hi cole
master(say_bye, **params1)
jay , have a good morning
this should work.
You can try with the below approach.
Code
class A:
def a(self, *args, **kwargs):
print('A')
def b(self, *args, **kwargs):
print('B', args, kwargs)
def main(func, *args, **kwargs):
x = A()
if hasattr(x, func):
func = getattr(x, func)
func(*args, **kwargs)
else:
print('No function name {} defined under class'.format(func))
if __name__ == '__main__':
func = raw_input('Func_name: a/b')
main(func, (1), {1:1})
Output
~/del$ python test.py
Func_name: a/ba
True
A
~/del$ python test.py
Func_name: a/bb
True
('B', (1, {1: 1}), {})
~/del$
Here, we're using getattr function in __builtin__.
* First it checks whether there's a function/method named a or b under the instance of class A.
* If yes, then only it will getattr from it and execute passing the args and kwargs.
* If somebody passes function name not defined, it will not crash and return saying the needful statement there.
Hope this helps !
I need to write a method that takes in 3 arguments:
a string with the name of a function
an ordered list of arguments to that function. This includes arguments with default values and *varargs, but does not include **kwargs
a dict representing any additional keyword arguments, or None if there are none
And I need to use this input to retrieve a function and call it. For example:
def dispatch(name, args, kwargs=None):
do_magic_here(name, args, kwargs)
def meth1():
print "meth1"
def meth2(a, b):
print "meth2: %s %s" % (a, b)
def meth3(a, **kwargs):
print "meth3: " + a
for k,v in kwargs.iteritems():
print "%s: %s" % (k,v)
And I need to be able to call things like this:
>>> dispatch("meth1", [])
meth1
>>> dispatch("meth2", [1, 3])
meth2: 1 3
>>> dispatch("meth3", [1], {"hello":2, "there":3})
meth3: 1
hello: 2
there: 3
I could do this:
def do_magic_here(name, args, kwargs=None):
if name=="meth1":
meth1()
if name=="meth2":
meth2(args[0], args[1])
if name=="meth3":
meth3(args[0], **kwargs)
But I'm trying to dispatch like 40 methods, and that number may expand, so I'm hoping there's a more programmatic way to do it. I'm looking at something with getattr, but I can't quite figure it out.
I would just use
def dispatch(name, *args, **kwargs):
func_name_dict[name](*args, **kwargs)
with
func_name_dict = {'meth1':meth1,
'meth2':meth2,
...}
Allowing you to pass args and kwargs through more naturally and transparently:
>>> dispatch("meth2", 1, 3)
meth2: 1 3
You can of course use globals() or locals() in place of the dict, but you might need to be careful about which functions in each namespace you do or don't want to expose to the caller
Indeed, getattr will get you there.
class X:
def a(self):
print('a called')
def b(self, arg):
print('b called with ' + arg)
x = X()
getattr(x, 'a')()
# a called
getattr(x, 'b')('foo')
# b called with foo
Just like getattr handles methods and fields the same way, you can handle
functions and variables not associated with a class by referencing locals() or globals().
If you want to refer to a function in the global scope:
globals()['meth'](args)
For example:
def dispatch(name, *args, **kwargs):
globals()[name](*args, **kwargs)
dispatch('meth3', 'hello', foo='bar')
# meth3: hello
# foo: bar
Remember in Python you can always pass a list of arguments or dict of keyword arguments using the **:
dispatch('meth3', *['hello'], **{'foo':'bar'})
If you truly prefer to pass arguments as list/dict to dispatch:
def dispatch(name, args, kwargs):
globals()[name](*args, **kwargs)
dispatch('meth3', ['hello'], {'foo': 'bar'})
I need to be able to dynamically invoke a method on a class that accepts various parameters based on the string name and a dictionary of variables. I know how to find the signature with the inspect module, and I can get the method with the getattr, but I do not know how to assign the parameters in the correct order to invoke it in a purely dynamic way.
class MyClass():
def call_me(a, b, *args, foo='bar', **kwargs):
print('Hey, I got called!')
command = {
'action':'call_me',
'parameters':{
'a': 'Apple',
'b': 'Banana',
'args':['one','two','three','four'],
'foo':'spam',
'clowns':'bad',
'chickens':'good'
}
}
me = MyClass()
action = getattr(me,command['action'])
... now what?
I need to be able to dynamically call this function as if this code were used, without any foreknowledge of the actual parameters for the method:
a = command['parameters']['a']
b = command['parameters']['b']
args = command['parameters']['args']
foo = command['parameters']['foo']
kwargs = {
'clowns': command['parameters']['clowns'],
'chickens':command['parameters']['chickens']
}
value = action(a, b, *args, foo=foo, **kwargs)
Surely there is a good pythonic way to do this.
Edit: Fixed getattr to call instance of MyClass instead of MyClass directly.
This is the best way I have found so far to capture every possible combination of normal args, *args, keyword args and **kwargs without getting any errors:
import inspect
class MyClass():
def a(self):
pass
def b(self,foo):
pass
def c(self,foo,*extras):
pass
def d(self,foo,food='spam'):
pass
def e(self,foo,**kwargs):
pass
def f(self,foo,*extras,food='spam'):
pass
def g(self,foo,*extras,**kwargs):
pass
def h(self,foo,*extras,food='spam',**kwargs):
pass
def i(self,*extras):
pass
def j(self,*extras,food='spam'):
pass
def k(self,*extras,**kwargs):
pass
def l(self,*extras,food='spam',**kwargs):
pass
def m(self,food='spam'):
pass
def n(self,food='spam',**kwargs):
pass
def o(self,**kwargs):
pass
def dynamic_invoke(obj,name,parameters):
action = getattr(obj,name)
spec = inspect.getfullargspec(action)
used = []
args = ()
kwargs = {}
for a in spec.args[1:]:
# skip the "self" argument since we are bound to a class
args += (parameters[a], )
used.append(a)
if spec.varargs:
args += tuple(parameters[spec.varargs])
used.append(spec.varargs)
for kw in spec.kwonlyargs:
try:
kwargs[kw] = parameters[kw]
used.append(kw)
except KeyError:
pass
# pass remaining parameters to kwargs, if allowed
if spec.varkw:
for k,v in parameters.items():
if k not in used:
kwargs[k] = v
return action(*args,**kwargs)
me = MyClass()
params = {
'foo':'bar',
'extras':['one','two','three','four'],
'food':'eggs',
'parrot':'blue'
}
dynamic_invoke(me,'a',params)
dynamic_invoke(me,'b',params)
dynamic_invoke(me,'c',params)
dynamic_invoke(me,'d',params)
dynamic_invoke(me,'e',params)
dynamic_invoke(me,'f',params)
dynamic_invoke(me,'g',params)
dynamic_invoke(me,'h',params)
dynamic_invoke(me,'i',params)
dynamic_invoke(me,'j',params)
dynamic_invoke(me,'k',params)
dynamic_invoke(me,'l',params)
dynamic_invoke(me,'m',params)
dynamic_invoke(me,'n',params)
dynamic_invoke(me,'o',params)
print('done!')
Try like this:
action = getattr(me,command['action'])
action(**{'a': 'Apple',
'b': 'Banana',
'args':['one','two','three','four'],
'foo':'spam',
'clowns':'bad',
'chickens':'good'
})
Is there are a way to pass a variable between two python decorators applied to the same function? The goal is for one of the decorators to know that the other was also applied. I need something like decobar_present() from the example below:
def decobar(f):
def wrap():
return f() + "bar"
return wrap
def decofu(f):
def wrap():
print decobar_present() # Tells me whether decobar was also applied
return f() + "fu"
return wrap
#decofu
#decobar
def important_task():
return "abc"
More generally I would like to be able to modify the behavior of decofu depending on whether decobar was also applied.
You can add the function to a "registry" when decobar is applied to it, then later check the registry to determine whether decobar was applied to the function or not. This approach requires preserving original function's __module__ and __name__ properties intact (use functools.wraps over the wrapper function for that).
import functools
class decobar(object):
registry = set()
#classmethod
def _func_key(cls, f):
return '.'.join((f.__module__, f.func_name))
#classmethod
def present(cls, f):
return cls._func_key(f) in cls.registry
def __call__(self, f):
self.registry.add(self._func_key(f))
#functools.wraps(f)
def wrap():
return f() + "bar"
return wrap
# Make the decorator singleton
decobar = decobar()
def decofu(f):
#functools.wraps(f)
def wrap():
print decobar.present(f) # Tells me whether decobar was also applied
return f() + "fu"
return wrap
#decofu
#decobar
def important_task():
return "abc"
Used a class to implement decobar, as it keeps registry and present() in a single namespace (which feels slighly cleaner, IMO)
To pass a variable between two python decorators you can use the decorated function's keyword arguments dictionary. Only don't forget to pop the added argument from there before calling the function from within the second decorator.
def decorator1(func):
def wrap(*args, **kwargs):
kwargs['cat_says'] = 'meow'
return func(*args, **kwargs)
return wrap
def decorator2(func):
def wrap(*args, **kwargs):
print(kwargs.pop('cat_says'))
return func(*args, **kwargs)
return wrap
class C:
#decorator1
#decorator2
def spam(self, a, b, c, d=0):
print("Hello, cat! What's your favourite number?")
return a + b + c + d
x=C()
print(x.spam(1, 2, 3, d=7))
While it is possible to do things like manipulate the stack trace, you're better off, I think, simply creating a function decofubar and incorporate as much of both "fu" and "bar" as possible. At a minimum, it will make your code cleaner and more obvious.
Each decorator gets to wrap another function. The function passed to decofu() is the result of the decobar() decorator.
Just test for specific traits of the decobar wrapper, provided you make the wrapper recognisable:
def decobar(f):
def wrap():
return f() + "bar"
wrap.decobar = True
return wrap
def decofu(f):
def wrap():
print 'decobar!' if getattr(f, 'decobar') else 'not decobar'
return f() + "fu"
return wrap
I used an arbitrary attribute on the wrapper function, but you could try to test for a name (not so unambiguous), for the signature (using inspect.getargspec() perhaps), etc.
This is limited to direct wrapping only.
Generally speaking, you don't want to couple decorators as tightly as all this. Work out a different solution and only depend on function signature or return values.
You can assign flag to f (or rather wrap) in decobar just like this
def decobar(f):
def wrap():
return f() + "bar"
wrap.decobar_applied = True
return wrap
def decofu(f):
def wrap():
if hasattr(f, 'decobar_applied') and f.decobar_applied:
print decobar_present() # Tells me whether decobar was also applied
return f() + "fu"
return wrap
#decofu
#decobar
def important_task():
return "abc"
I'm currently creating an object like this:
class Obj(object):
def __init__(self,**kwargs):
params = ['val1','val2','val3','val4',...]
for p in params:
setattr(self,p,kwargs.get(p,None))
I'm doing this so I don't have to do this:
class Obj(object):
def __init__(self,val1=None,val2=None,val3=None,val4=None,...):
self.val1=val1
self.val2=val2
self.val3=val3
self.val4=val4
...
My question is, can you do a mix of the two? Where I can define the expected parameters yet still loop the parameters to set the attributes? I like the idea of defining the expected parameters because it is self documenting and other developers don't have to search for what kwargs are used.
I know it seems pretty petty but I'm creating an object from some XML so I'll be passing in many parameters, it just clutters the code and bugs me.
I did google this but couldn't find anything, probably because dictionary and kwargs together point to kwarg examples.
UPDATE: To be more specific, is it possible to get a dictionary of passed in parameters so I don't have to use kwargs at all?
Sudo code:
class Obj(object):
def __init__(self,val1=None,val2=None,val3=None,val4=None,...):
for k,v in dictionary_of_paramters.iteritems():
setattr(self,k,v)
You can use the inspect module:
import inspect
def myargs(val1, val2, val3=None, val4=5):
print inspect.currentframe().f_locals
it shows all the locals available on the current stack frame.
myargs('a','b')
==> {'val3': None, 'val2': 'b', 'val1': 'a', 'val4': 5}
(note: it's not guaranteed to be implemented on all Python interpreters)
edit: i concur that it's not a pretty solution. what i would do is more like:
def _yourargs(*names):
"returns a dict with your named local vars"
alllocs = inspect.stack()[1][0].f_locals
return {n:alllocs[n] for n in names}
def askformine(val1, val2, val3=None, val4=5):
"example to show just those args i'm interested in"
print _yourargs('val1','val2','val3','val4')
class Obj(object):
"example inserting some named args as instance attributes"
def __init__(self, arg1, arg2=4):
self.__dict__.update(_yourargs('arg1','arg2'))
edit2 slightly better:
def pickdict(d,*names):
"picks some values from a dict"
return {n:d[n] for n in names}
class Obj(object):
"example inserting some named args as instance attributes"
def __init__(self, arg1, arg2=4):
self.__dict__.update(pickdict(locals(),'arg1','arg2'))
There is no nice way to get a dictionary of all the arguments to a function. The **kwargs syntax only collects up the extra keyword arguments, not the ones that match explicit parameters in the function definition.
Although you won't be able to get the parameters without using kwargs or the inspect module (see other answers), you can do something like this...
class Obj(object):
def __init__(self, **kwargs):
self.__dict__.update(**kwargs)
Every object has a dictionary that stores all of the attributes, which you can access via self.__dict__. Then you're just using update to set all of the attributes in that object's internal dictionary.
See this question on some discussion of this method.
If you want to obtain the args dict at the very top of your method, before you define any locals, this is as simple as:
class Obj(object):
def __init__(self,val1=None,val2=None,val3=None,val4=None):
kwargs = dict(locals())
To read this dict later on, some introspection magic is required:
class Obj(object):
def __init__(self,val1=None,val2=None,val3=None,val4=None):
# feel free to add more locals
loc = dict(locals())
fun = sys._getframe().f_code
kwargs = {x:loc[x] for x in fun.co_varnames[:fun.co_argcount]}
You can also make the latter reusable by adding a function like this:
def getargs():
f = sys._getframe(1)
return {x:f.f_locals[x] for x in f.f_code.co_varnames[:f.f_code.co_argcount]}
and then:
class Obj(object):
def __init__(self,val1=None,val2=None,val3=None,val4=None):
# feel free to add more locals
kwargs = getargs()
This is cpython-specific, I guess.
Yes you can mix the two.
See below:
def method(a, b=1, *args, **kwargs):
'''some code'''
This is valid. Here:
'a' is a required argument
'b' is a default argument
'args' will have all the non-keyword arguments and
'kwargs' will have all the keyword arguments.
Example:
method(1,2,3,4,5,test=6,test1=7)
This call will have:
a=1
b=2
args=(3,4,5)
kwargs={'test':6,'test1':7}
A kind of an ugly workaround: Inject extra arguments into kwargs and use it where you want to loop over all keyword arguments (PS this is an example usage of the inspect module, but not recommended for production use):
#!/usr/bin/env python
import inspect
def inject_defaults(func):
""" injects '__defaults' key into into kwargs,
so it can be merged with kwargs in the decorated method """
args, varargs, varkwargs, defaults = inspect.getargspec(func)
have_defaults = args[-len(defaults):]
defaults_dict = dict(zip(have_defaults, defaults))
def fun(*args, **kwargs):
kwargs['__defaults'] = defaults_dict
return func(*args, **kwargs)
return fun
#inject_defaults
def f1(a,b,c, x=1, **kwargs):
kwargs.update(kwargs['__defaults'])
del kwargs['__defaults']
for k, v in kwargs.items():
# here, x, y and z will appear
print(k, v)
f1(1, 2, 3, y=3, z=2)
# prints
# ('y', 3)
# ('x', 1)
# ('z', 2)