I have two functions that accept different parametes:
def foo(name, age):
pass
def bar(color, shape):
pass
Now, I have a master function that I want to be able to call with the function I want to execute and it's parameters. Since it's a master function that might call either foo or bar, it's called with only two params (the function to be executed and the parameters for this function.
function is a string telling what function to execute
params will be a dictionary of parameters (like **kwargs)
I can do this to make it work:
def master(function, params):
if function == 'foo':
foo(params['name'], params['age'])
elif function == 'bar':
foo(params['color'], params['shape'])
And then I call master like:
master('foo',{'name': 'John', 'age': 99})
However if master has a lot of subfuntions to call, there's too much conditions and picking the right parameters for each function.
So I basically have two questions:
1) Instead of calling master with the name of the function and then checking this name in a condition, can I directly pass the function to be executed? If so, how do I execute the function then?
Something like calling master like this:
master(foo(), {'name': 'John', 'age': 99})
2) functions foo and bar don't have **kwargs, however it would be very convinient if I can call them passing just a dictionary and then they assign to each variable their corresponding value from the dict.
So basically, could I do:
params = {'name':'John', 'age':99, 'color':red, 'shape':circle}
foo(params) # I would like to to this even if foo doesn't have **kwargs
bar(params) # same for bar
So at the end my ideal call of master would be:
params = {'name':'John', 'age':99, 'color':red, 'shape':circle}
master(foo(), params) # to execute foo
master(bar(), params) # to execute bar
You can pass functions as arguments:
def master(func, arguments: dict):
if func is foo:
args = arguments["name"], arguments["age"]
elif func is bar:
args = arguments["color"], arguments["shape"]
return func(*args)
This can be done even simpler if you don't know the names of the functions' arguments:
def master(func, arguments: list):
return func(*arguments)
A much more generic version is the following:
def master(function, *positional_arguments, **keyword_arguments):
function(*positional_arguments, **keyword_arguments)
master(foo, 'John', 56)
master(foo, **{'name': 'John', 'age': 56})
master(foo, name='John', age=56)
master(foo, *['John', 56])
Function in python are objects, so yes, you can pass them as a parameter (but do not use parenthesis).
def master(function, **kwargs):
function(params, kwargs)
Then you call the master:
master(foo, name='John', age=99...)
You can execute a function through master function not by passing a function as a parameter inside master, but by passing the definition of the function. Please follow the example below:
def foo(name, age):
pass
def bar(color, shape):
pass
Now consider a master function:
Func param will contain the defination of the function by adding parentheses after function name, it will make it to execute.
def masterFunc(func, params):
return func(params) # Func param will contain the defination of the function by adding parentheses after function name, it will make it to execute.
You will use this master function to execute the passed function defination as below:
masterFunc(foo,{a:1}) #Sample execution
function is a first-class object in Python and therefore can be assigned to an identifier, passed as an argument or returned by a function.
The unified way where you don't want to bother of keyword args for a particular function - with inspect.getfullargspec feature:
import inspect
def foo(name, age):
print(name, age)
def bar(color, shape):
print(color, shape)
def master(func, params):
arg_names = inspect.getfullargspec(func).args
func(**{k:v for k,v in params.items() if k in arg_names})
params = {'name':'John', 'age':99, 'color':'red', 'shape':'circle'}
master(foo, params)
master(bar, params)
Sample output:
John 99
red circle
In Python, functions are objects, like everything else. You can pass functions as parameters and execute them very simply.
In your case,
def say_hello(name):
print "hi ", name
def say_bye(name, time):
print name, ", have a good ", time
say_bye("john", "night")
john , have a good night
def master(fun, **kwargs):
fun(**kwargs)
master(say_bye, name='john', time='night')
john , have a good night
params1 = {'name': 'jay', 'time': 'morning'}
params2 = {'name': 'cole'}
master(say_hello, **params2)
hi cole
master(say_bye, **params1)
jay , have a good morning
this should work.
You can try with the below approach.
Code
class A:
def a(self, *args, **kwargs):
print('A')
def b(self, *args, **kwargs):
print('B', args, kwargs)
def main(func, *args, **kwargs):
x = A()
if hasattr(x, func):
func = getattr(x, func)
func(*args, **kwargs)
else:
print('No function name {} defined under class'.format(func))
if __name__ == '__main__':
func = raw_input('Func_name: a/b')
main(func, (1), {1:1})
Output
~/del$ python test.py
Func_name: a/ba
True
A
~/del$ python test.py
Func_name: a/bb
True
('B', (1, {1: 1}), {})
~/del$
Here, we're using getattr function in __builtin__.
* First it checks whether there's a function/method named a or b under the instance of class A.
* If yes, then only it will getattr from it and execute passing the args and kwargs.
* If somebody passes function name not defined, it will not crash and return saying the needful statement there.
Hope this helps !
Related
I need to write a method that takes in 3 arguments:
a string with the name of a function
an ordered list of arguments to that function. This includes arguments with default values and *varargs, but does not include **kwargs
a dict representing any additional keyword arguments, or None if there are none
And I need to use this input to retrieve a function and call it. For example:
def dispatch(name, args, kwargs=None):
do_magic_here(name, args, kwargs)
def meth1():
print "meth1"
def meth2(a, b):
print "meth2: %s %s" % (a, b)
def meth3(a, **kwargs):
print "meth3: " + a
for k,v in kwargs.iteritems():
print "%s: %s" % (k,v)
And I need to be able to call things like this:
>>> dispatch("meth1", [])
meth1
>>> dispatch("meth2", [1, 3])
meth2: 1 3
>>> dispatch("meth3", [1], {"hello":2, "there":3})
meth3: 1
hello: 2
there: 3
I could do this:
def do_magic_here(name, args, kwargs=None):
if name=="meth1":
meth1()
if name=="meth2":
meth2(args[0], args[1])
if name=="meth3":
meth3(args[0], **kwargs)
But I'm trying to dispatch like 40 methods, and that number may expand, so I'm hoping there's a more programmatic way to do it. I'm looking at something with getattr, but I can't quite figure it out.
I would just use
def dispatch(name, *args, **kwargs):
func_name_dict[name](*args, **kwargs)
with
func_name_dict = {'meth1':meth1,
'meth2':meth2,
...}
Allowing you to pass args and kwargs through more naturally and transparently:
>>> dispatch("meth2", 1, 3)
meth2: 1 3
You can of course use globals() or locals() in place of the dict, but you might need to be careful about which functions in each namespace you do or don't want to expose to the caller
Indeed, getattr will get you there.
class X:
def a(self):
print('a called')
def b(self, arg):
print('b called with ' + arg)
x = X()
getattr(x, 'a')()
# a called
getattr(x, 'b')('foo')
# b called with foo
Just like getattr handles methods and fields the same way, you can handle
functions and variables not associated with a class by referencing locals() or globals().
If you want to refer to a function in the global scope:
globals()['meth'](args)
For example:
def dispatch(name, *args, **kwargs):
globals()[name](*args, **kwargs)
dispatch('meth3', 'hello', foo='bar')
# meth3: hello
# foo: bar
Remember in Python you can always pass a list of arguments or dict of keyword arguments using the **:
dispatch('meth3', *['hello'], **{'foo':'bar'})
If you truly prefer to pass arguments as list/dict to dispatch:
def dispatch(name, args, kwargs):
globals()[name](*args, **kwargs)
dispatch('meth3', ['hello'], {'foo': 'bar'})
def a(**akwargs):
def b(bkwargs=akwargs):
# how to not only use akwargs defaultly,but also define bkwargs by
# myself?
print bkwargs
return b
If I wanna make the following function realized. how could I do with the code above?
>>>a(u='test')()
{'u': test}
>>>a(u='test')(u='test2')
{'u': test2}
It's a little unclear what you want, but I think this is the trick:
def a(**akwargs):
def b(bkwargs=akwargs, **kwargs):
# how to not only use akwargs defaultly,but also define bkwargs by
# myself?
if not bkwargs:
bkwargs = kwargs
else:
# it depends what you want here (merge or replace?), but probably
# something like bkwargs.update(kwargs) or kwargs.update(bkwargs)
return b
Uses the kwargs from the outer function as default, and updates kwargs passed to the inner function.
from functools import partial
def outer(**kwargs):
def inner(**kwargs):
return(kwargs)
return partial(inner, **kwargs)
This next one uses the kwargs from the outer function if no kwargs was assigned to the inner.
def outer(**outer_kwargs):
def inner(**inner_kwargs):
kwargs = inner_kwargs or outer_kwargs
return kwargs
return inner
Why not do the following?
def a(**akwargs):
def b(**bkwargs):
allkwargs = {}
allkwargs.update(akwargs)
allkwargs.update(bkwargs)
print allkwargs
return b
This uses the values from a but allows you to overwrite it by passing more to b. So:
>>> a(u='test')()
{'u': 'test'}
>>> a(u='test')(u='test2')
{'u': 'test2'}
I have a code that has the pattern "function(a)(**kwargs)". I know only the pattern "function (a, **kwargs)". What does it mean if there is a second set of arguments in separate paranthesis?
Shortened to what I think is relevant the code looks like this:
myprog1.py
def factory(cid, classes=CLASS_CACHE):
some code ...
myprog2.py
from myprog1 import factory
...
class Client(object):
def __init__(self, operations, factory):
self.factory = factory
def some_function()
chk = self.factory(test)(**kwargs)
factory is a function, test is a string (naming an object).
function(a)(**kwargs) calls the returned value of function(a) with keyword arguments unpacked from **kwargs. E.g the below code
def f():
def inner(**ka):
print(ka) # print received keyword arguments
return inner # return a callable function object
f()(argument='here')
outputs
{'argument': 'here'}
**kwargs is a syntax construction, which makes function arguments from dictionary. For example:
def a(b, c):
print b + c
args = {'b': 1, 'c': 2}
a(**args) # will print 3
In you code, search for definition of kwargs. I bet you some_function has **kwargs in arguments list like so:
def some_function(**kwargs):
So, your code chk = self.factory(test)(**kwargs) will do this thing:
Call self.factory method with test argument.
self.factory returns function
Returned function will be called with arguments, which is passed in some_function as arguments.
I want to give user API for my library with easier way to distinguish different types of parameters which I pass to function. All groups of arguments are defined earlier (for now I have 3 groups), but attributes of them need to be constructed on run. I can do this in Django ORM style, where double underscore separates 2 parts of parameter. But it is very unreadable. Example:
def api_function(**kwargs):
""" Separate passed arguments """
api_function(post__arg1='foo', api__arg1='bar', post_arg2='foo2')
Better way do this SQLAlchemy, but only to compare attributes and all args are defined earlier. Example:
class API(object):
arg1 = Arg()
arg2 = Arg()
class Post(object): #...
def api_function(*args):
""" Separate passed arguments """
api_function(POST.arg1=='foo', API.arg1=='bar', POST.arg2=='foo2')
What I would like to achive is behaviour like this:
class API(object): # Magic
class POST(object): # Magic
def api_function(*args):
""" Separate passed arguments """
api_function(POST.arg1='foo', API.arg1='bar', POST.arg2='foo2')
What have I tried:
declare metamodel with defined __setattr__, but it rise on evaluation SyntaxError: keyword can't be an expression
declare __set__, but it is designed for known attributes
My questions are:
Is it even possible in Python to work like in third snippet?
If not, is there any really close solution to look like in third snippet? The best way should use assignment operator API.arg1='foo', the worst API(arg1='foo')
Requirements -- should work at least at Python 2.7. Good to work on Python 3.2.
EDIT1
My first test, which is using equality operator (but it NEVER should be use in this way):
class APIMeta(type):
def __getattr__(cls, item):
return ApiData(item, None)
class API(object):
__metaclass__ = APIMeta
def __init__(self, key, value):
self.key = key
self.value = value
def __str__(self):
return "{0}={1}".format(self.key, self.value)
def __eq__(self, other):
self.value = other
return self
def print_api(*api_data):
for a in api_data:
print(str(a))
print_api(API.page=='3', API=='bar')
It is working right, but using == is suggesting that I want to compare something and I want to assign value.
NOTE: I don't know how much I like this schema you want. But I know one annoying thing will be all the imports to call api_function. E.G. from api import POST, API, api_function
As I said in the comments, the first way is not possible. This is because assignment (=) is a statement not an expression, so it can't return a value. Source
But the other way you asked for certainly is:
class POST(object):
def __init__(self, **kwargs):
self.args = kwargs
# You'll also probably want to make this function a little safer.
def __getattr__(self, name):
return self.args[name]
def api_function(*args):
# Update this to how complicated the handling needs to be
# but you get the general idea...
post_data = None
for a in args:
if isinstance(a, POST):
post_data = a.args
if post_data is None:
raise Exception('This function needs a POST object passed.')
print post_data
Using it:
>>> api_function('foo')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 7, in api_function
Exception: This function needs a POST object passed.
>>> api_function(POST(arg1='foo'))
{'arg1': 'foo'}
>>> api_function(POST(arg1='foo',
... arg2='bar'
... )
... )
{'arg1': 'foo', 'arg2': 'bar'}
Here's my solution. It's not the best in design, as the structure of the argument grouper is nested quite deep, so I'd appreciate feedback on it:
class ArgumentGrouper(object):
"""Transforms a function so that you can apply arguments in named groups.
This system isn't tested as thoroughly as something with so many moving
parts should be. Use at own risk.
Usage:
#ArgumentGrouper("foo", "bar")
def method(regular_arg, foo__arg1, bar__arg2):
print(regular_arg + foo__arg1 + bar__arg2)
method.foo(", ").bar("world!")("Hello")() # Prints "Hello, world!"
"""
def __call__(self, func):
"""Decorate the function."""
return self.Wrapper(func, self.argument_values)
def __init__(self, *argument_groups):
"""Constructor.
argument_groups -- The names of argument groups in the function.
"""
self.argument_values = {i: {} for i in argument_groups}
class Wrapper(object):
"""This is the result of decorating the function. You can call group
names as function to supply their keyword arguments.
"""
def __call__(self, *args):
"""Execute the decorated function by passing any given arguments
and predefined group arguments.
"""
kwargs = {}
for group, values in self.argument_values.items():
for name, value in values.items():
# Add a new argument in the form foo__arg1 to kwargs, as
# per the supplied arguments.
new_name = "{}__{}".format(
group,
name
)
kwargs[new_name] = value
# Invoke the function with the determined arguments.
return self.func(*args, **kwargs)
def __init__(self, func, argument_values):
"""Constructor.
func -- The decorated function.
argument_values -- A dict with the current values for group
arguments. Must be a reference to the actual dict, since each
WrappedMethod uses it.
"""
self.func = func
self.argument_values = argument_values
def __getattr__(self, name):
"""When trying to call `func.foo(arg1="bar")`, provide `foo`. TODO:
This would be better handled at initialization time.
"""
if name in self.argument_values:
return self.WrappedMethod(name, self, self.argument_values)
else:
return self.__dict__[name]
class WrappedMethod(object):
"""For `func.foo(arg1="bar")`, this is `foo`. Pretends to be a
function that takes the keyword arguments to be supplied to the
decorated function.
"""
def __call__(self, **kwargs):
"""`foo` has been called, record the arguments passed."""
for k, v in kwargs.items():
self.argument_values[self.name][k] = v
return self.wrapper
def __init__(self, name, wrapper, argument_values):
"""Constructor.
name -- The name of the argument group. (This is the string
"foo".)
wrapper -- The decorator. We need this so that we can return it
to chain calls.
argument_values -- A dict with the current values for group
arguments. Must be a reference to the actual dict, since
each WrappedMethod uses it.
"""
self.name = name
self.wrapper = wrapper
self.argument_values = argument_values
# Usage:
#ArgumentGrouper("post", "api")
def api_function(regular_arg, post__arg1, post__arg2, api__arg3):
print("Got regular args {}".format(regular_arg))
print("Got API args {}, {}, {}".format(post__arg1, post__arg2, api__arg3))
api_function.post(
arg1="foo", arg2="bar"
).api(
arg3="baz"
)
api_function("foo")
Then, usage:
#ArgumentGrouper("post", "api")
def api_function(regular_arg, post__arg1, post__arg2, api__arg3):
print("Got regular args {}".format(regular_arg))
print("Got API args {}, {}, {}".format(post__arg1, post__arg2, api__arg3))
api_function.post(
arg1="foo", arg2="bar"
).api(
arg3="baz"
)
api_function("foo")
Output:
Got regular args foo
Got API args foo, bar, baz
It should be simple to scrape argument group names by introspection.
You'll notice the argument naming convention is hardcoded into the WrappedMethod, so you'll have to make sure you're okay with that.
You can also invoke it in one statement:
api_function.post(
arg1="foo", arg2="bar"
).api(
arg3="baz"
)("foo")
Or you could add a dedicated run method which would invoke it, which would just take the place of Wrapper.__call__.
Python don't allow to use assignment operator inside any other code, so:
(a=1)
func((a=1))
will rise SyntaxError. This means that it is not possible to use it in this way. Moreover:
func(API.arg1=3)
Will be treated that left side of assignment is argument API.arg1 which is not valid name in Python for variables. Only solution is to make this in SQLAlchemy style:
func({
API.arg1: 'foo',
API.arg2: 'bar',
DATA.arg1: 'foo1',
})
or
func(**{
API.arg1: 'foo',
API.arg2: 'bar',
DATA.arg1: 'foo1',
})
or just only:
func( API(arg1='foo', arg2='bar'), POST(arg1='foo1'), POST(arg2='bar1'))
Thank you for your interest and answers.
I'm currently creating an object like this:
class Obj(object):
def __init__(self,**kwargs):
params = ['val1','val2','val3','val4',...]
for p in params:
setattr(self,p,kwargs.get(p,None))
I'm doing this so I don't have to do this:
class Obj(object):
def __init__(self,val1=None,val2=None,val3=None,val4=None,...):
self.val1=val1
self.val2=val2
self.val3=val3
self.val4=val4
...
My question is, can you do a mix of the two? Where I can define the expected parameters yet still loop the parameters to set the attributes? I like the idea of defining the expected parameters because it is self documenting and other developers don't have to search for what kwargs are used.
I know it seems pretty petty but I'm creating an object from some XML so I'll be passing in many parameters, it just clutters the code and bugs me.
I did google this but couldn't find anything, probably because dictionary and kwargs together point to kwarg examples.
UPDATE: To be more specific, is it possible to get a dictionary of passed in parameters so I don't have to use kwargs at all?
Sudo code:
class Obj(object):
def __init__(self,val1=None,val2=None,val3=None,val4=None,...):
for k,v in dictionary_of_paramters.iteritems():
setattr(self,k,v)
You can use the inspect module:
import inspect
def myargs(val1, val2, val3=None, val4=5):
print inspect.currentframe().f_locals
it shows all the locals available on the current stack frame.
myargs('a','b')
==> {'val3': None, 'val2': 'b', 'val1': 'a', 'val4': 5}
(note: it's not guaranteed to be implemented on all Python interpreters)
edit: i concur that it's not a pretty solution. what i would do is more like:
def _yourargs(*names):
"returns a dict with your named local vars"
alllocs = inspect.stack()[1][0].f_locals
return {n:alllocs[n] for n in names}
def askformine(val1, val2, val3=None, val4=5):
"example to show just those args i'm interested in"
print _yourargs('val1','val2','val3','val4')
class Obj(object):
"example inserting some named args as instance attributes"
def __init__(self, arg1, arg2=4):
self.__dict__.update(_yourargs('arg1','arg2'))
edit2 slightly better:
def pickdict(d,*names):
"picks some values from a dict"
return {n:d[n] for n in names}
class Obj(object):
"example inserting some named args as instance attributes"
def __init__(self, arg1, arg2=4):
self.__dict__.update(pickdict(locals(),'arg1','arg2'))
There is no nice way to get a dictionary of all the arguments to a function. The **kwargs syntax only collects up the extra keyword arguments, not the ones that match explicit parameters in the function definition.
Although you won't be able to get the parameters without using kwargs or the inspect module (see other answers), you can do something like this...
class Obj(object):
def __init__(self, **kwargs):
self.__dict__.update(**kwargs)
Every object has a dictionary that stores all of the attributes, which you can access via self.__dict__. Then you're just using update to set all of the attributes in that object's internal dictionary.
See this question on some discussion of this method.
If you want to obtain the args dict at the very top of your method, before you define any locals, this is as simple as:
class Obj(object):
def __init__(self,val1=None,val2=None,val3=None,val4=None):
kwargs = dict(locals())
To read this dict later on, some introspection magic is required:
class Obj(object):
def __init__(self,val1=None,val2=None,val3=None,val4=None):
# feel free to add more locals
loc = dict(locals())
fun = sys._getframe().f_code
kwargs = {x:loc[x] for x in fun.co_varnames[:fun.co_argcount]}
You can also make the latter reusable by adding a function like this:
def getargs():
f = sys._getframe(1)
return {x:f.f_locals[x] for x in f.f_code.co_varnames[:f.f_code.co_argcount]}
and then:
class Obj(object):
def __init__(self,val1=None,val2=None,val3=None,val4=None):
# feel free to add more locals
kwargs = getargs()
This is cpython-specific, I guess.
Yes you can mix the two.
See below:
def method(a, b=1, *args, **kwargs):
'''some code'''
This is valid. Here:
'a' is a required argument
'b' is a default argument
'args' will have all the non-keyword arguments and
'kwargs' will have all the keyword arguments.
Example:
method(1,2,3,4,5,test=6,test1=7)
This call will have:
a=1
b=2
args=(3,4,5)
kwargs={'test':6,'test1':7}
A kind of an ugly workaround: Inject extra arguments into kwargs and use it where you want to loop over all keyword arguments (PS this is an example usage of the inspect module, but not recommended for production use):
#!/usr/bin/env python
import inspect
def inject_defaults(func):
""" injects '__defaults' key into into kwargs,
so it can be merged with kwargs in the decorated method """
args, varargs, varkwargs, defaults = inspect.getargspec(func)
have_defaults = args[-len(defaults):]
defaults_dict = dict(zip(have_defaults, defaults))
def fun(*args, **kwargs):
kwargs['__defaults'] = defaults_dict
return func(*args, **kwargs)
return fun
#inject_defaults
def f1(a,b,c, x=1, **kwargs):
kwargs.update(kwargs['__defaults'])
del kwargs['__defaults']
for k, v in kwargs.items():
# here, x, y and z will appear
print(k, v)
f1(1, 2, 3, y=3, z=2)
# prints
# ('y', 3)
# ('x', 1)
# ('z', 2)