In Python we can assign a function to a variable. For example, the math.sine function:
sin = math.sin
rad = math.radians
print sin(rad(my_number_in_degrees))
Is there any easy way of assigning multiple functions (ie, a function of a function) to a variable? For example:
sin = math.sin(math.radians) # I cannot use this with brackets
print sin (my_number_in_degrees)
Just create a wrapper function:
def sin_rad(degrees):
return math.sin(math.radians(degrees))
Call your wrapper function as normal:
print sin_rad(my_number_in_degrees)
I think what the author wants is some form of functional chaining. In general, this is difficult, but may be possible for functions that
take a single argument,
return a single value,
the return values for the previous function in the list is of the same type as that of the input type of the next function is the list
Let us say that there is a list of functions that we need to chain, off of which take a single argument, and return a single argument. Also, the types are consistent. Something like this ...
functions = [np.sin, np.cos, np.abs]
Would it be possible to write a general function that chains all of these together? Well, we can use reduce although, Guido doesn't particularly like the map, reduce implementations and was about to take them out ...
Something like this ...
>>> reduce(lambda m, n: n(m), functions, 3)
0.99005908575986534
Now how do we create a function that does this? Well, just create a function that takes a value and returns a function:
import numpy as np
def chainFunctions(functions):
def innerFunction(y):
return reduce(lambda m, n: n(m), functions, y)
return innerFunction
if __name__ == '__main__':
functions = [np.sin, np.cos, np.abs]
ch = chainFunctions( functions )
print ch(3)
You could write a helper function to perform the function composition for you and use it to create the kind of variable you want. Some nice features are that it can combine a variable number of functions together that each accept a variable number of arguments.
import math
try:
reduce
except NameError: # Python 3
from functools import reduce
def compose(*funcs):
""" Compose a group of functions (f(g(h(...)))) into a single composite func. """
return reduce(lambda f, g: lambda *args, **kwargs: f(g(*args, **kwargs)), funcs)
sindeg = compose(math.sin, math.radians)
print(sindeg(90)) # -> 1.0
I am using a module for a project, I have to pass a function to the module and the model does something like:
class module:
def __init__(self, function, dictionary):
# dictionary is {'x':2, 'y':4, 'z':23}
function(**dictionary)
And my function is something like:
def function(*foo):
return sum(foo)
The problem is that, the module needs named variables, and will pass it to the function like an unpacked dictionary, and the number of elements in dictionary can be variable, so I cannot pre-write the function as def function(x,y,z): return sum(x,y,z), and this raises an error. I do not wish to modify the module, because then, the code will not be universal. How can I solve this problem by just changing my code?
EDIT: I need foo as a list to use in the function
You module that you can't change is calling your function with:
function(**dictionary)
You won't be able to write your function to the argument is a list — it's not being passed a list. You can accept the keywords as a dict and the easily make a list. Your function just needs to be prepared to be called that way:
def f(**foo):
This leads to:
class module:
def __init__(self, function, dictionary):
# dictionary is {'x':2, 'y':4, 'z':23}
function(**dictionary)
def f(**foo):
print(sum(foo.values()))
module(f, {'x':2, 'y':4, 'z':23})
# prints 29 as expected
def function(*args,**Kwargs):
try:
return sum(*args)
else:
return sum(**kwargs.values())
double * unpacked dictionary values, and one * is to unpacked anything(except dictionary).
The number and type of arguments are determined by code of function init
In your case this a single argument of type dictionary. So you have always to pass such function f(x) where x is a dictionary.
So the that is function f that deals with the argument.
E.g.
def fsum(x): return sum(x.values())
...
__init__(fsum, {'a': 1, 'b': 2, 'c': 3})
It seems you want the sum of the values:
def __init__(self, function, dictionary):
# dictionary is {'x':2, 'y':4, 'z':23}
function(dictionary.values())
The dictionary.values() will give a list of [2, 4, 23] for your example.
I'm not sure how verbose I should go so please ask for elaboration if this is too terse.
Is there a way to pack the for a,b,c in product(d['a'],d['b'],d['c']): in some syntactical sugar so I would only need to type mute variables a,b,c only once for this loop itself?
Something like this may be?
my_for_loop('a','b','c'):
API_Call1(a)
API_Call2(b,c)
instead of
for a,b,c in product(d['a'],d['b'],d['c']):
API_Call1(a)
API_Call2(b,c)
How could my_for_loop look like? I am a bit lost at how to approach this conceptually.
More detail:
I have an API that requires calling it for each cell of a cartesian product of some lists. I am using the product function to avoid nested loops. Let's say we have a list1 and a list2 it can be done in a following way
from itertools import product
for a,b in product(list1,list2):
API_Call(a,b)
I have created a dictionary_of_lists={'a':list1,'b':list2,'c':list3...}
to be able to write it like this
for a,b in product(dictionary_of_lists['a'],dictionary_of_lists['b']):
API_Call(a,b)
for c,b in product(dictionary_of_lists['c'],dictionary_of_lists['b']):
API_Call(c,b)
for e,f,g,h in product(dictionary_of_lists['e'],dictionary_of_lists['f'],dictionary_of_lists['g'],dictionary_of_lists['h'],):
API_Call1(e,f,g,h)
API_Call2(e,h)
...
So basically the variables that the loop creates are used in that API calls and they are mute otherwise, their name doesn't matter. There are many of these calls and there is some convoluted logic around them. So I would like to keep the loop itself simple and should I need to change the variables I won't have to to change them at three places for each such loop.
my_for_loop('a','b'):
API_Call(a,b)
my_for_loop('b','c'):
API_Call(c,b)
my_for_loop('e','f','g','h'):
API_Call1(e,f,g,h)
API_Call2(e,h)
...
ADDITION:
I have simplified a few things but was taken by surprise where exactly ambiguity was lurking :-)
Thanks for all the answers so far!
It's a good suggestion to have the dproduct wrapper. I have one indeed, just did not want to preempt your suggestions.
The variable names are mute for the code logic but they have some meaning for the sake of maintenance of the code. So they can not consist of a single letter each.
In an attempt to clarify further: I would like to avoid using the variable names three times - in the "for ..." part, in the call to dproduct wrapper and in the API calls. Two times - in the call to the wrapper and in the API calls is OK because it reflects the logic.
Below is a more elaborated example of the code I have now.
def dproduct(d, keys):
subset_d = dict((k, d[k]) for k in keys if k in d)
return product(*[subset_d.values()])
for foo, bar in dproduct(d, ['foo','bar',]):
some logic here
if API_Call1(foo,bar) == 123:
some other stuff, API_Call6(quux,baz,)
some more stuff and a call to another dproduct
for quux, sl, baz in dproduct(d, ['quux','sl','baz',]):
blah blah, API_Call2(quux,sl,baz)
other stuff
for pa, mf in dproduct(d, ['pa','mf',]):
API_Call4(pa,mf)
for quux, sl, baz in dproduct(d, ['quux','sl','baz',]):
further logic
if API_Call1(quux, sl, baz) == 342: some other stuff
some more stuff and a call to another dproduct
for pa,mf in dproduct(d, ['pa','mf',]):
API_Call3(pa,mf)
First, you can write a product wrapper like this:
def dproduct(d, keys):
return product(*(d[key] for key in keys))
for a, b, c in dproduct(d, 'abc'):
API_Call1(a)
API_Call2(b, c)
(Notice that you can write the same thing with operator.itemgetter instead of a genexpr; it depends on which you find more readable.)
Of course I'm taking advantage of the fact that all of your keys have single-character names, and a string is an iterable of its characters. If your real names aren't single-character, you have to be slightly more verbose:
for a, b, c in dproduct(d, ('spam', 'eggs', 'beans')):
… at which point you might want to consider taking *args instead of args in the parameters—or, if you don't mind being hacky:
def dproduct(d, keys):
return product(*(d[key] for key in keys.split()))
for a, b, c in dproduct(d, 'spam eggs beans'):
You can then go farther and write wrappers around your calls, too. For example, if you yield dicts of values instead of tuples of values, you can use those dicts the same way we used the original dict of lists:
def dproduct(d, keys):
for vals in product(*(d[key] for key in keys)):
yield dict(zip(keys, vals))
def call1(d, keys):
return API_Call1(*(d[key] for key in keys))
def call2(d, keys):
return API_Call2(*(d[key] for key in keys))
for vals in dproduct(d, 'abc'):
call1(vals, 'a')
call2(vals, 'bc')
Or, if your API_Call* functions can take keyword instead of positional arguments, you can make things a lot simpler and cleaner:
def dproduct(d, keys):
for vals in product(*(d[key] for key in keys)):
yield dict(zip(keys, vals))
def dselect(d, keys):
return {key: d[key] for key in keys}
for vals in dproduct(d, 'abc'):
API_Call1(**dselect(vals, 'ab'))
API_Call2(**dselect(vals, 'c'))
If you can't use keyword arguments, and you have a lot of those API_Call* functions, you can go farther and generate the wrappers dynamically:
def apiwrap(apicall):
#functools.wraps(apicall)
def wrapper(d, keys):
return apicall(*(d[key] for key in keys))
return wrapper
apicall1 = apiwrap(API_Call1)
apicall2 = apiwrap(API_Call2)
# etc.
Although if you have lots of these, you probably want to stick them in a list or a dict in the first place…
If you want to get way too clever, you can even split the tuples up dynamically based on the signatures of the API functions:
def dispatch(d, keys, *calls):
for vals in product(*(d[key] for key in keys)):
it = iter(vals)
for call in calls:
args = islice(it, len(signature(call).parameters))
call(*args)
dispatch(d, 'abc', API_Call1, API_Call2)
(If your function bodies are pretty minimal, you probably want to speed things up by doing argcounts = [len(signature(call).parameters for call in calls] at the top of the function and then using zip(calls, argcounts) rather than using inspect each time in the inner loop.)
Anyway, without knowing more about your program, it's hard to say exactly what you can do, and what you should do—most of these ideas are not very Pythonic in general, even if they might be useful in some particular unusual case.
But either way, they should serve as examples of the kinds of things you can do pretty easily without having to get into horrible hacks involving locals and globals or getframe.
You can't easily insert variables into the local namespace (probably can but shouldn't without good cause). So use a dictionary to hold your named values.
I've used the operator.itemgetter function to grab the ones you want for the various API calls, and wrapped your product function in a generator.
from operator import itemgetter
from itertools import product
class DictCaller(dict):
def __call__(self, fn, *args):
fn(*map(itemgetter(self), args))
def my_product(d, *args):
for xs in product(*map(itemgetter(d), args)):
yield DictCaller(zip(args, xs))
for caller in my_product(*'abc'):
caller(API_CALL, *'ab')
caller(API_CALL1, *'bc')
How can I dynamically get the names and values of all arguments to a class method? (For debugging).
The following code works, but it would need to be repeated a few dozen times (one for each method). Is there a simpler, more Pythonic way to do this?
class Foo:
def foo(self, a, b):
myself = getattr(self, inspect.stack()[0][3])
argnames = inspect.getfullargspec(myself).args[1:]
d = {}
for argname in argnames:
d[argname] = locals()[argname]
log.debug(d)
That's six lines of code for something that should be a lot simpler.
Sure, I can hardcode the debugging code separately for each method, but it seems easier to use copy/paste. Besides, it's way too easy to leave out an argument or two when hardcoding, which could make the debugging more confusing.
I would also prefer to assign local variables instead of accessing the values using a kwargs dict, because the rest of the code (not shown) could get clunky real fast, and is partially copied/pasted.
What is the simplest way to do this?
An alternative:
from collections import OrderedDict
class Foo:
def foo(self, *args):
argnames = 'a b'.split()
kwargs = OrderedDict(zip(argnames, args))
log.debug(kwargs)
for argname, argval in kwargs.items():
locals()[argname] = argval
This saves one line per method, but at the expense of IDE autocompete/intellisense when calling the method.
As wpercy wrote, you can reduce the last three lines to a single line using a dict comprehension. The caveat is that it only works in some versions of Python.
However, in Python 3, a dict comprehension has its own namespace and locals wouldn't work. So a workaround is to put the locals func after the in:
from itertools import repeat
class Foo:
def foo(self, a, b):
myname = inspect.stack()[0][3]
argnames = inspect.getfullargspec(getattr(self, myname)).args[1:]
args = [(x, parent[x]) for x, parent in zip(argnames, repeat(locals()))]
log.debug('{}: {!s}'.format(myname, args))
This saves two lines per method.
Is it possible to add values in dictionary in lambda expression?
That is to implement a lambda which has the similar function as below methods.
def add_value(dict_x):
dict_x['a+b'] = dict_x['a'] + dict_x['b']
return dict_x
Technically, you may use side effect to update it, and exploit that None returned from .update is falsy to return dict via based on boolean operations:
add_value = lambda d: d.update({'a+b': d['a'] + d['b']}) or d
I just don't see any reason for doing it in real code though, both with lambda or with function written by you in question.
You could build a custom dict, inheriting from dict, overriding its __setitem__ function. See the Python documentation.
class MyCustomDict(dict):
def __setitem__(self, key, item):
# your method here