Let there are some codes like
def f(a, b):
# some code (no return)
def g(c, d, e):
# some code (no return)
Then, I want to make a function "merge_functions" which
h = merge_functions(f, g)
works same with:
def h(a, b, c, d, e):
f(a, b)
g(c, d, e)
There could be more or fewer parameters in f and g, and I want to keep the name of the parameters the same.
There is no default value in any parameters.
I have tried:
from inspect import signature
getarg = lambda func: list(dict(signature(func).parameters).keys())
to_arg_form = lambda tup: ', '.join(tup)
def merge_functions(f, g):
fl = getarg(f)
gl = getarg(g)
result = eval(f"lambda {to_arg_form(fl+gl)}:{f.__name__}({to_arg_form(fl)}) and False or {g.__name__}({to_arg_form(gl)})")
return result
However, I could only use this function in the same file, not as a module.
How can I make the function that can also be used as a module?
You can try something like this code which creates a third function which you can use everywhere:
def f1(x1, x2, **args):
print(f"{x1} {x2}")
def f2(x1, x3, x4, **args):
print(f"{x1} {x3} {x4}")
def merge(f1, f2):
return lambda **args: (f1(**args), f2(**args))
f3 = merge(f1, f2)
f3(x1=1, x2=2, x3=3, x4=4)
Ok, so I can make a function which will call the functions passed to it in order with the right number of parameters.
I'm not sure that getting the names of the parameters is possible in the general case: There may be functions you want to compose which have the same parameter names.
from inspect import signature
def merge_functions(*funcs):
def composite(*args):
new_args = args[:]
for f in funcs:
sigs = signature(f).parameters
fargs = new_args[:len(sigs)]
new_args = new_args[len(sigs):]
f(*fargs)
return composite
def f(a, b):
print(f'Inside f({a},{b})')
def g(c, d, e):
print(f'Inside g({c},{d},{e})')
h = merge_functions(f, g)
h(1,2,3,4,5)
Output:
Inside f(1,2)
Inside g(3,4,5)
Related
I am trying to add some customized logic outside of an existing function. Here are the example:
# existing function that I cannot change
def sum(a, b, c, d):
return a+b+c+d
# the function I want to build
def sumMultiply(a, b, c, d, multiplier):
return multiplier * sum(a, b, c, d)
This is a stupid example, but essentially I want to build a new function that takes all the parameter of the existing function and add a few new arguments.
The above solution is problematic when the existing function changes its definition. For example:
# in some updates the original function dropped one parameter
def sum(a, b, c):
return a+b+c
# the new function will give an error since there is no parameter "d"
def sumMultiply(a, b, c, d, multiplier):
return multiplier * sum(a, b, c, d) # error
How can I specify the new function so that I do not need to worry about changing the new function definition when the existing function definition changes?
One way would be to use arbitrary positional or keyword arguments:
def sumMultiply(multiplier, *numbers):
return multiplier * sum(*numbers)
def sumMultiply(multiplier, *args, **kwargs):
return multiplier * sum(*args, **kwargs)
However, if you see yourself passing around the same set of data around, consider making a parameter object. In your case, it can simply be a list:
def sum(numbers):
...
def sumMultiply(multiplier, numbers):
return multiplier * sum(numbers)
There are some additional downsides to using arbitrary arguments:
the arguments are implicit: you might need to dig through several layers to see what you actually need to provide
they don't play well with type annotations and other static analysers (e.g. PyCharm's refactorings)
I would create a decorator function
def create_fun_multiplier(fun, multiplier=1):
def multiplier_fun(*args):
return multiplier * fun(*args)
return multiplier_fun
def my_sum(a, b, c):
return a + b + c
sumMultiply = create_fun_multiplier(my_sum, multiplier=2)
print(sumMultiply(3, 4, 7))
I would look at using keyword args for this problem.
eg.
def sum(a, b, c):
return a + b + c
def sumMultiply(*args, multiplier=1):
return multiplier * sum(*args)
I want to do something like this:
lst = []
lst.append( func1(a, b, c, d) ) # do NOT execute function here/now
lst.append( func2(e, f) ) # do NOT execute function here/now
lst.append( func3(w, x, y) ) # do NOT execute function here/now
# ... later
for f in lst:
result = f # execute function here/now
What is the simplest way to accomplish this in Python?
You can use the func(*args) syntax to provide a tuple of positional arguments:
lst = []
lst.append( (func1, a, b, c, d) )
lst.append( (func2, e, f) )
lst.append( (func3, w, x, y) )
for func, *args in lst:
result = func(*args)
Alternatively, just guard each function call with a lambda:
lst = []
lst.append(lambda: func1(a, b, c, d))
lst.append(lambda: func2(e, f))
lst.append(lambda: func3(w, x, y))
for f in lst:
result = f()
The second version is subject to some subtle bugs, because the argument names (e.g. a) aren't evaluated when the lambda is created, they're evaluated when the lambda is called - this means if a changes between the append and the call, these two solutions will give different results, and the first one is probably what you intended. So I'd recommend avoiding the second one, unless the arguments are just constants.
Try this:
list.append( functools.partial(func1, a, b, c, d) )
I want to find a clear and efficient way to be able to change parameter value set for functools.partial.
Let's see a simple example:
from functools import partial
def fn(a,b,c,d,e):
print(a,b,c,d,e)
fn12 = partial(fn, 1,2)
Later, I want to have something like:
fn12 [0] = 7
to replace the value on specific place without create a new partial, because it's pretty heavy code there.
Addition: i ask about general possibility to change partial value.
The naive example would be like :
def printme( a,b,c,d,e):
print(a,b,c,d,e)
class my_partial:
def __init__(self, fn, *args):
self.__func__ = fn
self. args = list(args)
def __call__(self, *next_args):
call = self. args + list(next_args)
return self. __func__(* (call) )
fn12 = my_partial(printme,1,2)
fn12(3,4,5)
fn12.args[1] = 7
fn12(3,4,5)
I need that for example for widgets, where action function is defined like :
rb.config(command = partial(...))
but then I'd like to change some parameters given in partial. I could do a new partial again but that looks kinda messy.
If it is permissible to look into the implementation of partial, then using __reduce__ and __setstate__ you can replace the args wholesale:
from functools import partial
def fn(a,b,c,d,e):
print(a,b,c,d,e)
fn12 = partial(fn, 1,2)
def replace_args(part, new_args):
_,_, f = part.__reduce__()
f, _, k, n = f
part.__setstate__( (f, new_args, k, n) )
fn12('c','d','e')
replace_args(fn12, (7,2))
fn12('c','d','e')
Output:
1 2 c d e
7 2 c d e
You can update partial parameters. For example if you have a function like this:
def f(a, b):
return a*b
func = partial(f, b=2)
func(1) # result: 1*2=2
Now, you can update partial parameter b like this:
func(1, b=7) # result: 1*7=7
I have defined three functions.
def evaluate1(a, b):
pass
def evaluate2(a, b):
pass
def evaluate3(a, b, c):
pass
What I want to do use a pointer to record which evaluate function I will use depending on the test inputs. The logic is as shown follows:
def test(a, b, c, d):
# let evaluate_function records which evaluate function I will use
if c > 1:
evaluate_function = evaluate3 # not sure
else:
if d:
evaluate_function = evaluate1
else:
evaluate_function = evaluate2
# execute the evaluate function
evaluate_function(a, b, ?)
However, since evaluate3 has different arguments from evaluate1 and evaluate3. How should I do? Thanks!
You have come up with a good idea of using a 'function pointer' to select the function. But since you know which function you are selecting at the time, you could also bind up the params:
def test(a, b, c, d):
# let evaluate_function records which evaluate function I will use
if c > 1:
evaluate_function = evaluate3 # not sure
params = a,b,d
else:
if d:
evaluate_function = evaluate1
params = a,b
else:
evaluate_function = evaluate2
params = a,c
# execute the evaluate function
evaluate_function(*params)
I'll leave it to you to properly select the params.
Why not just call the evaluate functions directly instead of assigning them to a function as so. Makes it more readable
def evaluate1(a, b):
print('evaluate1')
def evaluate2(a, b):
print('evaluate2')
def evaluate3(a, b, c):
print('evaluate3')
def test(a, b, c=None, d=None):
# let evaluate_function records which evaluate function I will use
if c and c > 1:
evaluate3(a, b, c)
else:
if d:
evaluate1(a, b)
else:
evaluate2(a, c)
test(1,2,c=0.1,d=1)
#evaluate1
test(1,2)
#evaluate2
test(1,2,3)
#evaluate3
I have a list of functions that I want to execute, like in a pipeline.
pipeline = [f1, f2, f3, f4]
(Basically I compose the functions on the list see: https://mathieularose.com/function-composition-in-python/):
def run(pipeline):
return functools.reduce(lambda f, g: lambda x: f(g(x)), pipeline, lambda x: x)
I want to observe when a function is called, when it finishes, if it fails, etc and log that to a file or a database or to a MongoDB, etc. Each of this functions is returning some python objects, so I want to use the return value and log its attributes, e.g.
If f1 returns a list I want to log f1 was completed at 23:00 on 04/22/2018. It returned a list of length 5 and so on
My problem is not about executing the functions, is about observing the beahaviour of the functions. I want that the functions are agnostic to how I code the pipeline.
I was wondering how to implement the observer pattern here. I know that the "observer pattern" sounds too much "Object Oriented", so my first idea was using decorators, but searching for a guide on this I found RxPy.
So, I am looking for a guide on how to solve this problem.
If you like decorators, you can use the following approach:
def log_finish_time(f):
def g(*args):
ret = f(*args)
print('{} completed on {}'.format(f, time.time()))
return ret
return g
def log_return_value(f):
def g(*args):
ret = f(*args)
print('{} returned {}'.format(f, ret))
return ret
return g
#log_finish_time
#log_return_value
def f1(x):
return x+1
#log_finish_time
#log_return_value
def f2(x):
return x*x
without changing your composition code.
If you want to have arguments to your decorators, you have to add another function in the middle (basically it's a function which returns a decorator):
def log_finish_time(log_prefix):
def h(f):
def g(*args):
ret = f(*args)
print('{}: {} completed on {}'.format(log_prefix, f, time.time()))
return ret
return g
return h
#log_finish_time('A')
def f1(x):
return x+1
Something like:
import time
def run(pipeline):
def composed(x):
for f in pipeline:
y = f(x)
print('{} completed on {}. it returned {}'.format(f, time.time(), y))
x = y
return x
return composed
The observer pattern does not fit well functional-style.
Edit: callback based:
def run(pipeline, callback):
def composed(x):
for f in pipeline:
y = f(x)
callback(f, time.time(), y)
x = y
return x
return composed