how to optimize these python functions? - python

"""
I have three python functions:a(), b(), c(), which have almost the same process flow.The difference place is just 'x', 'y', 'z', the three are just a part of a function name or a variable name.
How can I write these functions friendly and elegantly?
"""
def a():
do_x()
some = var1['x_id']
var2['x_result'] = x_process()
...
def b():
do_y()
some = var1['y_id']
var2['y_result'] = y_process()
...
def c():
do_z()
some = var1['z_id']
var2['z_result'] = z_process()
...

These three functions are basically identical, except for which functions they call, and which map indices they use. So you can just pass the called functions and map indices as arguments to a unified function:
def do_something(do_fun, id1, id2, process_fun):
do_fun()
some = var1[id1]
var2[id2] = process_fun()
do_something(do_x, 'x_id', 'x_result', x_process)

The best way is not just to rewrite those functions, but to restructure the various items they make use of, by changing them from variables with x, y, or z in the names to items stored in a structure that maps the strings "x", "y", and "z" to the appropriate stuff. Something like this:
do_funcs = {'x': do_x, 'y': do_y, 'z': do_z}
# make ids whatever is in var1
# but with the keys as "x", "y", "z" instead of "x_id", "y_id", "z_id"
ids = {}
# make resu;ts whatever is in var2
# but with the keys as "x", "y", "z" instead of "x_result", "y_result", "z_result"
results = {}
processes = {'x': x_process, 'y': y_process, 'z': z_process}
Then you can write one function:
def do_stuff(which):
do_funcs[which]()
some = ids[which]
results[which] = processes[which]()
And then you call do_stuff('x'), do_stuff('y'), or do_stuff('z').
That is just a sketch, because exactly how to do it best depends on where those other things are defined and how they're used. The basic, idea, though, is to not use parts of variable names as parameters. If you find yourself with a bunch of things called x_blah, y_blah and z_blah, or even dict keys like "x_id" and "y_id", you should try to restructure things so that they are data structures directly parameterized by a single value (e.g., a string "x", "y" or "z").

You can use lambda functions:
def a():
print 'a'
def b():
print 'b'
def c():
print 'c'
def a_process():
print 'a'
def b_process():
print 'b'
def c_process():
print 'c'
def func(x):
do = {'a':lambda: a(), 'b':lambda: b(), 'c':lambda:c()}
do[x]()
some = var1['{}_id'.format(x)]
process = {'a':lambda: a_process(), 'b':lambda: b_process(), 'c':lambda:c_process()}
var2['{}_result'.format(x)] = process[x]()
func('c')

Related

Pass dictionary directly as arguments of function [duplicate]

Let's imagine I have a dict :
d = {'a': 3, 'b':4}
I want to create a function f that does the exact same thing than this function :
def f(x, a=d['a'], b=d['b']):
print(x, a, b)
(Not necessarily print, but do some stuff with the variable and calling directly from their name).
But I would like to create this function directly from the dict, that is to say, I would like to have something that look likes
def f(x, **d=d):
print(x, a, b)
and that behaves like the previously defined function. The idea is that I have a large dictionary that contains defaults values for arguments of my function, and I would like not to have to do
def f(a= d['a'], b = d['b'] ...)
I don't know if it's possible at all in python. Any insight is appreciated !
Edit : The idea is to be able to call f(5, a=3).
Edit2 : The question is not about passing arguments stored in a dict to a function but to define a function whose arguments names and defaults values are stored in a dict.
You cannot achieve this at function definition because Python determines the scope of a function statically. Although, it is possible to write a decorator to add in default keyword arguments.
from functools import wraps
def kwargs_decorator(dict_kwargs):
def wrapper(f):
#wraps(f)
def inner_wrapper(*args, **kwargs):
new_kwargs = {**dict_kwargs, **kwargs}
return f(*args, **new_kwargs)
return inner_wrapper
return wrapper
Usage
#kwargs_decorator({'bar': 1})
def foo(**kwargs):
print(kwargs['bar'])
foo() # prints 1
Or alternatively if you know the variable names but not their default values...
#kwargs_decorator({'bar': 1})
def foo(bar):
print(bar)
foo() # prints 1
Caveat
The above can be used, by example, to dynamically generate multiple functions with different default arguments. Although, if the parameters you want to pass are the same for every function, it would be simpler and more idiomatic to simply pass in a dict of parameters.
Python is designed such that the local variables of any function can be determined unambiguously by looking at the source code of the function. So your proposed syntax
def f(x, **d=d):
print(x, a, b)
is a nonstarter because there's nothing that indicates whether a and b are local to f or not; it depends on the runtime value of the dictionary, whose value could change across runs.
If you can resign yourself to explicitly listing the names of all of your parameters, you can automatically set their default values at runtime; this has already been well covered in other answers. Listing the parameter names is probably good documentation anyway.
If you really want to synthesize the whole parameter list at run time from the contents of d, you would have to build a string representation of the function definition and pass it to exec. This is how collections.namedtuple works, for example.
Variables in module and class scopes are looked up dynamically, so this is technically valid:
def f(x, **kwargs):
class C:
vars().update(kwargs) # don't do this, please
print(x, a, b)
But please don't do it except in an IOPCC entry.
try this:
# Store the default values in a dictionary
>>> defaults = {
... 'a': 1,
... 'b': 2,
... }
>>> def f(x, **kwa):
# Each time the function is called, merge the default values and the provided arguments
# For python >= 3.5:
args = {**defaults, **kwa}
# For python < 3.5:
# Each time the function is called, copy the default values
args = defaults.copy()
# Merge the provided arguments into the copied default values
args.update(kwa)
... print(args)
...
>>> f(1, f=2)
{'a': 1, 'b': 2, 'f': 2}
>>> f(1, f=2, b=8)
{'a': 1, 'b': 8, 'f': 2}
>>> f(5, a=3)
{'a': 3, 'b': 2}
Thanks Olvin Roght for pointing out how to nicely merge dictionaries in python >= 3.5
How about the **kwargs trick?
def function(arg0, **kwargs):
print("arg is", arg0, "a is", kwargs["a"], "b is", kwargs["b"])
d = {"a":1, "b":2}
function(0., **d)
outcome:
arg is 0.0 a is 1 b is 2
This question is very interesting, and it seemed different people have their different own guess about what the question really want.
I have my own too. Here is my code, which can express myself:
# python3 only
from collections import defaultdict
# only set once when function definition is executed
def kwdefault_decorator(default_dict):
def wrapper(f):
f.__kwdefaults__ = {}
f_code = f.__code__
po_arg_count = f_code.co_argcount
keys = f_code.co_varnames[po_arg_count : po_arg_count + f_code.co_kwonlyargcount]
for k in keys:
f.__kwdefaults__[k] = default_dict[k]
return f
return wrapper
default_dict = defaultdict(lambda: "default_value")
default_dict["a"] = "a"
default_dict["m"] = "m"
#kwdefault_decorator(default_dict)
def foo(x, *, a, b):
foo_local = "foo"
print(x, a, b, foo_local)
#kwdefault_decorator(default_dict)
def bar(x, *, m, n):
bar_local = "bar"
print(x, m, n, bar_local)
foo(1)
bar(1)
# only kw_arg permitted
foo(1, a=100, b=100)
bar(1, m=100, n=100)
output:
1 a default_value
1 m default_value
1 100 100
1 100 100
Posting this as an answer because it would be too long for a comment.
Be careful with this answer. If you try
#kwargs_decorator(a='a', b='b')
def f(x, a, b):
print(f'x = {x}')
print(f'a = {a}')
print(f'b = {b}')
f(1, 2)
it will issue an error:
TypeError: f() got multiple values for argument 'a'
because you are defining a as a positional argument (equal to 2).
I implemented a workaround, even though I'm not sure if this is the best solution:
def default_kwargs(**default):
from functools import wraps
def decorator(f):
#wraps(f)
def wrapper(*args, **kwargs):
from inspect import getfullargspec
f_args = getfullargspec(f)[0]
used_args = f_args[:len(args)]
final_kwargs = {
key: value
for key, value in {**default, **kwargs}.items()
if key not in used_args
}
return f(*args, **final_kwargs)
return wrapper
return decorator
In this solution, f_args is a list containing the names of all named positional arguments of f. Then used_args is the list of all parameters that have effectively been passed as positional arguments. Therefore final_kwargs is defined almost exactly like before, except that it checks if the argument (in the case above, a) was already passed as a positional argument.
For instance, this solution works beautifully with functions such as the following.
#default_kwargs(a='a', b='b', d='d')
def f(x, a, b, *args, c='c', d='not d', **kwargs):
print(f'x = {x}')
print(f'a = {a}')
print(f'b = {b}')
for idx, arg in enumerate(args):
print(f'arg{idx} = {arg}')
print(f'c = {c}')
for key, value in kwargs.items():
print(f'{key} = {value}')
f(1)
f(1, 2)
f(1, b=3)
f(1, 2, 3, 4)
f(1, 2, 3, 4, 5, c=6, g=7)
Note also that the default values passed in default_kwargs have higher precedence than the ones defined in f. For example, the default value for d in this case is actually 'd' (defined in default_kwargs), and not 'not d' (defined in f).
You can unpack values of dict:
from collections import OrderedDict
def f(x, a, b):
print(x, a, b)
d = OrderedDict({'a': 3, 'b':4})
f(10, *d.values())
UPD.
Yes, it's possible to implement this mad idea of modifying local scope by creating decorator which will return class with overriden __call__() and store your defaults in class scope, BUT IT'S MASSIVE OVERKILL.
Your problem is that you're trying to hide problems of your architecture behind those tricks. If you store your default values in dict, then access to them by key. If you want to use keywords - define class.
P.S. I still don't understand why this question collect so much upvotes.
Sure...
hope this helps
def funcc(x, **kwargs):
locals().update(kwargs)
print(x, a, b, c, d)
kwargs = {'a' : 1, 'b' : 2, 'c':1, 'd': 1}
x = 1
funcc(x, **kwargs)

Give function defaults arguments from a dictionary in Python

Let's imagine I have a dict :
d = {'a': 3, 'b':4}
I want to create a function f that does the exact same thing than this function :
def f(x, a=d['a'], b=d['b']):
print(x, a, b)
(Not necessarily print, but do some stuff with the variable and calling directly from their name).
But I would like to create this function directly from the dict, that is to say, I would like to have something that look likes
def f(x, **d=d):
print(x, a, b)
and that behaves like the previously defined function. The idea is that I have a large dictionary that contains defaults values for arguments of my function, and I would like not to have to do
def f(a= d['a'], b = d['b'] ...)
I don't know if it's possible at all in python. Any insight is appreciated !
Edit : The idea is to be able to call f(5, a=3).
Edit2 : The question is not about passing arguments stored in a dict to a function but to define a function whose arguments names and defaults values are stored in a dict.
You cannot achieve this at function definition because Python determines the scope of a function statically. Although, it is possible to write a decorator to add in default keyword arguments.
from functools import wraps
def kwargs_decorator(dict_kwargs):
def wrapper(f):
#wraps(f)
def inner_wrapper(*args, **kwargs):
new_kwargs = {**dict_kwargs, **kwargs}
return f(*args, **new_kwargs)
return inner_wrapper
return wrapper
Usage
#kwargs_decorator({'bar': 1})
def foo(**kwargs):
print(kwargs['bar'])
foo() # prints 1
Or alternatively if you know the variable names but not their default values...
#kwargs_decorator({'bar': 1})
def foo(bar):
print(bar)
foo() # prints 1
Caveat
The above can be used, by example, to dynamically generate multiple functions with different default arguments. Although, if the parameters you want to pass are the same for every function, it would be simpler and more idiomatic to simply pass in a dict of parameters.
Python is designed such that the local variables of any function can be determined unambiguously by looking at the source code of the function. So your proposed syntax
def f(x, **d=d):
print(x, a, b)
is a nonstarter because there's nothing that indicates whether a and b are local to f or not; it depends on the runtime value of the dictionary, whose value could change across runs.
If you can resign yourself to explicitly listing the names of all of your parameters, you can automatically set their default values at runtime; this has already been well covered in other answers. Listing the parameter names is probably good documentation anyway.
If you really want to synthesize the whole parameter list at run time from the contents of d, you would have to build a string representation of the function definition and pass it to exec. This is how collections.namedtuple works, for example.
Variables in module and class scopes are looked up dynamically, so this is technically valid:
def f(x, **kwargs):
class C:
vars().update(kwargs) # don't do this, please
print(x, a, b)
But please don't do it except in an IOPCC entry.
try this:
# Store the default values in a dictionary
>>> defaults = {
... 'a': 1,
... 'b': 2,
... }
>>> def f(x, **kwa):
# Each time the function is called, merge the default values and the provided arguments
# For python >= 3.5:
args = {**defaults, **kwa}
# For python < 3.5:
# Each time the function is called, copy the default values
args = defaults.copy()
# Merge the provided arguments into the copied default values
args.update(kwa)
... print(args)
...
>>> f(1, f=2)
{'a': 1, 'b': 2, 'f': 2}
>>> f(1, f=2, b=8)
{'a': 1, 'b': 8, 'f': 2}
>>> f(5, a=3)
{'a': 3, 'b': 2}
Thanks Olvin Roght for pointing out how to nicely merge dictionaries in python >= 3.5
How about the **kwargs trick?
def function(arg0, **kwargs):
print("arg is", arg0, "a is", kwargs["a"], "b is", kwargs["b"])
d = {"a":1, "b":2}
function(0., **d)
outcome:
arg is 0.0 a is 1 b is 2
This question is very interesting, and it seemed different people have their different own guess about what the question really want.
I have my own too. Here is my code, which can express myself:
# python3 only
from collections import defaultdict
# only set once when function definition is executed
def kwdefault_decorator(default_dict):
def wrapper(f):
f.__kwdefaults__ = {}
f_code = f.__code__
po_arg_count = f_code.co_argcount
keys = f_code.co_varnames[po_arg_count : po_arg_count + f_code.co_kwonlyargcount]
for k in keys:
f.__kwdefaults__[k] = default_dict[k]
return f
return wrapper
default_dict = defaultdict(lambda: "default_value")
default_dict["a"] = "a"
default_dict["m"] = "m"
#kwdefault_decorator(default_dict)
def foo(x, *, a, b):
foo_local = "foo"
print(x, a, b, foo_local)
#kwdefault_decorator(default_dict)
def bar(x, *, m, n):
bar_local = "bar"
print(x, m, n, bar_local)
foo(1)
bar(1)
# only kw_arg permitted
foo(1, a=100, b=100)
bar(1, m=100, n=100)
output:
1 a default_value
1 m default_value
1 100 100
1 100 100
Posting this as an answer because it would be too long for a comment.
Be careful with this answer. If you try
#kwargs_decorator(a='a', b='b')
def f(x, a, b):
print(f'x = {x}')
print(f'a = {a}')
print(f'b = {b}')
f(1, 2)
it will issue an error:
TypeError: f() got multiple values for argument 'a'
because you are defining a as a positional argument (equal to 2).
I implemented a workaround, even though I'm not sure if this is the best solution:
def default_kwargs(**default):
from functools import wraps
def decorator(f):
#wraps(f)
def wrapper(*args, **kwargs):
from inspect import getfullargspec
f_args = getfullargspec(f)[0]
used_args = f_args[:len(args)]
final_kwargs = {
key: value
for key, value in {**default, **kwargs}.items()
if key not in used_args
}
return f(*args, **final_kwargs)
return wrapper
return decorator
In this solution, f_args is a list containing the names of all named positional arguments of f. Then used_args is the list of all parameters that have effectively been passed as positional arguments. Therefore final_kwargs is defined almost exactly like before, except that it checks if the argument (in the case above, a) was already passed as a positional argument.
For instance, this solution works beautifully with functions such as the following.
#default_kwargs(a='a', b='b', d='d')
def f(x, a, b, *args, c='c', d='not d', **kwargs):
print(f'x = {x}')
print(f'a = {a}')
print(f'b = {b}')
for idx, arg in enumerate(args):
print(f'arg{idx} = {arg}')
print(f'c = {c}')
for key, value in kwargs.items():
print(f'{key} = {value}')
f(1)
f(1, 2)
f(1, b=3)
f(1, 2, 3, 4)
f(1, 2, 3, 4, 5, c=6, g=7)
Note also that the default values passed in default_kwargs have higher precedence than the ones defined in f. For example, the default value for d in this case is actually 'd' (defined in default_kwargs), and not 'not d' (defined in f).
You can unpack values of dict:
from collections import OrderedDict
def f(x, a, b):
print(x, a, b)
d = OrderedDict({'a': 3, 'b':4})
f(10, *d.values())
UPD.
Yes, it's possible to implement this mad idea of modifying local scope by creating decorator which will return class with overriden __call__() and store your defaults in class scope, BUT IT'S MASSIVE OVERKILL.
Your problem is that you're trying to hide problems of your architecture behind those tricks. If you store your default values in dict, then access to them by key. If you want to use keywords - define class.
P.S. I still don't understand why this question collect so much upvotes.
Sure...
hope this helps
def funcc(x, **kwargs):
locals().update(kwargs)
print(x, a, b, c, d)
kwargs = {'a' : 1, 'b' : 2, 'c':1, 'd': 1}
x = 1
funcc(x, **kwargs)

python function dictionary with variable number of parameters

I am trying to create a dictionary which maps strings to functions. The problem is, the functions can have different parameter lengths. Is there a way to handle this.
For ex:
myFuncDict
{
'A' : a ----> def a(param1, param2)
'B : b ----> def b(param1)
'C' : c ----> def c(param1, param2, param3)
}
I want to call functions like:
def test(k):
myFuncDict[k](params)
How can i achieve this?
kwargs or args is one way to go, but not sure how to handle the above using those without sending extra parameters.
Python actually makes this quite simple. You can simply unpack your container of arguments into the function call using the unpacking operator *. Here is an example:
def a(x):
print(x)
def b(x, y):
print x, y
dic = {'a': a, 'b': b}
def call_func(func, params):
dic[func](*params) # *params is the magic.
call_func('a', (1))
call_func('b', (1, 2))

How to reset some function variables in Python in a loop

I have a script where I have to change some functions and reset the changes I made to them. I currently do it like this:
def a():
pass
def b():
pass
def c():
pass
def d():
pass
previous_a = a
previous_b = b
previous_c = c
a = d
b = d
c = d
# I want to make the following code block shorter.
a = previous_a
b = previous_b
c = previous_c
Instead of enumerating all the functions to reset, I would like to have a loop that iterates on a data structure (a dictionary, perhaps) and resets the function variables with their previous values. In the previous example, the current approach 3 functions is ok, but doing that for 15+ functions will produce a big code chunk that I would like to reduce.
Unfortunately, I have been unable to find a viable solution. I thought of weakrefs, but my experiments with them failed.
Just store the old functions in a dictionary:
old = {'a': a, 'b': b, 'c': c}
then use the globals() dictionary to restore them:
globals().update(old)
This only works if a, b and c were globals to begin with.
You can use the same trick to assign d to all those names:
globals().update(dict.fromkeys(old.keys(), d))
This sets the keys a, b and c to the same value d.
Function definitions are stored in the "global" scope of the module where they are declared. The global scope is a dictionary. As such, you could access/modify its values by key.
See this example:
>>> def a():
... print "a"
...
>>> def b():
... print "b"
...
>>> def x():
... print "x"
...
>>> for i in ('a', 'b'):
... globals()[i] = x
...
>>> a()
x

Handling names and values of attributes

I am probably approaching this wrong, but would appreciate being straightened out.
I would like to be able to use both the values and the names of some attributes of a class
Sample:
class DoStuff(object):
def __init__(self):
self.a="Alpha"
self.b="Beta"
self.c="Gamma"
def printStuff(self):
for thing in [self.a, self.b, self.c]:
print NAMEOFTHING, thing
What I want is:
a Alpha
b Beta
c Gamma
How can I get that?
Edit: Some confusion because my example showed me printing ALL the values. Instead I want this:
a Alpha
c Gamma
with the list for my print method just having 'a' and 'c' in it.
The way your class and for loop are set up, there is nothing you can put in place of NAMEOFTHING to get to the names of those variables. Here are a few alternatives on how you can modify your approach:
Use a dictionary instead of individual attributes, and then provide a list of keys in your for loop:
class DoStuff(object):
def __init__(self):
self.names = {"a": "Alpha",
"b": "Beta",
"c": "Gamma"}
def printStuff(self):
for name in ['a', 'b', 'c']:
print name, self.names[name]
Use the attribute names in your list and then use getattr():
class DoStuff(object):
def __init__(self):
self.a="Alpha"
self.b="Beta"
self.c="Gamma"
def printStuff(self):
for name in ['a', 'b', 'c']:
print name, getattr(self, name)
The closest you could get is:
for thing in ['a', 'b', 'c']:
print thing, getattr(self, thing)
Variables can have multiple names and aren't aware of their own name, so if you know it's 'a', then you can use getattr to resolve the lookup.
Another option (although not greatly different than above)
to_get = ['a', 'b', 'c']
from operator import attrgetter
blah = zip(to_get, attrgetter(*to_get)(self))
Following on Jon's answer, you might also find it helpful to set the list of attributes you want to include in the output as an optional argument:
def printStuff(self, included=['a', 'c']):
for thing in included:
print thing, getattr(self, thing)
which makes it easy to generate both outputs, by saying DoStuff().printStuff() to get just the values of a and c, or DoStuff().printStuff(['a', 'b', 'c']) to get all three. Of course, this allows for varying output—if it's an explicit design-goal that the set of fields being printed is invariant, this would be counterproductive.
# You can use __dict__
>>> class x:
>>> def __init__(self):
>>> self.a = 1
>>> self.b = 2
>>> self.c = 3
>>> self.d = 4
>>> def prnt(self):
>>> limit = "b", "c"
>>> return {k:v for (k, v) in self.__dict__.iteritems()if k in limit}
>>> r = x()
>>> print r.prnt()
{'b': 2, 'c': 3}
# __dict__ can be also done outside the class
limit = "b", "c"
print {k:v for (k, v) in r.__dict__.iteritems()if k in limit}

Categories