How can I get a list of all the names that point to a Python object?
import my_function from example
a = my_function
b = my_function
get_names(my_function)
[a, b]
Edit: The goal is to help find how to monkey patch an object that is loaded in an unknown way.
Search the global namespace for objects matching via identity, and report the keys (names).
def my_func():
pass
a = my_func
b = my_func
def get_names(x):
for k, v in globals().items():
if v is x:
yield k
print(list(get_names(my_func))) #prints ['my_func', 'a', 'b']
See below (use globals and make sure you do not return the function itself)
from example import my_function
def get_names(func):
result = []
for k,v in globals().items():
if v == func and k not in str(v).split():
result.append(k)
return result
def foo():
pass
a = my_function
b = my_function
c = foo
print(get_names(my_function))
example.py
def my_function():
pass
output
['a','b']
Let's imagine I have a dict :
d = {'a': 3, 'b':4}
I want to create a function f that does the exact same thing than this function :
def f(x, a=d['a'], b=d['b']):
print(x, a, b)
(Not necessarily print, but do some stuff with the variable and calling directly from their name).
But I would like to create this function directly from the dict, that is to say, I would like to have something that look likes
def f(x, **d=d):
print(x, a, b)
and that behaves like the previously defined function. The idea is that I have a large dictionary that contains defaults values for arguments of my function, and I would like not to have to do
def f(a= d['a'], b = d['b'] ...)
I don't know if it's possible at all in python. Any insight is appreciated !
Edit : The idea is to be able to call f(5, a=3).
Edit2 : The question is not about passing arguments stored in a dict to a function but to define a function whose arguments names and defaults values are stored in a dict.
You cannot achieve this at function definition because Python determines the scope of a function statically. Although, it is possible to write a decorator to add in default keyword arguments.
from functools import wraps
def kwargs_decorator(dict_kwargs):
def wrapper(f):
#wraps(f)
def inner_wrapper(*args, **kwargs):
new_kwargs = {**dict_kwargs, **kwargs}
return f(*args, **new_kwargs)
return inner_wrapper
return wrapper
Usage
#kwargs_decorator({'bar': 1})
def foo(**kwargs):
print(kwargs['bar'])
foo() # prints 1
Or alternatively if you know the variable names but not their default values...
#kwargs_decorator({'bar': 1})
def foo(bar):
print(bar)
foo() # prints 1
Caveat
The above can be used, by example, to dynamically generate multiple functions with different default arguments. Although, if the parameters you want to pass are the same for every function, it would be simpler and more idiomatic to simply pass in a dict of parameters.
Python is designed such that the local variables of any function can be determined unambiguously by looking at the source code of the function. So your proposed syntax
def f(x, **d=d):
print(x, a, b)
is a nonstarter because there's nothing that indicates whether a and b are local to f or not; it depends on the runtime value of the dictionary, whose value could change across runs.
If you can resign yourself to explicitly listing the names of all of your parameters, you can automatically set their default values at runtime; this has already been well covered in other answers. Listing the parameter names is probably good documentation anyway.
If you really want to synthesize the whole parameter list at run time from the contents of d, you would have to build a string representation of the function definition and pass it to exec. This is how collections.namedtuple works, for example.
Variables in module and class scopes are looked up dynamically, so this is technically valid:
def f(x, **kwargs):
class C:
vars().update(kwargs) # don't do this, please
print(x, a, b)
But please don't do it except in an IOPCC entry.
try this:
# Store the default values in a dictionary
>>> defaults = {
... 'a': 1,
... 'b': 2,
... }
>>> def f(x, **kwa):
# Each time the function is called, merge the default values and the provided arguments
# For python >= 3.5:
args = {**defaults, **kwa}
# For python < 3.5:
# Each time the function is called, copy the default values
args = defaults.copy()
# Merge the provided arguments into the copied default values
args.update(kwa)
... print(args)
...
>>> f(1, f=2)
{'a': 1, 'b': 2, 'f': 2}
>>> f(1, f=2, b=8)
{'a': 1, 'b': 8, 'f': 2}
>>> f(5, a=3)
{'a': 3, 'b': 2}
Thanks Olvin Roght for pointing out how to nicely merge dictionaries in python >= 3.5
How about the **kwargs trick?
def function(arg0, **kwargs):
print("arg is", arg0, "a is", kwargs["a"], "b is", kwargs["b"])
d = {"a":1, "b":2}
function(0., **d)
outcome:
arg is 0.0 a is 1 b is 2
This question is very interesting, and it seemed different people have their different own guess about what the question really want.
I have my own too. Here is my code, which can express myself:
# python3 only
from collections import defaultdict
# only set once when function definition is executed
def kwdefault_decorator(default_dict):
def wrapper(f):
f.__kwdefaults__ = {}
f_code = f.__code__
po_arg_count = f_code.co_argcount
keys = f_code.co_varnames[po_arg_count : po_arg_count + f_code.co_kwonlyargcount]
for k in keys:
f.__kwdefaults__[k] = default_dict[k]
return f
return wrapper
default_dict = defaultdict(lambda: "default_value")
default_dict["a"] = "a"
default_dict["m"] = "m"
#kwdefault_decorator(default_dict)
def foo(x, *, a, b):
foo_local = "foo"
print(x, a, b, foo_local)
#kwdefault_decorator(default_dict)
def bar(x, *, m, n):
bar_local = "bar"
print(x, m, n, bar_local)
foo(1)
bar(1)
# only kw_arg permitted
foo(1, a=100, b=100)
bar(1, m=100, n=100)
output:
1 a default_value
1 m default_value
1 100 100
1 100 100
Posting this as an answer because it would be too long for a comment.
Be careful with this answer. If you try
#kwargs_decorator(a='a', b='b')
def f(x, a, b):
print(f'x = {x}')
print(f'a = {a}')
print(f'b = {b}')
f(1, 2)
it will issue an error:
TypeError: f() got multiple values for argument 'a'
because you are defining a as a positional argument (equal to 2).
I implemented a workaround, even though I'm not sure if this is the best solution:
def default_kwargs(**default):
from functools import wraps
def decorator(f):
#wraps(f)
def wrapper(*args, **kwargs):
from inspect import getfullargspec
f_args = getfullargspec(f)[0]
used_args = f_args[:len(args)]
final_kwargs = {
key: value
for key, value in {**default, **kwargs}.items()
if key not in used_args
}
return f(*args, **final_kwargs)
return wrapper
return decorator
In this solution, f_args is a list containing the names of all named positional arguments of f. Then used_args is the list of all parameters that have effectively been passed as positional arguments. Therefore final_kwargs is defined almost exactly like before, except that it checks if the argument (in the case above, a) was already passed as a positional argument.
For instance, this solution works beautifully with functions such as the following.
#default_kwargs(a='a', b='b', d='d')
def f(x, a, b, *args, c='c', d='not d', **kwargs):
print(f'x = {x}')
print(f'a = {a}')
print(f'b = {b}')
for idx, arg in enumerate(args):
print(f'arg{idx} = {arg}')
print(f'c = {c}')
for key, value in kwargs.items():
print(f'{key} = {value}')
f(1)
f(1, 2)
f(1, b=3)
f(1, 2, 3, 4)
f(1, 2, 3, 4, 5, c=6, g=7)
Note also that the default values passed in default_kwargs have higher precedence than the ones defined in f. For example, the default value for d in this case is actually 'd' (defined in default_kwargs), and not 'not d' (defined in f).
You can unpack values of dict:
from collections import OrderedDict
def f(x, a, b):
print(x, a, b)
d = OrderedDict({'a': 3, 'b':4})
f(10, *d.values())
UPD.
Yes, it's possible to implement this mad idea of modifying local scope by creating decorator which will return class with overriden __call__() and store your defaults in class scope, BUT IT'S MASSIVE OVERKILL.
Your problem is that you're trying to hide problems of your architecture behind those tricks. If you store your default values in dict, then access to them by key. If you want to use keywords - define class.
P.S. I still don't understand why this question collect so much upvotes.
Sure...
hope this helps
def funcc(x, **kwargs):
locals().update(kwargs)
print(x, a, b, c, d)
kwargs = {'a' : 1, 'b' : 2, 'c':1, 'd': 1}
x = 1
funcc(x, **kwargs)
Why does the following code work while the code after it breaks?
I'm not sure how to articulate my question in english, so I attached the smallest code I could come up with to highlight my problem.
(Context: I'm trying to create a terminal environment for python, but for some reason the namespaces seem to be messed up, and the below code seems to be the essence of my problem)
No errors:
d={}
exec('def a():b',d)
exec('b=None',d)
exec('a()',d)
Errors:
d={}
exec('def a():b',d)
d=d.copy()
exec('b=None',d)
d=d.copy()
exec('a()',d)
It is because the d does not use the globals provided by exec; it uses the mapping to which it stored the reference in the first exec. While you set 'b' in the new dictionary, you never set b in the globals of that function.
>>> d={}
>>> exec('def a():b',d)
>>> exec('b=None',d)
>>> d['a'].__globals__ is d
True
>>> 'b' in d['a'].__globals__
True
vs
>>> d={}
>>> exec('def a():b',d)
>>> d = d.copy()
>>> exec('b=None',d)
>>> d['a'].__globals__ is d
False
>>> 'b' in d['a'].__globals__
False
If exec didn't work this way, then this too would fail:
mod.py
b = None
def d():
b
main.py
from mod import d
d()
A function will remember the environment where it was first created.
It is not possible to change the dictionary that an existing function points to. You can either modify its globals explicitly, or you can make another function object altogether:
from types import FunctionType
def rebind_globals(func, new_globals):
f = FunctionType(
code=func.__code__,
globals=new_globals,
name=func.__name__,
argdefs=func.__defaults__,
closure=func.__closure__
)
f.__kwdefaults__ = func.__kwdefaults__
return f
def foo(a, b=1, *, c=2):
print(a, b, c, d)
# add __builtins__ so that `print` is found...
new_globals = {'d': 3, '__builtins__': __builtins__}
new_foo = rebind_globals(foo, new_globals)
new_foo(a=0)
I have a script where I have to change some functions and reset the changes I made to them. I currently do it like this:
def a():
pass
def b():
pass
def c():
pass
def d():
pass
previous_a = a
previous_b = b
previous_c = c
a = d
b = d
c = d
# I want to make the following code block shorter.
a = previous_a
b = previous_b
c = previous_c
Instead of enumerating all the functions to reset, I would like to have a loop that iterates on a data structure (a dictionary, perhaps) and resets the function variables with their previous values. In the previous example, the current approach 3 functions is ok, but doing that for 15+ functions will produce a big code chunk that I would like to reduce.
Unfortunately, I have been unable to find a viable solution. I thought of weakrefs, but my experiments with them failed.
Just store the old functions in a dictionary:
old = {'a': a, 'b': b, 'c': c}
then use the globals() dictionary to restore them:
globals().update(old)
This only works if a, b and c were globals to begin with.
You can use the same trick to assign d to all those names:
globals().update(dict.fromkeys(old.keys(), d))
This sets the keys a, b and c to the same value d.
Function definitions are stored in the "global" scope of the module where they are declared. The global scope is a dictionary. As such, you could access/modify its values by key.
See this example:
>>> def a():
... print "a"
...
>>> def b():
... print "b"
...
>>> def x():
... print "x"
...
>>> for i in ('a', 'b'):
... globals()[i] = x
...
>>> a()
x
I'm having problems using copy.copy() and copy.deepcopy() and Python's scope. I call a function and a dictionary is passed as an argument. The dictionary copies a local dictionary but the dictionary does not retain the values that were copied.
def foo (A, B):
localDict = {}
localDict['name'] = "Simon"
localDict['age'] = 55
localDict['timestamp'] = "2011-05-13 15:13:22"
localDict['phone'] = {'work':'555-123-1234', 'home':'555-771-2190', 'mobile':'213-601-9100'}
A = copy.deepcopy(localDict)
B['me'] = 'John Doe'
return
def qua (A, B):
print "qua(A): ", A
print "qua(B): ", B
return
# *** MAIN ***
#
# Test
#
A = {}
B = {}
print "initial A: ", A
print "initial B: ", B
foo (A, B)
print "after foo(A): ", A
print "after foo(B): ", B
qua (A, B)
The copy.deepcopy works and within function "foo", dict A has the contents of localDict. But outside the scope of "foo", dict A is empty. Meanwhile, after being assigned a key and value, dict B retains the value after coming out of function 'foo'.
How do I maintain the values that copy.deepcopy() copies outside of function "foo"?
Ponder this:
>>> def foo(d):
... d = {1: 2}
...
>>> d = {3: 4}
>>> d
{3: 4}
>>> foo(d)
>>> d
{3: 4}
>>>
Inside foo, d = {1: 2} binds some object to the name d. This name is local, it does not modify the object d used to point to. On the other hand:
>>> def bar(d):
... d[1] = 2
...
>>> bar(d)
>>> d
{1: 2, 3: 4}
>>>
So this has nothing to do with your use of (deep)copy, it's just the way "variables" in Python work.
What's happening is that inside foo() you create a copy of B and assigns it to A, shadowing the empty dict you sent as an argument by reassigning a new object to the same name. Now inside the function you have a new dict called A, completely unrelated to the A outside in the global scope, and it gets garbage collected when the function ends, so actually nothing happens, only the 'me' key added to B.
If instead of:
A = copy.deepcopy(localDict)
You do something like this, it would work as you expect:
C = copy.deepcopy(localDict)
A.update(C)
But it seems like what you really want has nothing to do with the copy module and would be something like this:
def foo (A, B):
A['name'] = "Simon"
A['age'] = 55
A['timestamp'] = "2011-05-13 15:13:22"
A['phone'] = {'work':'555-123-1234', 'home':'555-771-2190', 'mobile':'213-601-9100'}
B['me'] = 'John Doe'
The behavior you are seeing isn't related to deepcopy(), you are reassigning the name A to a new value, and that assignment will not carry over unless you use the global keyword. The reason the changes to B are persistent is that you are modifying a mutable variable, here are two options for how you could get the behavior you want:
Instead of using localDict, just modify A:
def foo(A, B):
A['name'] = "Simon"
A['age'] = 55
A['timestamp'] = "2011-05-13 15:13:22"
A['phone'] = {'work':'555-123-1234', 'home':'555-771-2190', 'mobile':'213-601-9100'}
B['me'] = 'John Doe'
return
Use A.update(copy.deepcopy(localDict)) instead of A = copy.deepcopy(localDict):
def foo(A, B):
localDict = {}
localDict['name'] = "Simon"
localDict['age'] = 55
localDict['timestamp'] = "2011-05-13 15:13:22"
localDict['phone'] = {'work':'555-123-1234', 'home':'555-771-2190', 'mobile':'213-601-9100'}
A.update(copy.deepcopy(localDict))
B['me'] = 'John Doe'
return