Scope of a lambda function when passed to another module - python

I'm trying to wrap my head around the scope of Lambda functions. I noticed that I can create a lambda function in one module A and pass it to a function in another module B, but able to call functions from module A .
Is this bad practice to pass lambda functions around like this, or is there a more preferred (Best Practice) method for handling this?
target.py
class TestLambda():
def __init__(self,name):
self.name = name
def call(self,func):
func(self.name)
source.py
from target import TestLambda
def sayHello(name):
print("Hello {}".format(name))
func = lambda n: sayHello(n)
l = TestLambda("John")
l.call(func)
output
➜ lambda-test python3 source.py
Hello John

The key here is that every function object keeps a reference to the scope in which the function was defined.
>>> func.__globals__['sayHello']
<function sayHello at 0x1085f2680>
This is what lets func still call sayHello even when called from a function with a different global scope. Name lookups are static; they depend only on the lexical (static) context in which the names appear, not the runtime (dynamic) context. There are languages (Lisp, shell, etc) that use dynamic scoping; lexical or static scoping is widely regarded as easier to reason about, though.

I don't think there are any issues with creating and passing lambda functions around, as opposed to using
def func(name):
but I don't see the point in defining a function as a lambda expression if you are going to use it in a different module.
The result is the same, the only difference is that you're not consistent with your function definitions.
The Python docs specifically discourage this:
Always use a def statement instead of an assignment statement that
binds a lambda expression directly to an identifier.
Yes:
def f(x): return 2*x
No:
f = lambda x: 2*x
The first form means that the name of the resulting
function object is specifically 'f' instead of the generic ''.
This is more useful for tracebacks and string representations in
general. The use of the assignment statement eliminates the sole
benefit a lambda expression can offer over an explicit def statement
(i.e. that it can be embedded inside a larger expression)

Related

__closure__ attribute of function object always be 'None' when defining func inside exec()

EDIT2:
A minimal demonstration is:
code = """\
a=1
def f1():
print(a)
print(f1.__closure__)
f1()
"""
def foo():
exec(code)
foo()
Which gives:
None
Traceback (most recent call last):
File "D:/workfiles/test_eval_rec.py", line 221, in <module>
foo()
File "D:/workfiles//test_eval_rec.py", line 219, in foo
exec(code)
File "<string>", line 5, in <module>
File "<string>", line 3, in f1
NameError: name 'a' is not defined
It can be seen that the __closure__ attribute of function defined inside code str passed to exec() is None, making calling the function fails.
Why does this happen and how can I define a function successfully?
I find several questions that may be related.
Closure lost during callback defined in exec()
Using exec() with recursive functions
Why exec() works differently when invoked inside of function and how to avoid it
Why are closures broken within exec?
NameError: name 'self' is not defined IN EXEC/EVAL
These questions are all related to "defining a function insdie exec()". I think the fourth question here is closest to the essence of these problems. The common cause of these problems is that when defining a function in exec(), the __closure__ attribute of the function object can not be set correctly and will always be None. However, many existing answers to this question didn't realize this point.
Why these questions are caused by wrong __closure__:
When defining a function, __closure__ attribute is set to a dict that contains all local symbols (at the place where the keyword def is used) that is used inside the newly defined funtion. When calling a function, local symbol tables will be retrived from the __closure__ attribute. Since the __closure__ is set to None, the local symbol tables can not be retrived as expected, making the function call fail.
These answers work by making None a correct __closure__ attribute:
Existing solutions to the questions listed above solve these problems by getting the function definition rid of the usage of local symbol, i.e, they make the local symbols used(variable, function definition) global by passing globals() as locals of exec or by using keyword global explicitly in the code string.
Why existing solution unsatisfying:
These solutions I think is just an escape of the core problem of setting __closure__ correctly when define a functioni inside exec(). And as symbols used in the function definition is made global, these solutions will produce redundant global symbol which I don't want.
Original Questions:
(You May ignore this session, I have figured something out, and what I currently want to ask is described as the session EDIT2. The original question can be viewed as a sepecial case of the question described in session EDIT2)
original title of this question is: Wrapping class function to new function with exec() raise NameError that ‘self’ is not defined
I want to wrap an existing member function to a new class function. However, exec() function failed with a NameError that ‘self’ is not defined.
I did some experiment with the following codes. I called globals() and locals() in the execed string, it seems that the locals() is different in the function definition scope when exec() is executed. "self" is in the locals() when in exec(), however, in the function definition scope inside the exec(), "self" is not in the locals().
class test_wrapper_function():
def __init__(self):
# first wrapper
def temp_func():
print("locals() inside the function definition without exec:")
print(locals())
return self.func()
print("locals() outside the function definition without exec:")
print(locals())
self.wrappered_func1 = temp_func
# third wrapper using eval
define_function_str = '''def temp_func():
print("locals() inside the function definition:")
print(locals())
print("globals() inside the function definition:")
print(globals())
return self.func()
print("locals() outside the function definition:")
print(locals())
print("globals() outside the function definition:")
print(globals())
self.wrappered_func2 = temp_func'''
exec(define_function_str)
# call locals() here, it will contains temp_func
def func(self):
print("hi!")
t = test_wrapper_function()
print("**********************************************")
t.wrappered_func1()
t.wrappered_func2()
I have read this link. In the exec(), memeber function, attribute of "self" can be accessed without problem, while in the function difinition in the exec(), "self" is not available any more. Why does this happen?
Why I want to do this:
I am building a PyQt program. I want to create several similar slots(). These slots can be generated by calling one member function with different arguments. I decided to generate these slots using exec() function of python. I also searched with the keyword "nested name scope in python exec", I found this question may be related, but there is no useful answer.
To be more specific. I want to define a family of slots like func_X (X can be 'a', 'b', 'c'...), each do something like self.do_something_on(X). Here, do_something is a member function of my QWidget. So I use a for loop to create these slots function. I used codes like this:
class MyWidget():
def __init__(self):
self.create_slots_family()
def do_something(self, character):
# in fact, this function is much more complex. Do some simplification.
print(character)
def create_slots_i(self, character):
# want to define a function like this:
# if character is 'C', define self.func_C such that self.func_C() works like self.do_something(C)
create_slot_command_str = "self.func_" + character + " = lambda:self.do_something('" + character + "')"
print(create_slot_command_str)
exec(create_slot_command_str)
def create_slots_family(self):
for c in ["A", "B", "C", "D"]:
self.create_slots_i(c)
my_widget = MyWidget()
my_widget.func_A()
Note that, as far as I know, the Qt slots should not accept any parameter, so I have to wrap self.do_something(character) to be a series function self.func_A, self.func_C and so on for all the possible characters.
So the above is what I want to do orignially.
EDIT1:
(You May ignore this session, I have figured something out, and what I currently want to ask is described as the session EDIT2. This simplified version of original question can also be viewed as a sepecial case of the question described in session EDIT2)
As #Mad Physicist suggested. I provide a simplified version here, deleting some codes used for experiments.
class test_wrapper_function():
def __init__(self):
define_function_str = '''\
def temp_func():
return self.func()
self.wrappered_func2 = temp_func'''
exec(define_function_str)
def func(self):
print("hi!")
t = test_wrapper_function()
t.wrappered_func2()
I expected this to print a "hi". However, I got the following exception:
Traceback (most recent call last):
File "D:/workfiles/test_eval_class4.py", line 12, in <module>
t.wrappered_func2()
File "<string>", line 2, in temp_func
NameError: name 'self' is not defined
Using Exec
You've already covered most of the problems and workarounds with exec, but I feel that there is still value in adding a summary.
The key issue is that exec only knows about globals and locals, but not about free variables and the non-local namespace. That is why the docs say
If exec gets two separate objects as globals and locals, the code will be executed as if it were embedded in a class definition.
There is no way to make it run as though it were in a method body. However, as you've already noted, you can make exec create a closure and use that instead of the internal namespace by adding a method body to your snippet. However, there are still a couple of subtle restrictions there.
Your example of what you are trying to do showcases the issues perfectly, so I will use a modified version of that. The goal is to make a method that binds to self and has a variable argument in the exec string.
class Test:
def create_slots_i(self, c):
create_slot_command_str = f"self.func_{c} = lambda: self.do_something('{c}')"
exec(create_slot_command_str)
def do_something(self, c):
print(f'I did {c}!')
There are different ways of getting exec to "see" variables: literals, globals, and internal closures.
Literals. This works robustly, but only for simple types that can be easily instantiated from a string. The usage of c above is a perfect example. This will not help you with a complex object like self:
>>> t = Test()
>>> t.create_slots_i('a')
>>> t.func_a()
...
NameError: name 'self' is not defined
This happens exactly because exec has no concept of free variables. Since self is passed to it via the default locals(), it does not bind the reference to a closure.
globals. You can pass in a name self to exec via globals. There are a couple of ways of doing this, each with its own issues. Remember that globals are accessed by a function through its __globals__ (look at the table under "Callable types") attribute. Normally __globals__ refers to the __dict__ of the module in which a function is defined. In exec, this is the case by default as well, since that's what globals() returns.
Add to globals: You can create a global variable named self, which will make your problem go away, sort of:
>>> self = t
>>> t.func_a()
I did a!
But of course this is a house of cards that falls apart as soon as you delete, self, modify it, or try to run this on multiple instances:
>>> del self
>>> t.func_a()
...
NameError: name 'self' is not defined
Copy globals. A much more versatile solution, on the surface of it, is to copy globals() when you run exec in create_slots_i:
def create_slots_i(self, c):
create_slot_command_str = f"self.func_{c} = lambda: self.do_something('{c}')"
g = globals().copy()
g['self'] = self
exec(create_slot_command_str, g)
This appears to work normally, and for a very limited set of cases, it actually does:
>>> t = Test()
>>> t.create_slots_i('a')
>>> t.func_a()
I did a!
But now, your function's __globals__ attribute is no longer bound to the module you created it in. If it uses any other global values, especially ones that might change, you will not be able to see the changes. For limited functionality, this is OK, but in the general case, it can be a severe handicap.
Internal Closures. This is the solution you already hit upon, where you create a closure within the exec string to let it know that you have a free variable by artificial means. For example:
class Test:
def create_slots_i(self, c):
create_slot_command_str = f"""def make_func(self):
def func_{c}():
self.do_something('{c}')
return func_{c}
self.func_{c} = make_func(self)"""
g = globals().copy()
g['self'] = self
exec(create_slot_command_str, g)
def do_something(self, c):
print(f'I did {c}!')
This approach works completely:
>>> t = Test()
>>> t.create_slots_i('a')
>>> t.func_a()
I did a!
The only real drawbacks here are security, which is always a problem with exec, and the sheer awkwardness of this monstrosity.
A Better Way
Since you are already creating closures, there is really no need to use exec at all. In fact, the only thing you are really doing is creating methods so that self.func_... will bind the method for you, since you need a function with the signature of your slot and access to self. You can write a simple method that will generate functions that you can assign to your slots directly. The advantage of doing it this way is that (a) you avoid calling exec entirely, and (b) you don't need to have a bunch of similarly named auto-generated methods polluting your class namespace. The slot generator would look something like this:
def create_slots_i(self, c):
def slot_func():
self.do_something(c) # This is a real closure now
slot_func.__name__ = f'func_{c}'
return slot_func
Since you will not be referring to these function objects anywhere except your slots, __name__ is the only way to get the "name" under which they were stored. That is the same thing that def does for you under the hood.
You can now assign slots directly:
some_widget.some_signal.connect(self.create_slots_i('a'))
Note
I originally had a more complex approach in mind for you, since I thought you cared about generating bound methods, instead of just setting __name__. In case you have a sufficiently complex scenario where it still applies, here is my original blurb:
A quick recap of the descriptor protocol: when you bind a function with the dot operator, e.g., t.func_a, python looks at the class for descriptors with that name. If your class has a data descriptor (like property, but not functions), then that descriptor will shadow anything you may have placed in the instance __dict__. However, if you have a non-data descriptor (one a __get__ method but without a __set__ method, like a function object), then it will only be bound if an instance attribute does not shadow it. Once this decision has been made, actually invoking the descriptor protocol involves calling type(t).func_a.__get__(t). That's how a bound method knows about self.
Now you can return a bound method from within your generator:
def create_slots_i(self, c):
def slot_func(self):
self.do_something(c) # This is a closure on `c`, but not on `self` until you bind it
slot_func.__name__ = f'func_{c}'
return slot_func.__get__(self)
Why this phenomena happen:
Actually the answer of the question 4 listed above can answer this question.
When call exec() on one code string, the code string is first compiled. I suppose that during compiling, the provided globals and locals is not considered. The symbol in the exec()ed code str is compiled to be in the globals. So the function defined in the code str will be considered using global variables, and thus __closure__ is set to None.
Refer to this answer for more information about what the func exec does.
How to deal with this phenomena:
Imitating the solutions provided in the previous questions, for the minimal demostration the question, it can also be modified this way to work:
a=1 # moving out of the variable 'code'
code = """\
def f1():
print(a)
print(f1.__closure__)
f1()
"""
def foo():
exec(code)
foo()
Although the __closure__ is still None, the exception can be avoided because now only the global symbol is needed and __closure__ should also be None if correctly set. You can read the part The reason why the solutions work in the question body for more information.
This was originally added in Revision 4 of the question.
TL;DR
To set correct __closure__ attribute of function defined in the code string passed to exec() function. Just wrap the total code string with a function definition.
I provide an example here to demonstrate all possible situations. Suppose you want to define a function named foo inside a code string used by exec(). The foo use function, variables that defined inside and outside the code string:
def f1():
outside_local_variable = "this is local variable defined outside code str"
def outside_local_function():
print("this is function defined outside code str")
code = """\
local_variable = "this is local variable defined inside code str"
def local_function():
print("this is function defined inside code str")
def foo():
print(local_variable)
local_function()
print(outside_local_variable)
outside_local_function()
foo()
"""
exec(code)
f1()
It can be wrapper like this:
def f1():
outside_local_variable = "this is local variable defined outside code str"
def outside_local_function():
print("this is function defined outside code str")
code = """\
def closure_helper_func(outside_local_variable, outside_local_function):
local_variable = "this is local variable defined inside code str"
def local_function():
print("this is function defined inside code str")
def foo():
print(local_variable)
local_function()
print(outside_local_variable)
outside_local_function()
foo()
closure_helper_func(outside_local_variable, outside_local_function)
"""
exec(code)
f1()
Detailed explanation:
Why the __closure__ attribute is not corretly set:
please refer to The community wiki answer.
How to set the __closure__ attribute to what's expected:
Just wrap the whole code str with a helper function definition and call the helper function once, then during compiling, the variables are considered to be local, and will be stored in the __closure__ attribute.
For the minimal demonstration in the question, it can be modified to following:
code = """\
def closure_helper_func():
a=1
def f1():
print(a)
print(f1.__closure__)
f1()
closure_helper_func()
"""
def foo():
exec(code)
foo()
This output as expected
(<cell at 0x0000019CE6239A98: int object at 0x00007FFF42BFA1A0>,)
1
The example above provide a way to add symbols that defined in the code str to the __closure__ For example, in the minimal demo, a=1 is a defined inside the code str. But what if one want to add the local symbols defined outside the code str? For example, in the code snippet in EDIT1 session, the self symbol needs to be added to the __closure__, and the symbol is provided in the locals() when exec() is called. Just add the name of these symbols to the arguments of helper function and you can handle this situation.
The following shows how to fix the problem in EDIT1 session.
class test_wrapper_function():
def __init__(self):
define_function_str = '''\
def closure_helper_func(self):
def temp_func():
return self.func()
self.wrappered_func2 = temp_func
closure_helper_func(self)
'''
exec(define_function_str)
def func(self):
print("hi!")
t = test_wrapper_function()
t.wrappered_func2()
The following shows how to fix the codes in the session "Why I want to do this"
class MyWidget():
def __init__(self):
self.create_slots_family()
def do_something(self, character):
# in fact, this function is much more complex. Do some simplification.
print(character)
def create_slots_i(self, character):
# want to define a function like this:
# if character is 'C', define self.func_C such that self.func_C() works like self.do_something(C)
# create_slot_command_str = "self.func_" + character + " = lambda:self.do_something('" + character + "')"
create_slot_command_str = """
def closure_helper_func(self):
self.func_""" + character + " = lambda:self.do_something('" + character + """')
closure_helper_func(self)
"""
# print(create_slot_command_str)
exec(create_slot_command_str)
def create_slots_family(self):
for c in ["A", "B", "C", "D"]:
self.create_slots_i(c)
my_widget = MyWidget()
my_widget.func_A()
This solution seems to be too tricky. However, I can not find a more elegant way to declare that some variables should be local symbol during compiling.

Function Decorators in Python

I am a beginner in python and I am trying to wrap my head around function decorators in python. And I cannot figure out how functions return functions.
I mean in what order does interpreter interprets this function:
def decorator(another_func):
def wrapper():
print('before actual function')
return another_func()
print('pos')
return wrapper
And what is the difference between these 2 statements:-
return wrapper
AND
return wrapper()
I am using Head First Python, but this topic I feel is not described very well in there, please suggest any video or a good resource so that I can understand it.
The key to understanding the difference is understanding that everything is an object in python, including functions. When you use the name of the function without parenthesis (return wrapper), you are returning the actual function itself. When you use parenthesis, you're calling the function. Take a look at the following example code:
def foo(arg):
return 2
bar = foo
baz = foo()
qux = bar()
bar()
If you print baz or qux, it will print two. If you print bar, it will give you the memory address to reference the function, not a number. But, if you call the function, you are now printing the results of th
I cannot figure out how functions return functions.
As already explained by LTheriault, in python everything is an object. Not only this, but also everything happens at runtime - the def statement is an executable statement, which creates a function object from the code within the def block and bind this object to the function's name in the current namespace - IOW it's mostly syntactic sugar for some operations you could code manually (a very welcome syntactic sugar though - building a function object "by hand" is quite a lot of work).
Note that having functions as "first-class citizens" is not Python specific - that's the basis of functional programming.
I mean in what order does interpreter interprets this function:
def decorator(another_func):
def wrapper():
print('before actual function')
return another_func()
print('pos')
return wrapper
Assuming the decorator function is declared at the module top-level: the runtime first takes the code block that follows the def statement, compiles it into a code object, creates a function object (instance of type 'function') from this code object and a couple other things (the arguments list etc), and finally binds this function object to the declared name (nb: 'binds' => "assigns to").
The inner def statement is actually only executed when the outer function is called, and it's executed anew each time the outer function is called - IOW, each call to decorator returns a new function instance.
The above explanation is of course quite simplified (hence partially inexact), but it's enough to understand the basic principle.

Pythonic, elegant way of dynamically defining a list of statically defined functions?

I have only started learning Python recently. Let me explain what I am trying to accomplish. I have this .py script that basically has several functions (hard-coded into the script) that all need to be added to a single list, so that I can get the function I require by simply using the index operator as follows:
needed_function = function_list[needed_function_index]
My first attempt at implementing this resulted in the following code structure:
(imports)
function_list = []
(other global variables)
def function_0 = (...)
function_list.append(function_0)
def function_1 = (...)
function_list.append(function_1)
def function_2 = (...)
function_list.append(function_2)
(rest of code)
But I don't like that solution since it isn't very elegant. My goal is to be able to simply add the function definition to the script (without the append call) and the script will automatically add it to the list of functions.
I've thought of defining all the functions within another function, but I don't think I'd get anywhere with those. I thought of maybe "tagging" each function with a decorator but I realized that decorators (if I understand them correctly) are called every time a function is called, and not just once.
After some time I came up with this solution:
(imports)
(global variables)
def function_0 = (...)
def function_1 = (...)
def function_2 = (...)
function_list= [globals()[x] for x in globals() if re.match('^function_[0-9]+$', x)]
(rest of code)
I like it a bit more as a solution, but my only qualm with it is that I would prefer, for cleanliness purposes, to completely define function_list at the top of the script. However, I cannot do that since an invocation of globals() at the top of the script would not contain the functions since they have not been defined yet.
Perhaps I should simply settle for a less elegant solution, or maybe I am not writing my script in an idiomatic way. Whatever the case, any input and suggestions are appreciated.
You are mistaken about decorators. They are invoked once when the function is defined, and the function they return is then the value assigned to the function name, and it is that function that is invoked each time. You can do what you want in a decorator without incurring runtime overhead.
my_functions = []
def put_in_list(fn):
my_functions.append(fn)
return fn
#put_in_list
def function1():
pass
#put_in_list
def function2():
pass
PS: You probably don't need to worry about runtime overhead anyway.
PPS: You are also trying to optimize odd things, you might be better off simply maintaining a list in your file. How often are you adding functions, and with how little thought? A list is not difficult to update in the source file.
Example of using a decorator that does not add any overhead to the function call:
my_list = []
def add_to_my_list(func):
print 'decorator called'
my_list.append(func)
return func
#add_to_my_list
def foo():
print 'foo called'
#add_to_my_list
def bar():
print 'foo called'
print '-- done defining functions --'
my_list[0]()
my_list[1]()
One way to solve this problem would be to put all those functions into a single container, then extract the functions from the container to build your list.
The most Pythonic container would be a class. I'm not saying to make them member functions of the class; just define them in the class.
class MyFunctions(object):
def func0():
pass
def func1():
pass
lst_funcs = [x for x in MyFunctions.__dict__ if not x.startswith('_')]
But I like the decorator approach even better; that's probably the most Pythonic solution.

python: alternative to anonymous functions

Python doesn't support complicated anonymous functions. What's a good alternative? For example:
class Calculation:
def __init__(self, func):
self.func = func
def __call__(self, data):
try:
# check if the value has already been calculated
# if it has, it would be cached under key = self.func
return data[self.func]
except KeyError:
pass # first-time call; calculate and cache the values
data[self.func] = self.func(data)
return data[self.func]
# with a simple function, which can be represented using lambda, this works great
f1 = Calculation(lambda data : data['a'] * data['b'])
# with a complicated function, I can do this:
def f2_aux:
# some complicated calculation, which isn't suitable for a lambda one-liner
f2 = Calculation(f2_aux)
Is this a reasonable design to begin with?
If so, is there a way to avoid the ugliness of f*_aux for each f* that I define in the module?
UPDATE:
Example of use:
d = {'a' : 3, 'b' : 6}
# computes 3 * 6
# stores 18 in d under a key <function <lambda> at ...>
# returns 18
f1(d)
# retrieves 18 from d[<function <lambda> at ...>]
# returns 18, without having to recalculate it
f1(d)
UPDATE:
Just for my understanding, I added a version that uses the inner function.
def memoize(func):
def new_func(data):
try:
# check if the value has already been calculated
# if it has, it would be cached under key = self.func
return data[func]
except KeyError:
pass # first-time call; calculate and cache the values
data[func] = func(data)
return data[func]
return new_func
#memoize
def f1(data):
return data['a'] * data['b']
You don't need anonymous functions. Also, memoization has been done better than this, there's probably no reason for you to roll your own.
But to answer the question: You can use your class as a decorator.
#Calculation
def f2():
...
This simply defined the function, wraps it in Calculation and stored the result of that as f2.
The decorator syntax is defined to be equivalent to:
_decorator = Calculation # a fresh identifier
# not needed here, but in other cases (think properties) it's useful
def f2():
...
f2 = _decorator(f2)
The alternative to an anonymous function is a non-anonymous function. An anonymous function is only anonymous in the context where it was defined. But it is not truly anonymous, because then you could not use it.
In Python you make anonymous functions with the lambda statement. You can for example do this:
output = mysort(input, lambda x: x.lastname)
The lambda will create a function, but that function has no name in the local space, and it's own name for itself is just '<lambda>'. But if we look at mysort, it would have to be defined something like this:
def mysort(input, getterfunc):
blahblahblah
As we see here, in this context the function isn't anonymous at all. It has a name, getterfunc. From the viewpoint of this function it does not matter if the function passed in are anonymous or not. This works just as well, and is exactly equivalent in all significant ways:
def get_lastname(x):
return x.lastname
output = mysort(input, get_lastname)
Sure, it uses more code, but it is not slower or anything like that. In Python, therefore anonymous functions are nothing but syntactic sugar for ordinary functions.
A truly anonymous function would be
lambda x: x.lastname
But as we don't assign the resulting function to anything, we do not get a name for the function, and then we can't use it. All truly anonymous functions are unusable.
For that reason, if you need a function that can't be a lambda, make it an ordinary function. It can never be anonymous in any meaningful way, so why bother making it anonymous at all? Lambdas are useful when you want a small one-line function and you don't want to waste space by defining a full function. That they are anonymous are irrelevant.
A closure can be a succinct alternative to writing a class such as the one in your example. The technique involves putting a def inside another def. The inner function can have access to the variable in the enclosing function. In Python 3, the nonlocal keyword gives you write access to that variable. In Python 2, you need to use a mutable value for the nonlocal variable in order to be able to update it from the inner function.
About the question regarding anonymous functions, the language intentionally pushes you back to use def for anything more complicated than a lambda can handle.

"self" inside plain function?

I've got a bunch of functions (outside of any class) where I've set attributes on them, like funcname.fields = 'xxx'. I was hoping I could then access these variables from inside the function with self.fields, but of course it tells me:
global name 'self' is not defined
So... what can I do? Is there some magic variable I can access? Like __this__.fields?
A few people have asked "why?". You will probably disagree with my reasoning, but I have a set of functions that all must share the same signature (accept only one argument). For the most part, this one argument is enough to do the required computation. However, in a few limited cases, some additional information is needed. Rather than forcing every function to accept a long list of mostly unused variables, I've decided to just set them on the function so that they can easily be ignored.
Although, it occurs to me now that you could just use **kwargs as the last argument if you don't care about the additional args. Oh well...
Edit: Actually, some of the functions I didn't write, and would rather not modify to accept the extra args. By "passing in" the additional args as attributes, my code can work both with my custom functions that take advantage of the extra args, and with third party code that don't require the extra args.
Thanks for the speedy answers :)
self isn't a keyword in python, its just a normal variable name. When creating instance methods, you can name the first parameter whatever you want, self is just a convention.
You should almost always prefer passing arguments to functions over setting properties for input, but if you must, you can do so using the actual functions name to access variables within it:
def a:
if a.foo:
#blah
a.foo = false
a()
see python function attributes - uses and abuses for when this comes in handy. :D
def foo():
print(foo.fields)
foo.fields=[1,2,3]
foo()
# [1, 2, 3]
There is nothing wrong with adding attributes to functions. Many memoizers use this to cache results in the function itself.
For example, notice the use of func.cache:
from decorator import decorator
#decorator
def memoize(func, *args, **kw):
# Author: Michele Simoniato
# Source: http://pypi.python.org/pypi/decorator
if not hasattr(func, 'cache'):
func.cache = {}
if kw: # frozenset is used to ensure hashability
key = args, frozenset(kw.iteritems())
else:
key = args
cache = func.cache # attribute added by memoize
if key in cache:
return cache[key]
else:
cache[key] = result = func(*args, **kw)
return result
You can't do that "function accessing its own attributes" correctly for all situations - see for details here how can python function access its own attributes? - but here is a quick demonstration:
>>> def f(): return f.x
...
>>> f.x = 7
>>> f()
7
>>> g = f
>>> g()
7
>>> del f
>>> g()
Traceback (most recent call last):
File "<interactive input>", line 1, in <module>
File "<interactive input>", line 1, in f
NameError: global name 'f' is not defined
Basically most methods directly or indirectly rely on accessing the function object through lookup by name in globals; and if original function name is deleted, this stops working. There are other kludgey ways of accomplishing this, like defining class, or factory - but thanks to your explanation it is clear you don't really need that.
Just do the mentioned keyword catch-all argument, like so:
def fn1(oneArg):
// do the due
def fn2(oneArg, **kw):
if 'option1' in kw:
print 'called with option1=', kw['option1']
//do the rest
fn2(42)
fn2(42, option1='something')
Not sure what you mean in your comment of handling TypeError - that won't arise when using **kw. This approach works very well for some python system functions - check min(), max(), sort(). Recently sorted(dct,key=dct.get,reverse=True) came very handy to me in CodeGolf challenge :)
Example:
>>> def x(): pass
>>> x
<function x at 0x100451050>
>>> x.hello = "World"
>>> x.hello
"World"
You can set attributes on functions, as these are just plain objects, but I actually never saw something like this in real code.
Plus. self is not a keyword, just another variable name, which happens to be the particular instance of the class. self is passed implicitly, but received explicitly.
if you want globally set parameters for a callable 'thing' you could always create a class and implement the __call__ method?
There is no special way, within a function's body, to refer to the function object whose code is executing. Simplest is just to use funcname.field (with funcname being the function's name within the namespace it's in, which you indicate is the case -- it would be harder otherwise).
This isn't something you should do. I can't think of any way to do what you're asking except some walking around on the call stack and some weird introspection -- which isn't something that should happen in production code.
That said, I think this actually does what you asked:
import inspect
_code_to_func = dict()
def enable_function_self(f):
_code_to_func[f.func_code] = f
return f
def get_function_self():
f = inspect.currentframe()
code_obj = f.f_back.f_code
return _code_to_func[code_obj]
#enable_function_self
def foo():
me = get_function_self()
print me
foo()
While I agree with the the rest that this is probably not good design, the question did intrigue me. Here's my first solution, which I may update once I get decorators working. As it stands, it relies pretty heavily on being able to read the stack, which may not be possible in all implementations (something about sys._getframe() not necessarily being present...)
import sys, inspect
def cute():
this = sys.modules[__name__].__dict__.get(inspect.stack()[0][3])
print "My face is..." + this.face
cute.face = "very cute"
cute()
What do you think? :3
You could use the following (hideously ugly) code:
class Generic_Object(object):
pass
def foo(a1, a2, self=Generic_Object()):
self.args=(a1,a2)
print "len(self.args):", len(self.args)
return None
... as you can see it would allow you to use "self" as you described. You can't use an "object()" directly because you can't "monkey patch(*)" values into an object() instance. However, normal subclasses of object (such as the Generic_Object() I've shown here) can be "monkey patched"
If you wanted to always call your function with a reference to some object as the first argument that would be possible. You could put the defaulted argument first, followed by a *args and optional **kwargs parameters (through which any other arguments or dictionaries of options could be passed during calls to this function).
This is, as I said hideously ugly. Please don't ever publish any code like this or share it with anyone in the Python community. I'm only showing it here as a sort of strange educational exercise.
An instance method is like a function in Python. However, it exists within the namespace of a class (thus it must be accessed via an instance ... myobject.foo() for example) and it is called with a reference to "self" (analagous to the "this" pointer in C++) as the first argument. Also there's a method resolution process which causes the interpreter to search the namespace of the instance, then it's class, and then each of the parent classes and so on ... up through the inheritance tree.
An unbound function is called with whatever arguments you pass to it. There can't bee any sort of automatically pre-pended object/instance reference to the argument list. Thus, writing a function with an initial argument named "self" is meaningless. (It's legal because Python doesn't place any special meaning on the name "self." But meaningless because callers to your function would have to manually supply some sort of object reference to the argument list and it's not at all clear what that should be. Just some bizarre "Generic_Object" which then floats around in the global variable space?).
I hope that clarifies things a bit. It sounds like you're suffering from some very fundamental misconceptions about how Python and other object-oriented systems work.
("Monkey patching" is a term used to describe the direct manipulation of an objects attributes -- or "instance variables" by code that is not part of the class hierarchy of which the object is an instance).
As another alternative, you can make the functions into bound class methods like so:
class _FooImpl(object):
a = "Hello "
#classmethod
def foo(cls, param):
return cls.a + param
foo = _FooImpl.foo
# later...
print foo("World") # yes, Hello World
# and if you have to change an attribute:
foo.im_self.a = "Goodbye "
If you want functions to share attribute namespaecs, you just make them part of the same class. If not, give each its own class.
What exactly are you hoping "self" would point to, if the function is defined outside of any class? If your function needs some global information to execute properly, you need to send this information to the function in the form of an argument.
If you want your function to be context aware, you need to declare it within the scope of an object.

Categories