There's something that's been bugging me (ha!) in Python (I use 2.7). It's that I get NameError: global name 'x' is not defined when I run this code:
def function1():
x = 1
return 0
def function2():
function1()
print(x)
return 0
function2()
It's not a serious problem for me, but I am genuinely curious about why this doesn't print 1. It makes sense and flows right in my mind. The function that defines the variable x is defined before it is called, and the function is called before print(x). I seriously am not seeing why this code doesn't work. Maybe the way I am thinking about this is flawed. Either way, why doesn't that code print 1? Thanks in advance for the help!
Yes, your thinking is flawed. The variable x defined in function1 is local to that function. It doesn't exist anywhere else. When you call one function with another, that doesn't mean that all the variables from the called function get dumped into the calling function. Only the return value is passed back. If you want to use x in the second function, you should return it from function1. (Even then, it won't create a variable called x in function2. It will only return the value, which you can then assign to a variable in function2 if you want, or print it, or whatever.)
in both functions use global x as the 1st line. this will make your code work. but, anyway, passing variables between functions is not a good practice. returning values and using parameters is a good practice.
if you are testing variable-sharing, you're using global variables. for that, unlike javascript, you NEED to use global x before trying to get it, as is in php (except that you don't get error in php unless using strict mode).
x scope is only inside function1() that's why you can't print x inside function2(), even if you're calling function1() (when you return from that function1(), x will be destroyed)
Because inside function1 scope variable x is local and function2 doesn't "see" this variable.
Related
I am developing a Python function of which the intermediate results should conditionally be accessible outside of the function, for plotting. It concerns > 20 variables of large size. To preserve memory I aim to only make the variables globally available in case the plotting is needed.
See the following code snippet I initially expected to work like desired:
def x():
if True:
global a
a = 10
return
x()
print(a)
Expected result: variable 'a' can be printed only if the if statement is executed.
Actual result: variable 'a' can always be printed.
I tried many things but cannot seem to achieve the desired behaviour. Proper programming would be to return the variables needed to the global scope, using 'return'. However, I do not know how to do that conditionally in a pretty way, with this many variables. Can anybody explain the behaviour I see, and suggest how to achieve the desired behaviour?
The determination of whether a variable is local or global is made when the function is being compiled, not when it runs. So this can't be dynamic.
You can solve it by using the locals() and globals() functions to access the different variable environments.
def x():
env = locals()
if <condition>:
env = globals()
env['a'] = 10
return
x()
print(a)
Is there a way to limit function so that it would only have access to local variable and passed arguments?
For example, consider this code
a = 1
def my_fun(x):
print(x)
print(a)
my_fun(2)
Normally the output will be
2
1
However, I want to limit my_fun to local scope so that print(x) would work but throw an error on print(a). Is that possible?
I feel like I should preface this with: Do not actually do this.
You (sort of) can with functions, but you will also disable calls to all other global methods and variables, which I do not imagine you would like to do.
You can use the following decorator to have the function act like there are no variables in the global namespace:
import types
noglobal = lambda f: types.FunctionType(f.__code__, {})
And then call your function:
a = 1
#noglobal
def my_fun(x):
print(x)
print(a)
my_fun(2)
However this actually results in a different error than you want, it results in:
NameError: name 'print' is not defined
By not allowing globals to be used, you cannot use print() either.
Now, you could pass in the functions that you want to use as parameters, which would allow you to use them inside the function, but this is not a good approach and it is much better to just keep your globals clean.
a = 1
#noglobal
def my_fun(x, p):
p(x)
p(a)
my_fun(2, print)
Output:
2
NameError: name 'a' is not defined
Nope. The scoping rules are part of a language's basic definition. To change this, you'd have to alter the compiler to exclude items higher on the context stack, but still within the user space. You obviously don't want to limit all symbols outside the function's context, as you've used one in your example: the external function print. :-)
In Fortran there is a statement Implicit none that throws a compilation error when a local variable is not declared but used. I understand that Python is a dynamically typed language and the scope of a variable may be determined at runtime.
But I would like to avoid certain unintended errors that happen when I forget to initialize a local variable but use it in the main code. For example, the variable x in the following code is global even though I did not intend that:
def test():
y=x+2 # intended this x to be a local variable but forgot
# x was not initialized
print y
x=3
test()
So my question is that: Is there any way to ensure all variables used in test() are local to it and that there are no side effects. I am using Python 2.7.x. In case there is a local variable, an error is printed.
So my question is that: Is there any way to ensure all variables used
in test() are local to it and that there are no side effects.
There is a technique to validate that globals aren't accessed.
Here's a decorator that scans a function's opcodes for a LOAD_GLOBAL.
import dis, sys, re, StringIO
def check_external(func):
'Validate that a function does not have global lookups'
saved_stdout = sys.stdout
sys.stdout = f = StringIO.StringIO()
try:
dis.dis(func)
result = f.getvalue()
finally:
sys.stdout = saved_stdout
externals = re.findall('^.*LOAD_GLOBAL.*$', result, re.MULTILINE)
if externals:
raise RuntimeError('Found globals: %r', externals)
return func
#check_external
def test():
y=x+2 # intended this x to be a local variable but forgot
# x was not initialized
print y
To make this practical, you will want a stop list of acceptable global references (i.e. modules). The technique can be extended to cover other opcodes such as STORE_GLOBAL and DELETE_GLOBAL.
All that said, I don't see straight-forward way to detect side-effects.
There is no implicit None in the sense you mean. Assignment will create a new variable, thus a typo might introduce a new name into your scope.
One way to get the effect you want is to use the following ugly-ish hack:
def no_globals(func):
if func.func_code.co_names:
raise TypeError(
'Function "%s" uses the following globals: %s' %
(func.__name__, ', '.join(func.func_code.co_names)))
return func
So when you declare your function test–with the no_globals wrapper–you'll get an error, like so:
>>> #no_globals
... def test():
... y = x + 2 # intended this x to be a local variable but forgot
... # x was not initialized
... print y
...
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 5, in no_globals
TypeError: Function "test" uses the following globals: x
>>>
>>> x = 3
>>> test()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
NameError: name 'test' is not defined
Just avoid using globally-scoped variables at all. And if you must, prefix their names with something you'll never use in a local variable name.
If you were really worried about this, you could try the following:
def test():
try:
x
except:
pass
else:
return
y = x+2
print y
But I'd recommend simply being mindful when writing a function that you don't try to reference things before assigning them. If possible, try to test each function separately, with a variety of carefully-defined inputs and intended outputs. There are a variety of testing suites and strategies, not to mention the simple assert keyword.
In Python, this is quite simply entirely legal. In fact, it is a strength of the language! This (lack) of error is the reason why you can do something like this:
def function1():
# stuff here
function2()
def function2():
pass
Whereas in C, you would need to "forward declare" function2.
There are static syntax checkers (like flake8) for Python that do plenty of work to catch errors and bad style, but this is not an error, and it is not caught by such a checker. Otherwise, something like this would be an error:
FILENAME = '/path/to/file'
HOSTNAME = 'example.com'
def main():
with open(FILENAME) as f:
f.write(HOSTNAME)
Or, something even more basic like this would be an error:
import sys
def main():
sys.stdout.write('blah')
The best thing you can do is use a different naming convention (like ALL_CAPS) for module level variable declarations. Also, make it a habit to put all of your code within a function (no module-level logic) in order to prevent variables from leaking into the global namespace.
Is there any way to ensure all variables used in test() are local to it and that there are no side effects.
No. The language offers no such functionality.
There is the built in locals() function. So you could write:
y = locals()['x'] + 2
but I cannot imagine anyone considering that to be an improvement.
To make sure the correct variable is used, you need to limit the scope of the lookup. Inside a function, Python will look to arguments defined in line, then to the args and kwargs. After those, its going to look outside the function. This can cause annoying bugs if the function depends on a global variable that gets changed elsewhere.
To avoid using a global variable by accident, you can define the function with a keyword argument for the variables your going to use:
def test(x=None):
y=x+2 # intended this x to be a local variable but forgot
# x was not initialized
print y
x=3
test()
I'm guessing you don't want to do this for lots of variables. However, it will stop the function from using globals.
Actually, even if you want to use a global variable in the function, I think its best to make it explicit:
x = 2
def test(x=x):
y=x+2 # intended this x to be a local variable but forgot
# x was not initialized
print y
x=3
test()
This example will use x=2 for the function no matter what happens to the global value of x afterwards. Inside the function, x is fixed to the value it had at compile time.
I started passing global variables as keyword arguments after getting burned a couple times. I think this is generally considered good practice?
The offered solutions are interesting, especially the one using dis.dis, but you are really thinking in the wrong direction. You don't want to write such a cumbersome code.
Are you afraid that you will reach a global accidentally? Then don't write globals. The purpose of module globals is mostly to be reached. (in a comment I have read that you have 50 globals in scope, which seems to me that you have some design errors).
If you still DO have to have globals, then either use a naming convention (UPPER_CASE is recommended for constants, which could cover your cases).
If a naming convention is not an option either, just put the functions you don't want to reach any global in a separate module, and do not define globals there. For instance, define pure_funcs and inside of that module, write your "pure" functions there, and then import this module. Since python has lexical scope, functions can only reach variables defined in outer scopes of the module they were written (and locals or built-ins, of course). Something like this:
# Define no globals here, just the functions (which are globals btw)
def pure1(arg1, arg2):
print x # This will raise an error, no way you can mix things up.
I have a function in a program that I`m working at and I named a variable inside this function and I wanted to make it global. For example:
def test():
a = 1
return a
test()
print (a)
And I just can`t access "a" because it keeps saying that a is not defined.
Any help would be great, thanks.
I have made some changes in your function.
def test():
# Here I'm making a variable as Global
global a
a = 1
return a
Now if you do
print (a)
it outputs
1
As Vaibhav Mule answered, you can create a global variable inside a function but the question is why would you?
First of all, you should be careful with using any kind of global variable, as it might be considered as a bad practice for this. Creating a global from a function is even worse. It will make the code extremely unreadable and hard to understand. Imagine, you are reading the code where some random a is used. You first have to find where that thing was created, and then try to find out what happens to it during the program execution. If the code is not small, you will be doomed.
So the answer is to your question is simply use global a before assignment but you shouldn't.
BTW, If you want c++'s static variable like feature, check this question out.
First, it is important to ask 'why' would one want that? Essentially what a function returns is a 'local computation' (normally). Having said so - if I have to use return 'value' of a function in a 'global scope', it's simply easier to 'assign it to a global variable. For example in your case
def test():
a = 1 # valid this 'a' is local
return a
a = test() # valid this 'a' is global
print(a)
Still, it's important to ask 'why' would I want to do that, normally?
I'm just learning about how Python works and after reading a while I'm still confused about globals and proper function arguments. Consider the case globals are not modified inside functions, only referenced.
Can globals be used instead function arguments?
I've heard about using globals is considered a bad practice. Would it be so in this case?
Calling function without arguments:
def myfunc() :
print myvalue
myvalue = 1
myfunc()
Calling function with arguments
def myfunc(arg) :
print arg
myvalue = 1
myfunc(myvalue)
I've heard about using globals is considered a bad practice. Would it be so in this case?
It depends on what you're trying to achieve. If myfunc() is supposed to print any value, then...
def myfunc(arg):
print arg
myfunc(1)
...is better, but if myfunc() should always print the same value, then...
myvalue = 1
def myfunc():
print myvalue
myfunc()
...is better, although with an example so simple, you may as well factor out the global, and just use...
def myfunc():
print 1
myfunc()
Yes. Making a variable global works in these cases instead of passing them in as a function argument. But, the problem is that as soon as you start writing bigger functions, you quickly run out of names and also it is hard to maintain the variables which are defined globally. If you don't need to edit your variable and only want to read it, there is no need to define it as global in the function.
Read about the cons of the global variables here - Are global variables bad?
There are several reasons why using function arguments is better than using globals:
It eliminates possible confusion: once your program gets large, it will become really hard to keep track of which global is used where. Passing function arguments lets you be much more clear about which values the function uses.
There's a particular mistake you WILL make eventually if you use globals, which will look very strange until you understand what's going on. It has to do with both modifying and reading a global variable in the same function. More on this later.
Global variables all live in the same namespace, so you will quickly run into the problem of overlapping names. What if you want two different variables named "index"? Calling them index1 and index2 is going to get real confusing, real fast. Using local variables, or function parameters, means that they all live in different namespaces, and the potential for confusion is greatly reduced.
Now, I mentioned modifying and reading a global variable in the same function, and a confusing error that can result. Here's what it looks like:
record_count = 0 # Global variable
def func():
print "Record count:", record_count
# Do something, maybe read a record from a database
record_count = record_count + 1 # Would normally use += 1 here, but it's easier to see what's happening with the "n = n + 1" syntax
This will FAIL: UnboundLocalError: local variable 'record_count' referenced before assignment
Wait, what? Why is record_count being treated as a local variable, when it's clearly global? Well, if you never assigned to record_count in your function, then Python would use the global variable. But when you assign a value to record_count, Python has to guess what you mean: whether you want to modify the global variable, or whether you want to create a new local variable that shadows (hides) the global variable, and deal only with the local variable. And Python will default to assume that you're being smart with globals (i.e., not modifying them without knowing exactly what you're doing and why), and assume that you meant to create a new local variable named record_count.
But if you're accessing a local variable named record_count inside your function, Python won't let you access the global variable with the same name inside the function. This is to spare you some really nasty, hard-to-track-down bugs. Which means that if this function has a local variable named record_count -- and it does, because of the assignment statement -- then all access to record_count is considered to be accessing the local variable. Including the access in the print statement, before the local variable's value is defined. Thus, the UnboundLocalError exception.
Now, an exercise for the reader. Remove the print statement and notice that the UnboundLocalError exception is still thrown. Can you figure out why? (Hint: before assigning to a variable, the value on the right-hand side of the assignment has to be calculated.)
Now: if you really want to use the global record_count variable in your function, the way to do it is with Python's global statement, which says "Hey, this variable name I'm about to specify? Don't ever make it a local variable, even if I assign to it. Assign to the global variable instead." The way it works is just global record_count (or any other variable name), at the start of your function. Thus:
record_count = 0 # Global variable
def func():
global record_count
print "Record count:", record_count
# Do something, maybe read a record from a database
record_count = record_count + 1 # Again, you would normally use += 1 here
This will do what you expected in the first place. But hopefully now you understand why it will work, and the other version won't.
It depends on what you want to do.
If you need to change the value of a variable that is declared outside of the function then you can't pass it as an argument since that would create a "copy" of that variable inside the functions scope.
However if you only want to work with the value of a variable you should pass it as an argument. The advantage of this is that you can't mess up the global variable by accident.
Also you should declare global variable before they are used.