How to use Tkinter's 'file' variable across different functions in Python? - python

I'm using Python 2.7. I'm also using a library known as id3reader to get metadata from mp3 files. If I use this code:
import tkFileDialog
import id3reader
file = tkFileDialog.askopenfile()
id3r = id3reader.Reader(file)
print(id3r.getValue('performer')
everything works just fine, and the artist of the song's name will be printed out in the console.
However, I am trying to do this across different functions. So if I use this code:
import tkFileDialog
import id3reader
def Load(self):
file = tkFileDialog.askopenfile()
def Display(self):
id3r = id3reader.Reader(file)
print(id3r.getValue('performer')
I get an error coming from within the id3reader script. If I use:
self.file
or
fileName = file
global fileName
I get a global variable not defined error.
How would I be able to use the built-in 'file' variable across different functions?

You're confusing a bunch of different things.
First, the built-in file variable is the actual type of file objects. You don't want to use that, you're trying to hide it with the filename you got back from askopenfile().
And file is not a Tkinter variable—neither the builtin, nor the one you're creating, have anything to do with Tkinter.
The reason your code isn't working is that, inside the Load function, when you write file = tkFileDialog.askopenfile(), you're creating a local variable. That local variable hides the name of the global variable of the same name, until the function exits, at which point it goes away.
Your attempt to use self.file is a great solution—except you don't have any classes. If you want to learn about how to use classes in general, and the idiomatic way to use them with Tkinter in particular, that's a great thing to learn, but it's too much to teach in a StackOverflow answer, and Python already comes with a great tutorial.
If you want to use a global variable, you can do that, but (a) you have to use global file, not global fileName, if you want file to be global, and (b) you have to put that inside the Load function, not at the top level. If you do both of those, then that file = tkFileDialog.askopenfile() will now reassign the global variable file, instead of hiding it with a local variable, so the value will still be available once you're done, to any other function that wants to access it.
However, a better solution is to not try to share a global variable. Just return the value, and have the caller hold onto it and pass it into Display. Since I can't see the code you're using to call those functions, I'll have to make something up, but hopefully you can understand it and apply it to your own code:
def Load():
return tkFileDialog.askopenfile()
def Display(file):
id3r = id3reader.Reader(file)
print(id3r.getValue('performer')
f = Load()
Display(f)

Related

accessing and changing module level variable [duplicate]

I've run into a bit of a wall importing modules in a Python script. I'll do my best to describe the error, why I run into it, and why I'm tying this particular approach to solve my problem (which I will describe in a second):
Let's suppose I have a module in which I've defined some utility functions/classes, which refer to entities defined in the namespace into which this auxiliary module will be imported (let "a" be such an entity):
module1:
def f():
print a
And then I have the main program, where "a" is defined, into which I want to import those utilities:
import module1
a=3
module1.f()
Executing the program will trigger the following error:
Traceback (most recent call last):
File "Z:\Python\main.py", line 10, in <module>
module1.f()
File "Z:\Python\module1.py", line 3, in f
print a
NameError: global name 'a' is not defined
Similar questions have been asked in the past (two days ago, d'uh) and several solutions have been suggested, however I don't really think these fit my requirements. Here's my particular context:
I'm trying to make a Python program which connects to a MySQL database server and displays/modifies data with a GUI. For cleanliness sake, I've defined the bunch of auxiliary/utility MySQL-related functions in a separate file. However they all have a common variable, which I had originally defined inside the utilities module, and which is the cursor object from MySQLdb module.
I later realised that the cursor object (which is used to communicate with the db server) should be defined in the main module, so that both the main module and anything that is imported into it can access that object.
End result would be something like this:
utilities_module.py:
def utility_1(args):
code which references a variable named "cur"
def utility_n(args):
etcetera
And my main module:
program.py:
import MySQLdb, Tkinter
db=MySQLdb.connect(#blahblah) ; cur=db.cursor() #cur is defined!
from utilities_module import *
And then, as soon as I try to call any of the utilities functions, it triggers the aforementioned "global name not defined" error.
A particular suggestion was to have a "from program import cur" statement in the utilities file, such as this:
utilities_module.py:
from program import cur
#rest of function definitions
program.py:
import Tkinter, MySQLdb
db=MySQLdb.connect(#blahblah) ; cur=db.cursor() #cur is defined!
from utilities_module import *
But that's cyclic import or something like that and, bottom line, it crashes too. So my question is:
How in hell can I make the "cur" object, defined in the main module, visible to those auxiliary functions which are imported into it?
Thanks for your time and my deepest apologies if the solution has been posted elsewhere. I just can't find the answer myself and I've got no more tricks in my book.
Globals in Python are global to a module, not across all modules. (Many people are confused by this, because in, say, C, a global is the same across all implementation files unless you explicitly make it static.)
There are different ways to solve this, depending on your actual use case.
Before even going down this path, ask yourself whether this really needs to be global. Maybe you really want a class, with f as an instance method, rather than just a free function? Then you could do something like this:
import module1
thingy1 = module1.Thingy(a=3)
thingy1.f()
If you really do want a global, but it's just there to be used by module1, set it in that module.
import module1
module1.a=3
module1.f()
On the other hand, if a is shared by a whole lot of modules, put it somewhere else, and have everyone import it:
import shared_stuff
import module1
shared_stuff.a = 3
module1.f()
… and, in module1.py:
import shared_stuff
def f():
print shared_stuff.a
Don't use a from import unless the variable is intended to be a constant. from shared_stuff import a would create a new a variable initialized to whatever shared_stuff.a referred to at the time of the import, and this new a variable would not be affected by assignments to shared_stuff.a.
Or, in the rare case that you really do need it to be truly global everywhere, like a builtin, add it to the builtin module. The exact details differ between Python 2.x and 3.x. In 3.x, it works like this:
import builtins
import module1
builtins.a = 3
module1.f()
As a workaround, you could consider setting environment variables in the outer layer, like this.
main.py:
import os
os.environ['MYVAL'] = str(myintvariable)
mymodule.py:
import os
myval = None
if 'MYVAL' in os.environ:
myval = os.environ['MYVAL']
As an extra precaution, handle the case when MYVAL is not defined inside the module.
This post is just an observation for Python behaviour I encountered. Maybe the advices you read above don't work for you if you made the same thing I did below.
Namely, I have a module which contains global/shared variables (as suggested above):
#sharedstuff.py
globaltimes_randomnode=[]
globalist_randomnode=[]
Then I had the main module which imports the shared stuff with:
import sharedstuff as shared
and some other modules that actually populated these arrays. These are called by the main module. When exiting these other modules I can clearly see that the arrays are populated. But when reading them back in the main module, they were empty. This was rather strange for me (well, I am new to Python). However, when I change the way I import the sharedstuff.py in the main module to:
from globals import *
it worked (the arrays were populated).
Just sayin'
A function uses the globals of the module it's defined in. Instead of setting a = 3, for example, you should be setting module1.a = 3. So, if you want cur available as a global in utilities_module, set utilities_module.cur.
A better solution: don't use globals. Pass the variables you need into the functions that need it, or create a class to bundle all the data together, and pass it when initializing the instance.
The easiest solution to this particular problem would have been to add another function within the module that would have stored the cursor in a variable global to the module. Then all the other functions could use it as well.
module1:
cursor = None
def setCursor(cur):
global cursor
cursor = cur
def method(some, args):
global cursor
do_stuff(cursor, some, args)
main program:
import module1
cursor = get_a_cursor()
module1.setCursor(cursor)
module1.method()
Since globals are module specific, you can add the following function to all imported modules, and then use it to:
Add singular variables (in dictionary format) as globals for those
Transfer your main module globals to it
.
addglobals = lambda x: globals().update(x)
Then all you need to pass on current globals is:
import module
module.addglobals(globals())
Since I haven't seen it in the answers above, I thought I would add my simple workaround, which is just to add a global_dict argument to the function requiring the calling module's globals, and then pass the dict into the function when calling; e.g:
# external_module
def imported_function(global_dict=None):
print(global_dict["a"])
# calling_module
a = 12
from external_module import imported_function
imported_function(global_dict=globals())
>>> 12
The OOP way of doing this would be to make your module a class instead of a set of unbound methods. Then you could use __init__ or a setter method to set the variables from the caller for use in the module methods.
Update
To test the theory, I created a module and put it on pypi. It all worked perfectly.
pip install superglobals
Short answer
This works fine in Python 2 or 3:
import inspect
def superglobals():
_globals = dict(inspect.getmembers(
inspect.stack()[len(inspect.stack()) - 1][0]))["f_globals"]
return _globals
save as superglobals.py and employ in another module thusly:
from superglobals import *
superglobals()['var'] = value
Extended Answer
You can add some extra functions to make things more attractive.
def superglobals():
_globals = dict(inspect.getmembers(
inspect.stack()[len(inspect.stack()) - 1][0]))["f_globals"]
return _globals
def getglobal(key, default=None):
"""
getglobal(key[, default]) -> value
Return the value for key if key is in the global dictionary, else default.
"""
_globals = dict(inspect.getmembers(
inspect.stack()[len(inspect.stack()) - 1][0]))["f_globals"]
return _globals.get(key, default)
def setglobal(key, value):
_globals = superglobals()
_globals[key] = value
def defaultglobal(key, value):
"""
defaultglobal(key, value)
Set the value of global variable `key` if it is not otherwise st
"""
_globals = superglobals()
if key not in _globals:
_globals[key] = value
Then use thusly:
from superglobals import *
setglobal('test', 123)
defaultglobal('test', 456)
assert(getglobal('test') == 123)
Justification
The "python purity league" answers that litter this question are perfectly correct, but in some environments (such as IDAPython) which is basically single threaded with a large globally instantiated API, it just doesn't matter as much.
It's still bad form and a bad practice to encourage, but sometimes it's just easier. Especially when the code you are writing isn't going to have a very long life.

Passing variables between two files both ways in python

Okay, so I know this may seem stupid but I am currently making a game using multiple files, one main one that receives all variables from other files and uses them in ways. I'm using the:
from SPRITES import *
to get these variable over, however now I need a variable that can only be defined in MAIN in SPRITES (as the platform the player is standing on is located in main, and this needs to change the controls defined in sprites), however if I just do a
from MAIN import *
this seems to break the connection completely. Help please
EDIT: Okay, currently my file is probs too large to post all code on here but I'll try to post whats relevent on here (first time here). This is the start to the main 'titleMAIN' file
import pygame as pg
import random
from titleSETTING import *
from titleSPRITE import *
cont = ("green")
class Game:
def __init__(self):
# initialize game window, etc
pg.init()
and so on
calling upon the Player class in the SPRITES file from the Game class - I need to be able to use the 'cont' variable in the Player class:
def new(self):
# start a new game
cont = ("green")
...
self.player = Player(self)
self.all_sprites.add(self.player)
And here is where I tried to call upon the MAIN file from the SPRITES file:
from titleSETTING import *
...
class Player(pg.sprite.Sprite):
def __init__(self, game):
Sorry that I was vague, first time here and kinda a novice at coding, no matter how much I enjoy it. BTW by files I mean different python (.py) files in the same folder - using IDLE as an IDE, which sucks but it's what I got
EDIT 2: Thanks for the responses - I realize that it's probably better to just try to make this all one file instead of two, so to not over complicate the code, so I'll work with that mindset now
The main reason this wasn't working for you is that circular imports are problematic. If some file A imports some other file B, then you do not want B to import A (even indirectly, via some other files).
So to make it work, I split main into two.
As an aside, using global variables (and import *) tends to make programs harder to read. Instead of a bunch of globals, consider perhaps a single global that has the values you need as fields. Instead of import *, consider explicit imports (just import sprites, and then sprites.foo).
main.py:
from sprites import *
from config import *
print "Hello from main. main_value is: ", main_value
print "sprite value is: ", sprite_value
do_sprite()
sprites.py:
from config import *
sprite_value=10
def do_sprite():
print "main value is: ", main_value
config.py:
main_value=5
While technically possible (using some obscure Python features), what you're trying to achieve is neither easy, nor actually a good idea.
Note that the fact that you did from moduleX import *, doesn't make the variables defined in moduleX magically available in main (or where-ever you put the import statement). What it does, it creates new variables with the same names in your current module and make them point to the same objects as those in moduleX at the moment when the import is executed. Let's say there's A in some module named X and it was initialized to "foo". You did import * from X and now print(A) will show foo. If you now call a function from X and it changes A to bar, it won't affect what you have in main - that is still the object foo. Likewise, if you do a="baz" in main, functions from X that refer to A will still see X's copy of that variable.
If you need some data to be available to more than one module, it may be best to arrange for all that shared data to be stored in some common object and have an easily-accessible reference to that object in all the modules that need the shared data. Here are possible choices for this object, pick what suits your taste:
it can be a module that's just meant to keep common variables, let's say it is called data or common (have an empty file data.py). Import it everywhere you need to and set variables as data.A = "something" or use them as needed, e.g. print (data.A).
it can be an instance of a class that you define yourself,
e.g.:
class data_class(object):
# set initial values
var1 = 10
A = "foo"
Now, create an instance of it with data = data_class() and pass it to every module that needs it. E.g., define it in one module and import it from everywhere else.
you can also use a Python dictionary (and, like with the class instance, have a reference to it in all modules). You will then refer to your common data items as data["A"], etc.

executing python code from string loaded into a module

I found the following code snippet that I can't seem to make work for my scenario (or any scenario at all):
def load(code):
# Delete all local variables
globals()['code'] = code
del locals()['code']
# Run the code
exec(globals()['code'])
# Delete any global variables we've added
del globals()['load']
del globals()['code']
# Copy k so we can use it
if 'k' in locals():
globals()['k'] = locals()['k']
del locals()['k']
# Copy the rest of the variables
for k in locals().keys():
globals()[k] = locals()[k]
I created a file called "dynamic_module" and put this code in it, which I then used to try to execute the following code which is a placeholder for some dynamically created string I would like to execute.
import random
import datetime
class MyClass(object):
def main(self, a, b):
r = random.Random(datetime.datetime.now().microsecond)
a = r.randint(a, b)
return a
Then I tried executing the following:
import dynamic_module
dynamic_module.load(code_string)
return_value = dynamic_module.MyClass().main(1,100)
When this runs it should return a random number between 1 and 100. However, I can't seem to get the initial snippet I found to work for even the simplest of code strings. I think part of my confusion in doing this is that I may misunderstand how globals and locals work and therefore how to properly fix the problems I'm encountering. I need the code string to use its own imports and variables and not have access to the ones where it is being run from, which is the reason I am going through this somewhat over-complicated method.
You should not be using the code you found. It is has several big problems, not least that most of it doesn't actually do anything (locals() is a proxy, deleting from it has no effect on the actual locals, it puts any code you execute in the same shared globals, etc.)
Use the accepted answer in that post instead; recast as a function that becomes:
import sys, imp
def load_module_from_string(code, name='dynamic_module')
module = imp.new_module(name)
exec(code, mymodule.__dict__)
return module
then just use that:
dynamic_module = load_module_from_string(code_string)
return_value = dynamic_module.MyClass().main(1, 100)
The function produces a new, clean module object.
In general, this is not how you should dynamically import and use external modules. You should be using __import__ within your function to do this. Here's a simple example that worked for me:
plt = __import__('matplotlib.pyplot', fromlist = ['plt'])
plt.plot(np.arange(5), np.arange(5))
plt.show()
I imagine that for your specific application (loading from code string) it would be much easier to save the dynamically generated code string to a file (in a folder containing an __init__.py file) and then to call it using __import__. Then you could access all variables and functions of the code as parts of the imported module.
Unless I'm missing something?

python stdout and stderr: Am confused how my file object get passed to when i didn't declare global

I am having problem understand the sys.stdout and sys.stderr
The problem is how did the variable put get got into the cmd module?
My aim is to write a single function that would accept a string basically I am using it to write exception caught in my application to the screen and to a log file. I saw similar code somewhere so I decided to learn more by using the same example i saw which was a little different from my as the other person was writing to tow files simultaneously with just one function call.
According to my understanding:
The cmd module recieves a string which it then calls output module on the recieved string.
output module takes two arguements - (1 of its parameters must evalute to python standard input module object and second the a string) fine.
However, since output module calls logs module which does the printing or better still combines parameters by calling write function from python's standard output object passing it the string or text to be written.
Please if my explanation is not clear it means I am truely not understanding the whole process.
My questions is: How does put variable called outside the function got into the cmd module or any other module when I have commented it or not even called out?
Please find code below
`
import sys
def logs(prints, message):
#global put
#print prints
prints.write(message)
prints.flush()
def output(prints, message):
#global put
#logs(prints, content)
logs(prints, message)
#logs(put, via)
''' This is where the confusion is, how did put get into this function when i did
not declare it...'''
def cmd(message):
#global put
output(put, message)
output(sys.stderr, message)
put = open('think.txt', 'w')
#print put, '000000000'
cmd('Write me out to screen/file')
put.close()
`
Its because of the way that python handles scopes. When you execute the script, the logs, output and cmd functions are defined in the module namespace. Then put = open('think.txt', 'w') creates a variable called put in the module namespace.
When you call cmd, you are now executing in the function's local namespace. it is created when the function is called and destroyed when the function exits. When python hits the expression output(put, message), it needs to resolve the names output, put and message to see what to do with them. The rules for a function are that python will look for the name in the local function namespace and then fall back to the global module namespace if the variable is not found.
So, python checks the function namespace for output, doesn't find anything, looks at the module namespace and finds that output refers to a function object. It then checks the function namespace for put, doesn't find anything, looks at the module namespace and finds that put refers to an open file object. Finally, it looks up message, finds it in the function namespace (the function parameters go into the function namespace) and off it goes.
put is declared as a global variable, so when you access it from within cmd, it is accessing that global variable without you needing to declare it within the function.
For example, this code prints 5 for the same reason:
def foo():
print "bar: {0}".format(bar)
bar = 5
foo()

lazy load dictionary

I have a dictionary called fsdata at module level (like a global variable).
The content gets read from the file system. It should load its data once on the first access. Up to now it loads the data during importing the module. This should be optimized.
If no code accesses fsdata, the content should not be read from the file system (save CPU/IO).
Loading should happen, if you check for the boolean value, too:
if mymodule.fsdata:
... do_something()
Update: Some code already uses mymodule.fsdata. I don't want to change the other places. It should be variable, not a function. And "mymodule" needs to be a module, since it gets already used in a lot of code.
I think you should use Future/Promise like this https://gist.github.com/2935416
Main point - you create not an object, but a 'promise' about object, that behave like an object.
You can replace your module with an object that has descriptor semantics:
class FooModule(object):
#property
def bar(self):
print "get"
import sys
sys.modules[__name__] = FooModule()
Take a look at http://pypi.python.org/pypi/apipkg for a packaged approach.
You could just create a simple function that memoizes the data:
fsdata = []
def get_fsdata:
if not fsdata:
fsdata.append(load_fsdata_from_file())
return fsdata[0]
(I'm using a list as that's an easy way to make a variable global without mucking around with the global keyword).
Now instead of referring to module.fsdata you can just call module.get_fsdata().

Categories