This question already has answers here:
How to change a module variable from another module?
(3 answers)
Closed 1 year ago.
So I have read that the way to share globals across files is to create a module which holds globals and import this in all python files needing to access the global. However it doesn't seem to work as expected for me (Python 3.6+)
Simple directory structure:
run.py
mypack/
-- globals.py
-- stuff.py
-- __init__.py
I have a global var in globals.py which I need to modify in main file (run.py) and finally print out while exiting the program. It does not seem to work:
__init__.py:
from .stuff import *
globals.py:
test = 'FAIL'
stuff.py:
import atexit
from .globals import test
def cya():
print ("EXIT: test = " + test)
atexit.register(cya)
def hi():
print('HI')
run.py:
import mypack
import mypack.globals as globals
mypack.hi()
globals.test = 'PASS'
print ("MAIN: test = " + globals.test)
Output on script execution:
HI
MAIN: test = PASS
EXIT: test = FAIL
Clearly the exit routine (cya) did not show the correct value of global value that was modified in run.py. Not sure what I am doing wrong.
the python documentation might help you on this one.
https://docs.python.org/3/faq/programming.html#how-do-i-share-global-variables-across-modules
Thanks to #PeterTrcka for pointing out the issue. Also thanks to #buran for indicating globals is a bad name for module since its inbuilt function. Here is the working solution:
directory structure:
run.py
mypack/
-- universal.py
-- stuff.py
-- __init__.py
__init__.py:
from .stuff import *
universal.py:
class GlobalVars:
test = 'FAIL'
stuff.py:
import atexit
from .universal import GlobalVars
def cya():
print ("EXIT: test = " + GlobalVars.test)
atexit.register(cya)
def hi():
print('HI')
run.py:
import mypack
from mypack.universal import GlobalVars
mypack.hi()
GlobalVars.test = 'PASS'
print ("MAIN: test = " + GlobalVars.test)
Output on script execution:
HI
MAIN: test = PASS
EXIT: test = PASS
Issue was: at each import all variables will be reinitialized thier values. Use singleton object:
universal.py
import logging
class GlobVars:
_instances = {}
def __new__(cls, logger, counter_start=0):
if cls not in cls._instances:
print("Creating Instance")
cls._instances[cls] = super(GlobVars, cls).__new__(cls)
return cls._instances[cls]
def __init__(self, logger, counter_start=0):
self.logger = logger
self.counter = counter_start
glob_vars = GlobVars(logger=logging.getLogger("basic_logger"))
run.py
from universal import glob_vars
glob_vars.logger.info("One logger rulles them all")
Related
I'm having trouble setting the value of a global variable in a function that I'm writing for unit tests.
The function is probably not ready to be used in a test. Or at least to be used to test in an easy manner, but I'm trying to work around that.
Here is an example of the function I'm trying to test:
def my_func_with_globals(filepath):
spos=filepath.find(__my_global_var1)
new_path = filepath[0:spos] + __my_global_var2
return new_path
def some_function():
...
my_func_with_globals(filepath)
...
if __name__ = '__main__':
global __my_global_var1
__my_global_var1='value1'
global __my_global_var2
__my_global_var2='value2'
...
some_function()
And here is an example of my test:
import unittest
from my_module import *
class UnitTestMyModule(unittest.TestCase):
def test_my_func_with_globals(self):
self.assertEqual(my_func_with_globals('arbitrary/file/path'), 'valid output')
Another example of my test using #kdopen's suggestion (gives me the same error):
import unittest
import my_module
class UnitTestMyModule(unittest.TestCase):
def test_my_func_with_globals(self):
my_module.__my_global_var1='some/value'
my_module.__my_global_var2='second_val'
self.assertEqual(my_module.my_func_with_globals('arbitrary/file/path'), 'valid output')
I keep getting the error:
NameError: global name '__my_global_var1' is not defined.
I've tried a few different things, but I can't get anything to work. Using unittest.mock.patch looks like it would work perfectly, but I'm stuck with what I currently have with v2.6.4.
The globals are defined with a double leading underscore, so they are not imported by the from my_module import * statement.
You can make them accessible with the following:
from my_module import __my_global_var1, __my_global_var2
Alternatively, if you used import my_module you can access them as my_module.__my_global_var1 etc.
But I don't see any reference to the global variables in your sample test case
Here's a simple example
a.py
__global1 = 1
def foo():
return __global1
b.py:
import a
print "global1: %d" % a.__global1
print "foo: %d" % a.foo()
a.__global1 = 2
print "foo: %d" % a.foo()
And running b.py
$ python2.6 b.py
global1: 1
foo: 1
foo: 2
UPDATE:
Dang it, missed the obvious
You declare the variables within the if test. That code doesn't run on import - only when you execute python my_module from the command line.
During importing, __name__ will be set to my_module, not __main__
So, yes - they are undefined when you call your unit test.
I am writing a Python module and I have a function that defines a new variable in the module. I want to set a variable that can be accessed in the file that is importing the file. If that is confusing, here is my code:
# main.py
import other_module
other_module.set_variable("var1")
print(other_module.var1) # This is not a NameError
print(var1) # NameError
However, if I do something slightly different:
# main.py
from other_module import *
set_variable("var1")
print(var1) # NameError
print(other_module.var1) # NameError
And other_module.py:
# other_module.py
def set_variable(name):
exec("""
global %s
%s = 5
""" % (name, name))
I have no control over main.py. That is thr consumer's code. I want to be able to access and change main.py's globals. I want this to work:
# main.py
from other_module import *
set_variable("var")
print(var) # This should print 5
What you are doing sounds like class method behavior to me.
A class will be safer to use than the global namespace, try a class?
This works:
# other.py
class Other(object):
#classmethod
def set_variable(cls, name):
exec('Other.%s = 5' % name)
# main.py
from other import Other
Other.set_variable('x')
print Other.x
# output
% ./main.py
5
In my problem I have a python code that is being started by a user, like:
# file main.py
import sys
import mymodule
config = sys.argv[1]
which imports another module containing functions, classes etc. like
# module.py
def check_config():
# do something with the content of 'config'
class Module(object):
# does something with the content of 'config'
How can I access the value of 'config' from within module.py? Should I use a 'global' variable here? Or is there a more sophisticated, pythonic way to solve this problem?
Also, I do not want to define an argument 'config' to each function and class I use in other modules...
Further remark: main.py imports other modules, not the other way around...
Instead of trying to wrangle global into performing this you should pass config as a parameter.
file main.py
import sys
import mymodule
config = sys.argv[1]
checked = mymodule.check_config(config)
mod = mymodule.Module(config)
module.py
def check_config(config):
# do something with the content of 'config'
class Module(object):
# does something with the content of 'config'
def __init__(self, config):
# initialise with config
Always avoid usingglobal when you can. If you need to modify config just have a module function return it.
config = change_config(config)
module.py
def change_config(config):
...
return config
However, an alternative method is to define a value within module.py which will store this information that holds nothing by default. Then as soon as file main.py has imported module.py and the config data is ready, you could assign the data to module.py's config name. Like this:
file main.py
import sys
import mymodule
config = sys.argv[1]
mymodule.config = config
mymodule.test_config()
mymodule.check_config()
mymodule.Module()
module.py
config = None
def test_config():
print config
# this will refer to the value supplied from file main.py
Note however, that the values in the module and main file will not be joined. If you reassign config in file main.py for any reason you have to pass that value to the module again. However if you pass a mutable value like a dict or list then you can modify it in file main.py and the values will be shared.
I don't recommend using a global variable, but here's the design you should use if you do. config needs to be defined in mymodule; after you import the module, you can set the value of mymodule.config the way you are currently setting config.
# file main.py
import sys
import mymodule
mymodule.config = sys.argv[1]
# module.py
# The exact value doesn't matter, as long as we create the name.
# None is good as it conveys the lack of a value; it's part of your
# module's contract, presumably, that a proper value must be assigned
# before you can use the rest of the module.
config = None
def check_config():
# do something with the content of 'config'
class Module(object):
# does something with the content of 'config'
A global variable is almost never the answer. Just allow the functions and classes in your "library" (module.py or mymodule.py, you seem to use both) to accept arguments. So:
mymodule.py
def check_config(configuration):
pass
class Module(object):
def __init__(self, configuration):
self.config = configuration
class ConfigError(Exception):
pass
Then when you want to use them in your "application" code:
main.py
import sys
import mymodule
config = sys.argv[1]
if mymodule.check_config(config):
myobject = mymodule.Module(config)
else:
raise mymodule.ConfigError('Unrecognized configuration format.')
Could you describe that you app should to do? Because now it's not clear, why you want it.
Maybe environment variable could help you?
Btw, you can read config file in one place (module), and import all stuff you need from it.
config.py
import os
if os.environ['sys'] == 'load_1':
import load_1 as load
i = 12
else:
import load_2 as load
i = 13
main.py
import config
config.load("some_data")
print config.i
Is there any way to make an implicit initializer for modules (not packages)?
Something like:
#file: mymodule.py
def __init__(val):
global value
value = 5
And when you import it:
#file: mainmodule.py
import mymodule(5)
The import statement uses the builtin __import__ function.
Therefore it's not possible to have a module __init__ function.
You'll have to call it yourself:
import mymodule
mymodule.__init__(5)
These things often are not closed as duplicates, so here's a really nice solution from Pass Variable On Import. TL;DR: use a config module, configure that before importing your module.
[...] A cleaner way to do it which is very useful for multiple configuration
items in your project is to create a separate Configuration module
that is imported by your wrapping code first, and the items set at
runtime, before your functional module imports it. This pattern is
often used in other projects.
myconfig/__init__.py :
PATH_TO_R_SOURCE = '/default/R/source/path'
OTHER_CONFIG_ITEM = 'DEFAULT'
PI = 3.14
mymodule/__init__.py :
import myconfig
PATH_TO_R_SOURCE = myconfig.PATH_TO_R_SOURCE
robjects.r.source(PATH_TO_R_SOURCE, chdir = True) ## this takes time
class SomeClass:
def __init__(self, aCurve):
self._curve = aCurve
if myconfig.VERSION is not None:
version = myconfig.VERSION
else:
version = "UNDEFINED"
two_pi = myconfig.PI * 2
And you can change the behaviour of your module at runtime from the
wrapper:
run.py :
import myconfig
myconfig.PATH_TO_R_SOURCE = 'actual/path/to/R/source'
myconfig.PI = 3.14159
# we can even add a new configuration item that isn't present in the original myconfig:
myconfig.VERSION="1.0"
import mymodule
print "Mymodule.two_pi = %r" % mymodule.two_pi
print "Mymodule.version is %s" % mymodule.version
Output:
> Mymodule.two_pi = 6.28318
> Mymodule.version is 1.0
I have 3 files a.py, b.py, c.py
I am trying to dynamically import a class called "C" defined in c.py from within a.py
and have the evaluated name available in b.py
python a.py is currently catching the NameError. I'm trying to avoid this and create an
instance in b.py which calls C.do_int(10)
a.py
import b
#older
#services = __import__('services')
#interface = eval('services.MyRestInterface')
# python2.7
import importlib
module = importlib.import_module('c')
interface = eval('module.C')
# will work
i = interface()
print i.do_int(10)
# interface isn't defined in b.py after call to eval
try:
print b.call_eval('interface')
except NameError:
print "b.call_eval('interface'): interface is not defined in b.py"
b.py
def call_eval(name):
interface = eval(name)
i = interface()
return i.do_int(10)
c.py
class C(object):
my_int = 32
def do_int(self, number):
self.my_int += number
return self.my_int
How can I achieve this?
interface only exists in a's namespace. You can put a reference to the interface into b's namespace like this
b.interface = interface
try:
print b.call_eval('interface')
except NameError:
print "b.call_eval('interface'): interface is not defined in b.py"
I'm not sure why you're not just passing the interface to call_eval though
I'm sure there should be a better solution by totally avoiding this.
But this could do the trick:
a.py:
shared_variables = {}
import b
import c
shared_variables['C'] = c.C
b.do_something_with('C')
b.py:
from __main__ import shared_variables
def do_something_with(name):
print(shared_variables[name])
If a.py already loads the class, I fail to see the reason to pass it by name. Instead, do
# b.py
def call_eval(klass):
j = klass()
return i.do_int(10)
and, in a.py, do
import importlib
module = importlib.import_module('c')
interface = getattr(module, 'C')
b.call_eval(interface)