How to edit global variable in module? - python

I am writing a Python module and I have a function that defines a new variable in the module. I want to set a variable that can be accessed in the file that is importing the file. If that is confusing, here is my code:
# main.py
import other_module
other_module.set_variable("var1")
print(other_module.var1) # This is not a NameError
print(var1) # NameError
However, if I do something slightly different:
# main.py
from other_module import *
set_variable("var1")
print(var1) # NameError
print(other_module.var1) # NameError
And other_module.py:
# other_module.py
def set_variable(name):
exec("""
global %s
%s = 5
""" % (name, name))
I have no control over main.py. That is thr consumer's code. I want to be able to access and change main.py's globals. I want this to work:
# main.py
from other_module import *
set_variable("var")
print(var) # This should print 5

What you are doing sounds like class method behavior to me.
A class will be safer to use than the global namespace, try a class?
This works:
# other.py
class Other(object):
#classmethod
def set_variable(cls, name):
exec('Other.%s = 5' % name)
# main.py
from other import Other
Other.set_variable('x')
print Other.x
# output
% ./main.py
5

Related

python not able to share variable across files [duplicate]

This question already has answers here:
How to change a module variable from another module?
(3 answers)
Closed 1 year ago.
So I have read that the way to share globals across files is to create a module which holds globals and import this in all python files needing to access the global. However it doesn't seem to work as expected for me (Python 3.6+)
Simple directory structure:
run.py
mypack/
-- globals.py
-- stuff.py
-- __init__.py
I have a global var in globals.py which I need to modify in main file (run.py) and finally print out while exiting the program. It does not seem to work:
__init__.py:
from .stuff import *
globals.py:
test = 'FAIL'
stuff.py:
import atexit
from .globals import test
def cya():
print ("EXIT: test = " + test)
atexit.register(cya)
def hi():
print('HI')
run.py:
import mypack
import mypack.globals as globals
mypack.hi()
globals.test = 'PASS'
print ("MAIN: test = " + globals.test)
Output on script execution:
HI
MAIN: test = PASS
EXIT: test = FAIL
Clearly the exit routine (cya) did not show the correct value of global value that was modified in run.py. Not sure what I am doing wrong.
the python documentation might help you on this one.
https://docs.python.org/3/faq/programming.html#how-do-i-share-global-variables-across-modules
Thanks to #PeterTrcka for pointing out the issue. Also thanks to #buran for indicating globals is a bad name for module since its inbuilt function. Here is the working solution:
directory structure:
run.py
mypack/
-- universal.py
-- stuff.py
-- __init__.py
__init__.py:
from .stuff import *
universal.py:
class GlobalVars:
test = 'FAIL'
stuff.py:
import atexit
from .universal import GlobalVars
def cya():
print ("EXIT: test = " + GlobalVars.test)
atexit.register(cya)
def hi():
print('HI')
run.py:
import mypack
from mypack.universal import GlobalVars
mypack.hi()
GlobalVars.test = 'PASS'
print ("MAIN: test = " + GlobalVars.test)
Output on script execution:
HI
MAIN: test = PASS
EXIT: test = PASS
Issue was: at each import all variables will be reinitialized thier values. Use singleton object:
universal.py
import logging
class GlobVars:
_instances = {}
def __new__(cls, logger, counter_start=0):
if cls not in cls._instances:
print("Creating Instance")
cls._instances[cls] = super(GlobVars, cls).__new__(cls)
return cls._instances[cls]
def __init__(self, logger, counter_start=0):
self.logger = logger
self.counter = counter_start
glob_vars = GlobVars(logger=logging.getLogger("basic_logger"))
run.py
from universal import glob_vars
glob_vars.logger.info("One logger rulles them all")

Python 2.6 unittest - how to set a value to use for a global variable in a function that you're testing

I'm having trouble setting the value of a global variable in a function that I'm writing for unit tests.
The function is probably not ready to be used in a test. Or at least to be used to test in an easy manner, but I'm trying to work around that.
Here is an example of the function I'm trying to test:
def my_func_with_globals(filepath):
spos=filepath.find(__my_global_var1)
new_path = filepath[0:spos] + __my_global_var2
return new_path
def some_function():
...
my_func_with_globals(filepath)
...
if __name__ = '__main__':
global __my_global_var1
__my_global_var1='value1'
global __my_global_var2
__my_global_var2='value2'
...
some_function()
And here is an example of my test:
import unittest
from my_module import *
class UnitTestMyModule(unittest.TestCase):
def test_my_func_with_globals(self):
self.assertEqual(my_func_with_globals('arbitrary/file/path'), 'valid output')
Another example of my test using #kdopen's suggestion (gives me the same error):
import unittest
import my_module
class UnitTestMyModule(unittest.TestCase):
def test_my_func_with_globals(self):
my_module.__my_global_var1='some/value'
my_module.__my_global_var2='second_val'
self.assertEqual(my_module.my_func_with_globals('arbitrary/file/path'), 'valid output')
I keep getting the error:
NameError: global name '__my_global_var1' is not defined.
I've tried a few different things, but I can't get anything to work. Using unittest.mock.patch looks like it would work perfectly, but I'm stuck with what I currently have with v2.6.4.
The globals are defined with a double leading underscore, so they are not imported by the from my_module import * statement.
You can make them accessible with the following:
from my_module import __my_global_var1, __my_global_var2
Alternatively, if you used import my_module you can access them as my_module.__my_global_var1 etc.
But I don't see any reference to the global variables in your sample test case
Here's a simple example
a.py
__global1 = 1
def foo():
return __global1
b.py:
import a
print "global1: %d" % a.__global1
print "foo: %d" % a.foo()
a.__global1 = 2
print "foo: %d" % a.foo()
And running b.py
$ python2.6 b.py
global1: 1
foo: 1
foo: 2
UPDATE:
Dang it, missed the obvious
You declare the variables within the if test. That code doesn't run on import - only when you execute python my_module from the command line.
During importing, __name__ will be set to my_module, not __main__
So, yes - they are undefined when you call your unit test.

python module __init__ function

Is there any way to make an implicit initializer for modules (not packages)?
Something like:
#file: mymodule.py
def __init__(val):
global value
value = 5
And when you import it:
#file: mainmodule.py
import mymodule(5)
The import statement uses the builtin __import__ function.
Therefore it's not possible to have a module __init__ function.
You'll have to call it yourself:
import mymodule
mymodule.__init__(5)
These things often are not closed as duplicates, so here's a really nice solution from Pass Variable On Import. TL;DR: use a config module, configure that before importing your module.
[...] A cleaner way to do it which is very useful for multiple configuration
items in your project is to create a separate Configuration module
that is imported by your wrapping code first, and the items set at
runtime, before your functional module imports it. This pattern is
often used in other projects.
myconfig/__init__.py :
PATH_TO_R_SOURCE = '/default/R/source/path'
OTHER_CONFIG_ITEM = 'DEFAULT'
PI = 3.14
mymodule/__init__.py :
import myconfig
PATH_TO_R_SOURCE = myconfig.PATH_TO_R_SOURCE
robjects.r.source(PATH_TO_R_SOURCE, chdir = True) ## this takes time
class SomeClass:
def __init__(self, aCurve):
self._curve = aCurve
if myconfig.VERSION is not None:
version = myconfig.VERSION
else:
version = "UNDEFINED"
two_pi = myconfig.PI * 2
And you can change the behaviour of your module at runtime from the
wrapper:
run.py :
import myconfig
myconfig.PATH_TO_R_SOURCE = 'actual/path/to/R/source'
myconfig.PI = 3.14159
# we can even add a new configuration item that isn't present in the original myconfig:
myconfig.VERSION="1.0"
import mymodule
print "Mymodule.two_pi = %r" % mymodule.two_pi
print "Mymodule.version is %s" % mymodule.version
Output:
> Mymodule.two_pi = 6.28318
> Mymodule.version is 1.0

how do I make a dynamically imported module available in another module or file?

I have 3 files a.py, b.py, c.py
I am trying to dynamically import a class called "C" defined in c.py from within a.py
and have the evaluated name available in b.py
python a.py is currently catching the NameError. I'm trying to avoid this and create an
instance in b.py which calls C.do_int(10)
a.py
import b
#older
#services = __import__('services')
#interface = eval('services.MyRestInterface')
# python2.7
import importlib
module = importlib.import_module('c')
interface = eval('module.C')
# will work
i = interface()
print i.do_int(10)
# interface isn't defined in b.py after call to eval
try:
print b.call_eval('interface')
except NameError:
print "b.call_eval('interface'): interface is not defined in b.py"
b.py
def call_eval(name):
interface = eval(name)
i = interface()
return i.do_int(10)
c.py
class C(object):
my_int = 32
def do_int(self, number):
self.my_int += number
return self.my_int
How can I achieve this?
interface only exists in a's namespace. You can put a reference to the interface into b's namespace like this
b.interface = interface
try:
print b.call_eval('interface')
except NameError:
print "b.call_eval('interface'): interface is not defined in b.py"
I'm not sure why you're not just passing the interface to call_eval though
I'm sure there should be a better solution by totally avoiding this.
But this could do the trick:
a.py:
shared_variables = {}
import b
import c
shared_variables['C'] = c.C
b.do_something_with('C')
b.py:
from __main__ import shared_variables
def do_something_with(name):
print(shared_variables[name])
If a.py already loads the class, I fail to see the reason to pass it by name. Instead, do
# b.py
def call_eval(klass):
j = klass()
return i.do_int(10)
and, in a.py, do
import importlib
module = importlib.import_module('c')
interface = getattr(module, 'C')
b.call_eval(interface)

Importing classes contained in a module

I have the following files in my directory:
foo/
foo.py
foolib/
__init__.py
bar.py
Within __init__.py:
__all__ = ["bar"]
Within bar.py:
class Bar:
def __init__(self):
None
def hello(self):
print("Hello World")
return
def hi():
print("Hi World")
Now if I have the following code within foo.py:
from foolib import *
bar.hi()
foobar = Bar()
foobar.hello()
"Hi World" prints, but I get a NameError for Bar(). If I explicitly import the module:
from foolib.bar import *
I get the expected output "Hello World".
Is there a way for me to import classes from the modules, without explicitly calling them? I feel like I am missing something in the __init__ file. Either that or I am flagrantly violating some Python best practice.
To import the class you must import the class, somewhere. When you do from foolib import *, because of your __init__.py this imports the module bar. It doesn't allow you to access anything inside that module.
If you want to automatically access everything in bar from the foolib package without having to import bar, you could put this in __init__.py:
from bar import *
This makes everything in bar available directly in foolib.

Categories