I know that if I import a module by name import(moduleName), then I can reload it with reload(moduleName)
But, I am importing a bunch of modules with a Kleene star:
from proj import *
How can I reload them in this case?
I think there's a way to reload all python modules. The code for Python 2.7 is listed below: Instead of importing the math module with an asterisk, you can import whatever you need.
from math import *
from sys import *
Alfa = modules.keys()
modules.clear()
for elem in Alfa:
str = 'from '+elem+' import *'
try:
exec(str)
except:
pass
This is a complicated and confusing issue. The method I give will reload the module and refresh the variables in the given context. However, this method will fall over if you have multiple modules using a starred import on the given module as they will retain their original values instead of updating. In generally, even having to reload a module is something you shouldn't be doing, with the exception of when working with a REPL. Modules aren't something that should be dynamic. You should consider other ways of providing the updates you need.
import sys
def reload_starred(module_name, context):
if context in sys.modules:
context = vars(sys.modules[context])
module = sys.modules[module_name]
for name in get_public_module_variables(module):
try:
del context[name]
except KeyError:
pass
module = reload(module)
context.update(get_public_module_variables(module))
def get_public_module_variables(module):
if hasattr(module, "__all__"):
return dict((k,v) for (k,v) in vars(module).items()
if k in module.__all__)
else:
return dict((k,v) for (k,v) in vars(module).items()
if not k.startswith("_"))
Usage:
reload_starred("my_module", __name__)
reload_starred("my_module", globals())
reload_starred("my_module", "another_module")
def function():
from my_module import *
...
reload_starred("my_module", locals())
Related
Does the interpreter somehow keep a timestamp of when a module is imported? Or is there an easy way of hooking into the import machinery to do this?
The scenario is a long-running Python process that at various points imports user-provided modules. I would like the process to be able to check "should I restart to load the latest code changes?" by checking the module file's timestamps against the time the module was imported.
Here's a way to automatically have an attribute (named _loadtime in the example code below) added to modules when they're imported. The code is based on Recipe 10.12 titled "Patching Modules on Import" in the book Python Cookbook, by David Beazley and Brian Jones, O'Reilly, 2013, which shows a technique that I adapted to do what you want.
For testing purposes I created this trivial target_module.py file:
print('in target_module')
Here's the example code:
import importlib
import sys
import time
class PostImportFinder:
def __init__(self):
self._skip = set() # To prevent recursion.
def find_module(self, fullname, path=None):
if fullname in self._skip: # Prevent recursion
return None
self._skip.add(fullname)
return PostImportLoader(self)
class PostImportLoader:
def __init__(self, finder):
self._finder = finder
def load_module(self, fullname):
importlib.import_module(fullname)
module = sys.modules[fullname]
# Add a custom attribute to the module object.
module._loadtime = time.time()
self._finder._skip.remove(fullname)
return module
sys.meta_path.insert(0, PostImportFinder())
if __name__ == '__main__':
import time
try:
print('importing target_module')
import target_module
except Exception as e:
print('Import failed:', e)
raise
loadtime = time.localtime(target_module._loadtime)
print('module loadtime: {} ({})'.format(
target_module._loadtime,
time.strftime('%Y-%b-%d %H:%M:%S', loadtime)))
Sample output:
importing target_module
in target_module
module loadtime: 1604683023.2491636 (2020-Nov-06 09:17:03)
I don't think there's any way to get around how hacky this is, but how about something like this every time you import? (I don't know exactly how you're importing):
import time
from types import ModuleType
# create a dictionary to keep track
# filter globals to exclude things that aren't modules and aren't builtins
MODULE_TIMES = {k:None for k,v in globals().items() if not k.startswith("__") and not k.endswith("__") and type(v) == ModuleType}
for module_name in user_module_list:
MODULE_TIMES[module_name] = time.time()
eval("import {0}".format(module_name))
And then you can reference this dictionary in a similar way later.
Hello I did got into circular dependency what is not refactori-zable other than doubling code.
I have something like this (only much more complex):
myParser.py:
import sys
import main #comment this to make it runnable
def parseEvnt():
sys.stdout.write("evnt:")
main.parseCmd(1) #comment this to make it runnable
tbl.py:
import myParser
tblx = {
1:("cmd",),
2:("evnt",myParser.parseEvnt),
}
main.py:
import tbl
def parseCmd(d):
print(tbl.tblx[d][0])
data=[1,2]
for d in data:
if(d<2):
parseCmd(d)
else:
fce = tbl.tblx[d][1]
fce()
Obvious error I'm getting is:
File "D:\Data\vbe\workspace\marsPython\testCircular2\main.py", line 1, in <module>
import tbl
File "D:\Data\vbe\workspace\marsPython\testCircular2\tbl.py", line 1, in <module>
import myParser
File "D:\Data\vbe\workspace\marsPython\testCircular2\myParser.py", line 2, in <module>
import main
File "D:\Data\vbe\workspace\marsPython\testCircular2\main.py", line 7, in <module>
parseCmd(d)
File "D:\Data\vbe\workspace\marsPython\testCircular2\main.py", line 3, in parseCmd
print(tbl.tblx[d][0])
AttributeError: module 'tbl' has no attribute 'tblx'
In C I think I would just tell by declaration in tbl.py hey there is function parseEvnt(). I would not need to include myParser and there would be no circular include.
In python I do not know how to do it.
I read few threads and there is always some wise guy recommending refactorizing. But in this case parseCmd() needs to see tblx which needs to see parseEvnt() (unless function declaration) and parseEvnt() need to call parseCmd() (cos evnt contains triggering cmd and I do not want to double the decoding cmd code).
Is there way how to make it working in python?
You can frequently get away with circular dependencies as long as the modules don't try to use each other's data until all importing is complete - as a practical matter, that means referencing with namespace (from module import something is forbidden) and only using the other modules inside functions and methods (no mystuff = module.mystuff in the global space). That's because when importing starts, python puts the module name in sys.modules and won't try to import that module again.
You ran into trouble because when you run main.py, python adds __main__ to sys.modules. When the code finally came around to import main, there was no "main" in the module list and so main.py was imported again... and its top level code tried to run.
Lets rearrange your test case and throw in a few print statements to tell when import happens.
myParser.py
print(' + importing myParser')
import sys
print('import parsecmd')
import parsecmd
def parseEvnt():
sys.stdout.write("evnt:")
parsecmd.parseCmd(1)
tbl.py
print(' + importing tbl')
print('import myParser')
import myParser
tblx = {
1:("cmd",),
2:("evnt",myParser.parseEvnt),
}
Parsecmd.py (new)
print(' + importing parsecmd')
print('import tbl')
import tbl
def parseCmd(d):
print(tbl.tblx[d][0])
main.py
print('running main.py')
print('import parsecmd')
import parsecmd
if __name__ == "__main__":
data=[1,2]
for d in data:
if(d<2):
parsecmd.parseCmd(d)
else:
fce = parsecmd.tbl.tblx[d][1]
fce()
When I run it I get
running main.py
import parsecmd
+ importing parsecmd
import tbl
+ importing tbl
import myParser
+ importing myParser
import parsecmd <-- didn't reimport parsecmd
cmd
evnt:cmd
If you're insistent on not refactoring (which is the real solution to this - not being a wise guy), you could move your problematic import into your function in myParser.py
import sys
def parseEvnt():
import main ## import moved into function
sys.stdout.write("evnt:")
main.parseCmd(1)
Again, see if you can redesign your code so such interdependencies are avoided.
The above solution is sort of a hack and won't solve future problems you might run into due to this dependency.
Circular imports should be avoided. Refactoring is required, any workaround that still requires a circular import is not a good solution.
That being said, the refactoring doesn't have to be extensive. There are at least a couple of fairly simple solutions.
Solution 1: move shared functions to a shared module
Since you want to use parseCmd from more than one place, move it to a separate file. That way both main.py and myParser.py can import the function.
Solution 2: have main pass parseCmd to parseEvnt.
First, make parseEvnt accept an argument to tell it which function to run:
# myParser.py
import sys
def parseEvnt(parseCmd):
sys.stdout.write("evnt:")
parseCmd(1)
Next, when you call myParser.parseEvnt, pass in a reference to main.parseCmd:
# main.py:
...
else:
fce = tbl.tblx[d][1]
fce(parseCmd)
There are other ways to accomplish the same thing. For example, you could add a "configure" method in myParser, and then have main.py call the configure method and pass in a reference to its parseCmd. The configure method can then store this reference in a global variable.
Another choice is to import main in the function that uses it:
main.py
import sys
def parseEvnt():
import main
sys.stdout.write("evnt:")
main.parseCmd(1)
Is there any way to make an implicit initializer for modules (not packages)?
Something like:
#file: mymodule.py
def __init__(val):
global value
value = 5
And when you import it:
#file: mainmodule.py
import mymodule(5)
The import statement uses the builtin __import__ function.
Therefore it's not possible to have a module __init__ function.
You'll have to call it yourself:
import mymodule
mymodule.__init__(5)
These things often are not closed as duplicates, so here's a really nice solution from Pass Variable On Import. TL;DR: use a config module, configure that before importing your module.
[...] A cleaner way to do it which is very useful for multiple configuration
items in your project is to create a separate Configuration module
that is imported by your wrapping code first, and the items set at
runtime, before your functional module imports it. This pattern is
often used in other projects.
myconfig/__init__.py :
PATH_TO_R_SOURCE = '/default/R/source/path'
OTHER_CONFIG_ITEM = 'DEFAULT'
PI = 3.14
mymodule/__init__.py :
import myconfig
PATH_TO_R_SOURCE = myconfig.PATH_TO_R_SOURCE
robjects.r.source(PATH_TO_R_SOURCE, chdir = True) ## this takes time
class SomeClass:
def __init__(self, aCurve):
self._curve = aCurve
if myconfig.VERSION is not None:
version = myconfig.VERSION
else:
version = "UNDEFINED"
two_pi = myconfig.PI * 2
And you can change the behaviour of your module at runtime from the
wrapper:
run.py :
import myconfig
myconfig.PATH_TO_R_SOURCE = 'actual/path/to/R/source'
myconfig.PI = 3.14159
# we can even add a new configuration item that isn't present in the original myconfig:
myconfig.VERSION="1.0"
import mymodule
print "Mymodule.two_pi = %r" % mymodule.two_pi
print "Mymodule.version is %s" % mymodule.version
Output:
> Mymodule.two_pi = 6.28318
> Mymodule.version is 1.0
I'm writing a piece of reusable code to import where I need it, but it needs some info about what is importing it. I have a workaround that does what I want, but it's a bit ugly. Is there a better way?
Here is a simplified version of what I'm doing.
What I want: Import a method and use it, but look at f in mod2. It needs some info from the importing module.
mod1:
from mod2 import f
f(...)
mod2:
from things_i_want import parent_module, importing_module
def f(*args, **kwargs):
from importing_module.parent_module import models
# ... do some stuff with it, including populating v with a string
v = 'some_string'
m = getattr(importing_module, v, None)
if callable(m)
return m(*args, **kwargs)
My ugly workaround:
mod1:
from mod2 import f as _f
def f(*a, **k):return _f(__name__, globals(), *a, **k)
f(...)
mod2:
def f(module_name, globs, *args, **kwargs):
# find parent modules path
parent_module_path = module_name.split('.')[0:-1]
# find models modules path
models_path = parent_module_path + ['models',]
# import it
models = __import__('.'.join(models_path), {}, {}, [''])
# ... do some stuff with it, including populating v with a string
v = 'some_string'
if v in globs:
return globs[v](*args, **kwargs)
That's a bad idea, because modules are cached.
So if another module, say, mod3.py, also imports mod2, it will get the same mod2 object of the first time. The module is not reimported.
Maybe you imported some other module that imported mod2 before importing mod2 yourself, then you're not the one importing mod2 anymore. Modules are imported only once.
So instead of trying to get who imported the module, you should use another, reusable approach. Perhaps using classes and passing the instance around?
Say I have a package "mylibrary".
I want to make "mylibrary.config" available for import, either as a dynamically created module, or a module imported from an entirely different place that would then basically be "mounted" inside the "mylibrary" namespace.
I.e., I do:
import sys, types
sys.modules['mylibrary.config'] = types.ModuleType('config')
Given that setup:
>>> import mylibrary.config # -> works
>>> from mylibrary import config
<type 'exceptions.ImportError'>: cannot import name config
Even stranger:
>>> import mylibrary.config as X
<type 'exceptions.ImportError'>: cannot import name config
So it seems that using the direct import works, the other forms do not. Is it possible to make those work as well?
You need to monkey-patch the module not only into sys.modules, but also into its parent module:
>>> import sys,types,xml
>>> xml.config = sys.modules['xml.config'] = types.ModuleType('xml.config')
>>> import xml.config
>>> from xml import config
>>> from xml import config as x
>>> x
<module 'xml.config' (built-in)>
As well as the following:
import sys, types
config = types.ModuleType('config')
sys.modules['mylibrary.config'] = config
You also need to do:
import mylibrary
mylibrary.config = config
You can try something like this:
class VirtualModule(object):
def __init__(self, modname, subModules):
try:
import sys
self._mod = __import__(modname)
sys.modules[modname] = self
__import__(modname)
self._modname = modname
self._subModules = subModules
except ImportError, err:
pass # please signal error in some useful way :-)
def __repr__(self):
return "Virtual module for " + self._modname
def __getattr__(self, attrname):
if attrname in self._subModules.keys():
import sys
__import__(self._subModules[attrname])
return sys.modules[self._subModules[attrname]]
else:
return self._mod.__dict__[attrname]
VirtualModule('mylibrary', {'config': 'actual_module_for_config'})
import mylibrary
mylibrary.config
mylibrary.some_function