Python error importing a child module - python

parent/__init__.py:
favorite_numbers = [1]
def my_favorite_numbers():
for num in favorite_numbers:
num
my_favorite_numbers()
from .child import *
my_favorite_numbers()
parent/child.py:
print favorite_numbers
favorite_numbers.append(7)
I then created a file one directory up from parent directory named tst.py:
import parent
So the directory structure looks like this:
parent (directory)
__init__.py (file)
child.py (file)
tst.py (file)
And I get this error upon execution:
NameError: name 'favorite_numbers' is not defined
How can I add a value to favorite_numbers within child.py so that when I execute the my_favorite_numbers() function, I get 1 and 7.

In Python, each module has its own separate globals. That's actually the whole point of modules (as opposed to, say, C preprocessor-style text inserts).
When you do from .child import *, that imports .child, then copies all of its globals into the current module's globals. They're still separate modules, with their own globals.
If you want to pass values between code in different modules, you probably want to wrap that code up in functions, then pass the values as function arguments and return values. For example:
parent/__init__.py:
from .child import *
favorite_numbers = [1]
def my_favorite_numbers():
for num in favorite_numbers:
num
my_favorite_numbers()
child_stuff(favorite_numbers)
my_favorite_numbers()
parent/child.py:
def child_stuff(favorite_numbers):
print favorite_numbers
favorite_numbers.append(7)
In fact, you almost always want to wrap up any code besides initialization (defining functions and classes, creating constants and other singletons, etc.) in a function anyway. When you import a module (including from … import), that only runs its top-level code the first time. If you import again, the module object already exists in memory (inside sys.modules), so Python will just use that, instead of running the code to build it again.
If you really want to push a value into another module's namespace, you can, but you have to do it explicitly. And this means you have to have the module object available by importing it, not just importing from it:
from . import child
child.favorite_numbers = favorite_numbers
But this is rarely a good idea.

Did you ever run setup.py or a way of "building" your library?
I would create a setup.py file and likely run it in develop mode. Python setup.py develop vs install

Related

Call a function for every script inside a folder

Is there a way (using only python. i.e.: without a bash script nor another language code) to call a specific function in every script inside a folder without needing to import all of them explicitly.
For example, let's say that this is my structure:
main.py
modules/
module1.py
module2.py
module3.py
module4.py
and every moduleX.py has this code:
import os
def generic_function(caller):
print('{} was called by {}'.format(os.path.basename(__file__), caller))
def internal_function():
print('ERROR: Someone called an internal function')
while main.py has this code:
import modules
import os
for module in modules.some_magic_function():
module.generic_function(os.path.basename(__file__))
So if I run main.py, I should get this output:
module1.py was called by main.py
module2.py was called by main.py
module3.py was called by main.py
module4.py was called by main.py
*Please note that internal_function() shouldn't be called (unlike this question). Also, I don't want to declare explicitly every module file even on a __init__.py
By the way, I don't mind to use classes for this. In fact it could be even better.
You can use exec or eval to do that. So it would go roughly this way (for exec):
def magic_execute():
import os
import glob
for pyfl in glob.glob(os.path(MYPATH, '*.py'):
with open(pyfl, 'rt') as fh:
pycode = fh.read()
pycode += '\ngeneric_function({})'.format(__file__)
exec(pycode)
The assumption here is that you are not going to import the modules at all.
Please note, that there are numerous security issues related to using exec in such a non-restricted manner. You can increase security a bit.
While sophros' approach is quickly and enough for implicitly importing the modules, you could have issues related to controlling every module or with complex calls (like having conditions for each calls). So I went with another approeach:
First I created a class with the function(s) (now methods) declared. With this I can avoid checking if the method exists as I can use the default one if I didn't declare it:
# main.py
class BaseModule:
def __init__(self):
# Any code
def generic_function(self, caller):
# This could be a Print (or default return value) or an Exception
raise Exception('generic_function wasn\'t overridden or it was used with super')
Then I created another class that extends the BaseModule. Sadly I wasn't able to get a good way for checking inherence without knowing the name of the child class so I used the same name for every module:
# modules/moduleX.py
from main import BaseModule
class GenericModule(BaseModule):
def __init__(self):
BaseModule.__init__(self)
# Any code
def generic_function(self, caller):
print('{} was called by {}'.format(os.path.basename(__file__), caller))
Finally, in my main.py, I used the importlib for importing the modules dynamically and saving an instance for each one, so I can use them later (for sake of simplicity I didn't save them in the following code, but it's easy as using a list and appending every instance on it):
# main.py
import importlib
import os
if __name__ == '__main__':
relPath = 'modules' # This has to be relative to the working directory
for pyFile in os.listdir('./' + relPath):
# just load python (.py) files except for __init__.py or similars
if pyFile.endswith('.py') and not pyFile.startswith('__'):
# each module has to be loaded with dots instead of slashes in the path and without the extension. Also, modules folder must have a __init___.py file
module = importlib.import_module('{}.{}'.format(relPath, pyFile[:-3]))
# we have to test if there is actually a class defined in the module. This was extracted from [1]
try:
moduleInstance = module.GenericModule(self)
moduleInstance.generic_function(os.path.basename(__file__)) # You can actually do whatever you want here. You can save the moduleInstance in a list and call the function (method) later, or save its return value.
except (AttributeError) as e:
# NOTE: This will be fired if there is ANY AttributeError exception, including those that are related to a typo, so you should print or raise something here for diagnosting
print('WARN:', pyFile, 'doesn\'t has GenericModule class or there was a typo in its content')
References:
[1] Check for class existence
[2] Import module dynamically
[3] Method Overriding in Python

Module namespace initialisation before execution

I'm trying to dynamically update code during runtime by reloading modules using importlib.reload. However, I need a specific module variable to be set before the module's code is executed. I could easily set it as an attribute after reloading but each module would have already executed its code (e.g., defined its default arguments).
A simple example:
# module.py
def do():
try:
print(a)
except NameError:
print('failed')
# main.py
import module
module.do() # prints failed
module.a = 'succeeded'
module.do() # prints succeeded
The desired pseudocode:
import_module_without_executing_code module
module.initialise(a = 'succeeded')
module.do()
Is there a way to control module namespace initialisation (like with classes using metaclasses)?
It's not usually a good idea to use reload other than for interactive debugging. For example, it can easily create situations where two objects of type module.A are not the same type.
What you want is execfile. Pass a globals dictionary (you don't need an explicit locals dictionary) to keep each execution isolated; anything you store in it ahead of time acts exactly like the "pre-set" variables you want. If you do want to have a "real" module interface change, you can have a wrapper module that calls (or just holds as an attribute) the most recently loaded function from your changing file.
Of course, since you're using Python 3, you'll have to use one of the replacements for execfile.
Strictly speaking, I don't believe there is a way to do what you're describing in Python natively. However, assuming you own the module you're trying to import, a common approach with Python modules that need some initializing input is to use an init function.
If all you need is some internal variables to be set, like a in you example above, that's easy: just declare some module-global variables and set them in your init function:
Demo: https://repl.it/MyK0
Module:
## mymodule.py
a = None
def do():
print(a)
def init(_a):
global a
a = _a
Main:
## main.py
import mymodule
mymodule.init(123)
mymodule.do()
mymodule.init('foo')
mymodule.do()
Output:
123
foo
Where things can get trickier is if you need to actually redefine some functions because some dynamic internal something is dependent on the input you give. Here's one solution, borrowed from https://stackoverflow.com/a/1676860. Basically, the idea is to grab a reference to the current module by using the magic variable __name__ to index into the system module dictionary, sys.modules, and then define or overwrite the functions that need it. We can define the functions locally as inner functions, then add them to the module:
Demo: https://repl.it/MyHT/2
Module:
## mymodule.py
import sys
def init(a):
current_module = sys.modules[__name__]
def _do():
try:
print(a)
except NameError:
print('failed')
current_module.do = _do

How does importing a function from a module work in Python?

I have a module some_module.py which contains the following code:
def testf():
print(os.listdir())
Now, in a file named test.py, I have this code:
import os
from some_module import testf
testf()
But executing test.py gives me NameError: name 'os' is not defined. I've already imported os in test.py, and testf is in the namespace of test.py. So why does this error occur?
import is not the same as including the content of the file as if you had typed it directly in place of the import statement. You might think it works this way if you're coming from a C background, where the #include preprocessor directive does this, but Python is different.
The import statement in Python reads the content of the file being imported and evaluates it in its own separate context - so, in your example, the code in some_module.py has no access to or knowledge of anything that exists in test.py or any other file. It starts with a "blank slate", so to speak. If some_module.py's code wants to access the os module, you have to import it at the top of some_module.py.
When a module is imported in Python, it becomes an object. That is, when you write
import some_module
one of the first things Python does is to create a new object of type module to represent the module being imported. As the interpreter goes through the code in some_module.py, it assigns any variables, functions, classes, etc. that are defined in that file to be attributes of this new module object. So in your example, the module object will have one attribute, testf. When the code in the function testf wants to access the variable os, it looks in the function itself (local scope) and sees that os is not defined there, so it then looks at the attributes of the module object which testf belongs to (this is the "global" scope, although it's not truly global). In your example, it will not see os there, so you get an error. If you add
import os
to some_module.py, then that will create an attribute of the module under the name os, and your code will find what it needs to.
You may also be interested in some other answers I've written that may help you understand Python's import statement:
Why import when you need to use the full name?
Does Python import statement also import dependencies automatically?
The name testf is in the namespace of test. The contents of the testf function are still in some_module, and don't have access to anything in test.
If you have code that needs a module, you need to import that module in the same file where that code is. Importing a module only imports it into the one file where you import it. (Multiple imports of the same module, in different files, won't incur a meaningful performance penalty; the actual loading of the module only happens once, and later imports of the same module just get a reference to the already-imported module.)
Importing a module adds its name as an attribute of the current scope. Since different modules have independent scopes, any code in some_module cannot use names in __main__ (the executed script) without having imported it first.

Dynamic module imports from external function, (or - editing globals() outside of module), in Python

I have a project in which I want to repeatedly change code in a class and then run other modules to test the changes (verification..). Currently, after each edit I have to reload the code, the testing modules which run it, and then run the test. I want to reduce this cycle to one line, moreover, I will later want to test different classes, so I want to be able to receive the name of the tested class as a parameter - meaning I need dynamic imports.
I wrote a function for clean imports of any module, it seems to work:
def build_module_clean(module_string,attr_strings):
module = import_module(module_string)
module = reload(module)
for f in attr_strings:
globals()[f]=getattr(module,f)
Now, in the name of cleanliness, I want to keep this function in a wrapper module (which will contain the one-liner I want to rebuild and test all the code each time), and run it from the various modules, i.e. among the import statements of my ModelChecker module I would place the line
from wrapper import build_module_clean
build_module_clean('test_class_module',['test_class_name'])
however, when I do this, it seems the test class is added to the globals in the wrapper module, but not in the ModelChecker module (attempting to access globals()['test_class_name'] in ModelChecker gives a key error). I have tried passing globals or globals() as further parameters to build_module_clean, but globals is a function (so the test module is still loaded to the wrapper globals), and passing and then using globals() gives the error
TypeError: 'builtin_function_or_method' object does not support item assignment
So I need some way to edit one module's globals() from another module.
Alternatively, (ideally?) I would like to import the test_class module in the wrapper, in a manner that would make it visible to all the modules that use it (e.g. ModelChecker). How can I do that?
Your function should look like:
def build_module_clean(globals, module_string, attr_strings):
module = import_module(module_string)
module = reload(module)
globals[module_string] = module
for f in attr_strings:
globals[f] = getattr(module, f)
and call it like so:
build_module_clean(globals(), 'test_class_module', ['test_class_name'])
Explanation:
Calling globals() in the function call (build_module_clean(globals()...) grabs the module's __dict__ while still in the correct module and passes that to your function.
The function is then able to (re)assign the names to the newly-loaded module and it's current attributes.
Note that I also (re)assigned the newly-loaded module itself to the globals (you may not want that part).

Injecting Locals into Dynamically Loaded Modules Before Execution

I'm trying to build a sort of script system in python that will allow small snippets of code to be selected and executed at runtime inside python.
Essentially I want to be able to load a small python file like
for i in Foo: #not in a function.
print i
Where somewhere else in the program I assign what Foo will be. As if Foo served as a function argument to the entire loaded python file instead of a single function
So somewhere else
FooToPass = GetAFoo ()
TempModule = __import__ ("TheSnippit",<Somehow put {'Foo' : FooToPass} in the locals>)
It is considered bad style to have code with side effects at module level. If you want your module to do something, put that code in a function, make Foo a parameter of this function and call it with the desired value.
Python's import mechanism does not allow to preinitialise a module namespace. If you want to do this anyway (which is, in my opinion, confusing and unnecessary), you have to fiddle around with details of the import mechanism. Example implementation (untested):
import imp
import sys
def my_import(module_name, globals):
if module_name in sys.modules:
return sys.modules[module_name]
module = imp.new_module(module_name)
vars(module).update(globals)
f, module.__file__, options = imp.find_module(module_name)
exec f.read() in vars(module)
f.close()
sys.modules[module_name] = module
return module

Categories