Python: Issue with modules in package importing each other - python

Path structure:
base.py
my_module/
__init__.py
mod1.py
mod2.py
base.py:
from my_module.mod2 import *
mod2_func2() # This call will work
my_module/mod1.py:
from mod2 import *
def mod1_func():
mod2_func1("Hello World") # This call will not work.
my_module/mod2.py:
from mod1 import *
def mod2_func1(input_text):
print(input_text)
def mod2_func2():
mod1_func() # This call will work
This code will error out because mod2_func1 is not defined. If I run mod1.py directly (i.e. adding code to an __name__=="__main__" block, it will work fine.
This is a reduced example of my real problem, which involves a bunch of modules inside a package all needing to talk to each other like this. The main thing is ensuring that mod1 and mod2 can access each others contents in the local namespace irrespective of which is called from base. I've looked at a ton of documentation on python namespace stuff, importing, and packages and while I'm relieved to find that it seems frustrating to everyone, I haven't found a solution. Any advice?

Related

Is specific way of importing submodules at all possible?

I am working in the following directory tree:
src/
__init__.py
train.py
modules/
__init__.py
encoders/
__init__.py
rnn_encoder.py
My pwd is the top-level directory and my __init__.py files are all empty. I am executing train.py, which contains the following code snippet.
import modules
# RNNEncoder is a class in rnn_encoder.py
encoder = modules.encoders.rnn_encoder.RNNEncoder(**params)
When I execute train.py, I get an error saying that
AttributeError: module 'modules' has no attribute 'encoders'
I am wondering if there is any clean way to make this work. Note that I am not looking for alternative methods of importing, I am well-aware that this can be done in other ways. What I'd like to know is whether it is possible to keep the code in train.py as is while maintaining the given directory structure.
Putting an __init__.py file in a folder allows that folder to act as an import target, even when it's empty. The way you currently have things set up, the following should work:
from modules.encoders import rnn_encoder
encoder = rnn_encoder.RNNEncoder(**params)
Here, python treats modules.encoders as a filepath, essentially, and then tries to actually import the code inside rnn_encoder.
However, this won't work:
import modules
encoder = modules.encoders.rnn_encoder.RNNEncoder(**params)
The reason is that, when you do import modules, what python is doing behind the scenes is importing __init__.py from the modules folder, and nothing else. It doesn't run into an error, since __init__.py exists, but since it's empty it doesn't actually do much of anything.
You can put code in __init__.py to fill out your module's namespace and allow people to access that namespace from outside your module. To solve your problem, make the following changes:
modules/encoders/__init__.py
from . import rnn_encoder
modules/__init__.py
from . import encoders
This imports rnn_encoder and assigns it to the namespace of encoders, allowing you to import encoders and then access encoders.rnn_encoder. Same with modules.encoders, except a step upwards.

If a Python package is just a module, can that module import from sister modules and expose their functions?

Suppose I have a package structured like:
root
+-- package_as_a_module.py
+-- setup.py
+-- requirements.py
In my case, package_as_a_module.py has grown much larger than initially anticipated, and it's becoming difficult to manage. Is it possible to add a new module to the root directory, say utils.py:
# utils.py
def func_a()
return "Hi!"
and then expose the functions of utils.py via an import statement into package_as_a_module.py like:
# package_as_a_module.py
from utils import func_a
So that after installation, I can use from package_as_a_module import func_a?
I attempted this without success, here. The modules from which the main module imports are not recognized, and the import fails. I suspect this might be possible with a true package structure, maybe in the init.py file or something. I'll give that a whirl next.
If no one proposes an alternative, I'll answer my own question in the negative.
You can.
In your file main_module.py you did not call the print_all() function, that is why nothing happened.
from module_a import print_hello_world as phw_a
from module_a import print_hello as ph
from module_b import print_hello_world as phw_b
def print_all():
ph()
phw_a()
phw_b()
print_all() # You need to call a function for it to do something

Import module defined in another module

I have the following setup. The first file (config.py) defines a path, that tells file1.py which module to import.
#file:config.py
moduleToBeImported="/here/is/some/path/file2.py"
import file1
Then in file1.py, I import the module that is defined in config.py
#file:file1.py
import imp
foo = imp.load_source('module.name',moduleToBeImported)
Is it possible to pass the variable moduleToBeImported from config.py to file1.py?
In the current setup I get expected error: NameError: name 'moduleToBeImported' is not defined
Short Answer - NO
Long Answer - Maybe, you can. And NO, you shouldn't. Circular imports would result in import cycles. And that is bad.
For example, let's say you imported config.py in file1.py. As soon as you run file1.py and it calls up config.py, the code in config.py runs just like any other python file. At this point, you would end up trying to import file1.py from file1.py.
Python may or may not detect this cycle before it breaks havoc on your system.
Generally speaking, circular imports are a very bad coding practice.
What you can do instead - Your config.py should contain bare minimal runnable code. Instead keep all configuration variables and settings and general utility methods in there. In short, if file1.py contains critical code, it shouldn't be imported into config.py. You can import config.py in file1.py though.
More reading here: Python circular importing?

Import local packages in python

i've run through many posts about this, but still doesn't seem to work. The deal is pretty cut. I've the got the following hierarchy.
main.py
DirA/
__init__.py
hello.py
DirB/
__init__.py
foo.py
bla.py
lol.py
The__init__.py at DirA is empty. The respective one at DirB just contains the foo module.
__all__.py = ["foo"]
The main.py has the following code
import DirA
import DirB
hey() #Def written at hello.py
foolish1() #Def written at foo.py
foolish2() #Def written at foo.py
Long story short, I got NameError: name 'foo' is not defined. Any ideas? Thanks in advance.
You only get what you import. Therefore, in you main, you only get DirA and DirB. You would use them in one of those ways:
import DirA
DirA.something_in_init_py()
# Importing hello:
import DirA.hello
DirA.hello.something_in_hello_py()
# Using a named import:
from DirA.hello import something_in_hello_py
something_in_hello_py()
And in DirB, just make the __init__.py empty as well. The only use of __all__ is for when you want to import *, which you don't want because, as they say, explicit is better than implicit.
But in case you are curious, it would work this way:
from DirB import *
something_in_dirb()
By default the import * will import everything it can find that does not start with an underscore. Specifying a __all__ restricts what it imported to the names defined in __all__. See this question for more details.
Edit: about init.
The __init__.py is not really connected to the importing stuff. It is just a special file with the following properties:
Its existence means the directory is a python package, with several modules in it. If it does not exist, python will refuse to import anything from the directory.
It will always be loaded before loading anything else in the directory.
Its content will be available as the package itself.
Just try it put this in DirA/__init__.py:
foo = 42
Now, in your main:
from DirA import foo
print(foo) # 42
It can be useful, because you can import some of your submodules in the __init__.py to hide the inner structure of your package. Suppose you build an application with classes Author, Book and Review. To make it easier to read, you give each class its own file in a package. Now in your main, you have to import the full path:
from myapp.author import Author
from myapp.book import Book
from myapp.review import Review
Clearly not optimal. Now suppose you put those exact lines above in your __init__.py, you may simplify you main like this:
from myapp import Author, Book, Review
Python will load the __init__.py, which will in turn load all submodules and import the classes, making them available on the package. Now your main does not need to know where the classes are actually implemented.
Have you tried something like this:
One way
from DirA import hello
Another way
from DirA.hello import hey
If those don't work then append a new system path
You need to import the function itself:
How to call a function from another file in Python?
In your case:
from DirA import foolish1, foolish2

How to organize code with __init__.py?

I am just starting off using google app engine, and have been looking around for good practices and code organization. Most of my problems lie from confusion of __init__.py.
My current test structure looks like
/website
main.py
/pages
__init__.py #1
blog.py
hello2.py
hello.py
/sub
__init__.py #2
base.py
I am trying to use main.py as a file that simply points to everything in /pages and /pages/sub. Most modules in /pages share almost all the same imports (ex. import urllib), is there a way to define that everything in /pages imports what I want rather than adding it in every individual module?
Currently in __init__.py #1 I have
from sub.base import *
Yet my module blog.py says BaseHandler (a function in base.py) not defined.
My end goal is to have something like ...
main.py
from pages import *
#be able to call any function in /pages without having to do blog.func1() or hello.func2()
#rather just func1() and func2()
And to be able to share common imports for modules in /pages in __init__.py. So that they share for example urllib and all functions from base.py. Thank you for taking the time to read this post, I look forward to your insight.
Sounds like you think __init__.py is an initializer for the other modules in the package. It is not. It turns pages into a package (allowing its files and subdirectories to be modules), and it is executed, like a normal module would be, when your program calls import pages. Imagine that it's named pages.py instead.
So if you really want to dump everything into the same namespace, init #2 can contain from base import * (which will import everything in base to the namespace of sub), and blog.py can contain from sub import *. Got it?

Categories