I'm creating a module in a python app, I have my primary code file, and I want to import some helper methods/classes from a helper folder. This is what I have for my folder structure:
module:
__init__.py
helpers:
__init__.py
some_class.py
this is module/helpers/__init__.py file:
from .some_class import SomeClass
def helper_method_1():
# code
def helper_method_2():
# code
so my question is: is importing SomeClass inside module/helpers/__init__.py inside the helpers enough to use it as in import in my main module/__init.py file?
this is what I'm trying in my module/__init__.py
from .helpers import (SomeClass, helper_method_1, helper_method_2)
I'm kind of in the middle of doing a bunch of things, so can't test it for errors at the moment
Yes, it is enough.
Unless module has __all__ variable, all names (including names imported from other modules) are exported.
Related
My project in java and I am using some python scripts for other tasks. My python scripts directory structuring are as follows:
Parent Dir
- Scripts (driver.py)
- utils (common.py, helper.py)
I want to use functions & classes from common.py & helper.py in driver.py classes. I am importing these in driver.py like this:
sys.path.append("..")
from utils import *
configs = load_config(filepath)
but when I use some functions from common.py (lets assume common.py imports class TestClass from helper.py and also has function load_config(filepath)) it throws error-
NameError: name 'load_config' is not defined
You cannot import an entire directory like that. You need to import the files individually. You can add a __init__.py file to your utils folder that will be executed when you import the directory the way you are doing. Then you can choose whatever behavior you would like to be carried out in the __init__.py
For instance, your __init__.py could look like this:
__all__ = ["common", "helper"]
This means that when you call from utils import *, the * will import the modules listed in __all__
Keep in mind you will still need to call common.function_name() if you want to use a function from within common.
Currently, I have a package name (let's say DummyPackage). DummyPackage contains three modules with functions, classes, etc. So the directory structure looks like this:
project_dir/
__init__.py
DummyPackage/
__init__.py
Module1/
__init__.py
module_x.py
module_y.py
Module2/
__init__.py
module_z.py
So importing methods from modules looks like this
from DummyPackage.Module1.module_x import method_x
We are adding new stuff to the project and I would like to create a module, with the name DummyProject, which should be importable like this
from DummyProject import new_method
I assumed, only adding file DummyPackage.py would be enough, but apparently, it's not. I tried to add it to the project_dir/ dir and to DummyPackage/ dir, but neither works.
Is it because of name conflict? Is it possible to have a code like this?
import DummyPackage
from DummyPackage.Module1.module_x import method_x
DummyPackage.new_method
method_x
to put my three comments in an answer:
First let me explain relative imports using the modules you already have, if you wanted to import module_x from module_y you can do this:
module_y.py
from .module_x import method_x
or similarly in module_z.py
from ..Module1.module_x import method_x
so depending on the location of your DummyProject in the package Intra-package References may be all you need.
As for the second part yes it is possible to have (runnable) code like this:
import DummyPackage
from DummyPackage.Module1.module_x import method_x
DummyPackage.new_method
method_x
in this case it looks like you want new_method to be a package level variable. To quote this great answer:
In addition to labeling a directory as a Python package and defining __all__, __init__.py allows you to define any variable at the package level.
I highly recommend taking a look at the source code for json/__init__.py in the standard library if you want an real world example.
Or as an example with your setup to be able to import method_x right from the package you would just need to add this to the top level __init__.py:
from .Module1.module_x import method_x
then from any file importing the package you could do this:
import DummyPackage
DummyPackage.method_x
(Although obviously you would do it for new_method according to where you place DummyProject)
i've run through many posts about this, but still doesn't seem to work. The deal is pretty cut. I've the got the following hierarchy.
main.py
DirA/
__init__.py
hello.py
DirB/
__init__.py
foo.py
bla.py
lol.py
The__init__.py at DirA is empty. The respective one at DirB just contains the foo module.
__all__.py = ["foo"]
The main.py has the following code
import DirA
import DirB
hey() #Def written at hello.py
foolish1() #Def written at foo.py
foolish2() #Def written at foo.py
Long story short, I got NameError: name 'foo' is not defined. Any ideas? Thanks in advance.
You only get what you import. Therefore, in you main, you only get DirA and DirB. You would use them in one of those ways:
import DirA
DirA.something_in_init_py()
# Importing hello:
import DirA.hello
DirA.hello.something_in_hello_py()
# Using a named import:
from DirA.hello import something_in_hello_py
something_in_hello_py()
And in DirB, just make the __init__.py empty as well. The only use of __all__ is for when you want to import *, which you don't want because, as they say, explicit is better than implicit.
But in case you are curious, it would work this way:
from DirB import *
something_in_dirb()
By default the import * will import everything it can find that does not start with an underscore. Specifying a __all__ restricts what it imported to the names defined in __all__. See this question for more details.
Edit: about init.
The __init__.py is not really connected to the importing stuff. It is just a special file with the following properties:
Its existence means the directory is a python package, with several modules in it. If it does not exist, python will refuse to import anything from the directory.
It will always be loaded before loading anything else in the directory.
Its content will be available as the package itself.
Just try it put this in DirA/__init__.py:
foo = 42
Now, in your main:
from DirA import foo
print(foo) # 42
It can be useful, because you can import some of your submodules in the __init__.py to hide the inner structure of your package. Suppose you build an application with classes Author, Book and Review. To make it easier to read, you give each class its own file in a package. Now in your main, you have to import the full path:
from myapp.author import Author
from myapp.book import Book
from myapp.review import Review
Clearly not optimal. Now suppose you put those exact lines above in your __init__.py, you may simplify you main like this:
from myapp import Author, Book, Review
Python will load the __init__.py, which will in turn load all submodules and import the classes, making them available on the package. Now your main does not need to know where the classes are actually implemented.
Have you tried something like this:
One way
from DirA import hello
Another way
from DirA.hello import hey
If those don't work then append a new system path
You need to import the function itself:
How to call a function from another file in Python?
In your case:
from DirA import foolish1, foolish2
i'm creating a python program, and i want to split it up into seperate files. I'm using import to do this, but its not working (specifically, a variable is stored in one python file, and its not being read by the main one.
program /
main.py
lib /
__init__.py
config.py
functions.py
I have in main.py:
import lib.config
import lib.functions
print(main)
and config.py has
main = "hello"
I should be getting the output "hello", when i'm executing the file main.py, but i'm not. I have the same problem with functions stored in functions.py
Any help would be great,
Importing the module with a simple import statement does not copy the names from that module into your own global namespace.
Either refer to the main name through attribute access:
print(lib.config.main)
or use the from ... import ... syntax:
from lib.config import main
instead.
You can learn more about how importing works in the Modules section of the Python tutorial.
What would be the best (read: cleanest) way to tell Python to import all modules from some folder?
I want to allow people to put their "mods" (modules) in a folder in my app which my code should check on each startup and import any module put there.
I also don't want an extra scope added to the imported stuff (not "myfolder.mymodule.something", but "something")
If transforming the folder itself in a module, through the use of a __init__.py file and using from <foldername> import * suits you, you can iterate over the folder contents
with "os.listdir" or "glob.glob", and import each file ending in ".py" with the __import__ built-in function:
import os
for name in os.listdir("plugins"):
if name.endswith(".py"):
#strip the extension
module = name[:-3]
# set the module name in the current global name space:
globals()[module] = __import__(os.path.join("plugins", name)
The benefit of this approach is: it allows you to dynamically pass the module names to __import__ - while the ìmport statement needs the module names to be hardcoded, and it allows you to check other things about the files - maybe size, or if they import certain required modules, before importing them.
Create a file named
__init__.py
inside the folder and import the folder name like this:
>>> from <folder_name> import * #Try to avoid importing everything when you can
>>> from <folder_name> import module1,module2,module3 #And so on
You might want to try that project: https://gitlab.com/aurelien-lourot/importdir
With this module, you only need to write two lines to import all plugins from your directory and you don't need an extra __init__.py (or any other other extra file):
import importdir
importdir.do("plugins/", globals())