I wish to split my code into multiple files in Python 3.
I have the following files:
/hello
__init__.py
first.py
second.py
Where the contents of the above files are:
first.py
from hello.second import say_hello
say_hello()
second.py
def say_hello():
print("Hello World!")
But when I run:
python3 first.py
while in the hello directory I get the following error:
Traceback (most recent call last):
File "first.py", line 1, in <module>
from hello.second import say_hello
ImportError: No module named 'hello'
Swap out
from hello.second import say_hello
for
from second import say_hello
Your default Python path will include your current directory, so importing straight from second will work. You don't even need the __init__.py file for this. You do, however, need the __init__.py file if you wish to import from outside of the package:
$ python3
>>> from hello.second import say_hello
>>> # Works ok!
You shouldn't run python3 in the hello directory.
You should run it outside the hello directory and run
python3
>>> import hello.first
By the way, __init__.py is no longer needed in Python 3. See PEP 420.
Packages are not meant to be imported from the current directory.
It is possible to make it work using if/else tests or try/except handlers, but it's more work than it is worth.
Just cd .. so you aren't in the package's directory and it will work fine.
Related
I created a python project, using Pycharm if it matters, that looks like this:
outer_dir/
outside_main.py
module_example/
module_main.py
tests/
test_1.py
…
level_1/
foo_1.py
level_2/
foo_2.py
main.py calls functions from foo_1.py
foo_1.py calls functions from foo_2.py
The general use case I'm trying to emulate is a python "package" or repo called module_example that has been downloaded and used by outside_main.py at the same level as the top-level package directory.
Here's how module_main.py looks:
# module_main.py
from level_1.foo_1 import foo_function_1
from level_1.level_2.foo_2 import foo_function_2
def main():
print(foo_1())
print(foo_2())
if __name__ == "__main__":
main()
And I can call this top level script from test_1.py just fine like this:
# test_1.py
from module_main import main
if __name__ == "__main__":
main()
However, when I try to use main() from outside the module_example directory, with code like this:
# outside_main.py
from module_example.module_main import main
if __name__ == "__main__":
main()
I get this error:
Traceback (most recent call last):
File "/path/to/outside_main.py", line 1, in <module>
from module_example.module_main import main
File "/path/to/module_example/module_main.py", line 1, in <module>
from level_1.foo_1 import foo_function_1
ModuleNotFoundError: No module named 'level_1'
What seems to be happening is that regardless of directory of the python script being imported, its own imports are being searched for relative to the directory from which the main script is executed.
I'm looking for a method of importing that works equally whether I'm calling outside_main.py or test_1.py, and regardless of what directory the terminal is currently pointing at.
I've read this post about how python imports work, and it basically suggests I'm out of luck, but that makes no sense to me, all I'm trying to do is emulate using someone's open source library from github, they usually have a tests folder within their repo, that works fine, but also I can just unzip the entire repo in my project and call import on their py files without issue. How do they do it?
Before entering the hell of sys.path-patching, I would suggest to wrap your code that currently lives in module_example into a package_example directory, add a setup.py and throw a number of __init__.py in.
Make package_example a git repo.
In every other project where you want to use module_example.[...], you create a venv and pip -e /path/to/package_example.
outer_dir/
outside_main.py
package_example/
.git/
...
setup.py
module_example/
__init__.py
module_main.py
tests/
...
where __init__.py (one in each subdirectory that contains modules that are to be imported) can just be empty files and setup.py has to be filled according to the distutils documentation.
Solution 1:
Use explicit relative imports (relative to the files inside the directory module_example). Note that the usage of implicit relative imports has already been removed in Python3
Implicit relative imports should never be used and have been removed in Python 3.
outer_dir/module_example/module_main.py
# Note the "." in the beginning signifying it is located in the current folder (use ".." to go to parent folder)
from .level_1.foo_1 import foo_function_1
from .level_1.level_2.foo_2 import foo_function_2
# All the rest the same
Solution 2:
Include the path to module_example in your PYTHONPATH environment variable (or you can also choose to update sys.path directly)
export PYTHONPATH=$(pwd):$(pwd)/module_example
Output:
Both solutions above was successful for me.
Before the fix:
$ python3 outside_main.py
Traceback (most recent call last):
File "outside_main.py", line 1, in <module>
from module_example.module_main import main
File "/home/nponcian/Documents/PearlPay/Program/2021_07Jul_31_StackOverflow/1/module_example/module_main.py", line 1, in <module>
from level_1.foo_1 import foo_function_1
ModuleNotFoundError: No module named 'level_1'
After the fix:
$ python3 outside_main.py
This is foo_function_1
This is foo_function_2
I have import.py script. I want to extract some code into a separate file, say, m1.py:
$ ls
import.py m1.py
$ cat import.py
from .m1 import a
a()
$ cat m1.py
def a():
print('it works')
$ python import.py
Traceback (most recent call last):
File "import.py", line 1, in <module>
from .m1 import a
ModuleNotFoundError: No module named '__main__.m1'; '__main__' is not a package
When I switch to absolute import, it works. But I don't want accidentally importing other module. I want to be sure module from script's directory is imported. How do I make it work? Or what am I doing wrong?
If you're not overriding the built in modules. By default, python looks first in your current directory for the file name you want to import. So if there is another script having the same name in another directory, only the one you have in the current directory is the one that will be imported.
Then, you could import using the absolute import.
from m1 import a
a()
You can check this post out, for more infrotmation about importing in python.
To make sure that the one your importing isn't the built in. You can create your own package in the current directory for example,"my_package" and have your module m1 moved in it. Then you can import by:
from my_package import m1
m1.a()
Add __init__.py in the directory where m1.py is.
EDIT : Run it as a package from the previous working directory. cd .. && python -m prev_dir.import
I'm having some trouble importing own packages in my programs, and so I made a test folder to try and understand what I'm doing wrong.
It's the simplest of things, But I still can't get it to work.
This is my folder structure:
test
> pack1
> __init__.py
> mod1.py
> pack2
> __init__.py
> mod2.py
Both init-files are empty.
mod1 looks like this:
def foo():
print "hello"
and mod2 looks like this
from pack1.mod1 import *
foo()
When running the code in PyCharm, everything works fine! But when trying to execute from cmd I get ImportError: No module named pack1.mod1
Is sys.path.insert(0, "../pack1") my only option, or is there another reason why cmd will not cooperate?
Regardless of version, python has to know where to look for packages. Manipulating sys.path is a quick and dirty option, which will break sometimes in the future, if your code grows more complex. Try making a package and install it via pip install -e or python setup.py develop
(Look for this at the nice distutils introduction)
In regular Python, there are only certain folders that are checked for importing packages and the test folder you have doesn't seem to be one of those files. To change this, edit sys.path in mod2.py and then import pack1.mod1.
mod2.py
import sys
# Add test folder to sys.path
sys.path.append("../")
from pack1.mod1 import *
# Prints "hello"!
foo()
Also, instead of editing sys.path, you could add the pack1 folder into the Lib folder within your Python directory. This will work because this is, by default, one of the folders in sys.path.
Python 2.7
> Lib
> pack1
> __init__.py
> mod1.py
mod2.py
from pack1.mod1 import *
# Prints "hello"!
foo()
You say you execute it via: (Documents)/test/pack2> python mod2.py
Problem is that pack2.mod2.py doesn't know where pack1 is.
Execute it as module:
(Documents)/test> python -m pack2.mod2
If you do not want to modify scripts or directory layout you can use PYTHONPATH environmental variable.
Example
vagrant#h:/tmp/test/pack2$ python mod2.py
Traceback (most recent call last):
File "mod2.py", line 1, in <module>
from pack1.mod1 import *
ImportError: No module named pack1.mod1
vagrant#h:/tmp/test/pack2$ export PYTHONPATH="${PYTHONPATH}:/tmp/test"
vagrant#h:/tmp/test/pack2$ python mod2.py
hello
vagrant#h:/tmp/test/pack2$
More about searching modules - https://docs.python.org/2/tutorial/modules.html#the-module-search-path
I have a package of the following form:
$ ls folder
entry_point.py hello.py __init__.py utils.py
This is a package, and I can treat it as such:
$ python2.7
>>> import folder.utils
>>>
I want to use relative imports between these Python modules.
$ cat folder/entry_point.py
from hello import say_hello
if __name__ == "__main__":
say_hello()
$ cat folder/hello.py
from .utils import say
def say_hello():
say()
$ cat folder/utils.py
def say():
print "hello world"
I know I can't use relative imports at the entry point, where I call the interpreter. However, I still get ImportError from my imported files:
$ python2.7 folder/entry_point.py
Traceback (most recent call last):
File "folder/entry_point.py", line 1, in <module>
from hello import say_hello
File "/tmp/folder/hello.py", line 1, in <module>
from .utils import say
ValueError: Attempted relative import in non-package
This is rather counterintuitive, it is a package, it's just not treated as one due to entry_point.py having __name__ set to __main__ (in line with PEP 328).
I'm surprised that hello.py has a __name__ of hello rather than folder.hello. This stops me using relative imports in hello.py.
How do I use relative imports in this package? Am I forced to move hello.py and utils.py to a libs subpackage?
If you want folder to be a module inside a bigger project and you want to be able to run entry_point.py for using your folder module - move entry_point.py one level up:
from folder.hello import say_hello
if __name__ == "__main__":
say_hello()
Import paths - the right way?
What is the __main__.py file for, what sort of code should I put into it, and when should I have one?
Often, a Python program is run by naming a .py file on the command line:
$ python my_program.py
You can also create a directory or zipfile full of code, and include a __main__.py. Then you can simply name the directory or zipfile on the command line, and it executes the __main__.py automatically:
$ python my_program_dir
$ python my_program.zip
# Or, if the program is accessible as a module
$ python -m my_program
You'll have to decide for yourself whether your application could benefit from being executed like this.
Note that a __main__ module usually doesn't come from a __main__.py file. It can, but it usually doesn't. When you run a script like python my_program.py, the script will run as the __main__ module instead of the my_program module. This also happens for modules run as python -m my_module, or in several other ways.
If you saw the name __main__ in an error message, that doesn't necessarily mean you should be looking for a __main__.py file.
What is the __main__.py file for?
When creating a Python module, it is common to make the module execute some functionality (usually contained in a main function) when run as the entry point of the program. This is typically done with the following common idiom placed at the bottom of most Python files:
if __name__ == '__main__':
# execute only if run as the entry point into the program
main()
You can get the same semantics for a Python package with __main__.py, which might have the following structure:
.
└── demo
├── __init__.py
└── __main__.py
To see this, paste the below into a Python 3 shell:
from pathlib import Path
demo = Path.cwd() / 'demo'
demo.mkdir()
(demo / '__init__.py').write_text("""
print('demo/__init__.py executed')
def main():
print('main() executed')
""")
(demo / '__main__.py').write_text("""
print('demo/__main__.py executed')
from demo import main
main()
""")
We can treat demo as a package and actually import it, which executes the top-level code in the __init__.py (but not the main function):
>>> import demo
demo/__init__.py executed
When we use the package as the entry point to the program, we perform the code in the __main__.py, which imports the __init__.py first:
$ python -m demo
demo/__init__.py executed
demo/__main__.py executed
main() executed
You can derive this from the documentation. The documentation says:
__main__ — Top-level script environment
'__main__' is the name of the scope in which top-level code executes.
A module’s __name__ is set equal to '__main__' when read from standard
input, a script, or from an interactive prompt.
A module can discover whether or not it is running in the main scope
by checking its own __name__, which allows a common idiom for
conditionally executing code in a module when it is run as a script or
with python -m but not when it is imported:
if __name__ == '__main__':
# execute only if run as a script
main()
For a package, the same effect can be achieved by including a
__main__.py module, the contents of which will be executed when the module is run with -m.
Zipped
You can also zip up this directory, including the __main__.py, into a single file and run it from the command line like this - but note that zipped packages can't execute sub-packages or submodules as the entry point:
from pathlib import Path
demo = Path.cwd() / 'demo2'
demo.mkdir()
(demo / '__init__.py').write_text("""
print('demo2/__init__.py executed')
def main():
print('main() executed')
""")
(demo / '__main__.py').write_text("""
print('demo2/__main__.py executed')
from __init__ import main
main()
""")
Note the subtle change - we are importing main from __init__ instead of demo2 - this zipped directory is not being treated as a package, but as a directory of scripts. So it must be used without the -m flag.
Particularly relevant to the question - zipapp causes the zipped directory to execute the __main__.py by default - and it is executed first, before __init__.py:
$ python -m zipapp demo2 -o demo2zip
$ python demo2zip
demo2/__main__.py executed
demo2/__init__.py executed
main() executed
Note again, this zipped directory is not a package - you cannot import it either.
Some of the answers here imply that given a "package" directory (with or without an explicit __init__.py file), containing a __main__.py file, there is no difference between running that directory with the -m switch or without.
The big difference is that without the -m switch, the "package" directory is first added to the path (i.e. sys.path), and then the files are run normally, without package semantics.
Whereas with the -m switch, package semantics (including relative imports) are honoured, and the package directory itself is never added to the system path.
This is a very important distinction, both in terms of whether relative imports will work or not, but more importantly in terms of dictating what will be imported in the case of unintended shadowing of system modules.
Example:
Consider a directory called PkgTest with the following structure
:~/PkgTest$ tree
.
├── pkgname
│ ├── __main__.py
│ ├── secondtest.py
│ └── testmodule.py
└── testmodule.py
where the __main__.py file has the following contents:
:~/PkgTest$ cat pkgname/__main__.py
import os
print( "Hello from pkgname.__main__.py. I am the file", os.path.abspath( __file__ ) )
print( "I am being accessed from", os.path.abspath( os.curdir ) )
from testmodule import main as firstmain; firstmain()
from .secondtest import main as secondmain; secondmain()
(with the other files defined similarly with similar printouts).
If you run this without the -m switch, this is what you'll get. Note that the relative import fails, but more importantly note that the wrong testmodule has been chosen (i.e. relative to the working directory):
:~/PkgTest$ python3 pkgname
Hello from pkgname.__main__.py. I am the file ~/PkgTest/pkgname/__main__.py
I am being accessed from ~/PkgTest
Hello from testmodule.py. I am the file ~/PkgTest/pkgname/testmodule.py
I am being accessed from ~/PkgTest
Traceback (most recent call last):
File "/usr/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/usr/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "pkgname/__main__.py", line 10, in <module>
from .secondtest import main as secondmain
ImportError: attempted relative import with no known parent package
Whereas with the -m switch, you get what you (hopefully) expected:
:~/PkgTest$ python3 -m pkgname
Hello from pkgname.__main__.py. I am the file ~/PkgTest/pkgname/__main__.py
I am being accessed from ~/PkgTest
Hello from testmodule.py. I am the file ~/PkgTest/testmodule.py
I am being accessed from ~/PkgTest
Hello from secondtest.py. I am the file ~/PkgTest/pkgname/secondtest.py
I am being accessed from ~/PkgTest
Note: In my honest opinion, running without -m should be avoided. In fact I would go further and say that I would create any executable packages in such a way that they would fail unless run via the -m switch.
In other words, I would only import from 'in-package' modules explicitly via 'relative imports', assuming that all other imports represent system modules. If someone attempts to run your package without the -m switch, the relative import statements will throw an error, instead of silently running the wrong module.
You create __main__.py in yourpackage to make it executable as:
$ python -m yourpackage
__main__.py is used for python programs in zip files. The __main__.py file will be executed when the zip file in run. For example, if the zip file was as such:
test.zip
__main__.py
and the contents of __main__.py was
import sys
print "hello %s" % sys.argv[1]
Then if we were to run python test.zip world we would get hello world out.
So the __main__.py file run when python is called on a zip file.
If your script is a directory or ZIP file rather than a single python file, __main__.py will be executed when the "script" is passed as an argument to the python interpreter.