Python - import from sibling directory while running as script - python

I have the following folder structure:
PROJECT_DIR
| --helpers
| |--utils.py
| --stuff
| |--script.py
I need to run script.py as a script, and from it, I need to use a function from helpers/utils.py.
I tried relative importing from ..helpers.utils import func, but it says
ImportError: attempted relative import with no known parent package
so I added an empty init.py file to each folder, including PROJECT_DIR.
Then I read that while running as a script, the python compiler runs the script as if it was the main module, so it doesn't see any other modules outside so relative import cannot be used.
But what should I do if I need to use that function? It's a fairly simple use case, I can't get my head around why it's so hard to import a function from a file outside the current directory. Tho I'm not really interested in the whys, I'd just like to know a solution how people do this.

root_project
└── proj
├── __init__.py
├── helpers
│   ├── __init__.py
│   └── utils.py
└── stuff
├── __init__.py
└── script.py
With this structure just cd to root_project and use this command:
python -m proj.stuff.script

Related

How can I use relative imports in Python to import a function in another directory

I have a directory structure with 2 basic python files inside seperate directories:
├── package
│ ├── subpackage1
│ │ └── module1.py
└── subpackage2
└── module2.py
module1.py:
def module1():
print('hello world')
module2.py:
from ..subpackage1.module1 import module1
module1()
When running python3 module2.py I get the error: ImportError: attempted relative import with no known parent package
However when I run it with the imports changed to use sys.path.append() it runs successfully
import sys
sys.path.append('../subpackage1/')
from module1 import module1
module1()
Can anyone help me understand why this is and how to correct my code so that I can do this with relative imports?
To be considered a package, a Python directory has to include an __init__.py file. Since your module2.py file is not below a directory that contains an __init__.py file, it isn't considered to be part of a package. Relative imports only work inside packages.
UPDATE:
I only gave part of the answer you needed. Sorry about that. This business of running a file inside a package as a script is a bit of a can of worms. It's discussed pretty well in this SO question:
Relative imports in Python 3
The main take-away is that you're better off (and you're doing what Guido wants you to) if you don't do this at all, but rather move directly executable code outside of any module. You can usually do this by adding an extra file next to your package root dir that just imports the module you want to run.
Here's how to do that with your setup:
.
├── package
│   ├── __init__.py
│   ├── subpackage1
│   │   └── module1.py
│   └── subpackage2
│   └── module2.py
└── test.py
test.py:
import package.subpackage2.module2
You then run test.py directly. Because the directory containing the executed script is included in sys.path, this will work regardless of what the working directory is when you run the script.
You can also do basically this same thing without changing any code (you don't need test.py) by running the "script" as a module.
python3 -m package.subpackage2.module2
If you have to make what you're trying to do work, I think I'd take this approach:
import os, sys
sys.path.append(os.path.join(os.path.dirname(__file__), '..'))
from subpackage1.module1 import module1
module1()
So you compute in a relative way where the root of the enclosing package is in the filesystem, you add that to the Python path, and then you use an absolute import rather than a relative import.
There are other solutions that involve extra tools and/or installation steps. I can't think why you could possibly prefer those solutions to the last solution I show.
By default, Python just considers a directory with code in it to be a directory with code in it, not a package/subpackage. In order to make it into a package, you'll need to add an __init__.py file to each one, as well as an __init__.py file to within the main package directory.
Even adding the __init__.py files won't be enough, but you should. You should also create a setup.py file next to your package directory. Your file tree would look like this:
├── setup.py
└── package
├── __init__.py
└── subpackage1
│ ├── __init__.py
│ └── module1.py
└── subpackage2
├── __init__.py
└── module2.py
This setup.py file could start off like this:
from setuptools import setup
setup(
name='package',
packages=['package'],
)
These configurations are enough to get you started. Then, on the root of your directory (parent folder to package and setup.py), you will execute next command in you terminal pip install -e . to install your package, named package, in development mode. Then you'll be able to navigate to package/subpackage2/ and execute python module2.py having your expected result. You could even execute python package/subpackage2/module2.py and it works.
The thing is, modules and packages don't work the same way they work in another programming languages. Without the creation of setup.py if you were to create a program in your root directory, named main.py for example, then you could import modules from inside package folder tree. But if you're looking to execute package\subpackage2\module2.py.
If you want relative imports without changing your directory structure and without adding a lot of boilerplate you could use my import library: ultraimport
It gives the programmer more control over their imports and lets you do file system based relative or absolute imports.
Your module2.py could then look like this:
import ultraimport
module1 = ultraimport('__dir__/../subpackage1/module1.py')
This will always work, no matter how you run your code or if you have any init files and independent of sys.path.

Failed to import python module from different directory

I have this code structure in python3:
- datalake
__init__.py
utils
__init__.py
utils.py
lambdas
__init__.py
my-lambdas.py
- tests
__init__.py
demo.py
All init__.py files are empty.
My problem is how I can import datalake module from tests/demo.py?
I tried from datalake.utils import utils in demo.py but when I run python tests/demo.py from command line, I get this error ModuleNotFoundError: No module named 'datalake'.
If I use this code:
from ..datalake.utils import utils
I will get error ValueError: attempted relative import beyond top-level package.
I also tried to import the module utils from my-lambda.py file which also failed. The code in my-lambda.py is from datalake.utils import utils but I get ModuleNotFoundError: No module named 'datalake' error when run python datalake/lambda/my-lambda.py from command line.
How can I import the module?
When you run a command like python tests/demo.py, the folder you are in does not get added to the PYTHONPATH, the script folder does. So a top-level import like import datalake will fail. To get around this you can run your tests as a module:
Python 2:
python -m tests/demo
Python 3:
python -m tests.demo
and any datalake imports in demo.py will work.
It sounds like what you really want to do is have a folder with tests separate to your main application and run them. For this I recommend py.test, for your case you can read Tests Outside Application Code for how to do it. TL;DR is run your tests from your top level project folder with python -m py.test and it will work.
First of all, my-lambdas.py is not importable with the import statement as hyphens are not valid in Python identifiers. Try to follow PEP-8's naming conventions, such as mylambdas.py.
Otherwise the package structure looks good, and it should be importable as long as you are at the level above datalake/, e.g., if you were in the directory myproject/ below:
myproject
├── datalake
│ ├── __init__.py
│ ├── utils
│ │ ├── __init__.py
│ │ └── utils.py
│ └── lambdas
│ ├── __init__.py
│ └── mylambdas.py
└── tests
├── __init__.py
└── demo.py
Then this should work:
~/myproject$ python -c 'from datalake import utils'
Otherwise, setting the environment variable PYTHONPATH to the path above datalake/ or modifying sys.path are both ways of changing where Python can import from. See the official tutorial on modules for more information.
Also some general advice: I've found it useful to stick with simple modules rather than packages (directories) until there is a need to expand. Then you can change foo.py into a foo/ directory with an __init__.py file and import foo will work as before, although you may need to add some imports to the __init__.py to maintain API compatibility. This would leave you with a simpler structure:
myproject
├── datalake
│ ├── __init__.py
│ ├── utils.py
│ └── lambdas.py
└── tests
├── __init__.py
└── demo.py
You can add the module directory into your sys.path:
import sys
sys.path.append("your/own/modules/folder") # like sys.path.append("../tests")
but this is a one-shot method, which is just valid at this time, the added path is not permanent, it will be eliminated after the code completed execution.
One of the ways to import the file directly instead of using from, like import util
you can try run :
python -m datalake.lambda.my-lambda
follow: https://docs.python.org/3.7/using/cmdline.html#cmdoption-m

Python Import modules 1 level above. Without using Sys.path

Update: I have changed my file directory
I have a directory structure as follows and I would like to import a module in a parent directory.
**project**/
__init__.py
main.py
**APP_NAME**/
**parser**/
__init__.py
parser.py
**test**/
__init__.py
parser_test.py
parser.py
class Parser(object):
pass
main.py (Works fine)
from APP_NAME.parser.parser import Parser
parser_test.py (Throws error)
from ..APP_NAME.parser.parser import Parser
Throws the following error at parser_test.py
Parent module '' not loaded, cannot perform relative import
I know I can fix it using sys.path.append(), but I want to import it like a package the way I did it in main.py.
Any help is appreciated. Thanks.
I had to check back at one of my projects for a reference.
To test files in the tests folder you must first create setup.py, so that you can install you project for python to use it.
If on linux use the command, sudo python setup.py install to install the package. When changes have been made to the project, you must install again for the changes to take place.
These folder will be created in your root project directory after installing.
build, dist, and project.egg-info.
You may need to clean the build directory before re-installing to update.
python setup.py clean
python setup.py build
python setup.py install
Project Structure
project
├── setup.py
├── tests
│ └── parser_test.py
│
└── project
   ├── __init__.py
   ├── __init__.pyc
   ├── main.py
   └── parser
   ├── __init__.py
   ├── __init__.pyc
   ├── parser.py
   └── parser.pyc
project/setup.py
from setuptools import setup
# Make sure the project name will not conflict with other libraries
# For example do not name the project, 'os', 'sys', ect.
setup(
name='project',
description='My project description',
author='your_online_name',
license='MIT', # Check out software licenses
packages=['project', 'tests']
)
project/tests/parser_test.py
from project.parser import Parser
parser = Parser()
project/project/__init__.py
from . import parser
project/project/parser/__init__.py
from .parser import Parser
project/project/parser/parser.py
class Parser(object):
pass
You shouldn't be using absolute import within your package. In-package imports should be done with relative imports this way:
parser_test.py
from ..parser.parser import Parser
With relative imports in Python, the first point refers to the file's directory and each extra point refers to the parent directory.
In this case, you would be pointing to the project/parser/parser.py file which from test_parser.py standpoint's is ../parser.py
If you are using Python 2, you should add the following line at the top of all the files in your parser package
from __future__ import absolute_import
This will avoid that you use absolute imports inside you package files by mistake.
Still assuming you are working with Python 2, you should also import unicode_literals for native unicode support and print_function to replace the print command by the print() function.
However, I would rather have my tests in the top folder of the package, which, assuming the package is called project and not parser, would give the following directory structure:
project/ # top project directory
├── main.py
└── project # top package directory
├── __init__.py # this file is required even if it is empty
├── parser
│   ├── __init__.py
│   └── parser.py
└── tests
└── test_parser.py
Also, the project/project/parser/__init__.py could contain the following:
from .parser import Parser
So that your main.py file could import the Parser class like this:
from project.parser import Parser
instead of the more tedious:
from project.parser.parser import Parser
Your test_parser.py file, however, will still have to import the Parser class like this:
from ..parser.parser import Parser
because the classes exposed in an __init__.py file are not available to relative imports.
Finally, if you are starting a new independent project, you should do it in Python 3 (that's a PEP recommendation), where all the above rules apply, except the from __future__ imports which are unnecessary.
Sources: https://axialcorps.wordpress.com/2013/08/29/5-simple-rules-for-building-great-python-packages/

How do you organise a python project that contains multiple packages so that each file in a package can still be run individually?

TL;DR
Here's an example repository that is set up as described in the first diagram (below): https://github.com/Poddster/package_problems
If you could please make it look like the second diagram in terms of project organisation and can still run the following commands, then you've answered the question:
$ git clone https://github.com/Poddster/package_problems.git
$ cd package_problems
<do your magic here>
$ nosetests
$ ./my_tool/my_tool.py
$ ./my_tool/t.py
$ ./my_tool/d.py
(or for the above commands, $ cd ./my_tool/ && ./my_tool.py is also acceptable)
Alternatively: Give me a different project structure that allows me to group together related files ('package'), run all of the files individually, import the files into other files in the same package, and import the packages/files into other package's files.
Current situation
I have a bunch of python files. Most of them are useful when callable from the command line i.e. they all use argparse and if __name__ == "__main__" to do useful things.
Currently I have this directory structure, and everything is working fine:
.
├── config.txt
├── docs/
│   ├── ...
├── my_tool.py
├── a.py
├── b.py
├── c.py
├── d.py
├── e.py
├── README.md
├── tests
│   ├── __init__.py
│   ├── a.py
│   ├── b.py
│   ├── c.py
│   ├── d.py
│   └── e.py
└── resources
├── ...
Some of the scripts import things from other scripts to do their work. But no script is merely a library, they are all invokable. e.g. I could invoke ./my_tool.py, ./a.by, ./b.py, ./c.py etc and they would do useful things for the user.
"my_tool.py" is the main script that leverages all of the other scripts.
What I want to happen
However I want to change the way the project is organised. The project itself represents an entire program useable by the user, and will be distributed as such, but I know that parts of it will be useful in different projects later so I want to try and encapsulate the current files into a package. In the immediate future I will also add other packages to this same project.
To facilitate this I've decided to re-organise the project to something like the following:
.
├── config.txt
├── docs/
│   ├── ...
├── my_tool
│   ├── __init__.py
│   ├── my_tool.py
│   ├── a.py
│   ├── b.py
│   ├── c.py
│   ├── d.py
│   ├── e.py
│   └── tests
│   ├── __init__.py
│   ├── a.py
│     ├── b.py
│   ├── c.py
│   ├── d.py
│   └── e.py
├── package2
│   ├── __init__.py
│   ├── my_second_package.py
| ├── ...
├── README.md
└── resources
├── ...
However, I can't figure out an project organisation that satisfies the following criteria:
All of the scripts are invokable on the command line (either as my_tool\a.py or cd my_tool && a.py)
The tests actually run :)
Files in package2 can do import my_tool
The main problem is with the import statements used by the packages and the tests.
Currently, all of the packages, including the tests, simply do import <module> and it's resolved correctly. But when jiggering things around it doesn't work.
Note that supporting py2.7 is a requirement so all of the files have from __future__ import absolute_import, ... at the top.
What I've tried, and the disastrous results
1
If I move the files around as shown above, but leave all of the import statements as they currently are:
$ ./my_tool/*.py works and they all run properly
$ nosetests run from the top directory doesn't work. The tests fail to import the packages scripts.
pycharm highlights import statements in red when editing those files :(
2
If I then change the test scripts to do:
from my_tool import x
$ ./my_tool/*.py still works and they all run properly
$ nosetests run from the top directory doesn't work. Then tests can import the correct scripts, but the imports in the scripts themselves fail when the test scripts import them.
pycharm highlights import statements in red in the main scripts still :(
3
If I keep the same structure and change everything to be from my_tool import then:
$ ./my_tool/*.py results in ImportErrors
$ nosetests runs everything ok.
pycharm doesn't complain about anything
e.g. of 1.:
Traceback (most recent call last):
File "./my_tool/a.py", line 34, in <module>
from my_tool import b
ImportError: cannot import name b
4
I also tried from . import x but that just ends up with ValueError: Attempted relative import in non-package for the direct running of scripts.
Looking at some other SO answers:
I can't just use python -m pkg.tests.core_test as
a) I don't have main.py. I guess I could have one?
b) I want to be able to run all of the scripts, not just main?
I've tried:
if __name__ == '__main__' and __package__ is None:
from os import sys, path
sys.path.append(path.dirname(path.dirname(path.abspath(__file__))))
but it didn't help.
I also tried:
__package__ = "my_tool"
from . import b
But received:
SystemError: Parent module 'loading_tool' not loaded, cannot perform relative import
adding import my_tool before from . import b just ends up back with ImportError: cannot import name b
Fix?
What's the correct set of magical incantations and directory layout to make all of this work?
Once you move to your desired configuration, the absolute imports you are using to load the modules that are specific to my_tool no longer work.
You need three modifications after you create the my_tool subdirectory and move the files into it:
Create my_tool/__init__.py. (You seem to already do this but I wanted to mention it for completeness.)
In the files directly under in my_tool: change the import statements to load the modules from the current package. So in my_tool.py change:
import c
import d
import k
import s
to:
from . import c
from . import d
from . import k
from . import s
You need to make a similar change to all your other files. (You mention having tried setting __package__ and then doing a relative import but setting __package__ is not needed.)
In the files located in my_tool/tests: change the import statements that import the code you want to test to relative imports that load from one package up in the hierarchy. So in test_my_tool.py change:
import my_tool
to:
from .. import my_tool
Similarly for all the other test files.
With the modifications above, I can run modules directly:
$ python -m my_tool.my_tool
C!
D!
F!
V!
K!
T!
S!
my_tool!
my_tool main!
|main tool!||detected||tar edit!||installed||keys||LOL||ssl connect||parse ASN.1||config|
$ python -m my_tool.k
F!
V!
K!
K main!
|keys||LOL||ssl connect||parse ASN.1|
and I can run tests:
$ nosetests
........
----------------------------------------------------------------------
Ran 8 tests in 0.006s
OK
Note that I can run the above both with Python 2.7 and Python 3.
Rather than make the various modules under my_tool be directly executable, I suggest using a proper setup.py file to declare entry points and let setup.py create these entry points when the package is installed. Since you intend to distribute this code, you should use a setup.py to formally package it anyway.
Modify the modules that can be invoked from the command line so that, taking my_tool/my_tool.py as example, instead of this:
if __name__ == "__main__":
print("my_tool main!")
print(do_something())
You have:
def main():
print("my_tool main!")
print(do_something())
if __name__ == "__main__":
main()
Create a setup.py file that contains the proper entry_points. For instance:
from setuptools import setup, find_packages
setup(
name="my_tool",
version="0.1.0",
packages=find_packages(),
entry_points={
'console_scripts': [
'my_tool = my_tool.my_tool:main'
],
},
author="",
author_email="",
description="Does stuff.",
license="MIT",
keywords=[],
url="",
classifiers=[
],
)
The file above instructs setup.py to create a script named my_tool that will invoke the main method in the module my_tool.my_tool. On my system, once the package is installed, there is a script located at /usr/local/bin/my_tool that invokes the main method in my_tool.my_tool. It produces the same output as running python -m my_tool.my_tool, which I've shown above.
Point 1
I believe it's working, so I don't comment on it.
Point 2
I always used tests at the same level as my_tool, not below it, but they should work if you do this at the top of each tests files (before importing my_tool or any other py file in the same directory)
import os
import sys
sys.path.insert(0, os.path.abspath(__file__).rsplit(os.sep, 2)[0])
Point 3
In my_second_package.py do this at the top (before importing my_tool)
import os
import sys
sys.path.insert(0,
os.path.abspath(__file__).rsplit(os.sep, 2)[0] + os.sep
+ 'my_tool')
Best regards,
JM
To run it from both command line and act like library while allowing nosetest to operate in a standard manner, I believe you will have to do a double up approach on Imports.
For example, the Python files will require:
try:
import f
except ImportError:
import tools.f as f
I went through and made a PR off the github you linked with all test cases working.
https://github.com/Poddster/package_problems/pull/1
Edit: Forgot the imports in __init__.py to be properly usable in other packages, added. Now should be able to do:
import tools
tools.c.do_something()

Python can't import my package

I have the following directory structure:
myapp
├── a
│   ├── amodule.py
│   └── __init__.py
├── b
│   ├── bmodule.py
│   ├── __init__.py
└── __init__.py
In a/amodule.py
I have this snippet which calls a simple function in b/bmodule.py
from myapp.b import bmodule
b.myfunc()
But when i run python a/amodule.py I get this error:
File "a/amodule.py", line 1, in <module>
from myapp.b import bmodule
ImportError: No module named 'myapp'
What am I doing wrong?
you need to put your project root onto your python path
you can set the PYTHONPATH environmental variable
or you can alter sys.path before importing
or you can use an IDE like pycharm that will do this kind of thing for you
(although it will probably be from b import blah)
there is likely other ways to resolve this issue as well
watch out for circular imports ...
(in python 3 you can also do relative imports... although I am not a big fan of this feature)
from ..b import blah
the best way to allow
from myapp.b import whatever
would be to edit your .bashrc file to always add your parent path to the PYTHONPATH
export PYTHONPATH=$PYTHONPATH;/home/lee/Code
now every time you log into the system python will treat your Code folder as a default place to look for import modules, regardless of where the file is executed from

Categories