I'm creating my first package and I can not figure out how to create a simple test.py file to debug my code.
Consider the following file structure.
|--MyPackage
|--test.py
|--PackageName
|--__init__.py
|--module.py
Inside test.py I have: from .src.module import SomeClass
And of course this gives me the dreaded "attempted relative import with no known parent package" message.
I know I could just install the package and then import it with from PackageName.module import SomeClass but then I'm using the code installed on my system and not the code that I am actively editing.
There has got to be some kind of standard way for testing and debugging a package right? Despite all the searching I've done, I can't seem to find any kind of solution.
I'm running test.py with python3 test.py
If it's helpful, here is a screenshot of my actual project folder structure:
Related
I have a layout like this:
src
__init__.py
main.py
examples
__init__.py
example_project.py
library
__init__.py
some_library_code.py
example_project.py uses code from some_library_code.py
I run the example_project.py like that:
***\src>: python examples\example_project.py and get ImportError: attempted relative import with no known parent package
I've read through some answers on SO and found that I need some construction
sys.path.append(os.path.normpath(os.path.join(os.path.dirname(os.path.abspath(__file__)), os.pardir)))
to be present in example_project.py
My example_project.py import section looks like that:
import os, sys
sys.path.append(os.path.normpath(os.path.join(os.path.dirname(os.path.abspath(__file__)), os.pardir)))
from ..library import some_library_code
but that doesn't work and shows the same ImportError
UPD:
if I change from ..library ***** to from library ***** then it works but the IDE doesn't recognize imported types and shows error around import clause
Revert the sys path hacks and instead run your code as:
***\src>: python -m examples.example_project
This way tells python to run the module example_project that lives in the package examples. Otherwise python has no way (when you run the script directly that is) to know that this script is part of a package - hence the error. The syspath hacks will fail in subtle ways (the IDE can't really follow those dynamic sys path additions - there are settings for this but then starts to become complicated, hence hack - but there are worst consequences even undefined behavior) while running your script with the -m switch from the parent dir of your root package is the recommended way of running scripts.
You forgot to add __init__.py in library folder. Here is the documentation https://docs.python.org/3/reference/import.html#regular-packages
Try from library.some_library_code import Something.
I am building on AWS CodeBuild using Python 2.7, but I believe this is much more a generic python import problem. I have a directory setting, shown below. I am running test.py inside the test folder. I would like to import the dependency mainScript.py as part of this testing. However, I cannot seem to get the relative dependencies right and I am having a lot of difficulty importing the mainScript within the test folder. Below is a layout of my directory folder
main
src
mainScript.py
test
test.py
If for example my directory setup was something like
main
test
test.py
mainScript.py
I could have my import be done the following way
from mainScript import *
I have confirmed this works. But, I like it in its own src folder. I have tried all these These are the following relative path attempts I have tried
from ..src/mainScript import * #SyntaxError: invalid syntax
from ..src.mainScript import * #ValueError: attempted relative import beyond top-level package
from mainScript import * #ModuleNotFoundError: No module named 'mainScript'
from src.mainScript import * #ModuleNotFoundError: No module named 'src'
from src/mainScript import * #SyntaxError: invalid syntax
I have been struggling for a bit and I couldn't quite find a question with someone asking about accessing a brother/sister folder script. Thank you in advance for your help.
Python treats directories as packages if they contain a __init__.py file. Updating your structure to the following should do the trick:
__init__.py
src
__init__.py
mainScript.py
test
__init__.py
test.py
Now, from test.py, you could do from ..src import *. For more details on init.py, you can look here: What is __init__.py for?
In addition to adding the init.py files. It ended up being that I had to run python with the -m argument in my command, which was added in Python 2.4.
PEP 338
Python 2.4 adds the command line switch -m to allow modules to be
located using the Python module namespace for execution as scripts.
The motivating examples were standard library modules such as pdb and
profile, and the Python 2.4 implementation is fine for this limited
purpose.
So the command to launch from the top directory is:
python -m test.test
This seems to work and get the right namespace. Then in your test.py file you would import the mainScript the following way
from src.mainScript import *
I have a python project structured like this:
repo_dir/
----project_package/
--------__init__.py
--------process.py
--------config.py
----tests/
--------test_process.py
__init__.py is empty
config.py looks like this:
name = 'brian'
USAGE
I use the library by running python process.py from the project/project/ directory, or by specifying the python file path absolutely. I'm running Python 2.7 on Amazon EC2 Linux.
When process.py looks like below, everything works fine and process.py prints brian.
import config
print config.name
When process.py looks like below, I get the error ImportError: No module named project.config.
import project.config
print config.name
When process.py looks like below, I get the error ImportError: No module named project. This makes sense as the same behavior from the previous example should be expected.
from project import config
print config.name
If I add these lines to process.py to include the library root in sys.path, all configurations above, work fine.
import os
import sys
sys.path.append(os.path.abspath(os.path.join(os.path.dirname(__file__), '..')))
MY CONFUSION
Many resources suggest setting up python libraries to import modules using project.module_name, but it doesn't seem like sys.path appending is standard, and seems weird that I need it. I can see that the sys.path append added my library root as a path in sys, but I thought that's what the __init__.py in my library root was supposed to do. What gives? What am I missing? I know Python importing creates lots of headaches so I've tried to simplify this as much as possible to wrap my head around it. I'm going crazy and it's Friday before a holiday. I'm bummed. Please help!!
QUESTIONS
How should I set up my libraries? How should I import packages? Where should I have __init__.py files? Do I need to append my library root to sys.path in every project? Why is this so confusing?
Your project setup is alright. I renamed the directories just for clarity
in this example, but the structure is the same as yours:
repo_dir/
project_package/
__init__.py
process.py
config.py
# Declare your project in a setup.py file, so that
# it will be installable, both by users and by you.
setup.py
When you have a module that wants to import from another module in
the same project, the best approach is to use relative imports. For example:
# In process.py
from .config import name
...
While working on the code on your dev box, do your work in a Python virtualenv,
and pip install your project in "editable" mode.
# From the root of your repo:
pip install -e .
With that approach, you'll never need to muck around with sys.path -- which
is almost always the wrong approach.
I think the problem is how you're running your script. If you want the script to be living in a package (the inner project folder), you should run it with python -m project.process, rather than by filename. Then you can make absolute or explicit relative imports to get config from process.
An absolute import would be from project import config or import project.config.
An explicit relative import would be from . import config.
Python 2 also allows implicit relative imports, but they're a really bad misfeature that you should never use. With implicit relative imports, internal package modules can shadow top-level modules. For instance, a project/json.py file would hide the standard library's json module from all the other modules in the package. You can tell Python you want to forbid implicit relative imports by putting from __future__ import absolute_import at the top of the file. It's the standard behavior in Python 3.
so I have this directory structure:
/pkg
/__init__.py
/script1.py
/dir1
/__init__.py
/file.json
/dir2
/__init__.py
/script2.py
As you can see, I have two script files, script1.py and script2.py, what I'm trying to do is to import script1.py from script2.py, so I did
import pkg.script1
but it is telling me
ModuleNotFoundError: No module named 'pkg'
ports and did
from ... import script1
but I get this error
ValueError: attempted relative import beyond top-level package
Anyone got any idea?
Without seeing your commandline (the important missing information) this is only a guess. You can see a writeup I did on this here.
You're probably running python pkg/dir1/dir2/script2.py which is going to put pkg/dir1/dir2 on the python path (and not . as you want) -- this leads to the error messages you see because you are in fact not a package at that depth (and don't have pkg importable on any of the sys.path roots). You'd also see the same problem in python 2.
You should almost always use the -m approach when calling scripts that are modules. In your case it would be python -m pkg.dir1.dir2.script2
I developed a solution with the following structure:
my_package/
my_test_data/
test.py
In test.pyI can easily import my_package (from my_package import my_class). This is very useful in my IDE of choice where I can write test cases, execute them and eventually set breakpoints in the code being tested where needed.
The fine structure, ready for distribution, changed to:
my_package/
tests/
my_test_data/
test.py
This is ok if someone wants to test that what's been installed works fine. Test references the installed version of my_package. The problem is that during development I need to reference my_package from the development folder, so that I can test the live version I'm developing and eventually step into it for debugging purposes. I tried to solve the problem with relative import from .my_package import my_class, from .. my_package import my_class and other combinations but I get this Exception:
ValueError: Attempted relative import in non-package
Any help?
I'll assume that the development structure is under /dev and the distribution structure is under /install.
Note that, by default, sys.path has the script directory as its first entry. So, if test.py has an import my_package statement and you run /dev/test.py, it should find my_package, even if your $PYTHONPATH is empty. If /install is the first entry in $PYTHONPATH, then running /install/tests/test.py should find /install/my_package for import my_package.
Summary: have you tried using import my_package in test.py and including /install in your $PYTHONPATH?
Relative imports are only allowed intra-packages wise. You mustn't confuse directory-references with package-handling.
Proposed solutions on a similar question.