How to import nested modules inside a folder in Python - python

I'm having a similar but different issue from the other posts I've seen on here: I'm trying to import a module from a nested module, but even though the python linter picks up everything fine I can't execute the file because of an import error, module not found.
ParentFolder /
|__ContainerFolder
|__ __init__.py
|__ Camera/
|__ __init__.py
|__ CameraService.py
|__ Data.py
|__ Settings /
|__ __init__.py
|__ SettingService.py
|__ Handler /
|__ __init__.py
|__ Handle.py
|__ Models /
|__ __init__.py
|__ Setting.py
What I want is to import Data.py inside of Handle.py
I have tried:
from ContainerFolder.Camera.Data import DataClass
The linter says it's fine, and the autofill in VS Code gives me type-ahead, however, at execution I get a ModuleNotFoundError for ContainerFolder module. I have an __init__.py in all the directories so what am I missing to make that a module to import from?
edit
So CameraService.py and SettingService.py are both Tornado APIs, since they are both executing as main how would one be able to share modules between the two? ie Data.py with modules under the Settings directory or Setting.py within modules under the Camera directory?

Try converting the file into a package like here
https://www.jetbrains.com/help/pycharm/refactoring-convert.html
or here
Converting a python package into a single importable file

Related

What statement to use to import relative packages from a relative folder without appending to the sys.path?

Given the following structure,
my_project/
|__ __init__.py
|__ README.md
|__ src/
| |__ __init.py__
| |__ data_processing.py
| |__ utils.py
| |__ models.py
| |__ bootstrap.py
|__ python_scripts/
| |__ myscript.py
|__ data/
|__ output/
I want to import my modules data_processing, utils, models, bootstrap in my script myscript.py, without appending to the os.sys path.
Currently, I have at the top of my myscript.py:
import os, sys
module_path = os.path.abspath(os.path.join('..'))
if module_path not in sys.path:
sys.path.append(module_path)
from src.utils import foo, bar
from src.data_processing import data_process_1, data_process_2
from src.models import MyModel
from src.bootstrap import bootstrap_eval
However, I am unsure what I need to add to my __init__.py file in src as well as maybe other places, in order to import my modules iwthout appending the module path to my sys path.
I.e., I would like to be able to do something like this :
from src.utils import foo, bar
from src.data_processing import data_process_1, data_process_2
from src.models import MyModel
from src.bootstrap import bootstrap_eval
Is it possible to do this, without building a python wheel and installing the modules as a package?

Python Scripts and Classes outside Django App

Apologies if this is a duplicate question. I tried searching up my question but didn't find anything.
I have a Django Project with 3 Django Apps and the project directory looks like this
Project
|__ App_1
|__ App_2
|__ App_3
|__ views.py
|__ models.py
|__ scripts
|__ class1.py
The scripts folder has python scripts that run algorithms using data from the database. I am using the Django Queryset to retrieve data. The class1.py file would look something like this:
from App_1.models import table1
.
.
.
When doing that, I get this error:
ModuleNotFoundError: No module named 'App_1'
Curious, I moved class1.py to the App_1 folder and I got this error.
ImportError: attempted relative import with no known parent package
I am a bit confused as to why BOTH methods did not work. What would be the best way to go about this? Ideally, I want to setup my scripts so I can just declare an object in a view-based function in App_1/views.py

How to import modules from adjacent package without setting PYTHONPATH

I have a python 2.7 project which I have structured as below:
project
|
|____src
| |
| |__pkg
| |
| |__ __init__.py
|
|____test
|
|__test_pkg
| |
| |__ __init__.py
|
|__helpers
| |
| |__ __init__.py
|
|__ __init__.py
I am setting the src folder to the PYTHONPATH, so importing works nicely in the packages inside src. I am using eclipse, pylint inside eclipse and nosetests in eclipse as well as via bash and in a make file (for project). So I have to satisfy lets say every stakeholder!
The problem is importing some code from the helpers package in test. Weirdly enough, my test is also a python package with __init__.py containing some top level setUp and tearDown method for all tests. So when I try this:
import helpers
from helpers.blaaa import Blaaa
in some module inside test_pkg, all my stakeholders are not satisfied. I get the ImportError: No module named ... and pylint also complains about not finding it. I can live with pylint complaining in test folders but nosetests is also dying if I run it in the project directory and test directory. I would prefer not to do relative imports with dot (.).
The problem is that you can not escape the current directory by importing from ..helpers.
But if you start your test code inside the test directory with
python3 -m test_pkg.foo
the current directory will be the test directory and importing helpers will work. On the minus side that means you have to import from . inside test_pkg.

How can i upload documentation to pythonhosted.org

I've been struggling for a long time to upload some documentation online but all my attempt has been a waste. I've read so many guides but they still do not help. I've zipped the file but get errors when manual]y uploading but the PyPI website says index.html can't be found even though its there. I've also tried the sphinx upload package but do not really get it to work successfully.
Please any help would be greatly appreciated. PyPI help guides doesn't help either.
This is how my folder is structured
Eventsim
|__ build
|__ dist
|__ doc
|__ eventsim
|__ setup.py
|__ readme
|__ _build
| |__ html
| | |__ index.html
| | |__ all htmlfiles etc.
| |__ doctrees
|__ _static
|__ _templates
|__ conf.py
|__ index.rst
|__ make.bat
|__ etc
the doc contains all sphinx data
i once made a setup.cfg and it built well but uploading was a fatal error.

import error when run tests in submodule with nose

We use git submodule to share common module within our team.
Each module has test cases in its source folder, and ran by nose.
I have a project in this structure:
/project_sample
|__/project_sample
| |__ __init__.py
| |__ moduleA.py
| |__/tests
| | |__ __init__.py
| | |__ moduleA_tests.py
| |__/subpackage
| |__ __init__.py
| |__ moduleB.py
| |__/tests
| |__ __init__.py
| |__ moduleB_tests.py
|__setup.py
All of these init.py files are empty.
The subpackage is developed seperately and added to the project by git submodule. We want it to be self-contained , and try to share it in different project. Its test case is like this:
moduleB_tests.py:
from subpackage import moduleB
def test_funcA():
moduleB.funcA()
The test pass when i run nosetests from the subpackage's repo folder.
Seems like nose find a init.py file in the parent folder of subpackage (project_sample), when i run nosetests from project_sample's root directory, i get "ImportError: No module named subpackage". But it passed when i change the first line to:
from project_sample.subpackage import moduleB
But this way makes subpackage not self-contained.
I tried some way like : adding subpackage to sys.path or use -w option of nose, but still get this exception.
My teammate run subpackage's test case seperately in PyCharm and get passed, so i think there should be some way to make it pass from command line.
Is there any way to resolve the problem, or any suggestion with the project structure?
This is my first question on SO, Any suggestion is appreciated.
I know this question is a bit old and has already been answered but we use a different strategy in our code base.
If for some reason you still want the package in the project_sample module directory, you can structure it something like this:
/project_sample
|__/project_sample
| |__ __init__.py
| |__ moduleA.py
| |__/tests
| | |__ __init__.py
| | |__ moduleA_tests.py
| |__/subpackage_repo
| |__/subpackage
| |__ __init__.py
| |__ moduleB.py
| |__/tests
| |__ __init__.py
| |__ moduleB_tests.py
| |__setup.py
|__setup.py
Without an __init__.py in the subpackage, it won't be a part of your project.
Then in the __init__.py of the main package, you can include
import os
import sys
sys.path.insert(0,
os.path.join(
os.path.dirname(os.path.dirname(__file__)),
'subpackage_repo')
)
import subpackage
which puts the subpackage in the path for the main project.
Then in moduleA.py that allows us to do something like
from . import subpackage
since the package is imported at the __init__.py level.
Of course you can also move it up a level as well and just reflect that in the path you're adding if you want to do it that way. The only disadvantage of doing that is that when you run your tests with something like
python -m unittest discover
you'll discover the subpackage tests as well and run those. Ideally those tests should be taken care of with whatever CI you have for the package so we'd hope that they don't need to be run.
If you want subpackage to be self-contained, you need it to treat as such. This means that you need to put it top-level, parallel to project_sample instead of into it.

Categories