How can I import my own package in python? - python

I have a project
testci/
├── __init__.py
├── README.md
├── requirements.txt
├── src
│   ├── __init__.py
│   └── mylib.py
└── test
├── __init__.py
└── pow_test.py
When I run python3.6 test/pow_test.py I see an error:
File "test/pow_test.py", line 3, in
import testci.src.mylib as mylib
ModuleNotFoundError: No module named 'testci'
pow_test.py
from testci.src.mylib import get_abs
def test_abs():
assert get_abs(-10) == 10
How can I fix this error?
System details: Ububntu 16.04 LTS, Python 3.6.10

try this
from .src import mylib
from mylib import get_abs
if it won't work then import one by one. But don't import the root folder since the file you are importing to is on the same folder you are trying to import then it will always raise an error

Run Python with the -m argument within the base testsci package to execute as a submodule.
I made a similar mock folder structure:
├───abc_blah
│ │ abc_blah.py
│ │ __init__.py
│
└───def
│ def.py
│ __init__.py
abc_blah.py
print('abc')
def.py
import abc_blah.abc_blah
Execute like such:
python -m def.def
Correctly prints out 'abc' as expected here.

simply add __package__ = "testci" and also it is a good practice to add a try and except block
Your final code should look something like
try:
from testci.src.mylib import get_abs
except ModuleNotFoundError:
from ..testci.src.mylib import get_abs
for running it, type python -m test.pow_test

I think your issue is how the package is installed. The import looks fine to me. As it says CI I'm guessing you're having the package installed remotely with only the test folder somehow.
Try adding a setup.py file where you define that both the test as well as the src packages are part of your testci package.

there are many ways to organize a project, keep things consider in mind, structure should be simple and more scaleable, can differentiate the things in codebase easily.
one of the few good possible ways to structure a project is below
project/
├── app.py
├── dockerfile
├── pipfile
├── Readme.md
├── requiements.txt
├── src_code
│   ├── code
│   │   ├── __init__.py
│   │   └── mylib.py
│   └── test
│   ├── __init__.py
│   └── test_func.py
└── travisfile
here app.py is main file which is responsible to run your entire project

Related

ImportError: No module named unable to import toplevel directory of module in python

My Directory structure:
├── common
│   ├── common.py
│   └── __init__.py
├── project1
│   ├── __init__.py
│   └── scripts
│   ├── example_import.py
│   └── __init__.py
└── project2
├── __init__.py
└── scripts
└── __init__.py
I need to import common/common.py module in project1/scripts/example_import.py file
example_import.py:
import sys
sys.path.append("../common")
from common import Test
print("Module Not import error")
Error:
Traceback (most recent call last):
File "project1/scripts/example_import.py", line 3, in <module>
from common import Test
ImportError: No module named common
How to fix a issue?
Understanding how Python imports work is tricky in the beginning but makes sense once you understand how.
There are different way to fix your import issue. I would not recommend messing up with sys.path. Depending on where you are calling your script, you have multiple choice at hand.
Your Directory structure.
├── common
│   ├── common.py
│   └── __init__.py
├── project1
│   ├── __init__.py
│   └── scripts
│   ├── example_import.py
│   └── __init__.py
└── project2
├── __init__.py
└── scripts
└── __init__.py
On the root of directory
python project1/scripts/example_import.py
Will work, assuming that the imports in example_import.py looks like
from common.common import Test
If you want to use from common import Test, you need to add from common.common import Test in __init__.py file in common folder.
You can also use PYTHONPATH environment variable to tell Python where to look for the base module.
PYTHONPATH=/pathto/basefolder python folder/filesx.py
Another way is to create a setup.py on base and do development installation python -m pip install --editable . on environment you are working on.
Example of setup.py
#!/usr/bin/env python
from distutils.core import setup
setup(name='projectX',
version='1.0',
description='The World is not Enoug',
author='James Bond',
author_email='agent#007.net',
url='https://www.python.org/sigs/distutils-sig/',
packages=['common', 'project1', 'project2],
)
See Python Documentation for more setup.py options.

importing python modules and packages in different sub-directories of the same project

I'd like to figure out the cleanest and preferably self contained way to use my packages in scripts that are in a different directory to the package itself.
The example problem is as follows:
The modules in lib need to both be imported, and run as a script.
My project directory is as below and I'm having two issues:
in lib/api.py, I want to read in data_files/key.txt correctly when api.py is called or imported
in testing_script.py I want to import and use lib/get_data.py
I can't seem to find a clean way to do this, does this mean my project is structured in a non-pythonic way?
Thanks for the help.
my-project-git
├── LICENSE
├── README.md
├─── my_project
│   ├── data_files
│   │   ├── key.txt
│   │   ├── mappings.csv
│   ├── lib
│   │   ├── __init__.py
│   │   ├── api.py
│   │   └── get_data.py
│   └── test
│   ├── __init__.py
│   └── testing_script.py
├── requirements.txt
└── setup.py
As far as I know, there's isn't a pythonic way to structure your project.
This is what Kenneth Reitz recommended in 2013 and it's how I use it: https://www.kennethreitz.org/essays/repository-structure-and-python.
README.rst
LICENSE
setup.py
requirements.txt
sample/__init__.py
sample/core.py
sample/helpers.py
docs/conf.py
docs/index.rst
tests/test_basic.py
tests/test_advanced.py
Inside sample (my_project in your case) you can separate into categories as you like. E.g. Utils (common functions), Database (read, write), View (user commands), etc. It depends on your project.
As for calling the modules at the same level, you should define them in the __init__ file of the top hierarchy module which is sample in this case.
For example:
__init__ in _my_project
from sample.core import a_Class
from sample.core import a_function
from sample.core import anything
then from /test/test_basic.py you do:
from sample import a_Class
# or import sample
a = a_Class() # use the class from core.py
# or a = sample.a_Class()
Take a look at the sample module repository: https://github.com/navdeep-G/samplemod

python import not working from one module to another

I have a below structure
in migrations/env.py file I am trying to import from database import *
but it shows no module name database
I tried from ..database imprt * and adding file in pythonpath also but no luck :(
Your directory structure looks a bit suspicious to me. The alembic.ini shouldn't normally be part of the package (and setuptools won't by default pick it up when packaging). I think this would better be placed into the project-root.
Something like this would be more standard:
├── alembic.ini
├── migrations
│   ├── env.py
│   ├── script.py.mako
│   └── versions
│ └── ...
├── package_name
│   └── database
│   ├── __init__.py
│ └── ...
│   └── models
│   └── __init__.py
│ └── ...
├── README.md
└── setup.py
└── ...
Now, this alone would not make database available from env.py. For this to work you have to somehow make your package discoverable. Usually this would be done by installing package_name into some virtualenv. In that environment you could then use from package_name.database import * in your env.py.
Migrations needs to know where to import from, they either belong to the same package:
A:
migrations
database
init.py
And then in migrations:
from A.database.whatever import whatever else
Or you install them as packages separatedly inside your virtualenv:
And then each of them is dependent on the other, but because they are installed they can be invoked:
database/setup.py
migrations/setup.py
Then both are installed and migrations/env.py can call the installed package database

Trouble loading local modules only with AWS Lambda

app structure:
.
├── Makefile
├── Pipfile
├── Pipfile.lock
├── README.md
├── template.yaml
├── tests
│ ├── __init__.py
│ └── unit
│ └── lambda_application
│ ├── test_handler.py
│ └── test_parent_child_class.py
└── lambda_application
├── __init__.py
├── first_child_class.py
├── lambda_function.py
├── second_child_class.py
├── requirements.txt
└── parent_class.py
4 directories, 14 files
Code sample from lambda_function.py:
import os
import json
from hashlib import sha256
import boto3
from requests import Session
from .first_child_class import FirstChildClass
def lambda_handler(event, context):
# Do some stuff.
As is, I get the error message
Unable to import module 'lambda_function'
but If I comment out the last import, from .first_child_class import FirstChildClass, it is able to get past that part and get the error that I haven't loaded the module for that class.
I only seem to get this error when I run it in the lambci/lambda:python3.7 docker image and when I deploy on AWS. All my tests pass and it is able to import the module with no problems.
Is there something I should load/setup in the __init__.py file?
EDIT I changed the names of some of the files to post it here.
You are using a relative import here which works in case the code you are executing is in a module. However, since your code is being executed not as a module, your AWS Lambda fails.
https://stackoverflow.com/a/73149/6391078
A quick run locally gave the following error:
PYTHON 3.6
Traceback (most recent call last):
File "lambda_function.py", line 4, in <module>
from .first_child_class import FirstChildClass
ModuleNotFoundError: No module named '__main__.first_child_class'; '__main__' is not a package
Your tests pass because your testing suite imports the file as a module from the lambda_application folder which gets treated as a package in the testing module
This got me going in the correct direction but didn't quite give me the answer but did lead me to the answer, so I thought I would update what I found here.
I didn't try it but from what I found, I believe that:
from first_child_class import FirstChildClass
would be the simplest resolution.
What I ended up doing was moving the classes into a sub-directory and essentially did the same as above but with a package name prepended.
So, the file structure changed to:
.
├── Makefile
├── Pipfile
├── Pipfile.lock
├── README.md
├── template.yaml
├── tests
│ ├── __init__.py
│ └── unit
│ └── lambda_application
│ ├── test_handler.py
│ └── test_parent_child_class.py
└── lambda_application
├── __init__.py
└── lib
├── first_child_class.py
├── second_child_class.py
└── parent_class.py
├── lambda_function.py
└── requirements.txt
and my import became from lib.first_child_class import FirstChildClass

Python project structure for selenium test

I'm a bit rusted with Python (2.7) modules & packages so I hope to find some help. I am writing some tests using selenium. I would like to organize the tests by "scenario families" and I'd like to implement a helper class to handle some boilerplate that would be my base class for all the tests.
Basically I'd like to have a structure looking like:
.
├── assets
│ └── ressource.ext
├── tests
│ ├── __init__.py
│ ├── user
│ │ ├── __init__.py
│ │ └── upload.py
│ └── visitor
│ ├── __init__.py
│ ├── homepage.py
│ ├── login.py
│ ├── search.py
│ └── signup.py
└── utils
├── __init__.py
└── base.py
I am completely stuck on how to make the utils.base module visible to e.g. the tests.visitor.signup module and still allow to run this specific test using python tests/visitor/signup.py.
How would I do that?
Thanks!
Edit: to make things easier here is a dummy sample of what I'm trying to do:
mkdir -p {utils,tests/user}
touch {utils,tests{,/user}}/__init__.py
echo -e "import unittest\n\nclass Base(unittest.TestCase):\n pass" > utils/base.py
echo -e "from utils.base import Base\n\nclass MyTest(Base):\n pass\n\nif __name__ == '__main__':\n unittest.main()" > tests/user/upload.py
This produces the following tree with empty __init__.py files and the import I'm trying to achieve in tests/user/upload.py:
dummy/
├── tests
│   ├── __init__.py
│   └── user
│   ├── __init__.py
│   └── upload.py
└── utils
├── __init__.py
└── base.py
Now if I am in the dummy/ folder I am getting this:
(env)dummy $ python tests/user/upload.py
Traceback (most recent call last):
File "tests/user/upload.py", line 1, in <module>
from utils.base import Base
ImportError: No module named utils.base
But if I run it interactively there is obviously no problem:
(env) dummy $ python
Python 2.7.6 (default, Jan 16 2014, 16:39:48)
>>> from utils.base import Base
>>>
My problem must be really silly but I can't see what I'm doing wrong. And what I don't get also is that specifying the python path ((env) dummy $ PYTHON_PATH=. python tests/user/upload.py) doesn't fix the issue.
You can set PYTHONPATH to search the current directory:
PYTHONPATH=. python tests/user/upload.py
Or you can use a test runner, like nose:
nosetests tests/user/upload.py
Or you can create your own test runner if you don't want to use nose. For instance a runtests.py file with:
import unittest
from tests.user.upload import *
unittest.main()
Then:
python runtests.py
This test runner could import more tests or could be selective about what tests it imports. Ultimately, I'd recommend using nose over writing your own test runner.
Starting a test runner at the top of your hierarchy works because Python adds the current directory of a script to the list of paths it searches for modules.

Categories