I have a standard python proj package::
proj/
├── proj
│ ├── data
│ │ └── data.csv
│ ├── __init__.py
│ └── lib.py
├── MANIFEST.in
├── setup.py
└── tests
├── __init__.py
└── lib_test.py
data/data.csv is installed with MANIFEST.in and include_package_data key in setup.py::
~/proj$ more setup.py
from setuptools import setup
setup(name='proj',
version=1.0,
packages=['proj'],
test_suite = 'tests',
include_package_data=True,
zip_safe=False)
~/proj$ more MANIFEST.in
include proj/data/*.csv
So after a python setup.py install data.csv is in the place I want in site-packages::
$ tree ~/venv/lib/python2.7/site-packages/proj-1.0-py2.7.egg
~/venv/lib/python2.7/site-packages/proj-1.0-py2.7.egg
├── EGG-INFO
│ └── ...
└── proj
├── data
│ └── data.csv
├── __init__.py
└── ...
In setup.py, test_suite is declared, so when I call test from proj first
folder, this work well::
~/proj$ python setup.py test
running test
...
----------------------------------------------------------------------
Ran 1 test in 0.000s
OK
I would like to use in my unitest the data.csv file from: ~/venv/lib/python2.7/site-packages/proj-1.0-py2.7.egg/proj/data/data.csv.
In short a way to get ~/venv/lib/python2.7/site-packages/proj-1.0-py2.7.egg/proj/data/ folder.
With this code::
$ more tests/lib_test.py
import unittest, os, proj
from proj.lib import get_data
class TestUtils(unittest.TestCase):
def test_get_data(self):
datapath = os.path.dirname(os.path.abspath(proj.__file__)) + '/data'
data = '{}/data.csv'.format(datapath)
print '\n my data:', data
if __name__ == '__main__':
unittest.main()
Using __file__ I only succed to get ~/sandbox/proj/proj/data/data.csv::
$ python setup.py test
running test
...
test_get_data (tests.lib_test.TestUtils) ...
my data: ~/sandbox/proj/proj/data/data.csv
ok
----------------------------------------------------------------------
Ran 1 test in 0.000s
In fact
python setup.py test
does not load python package from the place python setup install put it (~/venv/lib/python2.7/site-packages/proj-xxx), but from the place the python setup.py test is run.
That's why with:
import proj, os
os.path.abspath(proj.__file__)
we can't catch the site-package folder but only the working-copy files.
Related
Poetry and pytest are relatively new to me and I am seeking to understand a specific behavior.
I have created a project with poetry, and I have added pytest as a dependy with poetry add --group dev pytest. As a result, here are the relevant line from pyproject.toml:
[tool.poetry.dependencies]
python = "^3.10"
[tool.poetry.group.dev.dependencies]
pytest = "^7.2.0"
Here is the project/modules structure:
.
├── app
│ ├── core
│ │ ├── cli.py
│ │ └── __init__.py
│ ├── __init__.py
│ └── run.py
├── poetry.lock
├── pyproject.toml
├── README.md
└── tests
└── test_cli.py
In test_cli.py I am importing the cli module with from app.core.cli import *
With this particular setup, running poetry run pytest fails with:
tests/test_cli.py:2: in <module>
from app.core.cli import *
E ModuleNotFoundError: No module named 'app'
However, if I invoke a poetry shell and run python3 -m pytest tests/ it works.
It also works if I create an __init__.py in the tests/ directory.
I have seen a similar issue described here but I do not understand how it was resolved. One the suggestions to include include the project root name in the import statement did not help.
I also tried this answer, but it also did not work.
I was under the impression that an __init__.py is not necessary in tests/ and that poetry run pytest should work without it.
I have the following directory structure:
Project
├── README.md
├── app
│ ├── __init__.py
│ └── main.py
│
├── tests
│ ├── __init__.py
│ └── test_one.py
I want to import app.main in test_one. I went through similar stackoverflow questions and tried to add the path to the app folder in the test_one.py file as follows:
sys.path.append('/path to project/app')
However, I am getting the following error:
ModuleNotFoundError: No module named 'app'
How can I import the files from app into the test_one.py file? Is there a simple from ... import statement to achieve this?
You did not mention what unit testing framework you are using, but if you are using pytest, it already supports that kind of app/test organization where the tests are outside the application code.
setup.py
mypkg/
__init__.py
app.py
view.py
tests/
test_app.py
test_view.py
...
..
If you don’t have a setup.py file and are relying on the fact that
Python by default puts the current directory in sys.path to import
your package, you can execute python -m pytest to execute the tests
against the local copy directly, without using pip.
Your codes are already structured correctly. You don't need to manually set sys.path. You can use from app import main normally (see my sample test_one.py below). The only thing I would add is a pytest.ini to specify the test paths and the pattern for the test files.
directory structure:
Project
├── pytest.ini
├── app
│ ├── __init__.py
│ └── main.py
│
├── tests
│ ├── __init__.py
│ └── test_one.py
pytest.ini:
[pytest]
addopts = -v
console_output_style = count
python_files = test_*.py
testpaths = tests
sample main.py:
def calculate(x, y):
return x + y
sample test_one.py:
from app import main
def test_calculate():
assert(3 == main.calculate(1, 2))
running pytest:
$ pytest tests
=============================================================================== test session starts ===============================================================================
platform linux -- Python 3.7.2, pytest-4.6.2, py-1.8.0, pluggy-0.12.0 -- /home/gino/.virtualenvs/test-py37/bin/python3.7
cachedir: .pytest_cache
rootdir: /home/gino/Project, inifile: pytest.ini, testpaths: tests
collected 1 item
tests/test_one.py::test_calculate PASSED [1/1]
============================================================================ 1 passed in 0.01 seconds =============================================================================
Python 3.6
I've written some components and I'm trying to import one of them in the other.
Below is what my project structure looks like:
.
└── components
├── __init__.py
├── extract
│ └── python3
| ├── __init__.py
│ └── extract.py
└── transform
└── python3
├── __init__.py
└── preprocess.py
extract.py
from components.transform.python3.preprocess import my_function
if __name__ == '__main__':
my_function()
preprocess.py
def my_function():
print("Found me")
When I run python components/extract/python3/extract.py
I see the following error:
ModuleNotFoundError: No module named 'components'
I've added an empty __init__.py file to the directories that contain modules as well as the top level package directory.
Ok, imports require the top level package to be available in Python PATH (sys.path).
So to make it work, you should:
cd to the directory containing components
add . to the Python PATH:
export PYTHONPATH='.'
launch your script:
python components/extract/python3/extract.py
On my system, it successfully displays:
Found me
I want to create a distributable Python package. For that, I organized my directories as follows:
.
├── config
│ └── test.yml
├── MANIFEST.in
├── sample
│ ├── hello.py
│ ├── __init__.py
│ └── world
│ ├── __init__.py
│ └── refer.py
└── setup.py
The MANIFEST.in contains only one line:
graft config
The setup.py is as follows:
from setuptools import setup
setup(
name='sample',
version='1.0',
packages=[
'sample',
'sample.world'
],
include_package_data=True
)
However, after I have run pip install ., I end up with the following content of the target directory:
.
./__pycache__
./__pycache__/__init__.cpython-36.pyc
./__pycache__/hello.cpython-36.pyc
./world
./world/__pycache__
./world/__pycache__/__init__.cpython-36.pyc
./world/__pycache__/refer.cpython-36.pyc
./world/refer.py
./world/__init__.py
./hello.py
./__init__.py
While I expect the config be there along with the YAML file it contains. What am I doing wrong? Thank you!
Here is some example setup.py to contains the data file and ref to find that.
from setuptools import setup
setup(...
packages=find_packages(),
package_data={'': ['config/*.yml'],
},
...)
setup tools doc
I am looking for a way to call Fabric from a script inside one of my packages, essentially turning it into an alias for fab -f /path/to/my/installed/package/scripts/fabfile.py.
Is there a standard way to do that, or should I just call it from subprocess?
I don't really have complete solution for you problem, but you would need to start off using the package pkg_resources to perfectly identify the location of your fabric file inside the other project.
In the following example I've create a small testing project called hellofabric containg a file called testfab.py (please ignore fabfile.py in the root folder of the project, this comes from my python bootstrap script). Here is the file structure.
.
├── fabfile.py
├── hellofabric
│ ├── __init__.py
│ ├── testfab.py
│ └── version.txt
├── hellofabric.egg-info
│ ├── dependency_links.txt
│ ├── entry_points.txt
│ ├── not-zip-safe
│ ├── PKG-INFO
│ ├── SOURCES.txt
│ └── top_level.txt
├── MANIFEST.in
├── README.rst
└── setup.py
testfab.py contains the following code.
import fabric.api as fab
#fab.task
def hellofabric():
fab.local("echo Hello from fabric")
Next step would be to create a dist file of this project (python setup.py sdist), and try to install that distribution file inside of your destination project. Once I did that I was able to execute the following which executed the fabric script.
>>> from hellofabric import testfab
>>> testfab.hellofabric()
[localhost] local: echo Hello from fabric
Hello from fabric
>>>
Hope this what you are looking for.