setuptools: package data folder location - python

I use setuptools to distribute my python package. Now I need to distribute additional datafiles.
From what I've gathered fromt the setuptools documentation, I need to have my data files inside the package directory. However, I would rather have my datafiles inside a subdirectory in the root directory.
What I would like to avoid:
/ #root
|- src/
| |- mypackage/
| | |- data/
| | | |- resource1
| | | |- [...]
| | |- __init__.py
| | |- [...]
|- setup.py
What I would like to have instead:
/ #root
|- data/
| |- resource1
| |- [...]
|- src/
| |- mypackage/
| | |- __init__.py
| | |- [...]
|- setup.py
I just don't feel comfortable with having so many subdirectories, if it's not essential. I fail to find a reason, why I /have/ to put the files inside the package directory. It is also cumbersome to work with so many nested subdirectories IMHO. Or is there any good reason that would justify this restriction?

Option 1: Install as package data
The main advantage of placing data files inside the root of your Python package
is that it lets you avoid worrying about where the files will live on a user's
system, which may be Windows, Mac, Linux, some mobile platform, or inside an Egg. You can
always find the directory data relative to your Python package root, no matter where or how it is installed.
For example, if I have a project layout like so:
project/
foo/
__init__.py
data/
resource1/
foo.txt
You can add a function to __init__.py to locate an absolute path to a data
file:
import os
_ROOT = os.path.abspath(os.path.dirname(__file__))
def get_data(path):
return os.path.join(_ROOT, 'data', path)
print get_data('resource1/foo.txt')
Outputs:
/Users/pat/project/foo/data/resource1/foo.txt
After the project is installed as an Egg the path to data will change, but the code doesn't need to change:
/Users/pat/virtenv/foo/lib/python2.6/site-packages/foo-0.0.0-py2.6.egg/foo/data/resource1/foo.txt
Option 2: Install to fixed location
The alternative would be to place your data outside the Python package and then
either:
Have the location of data passed in via a configuration file,
command line arguments or
Embed the location into your Python code.
This is far less desirable if you plan to distribute your project. If you really want to do this, you can install your data wherever you like on the target system by specifying the destination for each group of files by passing in a list of tuples:
from setuptools import setup
setup(
...
data_files=[
('/var/data1', ['data/foo.txt']),
('/var/data2', ['data/bar.txt'])
]
)
Updated: Example of a shell function to recursively grep Python files:
atlas% function grep_py { find . -name '*.py' -exec grep -Hn $* {} \; }
atlas% grep_py ": \["
./setup.py:9: package_data={'foo': ['data/resource1/foo.txt']}

I Think I found a good compromise which will allow you to mantain the following structure:
/ #root
|- data/
| |- resource1
| |- [...]
|- src/
| |- mypackage/
| | |- __init__.py
| | |- [...]
|- setup.py
You should install data as package_data, to avoid the problems described in samplebias answer, but in order to mantain the file structure you should add to your setup.py:
try:
os.symlink('../../data', 'src/mypackage/data')
setup(
...
package_data = {'mypackage': ['data/*']}
...
)
finally:
os.unlink('src/mypackage/data')
This way we create the appropriate structure "just in time", and mantain our source tree organized.
To access such data files within your code, you 'simply' use:
data = resource_filename(Requirement.parse("main_package"), 'mypackage/data')
I still don't like having to specify 'mypackage' in the code, as the data could have nothing to do necessarally with this module, but i guess its a good compromise.

I could use importlib_resources or importlib.resources (depending on python version).
https://importlib-resources.readthedocs.io/en/latest/using.html

I think that you can basically give anything as an argument *data_files* to setup().

Related

Pytest import problems when tests import from adjacent directory

/root
|- __init__.py
|
|- /src
| |
| |- __init__.py
| |- /x
| |
| |- __init__.py
| |- xy.py
|
|- /tests
| |- __init__.py
| |- /test_x
| |
| |- __init__.py
| |- test_xy.py
# tests/test_x/test_xy.py
from src.x.xy import XY
Class TestXY:
# etc
When I’m in root I try to run pytest tests/*/* and I get an error that due to from src.x.xy import XY because src can’t be found. If I change the import to from …src.x.xy import XY I get “cannot import” because it’s from a directory one level above.
I also tried running Python -m pytest tests/*/* but I get an error about conftest not being found in __pycache__ which I don’t understand. (ERROR: not found: /root/tests/__pycache__/conftest.cpython-310-pytest-7.1.2.pyc (no name '/root/tests/__pycache__/conftest.cpython-310-pytest-7.1.2.pyc' in any of []))
What am I missing? Why is it so hard to run tests this way? I can run them individually in pycharm by clicking the little green arrows in the test script no problem.
In that architecture of project you should use config file. Config file should have path to src.
pyproject.toml example:
[tool.pytest.ini_options]
pythonpath = [
"src"
]
pytest.ini example:
[pytest]
pythonpath = src
If you will have multiple src directories you also can add it to config
For example pyproject.toml:
[tool.pytest.ini_options]
pythonpath = [
"src", "src2",
]

Problems trying to import a package

I have the following project structure:
Project |
|- sim |
| |- out |
| |- main.py
|- libs |
|- __init__.py
|- plus.py
Inside plus.py there is a function called sum(a,b) that returns the sum of 2 numbers. I'm trying to import the module plus.py into main.py to the able to call this function, however I'm getting the following error: ImportError: attempted relative import with no known parent package.
Here is the code inside main.py:
from ...libs import plus
a = 1
b = 5
c = plus.sum(a,b)
print(c)
One of the solutions I found is to add the project directory to path, but I'm trying to avoid that.
I'm using VSCode to call python, this could be also a useful information.
What I'm doing wrong here?
Thanks in advance.
EDIT:
Added __init__.py files in sim, out and Project directories as #ThePjot suggested and the error remains. Now the project structure is in the following form:
Project |
|- __init__.py
|
|- sim |
| |- __init__.py
| |- out |
| |- __init__.py
| |- main.py
|- libs |
|- __init__.py
|- plus.py
The __init__.py files are empty.
I've had similar issues and I've created an experimental new import library ultraimport that allows to do file system based imports to solve your issue.
In your main.py you would then write:
import ultraimport
plus = ultraimport('__dir__/../../libs/plus.py', 'plus')
a = 1
b = 5
c = plus.sum(a,b)
print(c)
PS: With ultraimport, it's also not necessary to create those __init__.py files. You could remove them again and it will still work.

Dealing with subfolders when usin crontab

I'm having some troubles trying to schedule the execution of one project.
The structure is:
main folder
|- lib
| |- file1.py
| |- file2.py
|
|- data
| |- file.csv
|
|- temp
| |- file.json
|
|-main.py
The contrab line is:
*/5 * * * * python3 /home/myName/main_folder/main.py
I've been trying this command line with simple python scripts without dependences and works fine. The problem is that in this case the main.py import classes and functions inside lib and I think it can deal with it.
On my main.py I'm importing like this from lib import file1, file2. Exists another way maybe using os that the program knows the absolute path?
Please try to add an empty file named: __init__.py in lib directory.

Python Packaging with .pth files

I have a suite of packages that are developed together and bundled into one distribution package.
For sake of argument, let's assume I have Good Reasons for organizing my python distribution package in the following way:
SpanishInqProject/
|---SpanishInq/
| |- weapons/
| | |- __init__.py
| | |- fear.py
| | |- surprise.py
| |- expectations/
| | |- __init__.py
| | |- noone.py
| |- characters/
| |- __init__.py
| |- biggles.py
| |- cardinal.py
|- tests/
|- setup.py
|- spanish_inq.pth
I've added the path configuration file spanish_inq.pth to add SpanishInq to the sys.path, so I can import weapons, .etc directly.
I want to be able to use setuptools to build wheels and have pip install weapons, expectations and characters inside the SpanishInq directory, but without making SpanishInq a package or namespace.
My setup.py:
from setuptools import setup, find_packages
setup(
name='spanish_inq',
packages=find_packages(),
include_package_data=True,
)
With a MANIFEST.in file containing:
spanish_inq.pth
This has been challenging in a couple of ways:
pip install has put weapons etc. directly in the site-packages directory, rather than in a SpanishInq dir.
my spanish_inq.pth file ends up in the sys.exec_prefix dir, rather than in my site-packages dir, meaning the relative path in it is now useless.
The first problem I was able to sort of solve by turning SpanishInq into a module (which I'm not happy about), but I still want to be able to import weapons etc. without SpanishInq as a namespace, and to do this I need SpanishInq added to the sys.path, which is where I was hoping the .pth file would help...but I can't get it to go where it ought to.
So...
How do I get the .pth file to install into the site-packages dir?
This is very similar to setup.py: installing just a pth file? (this question is strictly a superset, in terms of functionality) -- I've adapted the relevant part of my answer there below.
The right thing to do here is to extend setuptools' build_py, and copy the pth file into the directory into the build directory, in the location where setuptools prepares all the files that go into site-packages there.
from setuptools.commands import build_py
class build_py_with_pth_file(build_py):
"""Include the .pth file for this project, in the generated wheel."""
def run(self):
super().run()
destination_in_wheel = "spanish_inq.pth"
location_in_source_tree = "spanish_inq.pth"
outfile = os.path.join(self.build_lib, destination_in_wheel)
self.copy_file(location_in_source_tree, outfile, preserve_mode=0)
setup(
...,
cmdclass={"build_py": build_py_with_pth_file},
)

Handle file imports after package installation [duplicate]

I use setuptools to distribute my python package. Now I need to distribute additional datafiles.
From what I've gathered fromt the setuptools documentation, I need to have my data files inside the package directory. However, I would rather have my datafiles inside a subdirectory in the root directory.
What I would like to avoid:
/ #root
|- src/
| |- mypackage/
| | |- data/
| | | |- resource1
| | | |- [...]
| | |- __init__.py
| | |- [...]
|- setup.py
What I would like to have instead:
/ #root
|- data/
| |- resource1
| |- [...]
|- src/
| |- mypackage/
| | |- __init__.py
| | |- [...]
|- setup.py
I just don't feel comfortable with having so many subdirectories, if it's not essential. I fail to find a reason, why I /have/ to put the files inside the package directory. It is also cumbersome to work with so many nested subdirectories IMHO. Or is there any good reason that would justify this restriction?
Option 1: Install as package data
The main advantage of placing data files inside the root of your Python package
is that it lets you avoid worrying about where the files will live on a user's
system, which may be Windows, Mac, Linux, some mobile platform, or inside an Egg. You can
always find the directory data relative to your Python package root, no matter where or how it is installed.
For example, if I have a project layout like so:
project/
foo/
__init__.py
data/
resource1/
foo.txt
You can add a function to __init__.py to locate an absolute path to a data
file:
import os
_ROOT = os.path.abspath(os.path.dirname(__file__))
def get_data(path):
return os.path.join(_ROOT, 'data', path)
print get_data('resource1/foo.txt')
Outputs:
/Users/pat/project/foo/data/resource1/foo.txt
After the project is installed as an Egg the path to data will change, but the code doesn't need to change:
/Users/pat/virtenv/foo/lib/python2.6/site-packages/foo-0.0.0-py2.6.egg/foo/data/resource1/foo.txt
Option 2: Install to fixed location
The alternative would be to place your data outside the Python package and then
either:
Have the location of data passed in via a configuration file,
command line arguments or
Embed the location into your Python code.
This is far less desirable if you plan to distribute your project. If you really want to do this, you can install your data wherever you like on the target system by specifying the destination for each group of files by passing in a list of tuples:
from setuptools import setup
setup(
...
data_files=[
('/var/data1', ['data/foo.txt']),
('/var/data2', ['data/bar.txt'])
]
)
Updated: Example of a shell function to recursively grep Python files:
atlas% function grep_py { find . -name '*.py' -exec grep -Hn $* {} \; }
atlas% grep_py ": \["
./setup.py:9: package_data={'foo': ['data/resource1/foo.txt']}
I Think I found a good compromise which will allow you to mantain the following structure:
/ #root
|- data/
| |- resource1
| |- [...]
|- src/
| |- mypackage/
| | |- __init__.py
| | |- [...]
|- setup.py
You should install data as package_data, to avoid the problems described in samplebias answer, but in order to mantain the file structure you should add to your setup.py:
try:
os.symlink('../../data', 'src/mypackage/data')
setup(
...
package_data = {'mypackage': ['data/*']}
...
)
finally:
os.unlink('src/mypackage/data')
This way we create the appropriate structure "just in time", and mantain our source tree organized.
To access such data files within your code, you 'simply' use:
data = resource_filename(Requirement.parse("main_package"), 'mypackage/data')
I still don't like having to specify 'mypackage' in the code, as the data could have nothing to do necessarally with this module, but i guess its a good compromise.
I could use importlib_resources or importlib.resources (depending on python version).
https://importlib-resources.readthedocs.io/en/latest/using.html
I think that you can basically give anything as an argument *data_files* to setup().

Categories