When using setuptools, I can not get the installer to pull in any package_data files. Everything I've read says that the following is the correct way to do it. Can someone please advise?
setup(
name='myapp',
packages=find_packages(),
package_data={
'myapp': ['data/*.txt'],
},
include_package_data=True,
zip_safe=False,
install_requires=['distribute'],
)
where myapp/data/ is the location of the data files.
I realize that this is an old question, but for people finding their way here via Google: package_data is a low-down, dirty lie. It is only used when building binary packages (python setup.py bdist ...) but not when building source packages (python setup.py sdist ...). This is, of course, ridiculous -- one would expect that building a source distribution would result in a collection of files that could be sent to someone else to built the binary distribution.
In any case, using MANIFEST.in will work both for binary and for source distributions.
I just had this same issue. The solution, was simply to remove include_package_data=True.
After reading here, I realized that include_package_data aims to include files from version control, as opposed to merely "include package data" as the name implies. From the docs:
The data files [of include_package_data] must be under CVS or Subversion control
...
If you want finer-grained control over what files are included (for example, if
you have documentation files in your package directories and want to exclude
them from installation), then you can also use the package_data keyword.
Taking that argument out fixed it, which is coincidentally why it also worked when you switched to distutils, since it doesn't take that argument.
Following #Joe 's recommendation to remove the include_package_data=True line also worked for me.
To elaborate a bit more, I have no MANIFEST.in file. I use Git and not CVS.
Repository takes this kind of shape:
/myrepo
- .git/
- setup.py
- myproject
- __init__.py
- some_mod
- __init__.py
- animals.py
- rocks.py
- config
- __init__.py
- settings.py
- other_settings.special
- cool.huh
- other_settings.xml
- words
- __init__.py
word_set.txt
setup.py:
from setuptools import setup, find_packages
import os.path
setup (
name='myproject',
version = "4.19",
packages = find_packages(),
# package_dir={'mypkg': 'src/mypkg'}, # didnt use this.
package_data = {
# If any package contains *.txt or *.rst files, include them:
'': ['*.txt', '*.xml', '*.special', '*.huh'],
},
#
# Oddly enough, include_package_data=True prevented package_data from working.
# include_package_data=True, # Commented out.
data_files=[
# ('bitmaps', ['bm/b1.gif', 'bm/b2.gif']),
('/opt/local/myproject/etc', ['myproject/config/settings.py', 'myproject/config/other_settings.special']),
('/opt/local/myproject/etc', [os.path.join('myproject/config', 'cool.huh')]),
#
('/opt/local/myproject/etc', [os.path.join('myproject/config', 'other_settings.xml')]),
('/opt/local/myproject/data', [os.path.join('myproject/words', 'word_set.txt')]),
],
install_requires=[ 'jsonschema',
'logging', ],
entry_points = {
'console_scripts': [
# Blah...
], },
)
I run python setup.py sdist for a source distrib (haven't tried binary).
And when inside of a brand new virtual environment, I have a myproject-4.19.tar.gz, file,
and I use
(venv) pip install ~/myproject-4.19.tar.gz
...
And other than everything getting installed to my virtual environment's site-packages, those special data files get installed to /opt/local/myproject/data and /opt/local/myproject/etc.
include_package_data=True worked for me.
If you use git, remember to include setuptools-git in install_requires. Far less boring than having a Manifest or including all path in package_data ( in my case it's a django app with all kind of statics )
( pasted the comment I made, as k3-rnc mentioned it's actually helpful as is )
Using setup.cfg (setuptools ≥ 30.3.0)
Starting with setuptools 30.3.0 (released 2016-12-08), you can keep your setup.py very small and move the configuration to a setup.cfg file. With this approach, you could put your package data in an [options.package_data] section:
[options.package_data]
* = *.txt, *.rst
hello = *.msg
In this case, your setup.py can be as short as:
from setuptools import setup
setup()
For more information, see configuring setup using setup.cfg files.
There is some talk of deprecating setup.cfg in favour of pyproject.toml as proposed in PEP 518, but this is still provisional as of 2020-02-21.
Update: This answer is old and the information is no longer valid. All setup.py configs should use import setuptools. I've added a more complete answer at https://stackoverflow.com/a/49501350/64313
I solved this by switching to distutils. Looks like distribute is deprecated and/or broken.
from distutils.core import setup
setup(
name='myapp',
packages=['myapp'],
package_data={
'myapp': ['data/*.txt'],
},
)
I had the same problem for a couple of days but even this thread wasn't able to help me as everything was confusing. So I did my research and found the following solution:
Basically in this case, you should do:
from setuptools import setup
setup(
name='myapp',
packages=['myapp'],
package_dir={'myapp':'myapp'}, # the one line where all the magic happens
package_data={
'myapp': ['data/*.txt'],
},
)
The full other stackoverflow answer here
I found this post while stuck on the same problem.
My experience contradicts the experiences in the other answers.
include_package_data=True does include the data in the
bdist! The explanation in the setuptools
documentation
lacks context and troubleshooting tips, but
include_package_data works as advertised.
My setup:
Windows / Cygwin
git version 2.21.0
Python 3.8.1 Windows distribution
setuptools v47.3.1
check-manifest v0.42
Here is my how-to guide.
How-to include package data
Here is the file structure for a project I published on PyPI.
(It installs the application in __main__.py).
├── LICENSE.md
├── MANIFEST.in
├── my_package
│ ├── __init__.py
│ ├── __main__.py
│ └── _my_data <---- folder with data
│ ├── consola.ttf <---- data file
│ └── icon.png <---- data file
├── README.md
└── setup.py
Starting point
Here is a generic starting point for the setuptools.setup() in
setup.py.
setuptools.setup(
...
packages=setuptools.find_packages(),
...
)
setuptools.find_packages() includes all of my packages in the
distribution. My only package is my_package.
The sub-folder with my data, _my_data, is not considered a
package by Python because it does not contain an __init__.py,
and so find_packages() does not find it.
A solution often-cited, but incorrect, is to put an empty
__init__.py file in the _my_data folder.
This does make it a package, so it does include the folder
_my_data in the distribution. But the data files inside
_my_data are not included.
So making _my_data into a package does not help.
The solution is:
the sdist already contains the data files
add include_package_data=True to include the data files in the bdist as well
Experiment (how to test the solution)
There are three steps to make this a repeatable experiment:
$ rm -fr build/ dist/ my_package.egg-info/
$ check-manifest
$ python setup.py sdist bdist_wheel
I will break these down step-by-step:
Clean out the old build:
$ rm -fr build/ dist/ my_package.egg-info/
Run check-manifest to be sure MANIFEST.in matches the
Git index of files under version control:
$ check-manifest
If MANIFEST.in does not exist yet, create it from the Git
index of files under version control:
$ check-manifest --create
Here is the MANIFEST.in that is created:
include *.md
recursive-include my_package *.png
recursive-include my_package *.ttf
There is no reason to manually edit this file.
As long as everything that should be under version control is
under version control (i.e., is part of the Git index),
check-manifest --create does the right thing.
Note: files are not part of the Git index if they are either:
ignored in a .gitignore
excluded in a .git/info/exclude
or simply new files that have not been added to the index yet
And if any files are under version control that should not be
under version control, check-manifest issues a warning and
specifies which files it recommends removing from the Git index.
Build:
$ python setup.py sdist bdist_wheel
Now inspect the sdist (source distribution) and bdist_wheel
(build distribution) to see if they include the data files.
Look at the contents of the sdist (only the relevant lines are
shown below):
$ tar --list -f dist/my_package-0.0.1a6.tar.gz
my_package-0.0.1a6/
...
my_package-0.0.1a6/my_package/__init__.py
my_package-0.0.1a6/my_package/__main__.py
my_package-0.0.1a6/my_package/_my_data/
my_package-0.0.1a6/my_package/_my_data/consola.ttf <-- yay!
my_package-0.0.1a6/my_package/_my_data/icon.png <-- yay!
...
So the sdist already includes the data files because they are
listed in MANIFEST.in. There is nothing extra to do to include
the data files in the sdist.
Look at the contents of the bdist (it is a .zip file, parsed
with zipfile.ZipFile):
$ python check-whl.py
my_package/__init__.py
my_package/__main__.py
my_package-0.0.1a6.dist-info/LICENSE.md
my_package-0.0.1a6.dist-info/METADATA
my_package-0.0.1a6.dist-info/WHEEL
my_package-0.0.1a6.dist-info/entry_points.txt
my_package-0.0.1a6.dist-info/top_level.txt
my_package-0.0.1a6.dist-info/RECORD
Note: you need to create your own check-whl.py script to produce the
above output. It is just three lines:
from zipfile import ZipFile
path = "dist/my_package-0.0.1a6-py3-none-any.whl" # <-- CHANGE
print('\n'.join(ZipFile(path).namelist()))
As expected, the bdist is missing the data files.
The _my_data folder is completely missing.
What if I create a _my_data/__init__.py? I repeat the
experiment and I find the data files are still not there! The
_my_data/ folder is included but it does not contain the data
files!
Solution
Contrary to the experience of others, this does work:
setuptools.setup(
...
packages=setuptools.find_packages(),
include_package_data=True, # <-- adds data files to bdist
...
)
With the fix in place, redo the experiment:
$ rm -fr build/ dist/ my_package.egg-info/
$ check-manifest
$ python.exe setup.py sdist bdist_wheel
Make sure the sdist still has the data files:
$ tar --list -f dist/my_package-0.0.1a6.tar.gz
my_package-0.0.1a6/
...
my_package-0.0.1a6/my_package/__init__.py
my_package-0.0.1a6/my_package/__main__.py
my_package-0.0.1a6/my_package/_my_data/
my_package-0.0.1a6/my_package/_my_data/consola.ttf <-- yay!
my_package-0.0.1a6/my_package/_my_data/icon.png <-- yay!
...
Look at the contents of the bdist:
$ python check-whl.py
my_package/__init__.py
my_package/__main__.py
my_package/_my_data/consola.ttf <--- yay!
my_package/_my_data/icon.png <--- yay!
my_package-0.0.1a6.dist-info/LICENSE.md
my_package-0.0.1a6.dist-info/METADATA
my_package-0.0.1a6.dist-info/WHEEL
my_package-0.0.1a6.dist-info/entry_points.txt
my_package-0.0.1a6.dist-info/top_level.txt
my_package-0.0.1a6.dist-info/RECORD
How not to test if data files are included
I recommend troubleshooting/testing using the approach outlined
above to inspect the sdist and bdist.
pip install in editable mode is not a valid test
Note: pip install -e . does not show if data files are
included in the bdist.
The symbolic link causes the installation to behave as if the
data files are included (because they already exist locally on
the developer's computer).
After pip install my_package, the data files are in the
virtual environment's lib/site-packages/my_package/ folder,
using the exact same file structure shown above in the list of
the whl contents.
Publishing to TestPyPI is a slow way to test
Publishing to TestPyPI and then installing and looking in
lib/site-packages/my_packages is a valid test, but it is too
time-consuming.
Ancient question and yet... package management of python really leaves a lot to be desired. So I had the use case of installing using pip locally to a specified directory and was surprised both package_data and data_files paths did not work out. I was not keen on adding yet another file to the repo so I ended up leveraging data_files and setup.py option --install-data; something like this
pip install . --install-option="--install-data=$PWD/package" -t package
Moving the folder containing the package data into to module folder solved the problem for me.
See this question: MANIFEST.in ignored on "python setup.py install" - no data files installed?
Just remove the line:
include_package_data=True,
from your setup script, and it will work fine. (Tested just now with latest setuptools.)
Like others in this thread, I'm more than a little surprised at the combination of longevity and still a lack of clarity, BUT the best answer for me was using check-manifest as recommended in the answer from #mike-gazes
So, using just a setup.cfg and no setup.py and additional text and python files required in the package, what worked for me was keeping this in setup.cfg:
[options]
packages = find:
include_package_data = true
and updating the MANIFEST.in based on the check-manifest output:
include *.in
include *.txt
include *.yml
include LICENSE
include tox.ini
recursive-include mypkg *.py
recursive-include mypkg *.txt
For a directory structure like:
foo/
├── foo
│ ├── __init__.py
│ ├── a.py
│ └── data.txt
└── setup.py
and setup.py
#!/usr/bin/env python
# -*- coding: utf-8 -*-
from setuptools import setup
NAME = 'foo'
DESCRIPTION = 'Test library to check how setuptools works'
URL = 'https://none.com'
EMAIL = 'gzorp#bzorp.com'
AUTHOR = 'KT'
REQUIRES_PYTHON = '>=3.6.0'
setup(
name=NAME,
version='0.0.0',
description=DESCRIPTION,
author=AUTHOR,
author_email=EMAIL,
python_requires=REQUIRES_PYTHON,
url=URL,
license='MIT',
classifiers=[
'Programming Language :: Python',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.6',
],
packages=['foo'],
package_data={'foo': ['data.txt']},
include_package_data=True,
install_requires=[],
extras_require={},
cmdclass={},
)
python setup.py bdist_wheel works.
Starting with Setuptools 62.3.0, you can now use recursive wildcards ("**") to include a (sub)directory recursively. This way you can include whole folders with all their folders and files in it.
For example, when using a pyproject.toml file, this is how you include two folders recursively:
[tool.setuptools.package-data]
"ema_workbench.examples.data" = ["**"]
"ema_workbench.examples.models" = ["**"]
But you can also only include certain file-types, in a folder and all subfolders. If you want to include all markdown (.md) files for example:
[tool.setuptools.package-data]
"ema_workbench.examples.data" = ["**/*.md"]
It should also work when using setup.py or setup.cfg.
See https://github.com/pypa/setuptools/pull/3309 for the details.
I have a package called clana (Github, PyPI) with the following structure:
.
├── clana
│ ├── cli.py
│ ├── config.yaml
│ ├── __init__.py
│ ├── utils.py
│ └── visualize_predictions.py
├── docs/
├── setup.cfg
├── setup.py
├── tests/
└── tox.ini
The setup.py looks like this:
from setuptools import find_packages
from setuptools import setup
requires_tests = [...]
install_requires = [...]
config = {
"name": "clana",
"version": "0.3.6",
"author": "Martin Thoma",
"author_email": "info#martin-thoma.de",
"maintainer": "Martin Thoma",
"maintainer_email": "info#martin-thoma.de",
"packages": find_packages(),
"entry_points": {"console_scripts": ["clana=clana.cli:entry_point"]},
"install_requires": install_requires,
"tests_require": requires_tests,
"package_data": {"clana": ["clana/config.yaml"]},
"include_package_data": True,
"zip_safe": False,
}
setup(**config)
How to check that it didn't work
Quick
python3 setup.py sdist
open dist/clana-0.3.8.tar.gz # config.yaml is not in this file
The real check
I thought this would make sure that the config.yaml is in the same directory as the cli.py when the package is installed. But when I try this:
virtualenv venv
source venv/bin/activate
pip install clana
cd venv/lib/python3.6/site-packages/clana
ls
I get:
cli.py __init__.py __pycache__ utils.py visualize_predictions.py
The way I upload it to PyPI:
python3 setup.py sdist bdist_wheel && twine upload dist/*
So the config.yaml is missing. How can I make sure it is there?
You can add a file name MANIFEST.in next to setup.py with a list of the file you want to add, wildcard allowed (ex: include *.yaml or include clana/config.yaml)
then the option include_package_data=True will activate the manifest file
In short: add config.yaml to MANIFEST.in, and set include_package_data. One without the other is not enough.
Basically it goes like this:
MANIFEST.in adds files to sdist (source distribution).
include_package_data adds these same files to bdist (built distribution), i.e. it extends the effect of MANIFEST.in to bdist.
exclude_package_data prevents files in sdist to be added to bdist, i.e. it filters the effect of include_package_data.
package_data adds files to bdist, i.e. it adds build artifacts (typically the products of custom build steps) to your bdist and has of course no effect on sdist.
So in your case, the file config.yaml is not installed, because it is not added to your bdist (built distribution). There are 2 ways to fix this depending on where the file comes from:
either the file is a build artifact (typically it is somehow created during the ./setup.py build phase), then you need to add it to package_data ;
or the file is part of your source (typically it is in your source code repository), then you need to add it to MANIFEST.in, set include_package_data, and leave it out of exclude_package_data (this seems to be your case here).
See:
https://stackoverflow.com/a/54953494/11138259
https://setuptools.readthedocs.io/en/latest/setuptools.html#including-data-files
Following from the documentation on including data files, if your package has data files such as .yaml files, you may include them like so:
setup(
...
package_data={
"": ["*.yaml"],
},
...
)
This will allow any file in your package with the file extension .yaml to be included.
In the last few days, I was working on a python module. Until now, I used poetry as a packages management tool in many other projects, but it is my first time wanting to publish a package to PyPI.
I was able to run the poetry build and poetry publish commands. I was also able to also install the published package:
$ pip3 install git-profiles
Collecting git-profiles
Using cached https://files.pythonhosted.org/packages/0e/e7/bac9027effd1e34a5b5718f2b35c0b28b3d67f3809e2f2981b6c7b58963e/git_profiles-1.1.0-py3-none-any.whl
Installing collected packages: git-profiles
Successfully installed git-profiles-1.1.0
However, right after the install, I am not able to run my package:
$ git-profiles --help
git-profiles: command not found
My project has the following structure:
git-profiles/
├── src/
│ ├── commands/
│ ├── executor/
│ ├── git_manager/
│ ├── profile/
│ ├── utils/
│ ├── __init__.py
│ └── git_profiles.py
└── tests
I tried to work with different scripts configurations in the pyproject.toml file but I've never been able to make it work after install.
[tool.poetry.scripts]
poetry = "src:git_profiles.py"
or
[tool.poetry.scripts]
git-profile = "src:git_profiles.py"
I don't know if this is a python/pip path/version problem or I need to change something in the configuration file.
If it is helpful, this is the GitHub repository I'm talking about. The package is also published on PyPI.
Poetry's scripts sections wraps around the console script definition of setuptools. As such, the entrypoint name and the call path you give it need to follow the exact same rules.
In short, a console script does more or less this from the shell:
import my_lib # the module isn't called src, that's just a folder name
# the right name to import is whatever you put at [tool.poetry].name
my_lib.my_module.function()
Which, if given the name my-lib-call (the name can be the same as your module, but it doesn't need to be) would be written like this:
[tool.poetry.scripts]
my-lib-call = "my_lib.my_module:function"
Adapted to your project structure, the following should do the job:
[tool.poetry.scripts]
git-profile = "git-profiles:main"
I'm migrating all my modules to Poetry and I have a problem.
Before with a python setup.py test I was able to run my tests with the correct coverage information.
Now I'm moving to poetry, so my best option is poetry run pytest or otherwise poetry install; pytest. In both cases, I have to specify the source location in Sonar to collect the coverage data. Here I would naturally just pass my src folder, but clearly the references will be wrong because pytest is running using the code installed in the environment by poetry, not on the local code as it used to happen before, so the references will be mismatched. No amount of tinkering seems to be working.
So, is there a way with poetry to use the local references instead of the environment references when running with pytest? Or should I give up and use some weird trick with inspect to retrieve the path of the installed package in the site-packages folder?
Your current setup where pytest is run against the installed package instead of the source files is vastly preferable, since it simulates the behavior of the code as it will behave in use. Path errors, files that were not correctly marked/moved for install, or any other thing that can go wrong during deployment will be encountered right away at no cost whatsoever.
It also helps giving a more accurate coverage, since e.g. any build files that are not part of the package will be ignored. All that you need in order to tell coverage to look at the package instead of your source files is to tell it exactly that. Having this in your .coveragerc should be enough:
[run]
source = sample_project
Given a project structure like this[1]
.
├── .coveragerc
├── src
│ └── sample_project
│ ├── __init__.py
│ └── util.py
└── tests
├── __init__.py
└── test_util.py
Running pytest --cov tests/ looks inside the installed package correctly:
Test session starts (platform: linux, Python 3.7.2, pytest 3.10.1, pytest-sugar 0.9.2)
rootdir: /home/user/dev/sample_project, inifile:
plugins: sugar-0.9.2, cov-2.7.1
collecting ...
tests/test_util.py ✓ 100% ██████████
----------- coverage: platform linux, python 3.7.2-final-0 -----------
Name Stmts Miss Cover
----------------------------------------
tests/__init__.py 0 0 100%
tests/test_util.py 6 0 100%
----------------------------------------
TOTAL 6 0 100%
Results (0.10s):
1 passed
[1] It might be important to split off the source code in a directory to avoid name shadowing (the import mechanism will prefer a local package foo in its PYTHONPATH which the working directory is always part of over an installed package foo). From your description, it seems that you're already doing that. If you aren't, consider setting up your project again with poetry new and its optional --src flag enabled.
I'm writing a couple of packages that I'd like to release on PyPi for other people to use.
I've not released to PyPi before so I have been mocking up a submission template: https://github.com/chris-brown-nz/pypi-package-template
Here's a tree of the project template:
| MANIFEST.in
| README.rst
| setup.cfg
| setup.py
|
\---package
module_one.py
module_three.py
module_two.py
__init__.py
In terms of interacting with the package, this is what I would usually do - is it the best way?
To run a method:
from package import module_one
module_one.ClassOne().method_a()
To get a value from a method:
from package import module_two
print(module_two.ClassFive().method_e())
To set then use an attribute of an instance:
from package import module_three
cls = module_three.ClassSeven("Hello World")
print(cls.value)
'package' is a reserved name obviously and won't be used in the final project.
I'd be grateful for some feedback on how I've structured my project and whether it is considered standard, or if it should be modified in some way.
There are different approaches to this, whether one or the other is better is depending on a how you want to develop, usage of the package (e.g. if you ever install it using pip install -e packag_name), etc.
What is missing from your tree is the name of the directory where the setup.py resides, and that is usually the package name:
└── package
├── package
│ ├── __init__.py
│ ├── module_one.py
│ ├── module_three.py
│ └── module_two.py
├── MANIFEST.in
├── README.rst
├── setup.cfg
└── setup.py
as you can see you are doubling the 'package' name, and that means that your setup.py has to be adapted for each package, or dynamically determine the name of the directory where the module.py files resides. If you go for this route, I would suggest you put the module.py files in a generically named directory 'src' or 'lib'.
I don't like the above "standard" setup for multiple reasons:
it doesn't map well to how python programs "grow" before they are split up into packages. Before splitting up having such a 'src' directory would mean using:
from package.src.module_one import MyModuleOneClass
Instead you would have your module.py files directly under package
Having a setup.py to control installation, a README.rst for documentation and an __init__.py to satisfy Python's import is one thing, but all other stuff, apart from your module.py files containing the actual functionality, is garbage. Garbage that might be needed at some point during the package creation process, but is not necessary for the package functionality.
There are other considerations, such as being able to access the version number of the package from the setup.py as well as from the program, without the former having to import the package itself (which may lead to install complications), nor having another extra version.py file that needs importing.
In particular I always found the transition from using a directory structure under site-packages that looked like:
└── organisation
├── package1
└── package2
├── subpack1
└── subpack2
and that could intuitively be used for both importing and navigation to source files, to something like:
├── organisation_package1
│ └── src
├── organisation_package2_subpack1
│ └── src
└── organisation_package2_subpack2
└── src
unnatural. To rearrange and break a working structure to be able to package things seems wrong.
For my set of published packages I followed another way:
- I kept the natural tree structure that you can use "before packaging", 'src' or 'lib' directories.
- I have a generic setup.py which reads and parses (it does not import) the metadata (such as version number, package name, license information, whether to install a utility (and its name)), from a dictionary in the __init__.py file. A file you need anyway.
- The setup.py is smart enough to distinguish subdirectories containing other packages from subdirectories that are part of the parent package.
- setup.py generates files that are needed during package generation only (like setup.cfg), on the fly, and deletes them when no longer needed.
The above allows you to have nested namespaced packages (i.e. package2 can be a package you upload to PyPI, in addition to package2.subpack1 and package2.subpack2). The major thing it (currently) doesn't allow is using pip install -e to edit a single package (and not have the others editable). Given the way I develop, that is not a restriction.
The above embraces namespace packages, where many other approaches have problems with these (remember the last line of Zen of Python: Namespaces are one honking great idea – let’s do more of those)
Examples of the above can e.g be found in my packages ruamel.yaml (and e.g. ruamel.yaml.cmd), or generically by searching PyPI for ruamel.
As is probably obvious, the standard disclaimer: I am the author of those packages
As I use a utility to start packaging, which also runs the tests and does other sanity checks, the generic setup.py could be removed from the setup and inserted by that utility as well. But since subpackage detection is based upon setup.py availability or not, this requires some rework of the generic setup.py.