I am packaging a python project which has the following directory structure:
toingpkg/
src/
subtoingpkg1/
subsubtoingpkg1/
...
__init__.py
__init__.py
subtoingpkg2/
...
subtoingpkg2.py
__init__.py
toingpkg.py
__init__.py
setup.cfg
pyproject.toml
My setup.cfg is as follows:
[metadata]
name = toingpkg
...
classifiers =
Programming Language :: Python :: 3
...
[options]
package_dir =
= src
packages = find_namespace:
python_requires = >=3.6
install_requires =
requests
pytz
[options.packages.find]
where=src
And my pyproject.toml is as follows:
[build-system]
requires = [
"setuptools>=42",
"wheel"
]
build-backend = "setuptools.build_meta"
When I build my package using python3 -m build as mentioned in the docs, I get a whl file in my dist folder but it does not include the files in the root of the src directory.
So when I do a pip3 install dist/toingpackage-xxx-.whl, the package gets installed (shows up in pip3 list) but I cannot do a:
>>> import toingpkg
I get:
Traceback (most recent call last): File "", line 1, in
ModuleNotFoundError: No module named 'toingpkg'
I also tried specifing all the subpackages manually but got the same result. My python environment is 3.8.5, setuptools 45.2.0.
What am I doing wrong?
Related
Due to the console message of setup.py install is deprecated, I am in the middle of upgrading my existing setup.py install to the recommended setup.cfg with build
My existing setup.py looks something like
from setuptools import setup
setup(
name='pybindsample',
version='0.1.0',
packages=[''],
package_data={'': ['pybindsample.so']},
has_ext_modules=lambda: True,
)
My current translation looks like:
setup.cfg
[metadata]
name = pybindsample
version = 0.1.0
[options]
packages = .
[options.package_data]
. = pybindsample.so
pyproject.toml
[build-system]
requires = ["setuptools"]
build-backend = "setuptools.build_meta"
My question is how can I translate has_ext_modules=lambda: True? has_ext_modules=lambda: True is from the solution here. Without this, after executing python3 -m build --wheel the file name of the generated wheel will become pybindsample-0.1.0-py3-none-any.whl, whereas my old python3 setup.py bdist_wheel will generate wheel with file name pybindsample-0.1.0-cp39-cp39-macosx_11_0_x86_64.whl. I have attempted
setup.cfg
[metadata]
name = pybindsample
version = 0.1.0
[options]
packages = .
has_ext_modules=lambda: True,
[options.package_data]
. = pybindsample.so
but it still generates pybindsample-0.1.0-py3-none-any.whl, I also attempted
setup.cfg
[metadata]
name = pybindsample
version = 0.1.0
[options]
packages = .
[options.package_data]
. = pybindsample.so
[bdist_wheel]
python-tag = c39
plat-name = macosx_11_0_x86_64
py-limited-api = c39
this generates pybindsample-0.1.0-cp39-none-macosx_11_0_x86_64.whl, and I couldn't figure out why the abi tag is still none.
What is the right way to configure setuptools with setup.cfg to include platform name, python tag, and ABI tag?
msgpack includes an optional cython
extension. Some users of the package want py3-none-any wheels of msgpack. I'm trying to figure out how to make
it possible to build wheels both with and without the optional extension.
One possible solution is to use an environment variable in setup.py to decide
whether to set ext_modules to an empty list of a list of setuptools.Extension
pyproject.toml
[build-system]
requires = ["setuptools", "wheel", "cython"]
build-backend = "setuptools.build_meta"
setup.py
from setuptools import setup, Extension
import os
if 'ONLY_PURE' in os.environ:
ext_modules = []
else:
module1 = Extension('helloworld', sources = ['helloworld.pyx'])
ext_modules = [module1]
setup(ext_modules=ext_modules)
setup.cfg
[metadata]
name = mypackage
version = 0.0.1
[options]
py_modules = mypackage
mypackage.py
try:
import helloworld
except ImportError:
print('hello pure python')
helloworld.pyx
print("hello extension")
To build with extension:
$ pip install build
...
$ python -m build
...
$ ls dist/
mypackage-0.0.1-cp39-cp39-linux_x86_64.whl mypackage-0.0.1.tar.gz
To build without extension
$ pip install build
...
$ ONLY_PURE='a nonempty string' python -m build
...
$ ls dist/
mypackage-0.0.1-py3-none-any.whl mypackage-0.0.1.tar.gz
For traditional Python projects with a setup.py, there are various ways of ensuring that the version string does not have to be repeated throughout the code base. See PyPA's guide on "Single-sourcing the package version" for a list of recommendations.
Many are trying to move away from setup.py to setup.cfg (probably under the influence of PEP517 and PEP518; setup.py was mostly used declaratively anyway, and when there was logic in setup.py, it was probably for the worse.) This means that most the suggestions won't work anymore since setup.cfg cannot contain "code".
How can I single-source the package version for Python projects that use setup.cfg?
There are a couple of ways to do this (see below for the project structure used in these examples):
1.
setup.cfg
[metadata]
version = 1.2.3.dev4
src/my_top_level_package/__init__.py
import importlib.metadata
__version__ = importlib.metadata.version('MyProject')
2.
setup.cfg
[metadata]
version = file: VERSION.txt
VERSION.txt
1.2.3.dev4
src/my_top_level_package/__init__.py
import importlib.metadata
__version__ = importlib.metadata.version('MyProject')
3.
setup.cfg
[metadata]
version = attr: my_top_level_package.__version__
src/my_top_level_package/__init__.py
__version__ = '1.2.3.dev4'
And more...
There are probably other ways to do this, by playing with different combinatons.
References:
https://setuptools.readthedocs.io/en/latest/userguide/declarative_config.html
https://docs.python.org/3/library/importlib.metadata.html
Structure assumed in the previous examples is as follows...
MyProject
├── setup.cfg
├── setup.py
└── src
└── my_top_level_package
└── __init__.py
setup.py
#!/usr/bin/env python3
import setuptools
if __name__ == '__main__':
setuptools.setup(
# see 'setup.cfg'
)
setup.cfg
[metadata]
name = MyProject
# See above for the value of 'version = ...'
[options]
package_dir =
= src
packages = find:
[options.packages.find]
where = src
$ cd path/to/MyProject
$ python3 setup.py --version
1.2.3.dev4
$ python3 -m pip install .
# ...
$ python3 -c 'import my_top_level_package; print(my_top_level_package.__version__)'
1.2.3.dev4
$ python3 -V
Python 3.6.9
$ python3 -m pip list
Package Version
------------- ----------
MyProject 1.2.3.dev4
pip 20.0.2
pkg-resources 0.0.0
setuptools 45.2.0
wheel 0.34.2
zipp 3.0.0
I'm trying to compile and install the following python package, system-wide:
https://github.com/mathurinm/BlitzL1/
(note that the init.py of the module is inside a folder named python)
So I run, at the root of the repo,
pip install -e .
I get:
zongo#zongo-HP-EliteBook-840-G3:~/workspace/BlitzL1$ pip install -e .
Obtaining file:///home/zongo/workspace/BlitzL1
Installing collected packages: blitzl1
Running setup.py develop for blitzl1
Successfully installed blitzl1
zongo#zongo-HP-EliteBook-840-G3:~/workspace/BlitzL1$ ipython
Python 3.6.6 | packaged by conda-forge | (default, Jul 26 2018, 09:53:17)
Type 'copyright', 'credits' or 'license' for more information
IPython 7.0.1 -- An enhanced Interactive Python. Type '?' for help.
In [1]: import blitzl1
---------------------------------------------------------------------------
ModuleNotFoundError Traceback (most recent call last)
<ipython-input-1-8bb5a22c28e9> in <module>
----> 1 import blitzl1
ModuleNotFoundError: No module named 'blitzl1'
after trial and error, I found that renaming the python folder to blitzl1 and replacing, in setup.py:
package_dir = {"blitzl1": "python"},
by
package_dir = {"blitzl1": "blitzl1"},
makes it possible to import the package. Why is the first one not working?
By the way:
zongo#zongo-HP-EliteBook-840-G3:~/workspace/BlitzL1$ which pip
/home/zongo/anaconda3/bin/pip
This is due to a long lasting issue in pip with installing a package in develop mode when the package directory is not in the same folder as the setup.py. See here for more info.
To be clearer, if the package name is my_package and the structure of the source is:
|- setup.py
|- src
|- __init__.py
|- ...
with package_dir={'my_package':'src'}, installing the package with either pip install -e . or python setup.py develop will raise the error reported by the OP.
A way to mitigate this is to change to package_dir={'':'src'} and change the structure of the repo to
|- setup.py
|- src
|- mypackage
|- __init__.py
|- ...
I'm trying get the file VERSION in the root of my python package installed into so that I can read from this file once it's installed, however using MANIFEST.in, the file is placed in the top-level /usr/lib/python3.6/site-packages rather than inside of /usr/lib/python3.6/site-packages/mypackage.
I'm trying to do this so that I can display the packages version at runtime and also easily make use of it inside of the repo.
directory structure:
setup.py
MANIFEST.in
VERSION
mypackage/
- __init__.py
- __main__.py
- foo.py
MANIFEST.in:
include VERSION
setup.py:
#!/usr/bin/env python3
from setuptools import setup, find_packages
with open("VERSION", "r") as versionFile:
version = versionFile.read().strip()
setup(
name="mypackage",
version=version,
packages=find_packages(),
include_package_data=True)
mypackage/__main__.py:
...
verFile = pkg_resources.resource_string(__name__, "VERSION")
with open(verFile, "r") as fin:
version = str(fin.read().strip())
print(version)
...
How can I get the VERSION file to install inside of the package directory?