I am managing my dependencies using poetry and set
[build-system]
requires = [
"setuptools > =49.0.1",
"wheel",
However this does not seem to be working. I am still getting no matching distribution found for setuptools >= 40.8.0 during build time for one of my package.
My build.sh file looks like
release(){
export TTOX_PARALLEL_NO_SPINNER=1
poetry installl
****
}
the no matching distribution error pops up while poetry install is working, so maybe there is a way to specify setuptool version for poetry install?
As far as I know, Poetry does not allow using a different build back-end than poetry-core, meaning that you would not be allowed to use setuptools in the [build-system] section. But I am not 100% sure, you might want to ask the Poetry maintainers directly.
You have this in your question: setuptools > =49.0.1. This seems incorrect to me, there should be no empty space between > and =. If I were you I would replace with setuptools>=49.0.1.
Related
I would like to use the Python library pyhash in my project. The dependencies are managed by Poetry. If I add pyhash as a dependency, I get a build error: error in pyhash setup command: use_2to3 is invalid.
This is a well-known bug due to setuptools > 58.0.0 not supporting use_2to3 anymore. In a non-Poetry setup, the fix is easy. Just downgrade setuptools to <= 58.0.0: pip3 install setuptools==58.0.0.
However, in a Poetry project, I could not make this work. I added setuptools=58.0.0 as a dependency, but when I install my project I still get the use_2to3 error. I assume that poetry still uses a setuptools>58.0.0.
How can I fix this?
I found a workaround for my problem. In the case of pyhash, the dependency on use_2to3 has already been removed in the master branch. This fix has unfortunately not been released yet. However, it is possible for pip and also for poetry to install directly from a github repository. Any ref can be specified, so branches, tags and also individual commits.
The workaround with poetry means you have to add the pyhash dependency with the git repository as source:
poetry add git+https://github.com/flier/pyfasthash.git#20a53f9bb7bf15f98e3e549f523b49e1e0f62e15
One can also specify master, but this is not advisable, as any branch is a moving target and will lead to non-reproducable releases.
I want to achieve a similar behavior as the library Dask does, it is possible to use pip to install dask, dask[dataframe], dask[array] and others. They do it by using the setup.py with a packages key like this. If I install only dask the dask[dataframe] is not installed and they warn you about this when executing the module.
I found this in the poetry documentation but when I execute poetry build I only get one .whl file with all of the packages within.
How can I package my module to be able to install specific parts of a library using poetry?
Actually the Dask example does not install sub packages sepparatelly, it just installs the custom dependencies sepparatelly as explained in this link.
In order to accomplish the same behavior using poetry you need to use this (as mentioned by user #sinoroc in this comment)
The example pyproject.toml from the poetry extras page is this:
[tool.poetry]
name = "awesome"
[tool.poetry.dependencies]
# These packages are mandatory and form the core of this package’s distribution.
mandatory = "^1.0"
# A list of all of the optional dependencies, some of which are included in the
# below `extras`. They can be opted into by apps.
psycopg2 = { version = "^2.7", optional = true }
mysqlclient = { version = "^1.3", optional = true }
[tool.poetry.extras]
mysql = ["mysqlclient"]
pgsql = ["psycopg2"]
By using poetry build --format wheel a single wheel file would be created.
In order to install a specific set of extra dependencies using pip and the wheel file you should use:
pip install "wheel_filename.whl[mysql]"
I wanted to write a new Python package that I want to make available via PyPI.
In the past I always used setup.py. But this time I decided to embrace new
best practices of using setup.cfg
instead. So I started reading a little bit of the documentation, mainly
https://setuptools.pypa.io/en/latest/userguide/declarative_config.html.
I also decided to use pyscaffold for the generation of the files.
pyscaffold generated a setup.cfg file for me and I added (just for testing
purposes) version = 5.1.1 in under the metadata section, as described in the
documentation above.
For convenience I'm using anaconda and created a new empty environment:
$ python --version
Python 3.9.7
$ pip list
...
PyScaffold 4.1.1
setuptools 58.4.0
setuptools-scm 6.3.2
wheel 0.37.0
But when I executed pip install -e ., the version was ignored and pip assigned a
different one:
$ pip install -e .
Obtaining file:///tmp/test_package
Installing build dependencies ... done
Checking if build backend supports build_editable ... done
Getting requirements to build wheel ... done
Preparing metadata (pyproject.toml) ... done
Installing collected packages: test-package
Attempting uninstall: test-package
Found existing installation: test-package 0.0.post1.dev10+g3ed39c8.d20211106
Uninstalling test-package-0.0.post1.dev10+g3ed39c8.d20211106:
Successfully uninstalled test-package-0.0.post1.dev10+g3ed39c8.d20211106
Running setup.py develop for test-package
Successfully installed test-package-0.0.post1.dev1+g228b46c
https://stackoverflow.com/a/27077610/1480131 mentions that setuptools version
30.3.0 supports putting metadata in setup.cfg, I'm using 58.4.0, so a much
more recent version.
I also tested with different formats like version = file:VERSION.txt but that
didn't help either, pip simply ignores the version entry.
What am I missing? What am I doing wrong?
I prepared a small git repo with the files in order to be able to reproduce the
error: https://github.com/shaoran/test_package Does anybody get a different
result?
That's because pyscaffold generated a project that uses setuptools-scm for version detection. When setuptools-scm is used, the version is not read from version metadata, but parsed from the repository tags. The version 0.0.post1.dev10+g3ed39c8.d20211106 can be read as follows:
0.0.post1 - dummy version since you don't have any tags in repo yet;
dev10 - you have ten commits that are not included in any version tag (basically the amount of commits you made since tagging last);
g3ed39c8 - the short hash of commit you have installed from is 3ed39c8 (prefix g means it is a Git repo);
d20211106 - d means you have installed from a dirty state (some files versioned by Git were modified), the rest is just the timestamp.
If you want to use the version metadata from setup.cfg instead, just drop setuptools-scm activation in setup.py:
if __name__ == "__main__":
try:
setup()
except:
...
You can then also clean up pyproject.toml, although this isn't required explicitly:
[build-system]
requires = ["setuptools>=46.1.0", "wheel"]
build-backend = "setuptools.build_meta"
If you want to keep using setuptools-scm (which IMO is a great tool to prevent you from accidentally releasing dists with same version, but different contents), then just add a new tag to start versioning from:
$ git tag -a 5.1.1 -m 'start versioning'
If you had a clean repo state (no modified files), pip install -e . will install the version 5.1.1 right away. Otherwise, you will get a 5.1.1 with a suffix.
The neat part of using setuptools-scm is that the version metadata in setup.cfg is ignored, so you don't have to bump it yourself and can safely remove version = 5.1.1 line.
Background
I was about to try Python package downloaded from GitHub, and realized that it did not have a setup.py, so I could not install it with
pip install -e <folder>
Instead, the package had a pyproject.toml file which seems to have very similar entries as the setup.py usually has.
What I found
Googling lead me into PEP-518 and it gives some critique to setup.py in Rationale section. However, it does not clearly tell that usage of setup.py should be avoided, or that pyproject.toml would as such completely replace setup.py.
Questions
Is the pyproject.toml something that is used to replace setup.py? Or should a package come with both, a pyproject.toml and a setup.py?
How would one install a project with pyproject.toml in an editable state?
Yes, pyproject.toml is the specified file format of PEP 518 which contains the build system requirements of Python projects.
This solves the build-tool dependency chicken and egg problem, i.e. pip can read pyproject.toml and what version of setuptools or wheel one may need.
If you need a setup.py for an editable install, you could use a shim in setup.py:
#!/usr/bin/env python
import setuptools
if __name__ == "__main__":
setuptools.setup()
pyproject.toml is the new unified Python project settings file that replaces setup.py.
Editable installs still need a setup.py: import setuptools; setuptools.setup()
To use pyproject.toml, run python -m pip install .
Then, if the project is using poetry instead of pip, you can install dependencies (into %USERPROFILE%\AppData\Local\pypoetry\Cache\virtualenvs) like this:
poetry install
And then run dependencies like pytest:
poetry run pytest tests/
And pre-commit (uses .pre-commit-config.yaml):
poetry run pre-commit install
poetry run pre-commit run --all-files
What is it for?
Currently there are multiple packaging tools being popular in Python community and while setuptools still seems to be prevalent it's not a de facto standard anymore. This situation creates a number of hassles for both end users and developers:
For setuptools-based packages installation from source / build of a distribution can fail if one doesn't have setuptools installed;
pip doesn't support the installation of packages based on other packaging tools from source, so these tools had to generate a setup.py file to produce a compatible package. To build a distribution package one has to install the packaging tool first and then use tool-specific commands;
If package author decides to change the packaging tool, workflows must be changed as well to use different tool-specific commands.
pyproject.toml is a new configuration file introduced by PEP 517 and PEP 518 to solve these problems:
... think of the (rough) steps required to produce a built artifact for a project:
The source checkout of the project.
Installation of the build system.
Execute the build system.
This PEP [518] covers step #2. PEP 517 covers step #3 ...
Any tool can also extend this file with its own section (table) to accept tool-specific options, but it's up to them and not required.
PEP 621 suggests using pyproject.toml to specify package core metadata in static, tool-agnostic way. Which backends currently support this is shown in the following table:
enscons
flit_core
hatchling
pdm-pep517
poetry-core
setuptools
0.26.0+
3.2+
0.3+
0.3.0+
Issue #3332
61.0.0+
Does it replace setup.py?
For setuptools-based packages pyproject.toml is not strictly meant to replace setup.py, but rather to ensure its correct execution if it's still needed. For other packaging tools – yes, it is:
Where the build-backend key exists, this takes precedence and the source tree follows the format and conventions of the specified backend (as such no setup.py is needed unless the backend requires it). Projects may still wish to include a setup.py for compatibility with tools that do not use this spec.
How to install a package in editable mode?
Originally "editable install" was a setuptools-specific feature and as such it was not supported by PEP 517. Later on PEP 660 extended this concept to packages using pyproject.toml.
There are two possible conditions for installing a package in editable mode using pip:
Modern:
Both the frontend (pip) and a backend must support PEP 660.
pip supports it since version 21.3;
Legacy:
Packaging tool must provide a setup.py file which supports the develop command.
Since version 21.1 pip can also install packages using only setup.cfg file in editable mode.
The following table describes the support of editable installs by various backends:
enscons
flit_core
hatchling
pdm-pep517
poetry-core
setuptools
0.28.0+
3.4+
0.3+
0.8.0+
1.0.8+
64.0.0+
Answering this part only, as the rest has nicely been explained by others:
How would one install a project with pyproject.toml in an editable state?
Solution
Since the release of poetry-core v1.0.8 in Feb 2022 you can do this:
a) you need this entry in your pyproject.toml:
[build-system]
requires = ["poetry-core>=1.0.8"]
build-backend = "poetry.core.masonry.api"
b) run:
pip install -e .
Sources
https://github.com/python-poetry/poetry/issues/34#issuecomment-1054626460
pyproject.toml can declare the files in your python package and all the metadata for it that will show in PyPi.
A tool like flit can process the pyproject.toml file into a package that can be uploaded to PyPi or installed with pip.
Other tools use pyproject.toml for other purposes. For example, pytest stores information about where to find and how to run tests, and instructions to pytest about modifying pythonpath (sys.path) before running the tests. Many IDEs can use this to help developers conveniently run tests.
The importlib_resources backport for Python < 3.7 of the importlib.resources standard library module has the following section in the setup.cfg file:
[options]
python_requires = >=2.7,!=3.0,!=3.1,!=3.2,!=3.3
setup_requires =
setuptools
wheel
install_requires =
pathlib2; python_version < '3'
typing; python_version < '3.5'
packages = find:
Why does setup_requires include setuptools? This does not seem to make sense since:
the first line of the setup.py file imports setuptools, so by the time the setup function is called and reads the setup.cfg file that instructs to install setuptools it is already too late to install setuptools:
from setuptools import setup
setup()
setuptools is already installed on any fresh Python installation (well, only tested on Windows 10 and MacOS 10.15 with Python 3.8.0):
$ python -V
Python 3.8.0
$ pip list
Package Version
---------- -------
pip 19.2.3
setuptools 41.2.0
WARNING: You are using pip version 19.2.3, however version 19.3.1 is available.
You should consider upgrading via the 'python -m pip install --upgrade pip' command.
No, setuptools should not be included in setup_requires, according to PEP 518 (bold emphasis mine):
Setuptools tried to solve this with a setup_requires argument to its
setup() function [3]. This solution has a number of issues, such as:
No tooling (besides setuptools itself) can access this information without executing the setup.py, but setup.py can't be executed without having these items installed.
While setuptools itself will install anything listed in this, they won't be installed until during the execution of the setup() function, which means that the only way to actually use anything added here is through increasingly complex machinations that delay the import and usage of these modules until later on in the execution of the setup() function.
This cannot include setuptools itself nor can it include a replacement to setuptools, which means that projects such as numpy.distutils are largely incapable of utilizing it and projects cannot take advantage of newer setuptools features until their users naturally upgrade the version of setuptools to a newer one.
The items listed in setup_requires get implicitly installed whenever you execute the setup.py but one of the common ways that the setup.py is executed is via another tool, such as pip, who is already managing dependencies. This means that a command like pip install spam might end up having both pip and setuptools downloading and installing packages and end users needing to configure both tools (and for setuptools without being in control of the invocation) to change settings like which repository it installs from. It also means that users need to be aware of the discovery rules for both tools, as one may support different package formats or determine the latest version differently.
The accepted answer is mostly correct, but where PEP 518 says.
[The setup_requires mechanism] cannot include setuptools itself...
It's technically incorrect, and as importlib_resources demonstrates, it can actually include setuptools. The problem is that including setuptools in setup_requires serves mostly as documentation. It declares that setuptools is a build requirement (required to run setup.py), but it won't be capable of satisfying that requirement if it's not already satisfied.
But, the presence of setuptools in setup_requires is technically correct and does serve the purpose of declaring the requirement and asking setuptools to verify that the requirement is in fact installed (alongside other setup-time requirements).
It is, however, just a legacy artifact and doesn't provide that much value, and as can be seen in the question and answers, it does lead to confusion. The recommended, proper, approach is to use PEP 517 and 518 declarations and builders, but that part of the ecosystem hasn't matured yet, so setuptools vestiges will remain. Try not to let them bother you.
Why does setup_requires includes setuptools? This does not seem to make sense
Does not make sense at all. On the other hand it doesn't hamper anything so why not?