What is pyproject.toml file for? - python

Background
I was about to try Python package downloaded from GitHub, and realized that it did not have a setup.py, so I could not install it with
pip install -e <folder>
Instead, the package had a pyproject.toml file which seems to have very similar entries as the setup.py usually has.
What I found
Googling lead me into PEP-518 and it gives some critique to setup.py in Rationale section. However, it does not clearly tell that usage of setup.py should be avoided, or that pyproject.toml would as such completely replace setup.py.
Questions
Is the pyproject.toml something that is used to replace setup.py? Or should a package come with both, a pyproject.toml and a setup.py?
How would one install a project with pyproject.toml in an editable state?

Yes, pyproject.toml is the specified file format of PEP 518 which contains the build system requirements of Python projects.
This solves the build-tool dependency chicken and egg problem, i.e. pip can read pyproject.toml and what version of setuptools or wheel one may need.
If you need a setup.py for an editable install, you could use a shim in setup.py:
#!/usr/bin/env python
import setuptools
if __name__ == "__main__":
setuptools.setup()

pyproject.toml is the new unified Python project settings file that replaces setup.py.
Editable installs still need a setup.py: import setuptools; setuptools.setup()
To use pyproject.toml, run python -m pip install .
Then, if the project is using poetry instead of pip, you can install dependencies (into %USERPROFILE%\AppData\Local\pypoetry\Cache\virtualenvs) like this:
poetry install
And then run dependencies like pytest:
poetry run pytest tests/
And pre-commit (uses .pre-commit-config.yaml):
poetry run pre-commit install
poetry run pre-commit run --all-files

What is it for?
Currently there are multiple packaging tools being popular in Python community and while setuptools still seems to be prevalent it's not a de facto standard anymore. This situation creates a number of hassles for both end users and developers:
For setuptools-based packages installation from source / build of a distribution can fail if one doesn't have setuptools installed;
pip doesn't support the installation of packages based on other packaging tools from source, so these tools had to generate a setup.py file to produce a compatible package. To build a distribution package one has to install the packaging tool first and then use tool-specific commands;
If package author decides to change the packaging tool, workflows must be changed as well to use different tool-specific commands.
pyproject.toml is a new configuration file introduced by PEP 517 and PEP 518 to solve these problems:
... think of the (rough) steps required to produce a built artifact for a project:
The source checkout of the project.
Installation of the build system.
Execute the build system.
This PEP [518] covers step #2. PEP 517 covers step #3 ...
Any tool can also extend this file with its own section (table) to accept tool-specific options, but it's up to them and not required.
PEP 621 suggests using pyproject.toml to specify package core metadata in static, tool-agnostic way. Which backends currently support this is shown in the following table:
enscons
flit_core
hatchling
pdm-pep517
poetry-core
setuptools
0.26.0+
3.2+
0.3+
0.3.0+
Issue #3332
61.0.0+
Does it replace setup.py?
For setuptools-based packages pyproject.toml is not strictly meant to replace setup.py, but rather to ensure its correct execution if it's still needed. For other packaging tools – yes, it is:
Where the build-backend key exists, this takes precedence and the source tree follows the format and conventions of the specified backend (as such no setup.py is needed unless the backend requires it). Projects may still wish to include a setup.py for compatibility with tools that do not use this spec.
How to install a package in editable mode?
Originally "editable install" was a setuptools-specific feature and as such it was not supported by PEP 517. Later on PEP 660 extended this concept to packages using pyproject.toml.
There are two possible conditions for installing a package in editable mode using pip:
Modern:
Both the frontend (pip) and a backend must support PEP 660.
pip supports it since version 21.3;
Legacy:
Packaging tool must provide a setup.py file which supports the develop command.
Since version 21.1 pip can also install packages using only setup.cfg file in editable mode.
The following table describes the support of editable installs by various backends:
enscons
flit_core
hatchling
pdm-pep517
poetry-core
setuptools
0.28.0+
3.4+
0.3+
0.8.0+
1.0.8+
64.0.0+

Answering this part only, as the rest has nicely been explained by others:
How would one install a project with pyproject.toml in an editable state?
Solution
Since the release of poetry-core v1.0.8 in Feb 2022 you can do this:
a) you need this entry in your pyproject.toml:
[build-system]
requires = ["poetry-core>=1.0.8"]
build-backend = "poetry.core.masonry.api"
b) run:
pip install -e .
Sources
https://github.com/python-poetry/poetry/issues/34#issuecomment-1054626460

pyproject.toml can declare the files in your python package and all the metadata for it that will show in PyPi.
A tool like flit can process the pyproject.toml file into a package that can be uploaded to PyPi or installed with pip.
Other tools use pyproject.toml for other purposes. For example, pytest stores information about where to find and how to run tests, and instructions to pytest about modifying pythonpath (sys.path) before running the tests. Many IDEs can use this to help developers conveniently run tests.

Related

How to create a deb package for a python project without setup.py

Any documentation I've found about this topic mentions that the "only" requirement to build a deb package is to have a correct setup.py (and requirements.txt). For instance in dh-virtualenv tutorial, stdeb documentation and the Debian's library style guide for python.
But nowadays new (amazing) tools like poetry allow to develop (and upload to PyPI) python projects without any setup.py (this file and several others including requirements.txt are all replaced by pyproject.toml). I believe flit allows this too.
I have developed a python project managed by poetry and would like to package it for Ubuntu/Debian. I guess, as a workaround I can still write a setup.py file that would take its values from pyproject.toml and a requirements.txt file (written by hand using values from poetry.lock).
But, is there a way to do this without any setup.py file?
setuptools, and the setup.py file that it requires, has been the de-facto packaging standard in python for the longest time. The new package managers you mention were enabled by the introduction of PEP 517 and PEP 518 (or read this for a high-level description on the topic), which provide a standardized way of specifying the build backend without the need of a setup.py (and the ensuing hen-egg problem where you already need setuptools to correctly parse it).
Anyway, it's all still very fresh, and the linux packaging community hasn't caught up yet. I found no recent discussion regarding debian packages, but the rpm side sums it up neatly over here.
So, the short answer is to just wait a while, and google debian packaging pep517 support every now and then.
Until then, you can use dephell to generate the setup.py, and poetry to generate the requirements.txt for you as a workaround to keep using the existing tools:
dephell deps convert --from=poetry --to=setuppy
poetry export -f requirements.txt -o requirements.txt
And, during the build, tell your pyproject.tom that you plan to use setuptools for the build instead of poetry:
[build-system]
requires = ["setuptools >= 40.6.0", "wheel"]
build-backend = "setuptools.build_meta"

Python3. Setuptools. Adding a local package to an assembly

There is a locally built package (eg main-0.1.tar.gz). There is another package (for example base-0.1) that requires main-0.1 as a dependency.
It is necessary that during the subsequent installation of the base-0.1 package, the main-0.1 package is also installed.
Those. You can specify only packages with PyPI in install_requires, but local adding packages to the assembly is not clear how.
You can add the package main-0.1.tag.gz to the base-0.1 archive using MANIFEST.in (include main-0.1.tag.gz). But further dependency_links, for example, does not work correctly.
How do I add a local package to the build of another package and then install it along with another package, as if it were pulled from PyPI?
You might want to look at:
PEP 440 ("File URLs")
PEP 508
import setuptools
setuptools.setup(
# [...]
install_requires = [
'main # file:///path/to/main-0.1.tar.gz'
# [...]
],
)
Alternatively (probably better actually), use some combination of pip install options:
pip install --no-index --find-links '/path/to/distributions' main base
Reference:
https://pip.pypa.io/en/stable/user_guide/#installing-from-local-packages
Found a rough solution. I don't know how much it is for Feng Shui, but it works.
Add include main-0.1.tar.gz to MANIFEST.in
In setup.py, at the end of the file (after calling setup ()), add:
if 'sdist' not in sys.argv[1]:
os.system('pip install main-0.1.tar.gz')
The condition may be different if, for example, sdist is not used for building (python setup.py sdist). The main thing is to somehow determine that this is running setup for assembly, and not for installation (pip install base-0.1.tar.gz in the future).
In this case, we copy the local dependent package into the archive of the package being built, and it is distributed, accordingly, along with it. And installed the same way.

Write test-cases for python setuptools entry-points plugins

I built a python application (the "host" app) that defines a setuptools entry-point, so that it can be extended. Plugin-authors then have to add the following into their setup.py file:
setup(
# ...
entry_points = {
'myapp.plugins':
['plugin_1 = <foo.plugin.module>:<plugin-install-func>']
}
)
In order to test my setup, i have to
build a dummy wheel-package,
use pip to install it,
append new package's folder into sys.path and invoke pkg_resources.working_set.add_entry(package_dir)[*],
only then i can check the expected behavior (run TCs),
use pip to uninstall the package, and
finally remove installed package folder from sys.path,
And a separate package is needed for each test-case, if different functionality must be validated.
This whole testing-rig is rather verbose and clumsy.
Is there a more elegant way to write test-cases for setuptools entry-point plugins?
[*] Note: Installing a wheels or using pip* in develop mode with pip install -e <plugin-package> would not activate the plugin on the same interpreter on Linux; or at least not without appending afterwards the package folder in sys.path.
On Windows, the above problem exists only on develop mode.
I had a same problem and resolved by a dirty hack to build and 'pip install' the plugin package to test, into a tox's environment and run tests in that environment.
a test script does dirty hack, that is, to build and install a wheel package and run tests: https://github.com/ssato/yamllint-plugin-example/blob/master/tests/test_plugin.sh
tox configuration: https://github.com/ssato/yamllint-plugin-example/blob/master/tox.ini

Bundle two Python packages together

I have a Python package myapp which depends on a Python package theirapp.
theirapp is used by others and may update occasionally, but it is not hosted on PyPI.
I currently have my repository setup like this:
my-app/
myapp/
__init__.py
requirements.txt
their-app/
setup.py
theirapp/
__init__.py
My requirements.txt file contains the following line (among others):
./their-app/
their-app is not hosted on PyPI but I want to make sure the latest version is installed. Up to this point I have been downloading a zip file containing my-app and typing pip install -U requirements.txt and using the application manually.
I would like to make an installable Python package. Ideally I would like to download a my-app.zip file and type pip install my-app.zip to install myapp, theirapp and any other dependencies.
Is this possible? If not, what is the best way to handle this scenario?
You may just need to bundle theirapp as part of your project and import it as myapp.contrib.theirapp. If both projects are versioned in git you can impliment it as a submodule, but it may increase complexity for maintainers.
How pip handles a similar problem:
https://github.com/pypa/pip/tree/develop/pip/_vendor
You can see pip imports bundled vendor packages as pip._vendor.theirapp.

How to require and install a package using python 3.x distutils?

I have a program that uses dateutil from the package index. I would like to have setup.py check for for its presence and try to get it using easy_install if it is not there.
The documentation for distutils seems to indicate that this can be done using the requires keyword in setup(), but when I try, it installs on a system without dateutil without giving a warning or installing the required package.
The only thing I could find on google was this blog post about the same issue which did not have any answer either.
Am I using distutils wrong? Do I need to subclass distutils.command.install and do the checking/installing myself?
Automatic downloading of dependencies is a feature introduced by setuptools which is a third-party add-on to distutils, in particular, the install_requires argument it adds. See the setuptools documentation for more information.
Another option is to use requirements.txt file with pip rather than using easy_install as a package installer. pip has now become the recommended installer; see the Python Packaging User Guide for more information.
Update [2015-01]: The previous version of this answer referred to the distribute fork of setuptools. The distribute fork has since been merged back into a newer active setuptools project. distribute is now dead and should no longer be used. setuptools and pip are now very actively maintained and support Python 3.
The argument install_requires in setup function from distutils work for me well, only if I create sdist distributive, like: python setup.py sdist

Categories