I created a package which has 2 modes:
the basic mode with basic functionality only
extended functionality which adds additional modules and needs extra requirements.
For example:
MyPackageName
core
cyber_analyzer
parsing
Where "parsing" is the extension and needs "pandas" as requirement.
Then, I want my package to have 2 modes:
pip install mypackage
pip install mypackage[parsing]
I found out I can use extras_require to install "pandas". Yet, installing the whl file would install all the 3 modules: core, cyber_analyzer and parsing. I would like to install "parsing" only if the extra flag "parsing" was specified.
Is it possible to do so? How can I achieve it? Should I always install "parsing" and the users shouldn't use it?
Related
Using setuptools, is it possible to list another editable package as a dependency for an editable package?
I'm trying to develop a collection of packages in order to use them across different production services, one of these packages (my_pkg_1) depends on a subset of my package collection (my_pkg_2, my_pkg_x, ...), so far, I've managed to put together this pyproject.toml:
[build-system]
requires = ["setuptools"]
build-backend = "setuptools.build_meta"
[project]
name = "my_pkg_1"
version = "0.0.1"
dependencies = [
"my_pkg_2 # file:///somewhere/in/mysystem/my_pkg_2"
]
which does work when/for installing my_pkg_1 in editable mode, and it does install my_pkg_2 but not in editable mode. this is what I see when I pip list:
Package Version Editable project location
--------------- ------- -------------------------
my_pkg_2 0.0.1
my_pkg_1 0.0.1 /somewhere/in/mysystem/my_pkg_1
Is what I'm trying to do even possible? if so, how?
You may install my_pkg_2 explicitly in editable mode before installing my_pkg_1:
pip install --editable /somewhere/in/mysystem/my_pkg_2
Unfortunately, It is not possible to install dependencies (and dependencies of dependencies) automatically in editable mode by installing the main package. I am curious why it is not implemented.
Alternatively, you may add the package paths to the environment variable PYTHONPATH before running code from your main package. That way, you are able to import python modules from your other packages without having to install them.
This can not be done in pyproject.toml. At least not the way you want it and in a standard way.
If I were you I would write for myself a requirements.txt file (you could also give it a different name, obviously):
# install the current project as editable
--editable .
# install `my_pk_2` as editable
--editable /somewhere/in/mysystem/my_pkg_2
And you could use it like so:
path/to/venv/bin/python -m pip install --requirement 'path/to/requirements.txt'
for when you want to work on (edit) both pieces of software at the same time in the same environment.
Alternatively you could use a "development workflow tool" (such as PDM, Hatch, Poetry, and so on), and maybe one of those would be a better fit for your expectations.
There is a locally built package (eg main-0.1.tar.gz). There is another package (for example base-0.1) that requires main-0.1 as a dependency.
It is necessary that during the subsequent installation of the base-0.1 package, the main-0.1 package is also installed.
Those. You can specify only packages with PyPI in install_requires, but local adding packages to the assembly is not clear how.
You can add the package main-0.1.tag.gz to the base-0.1 archive using MANIFEST.in (include main-0.1.tag.gz). But further dependency_links, for example, does not work correctly.
How do I add a local package to the build of another package and then install it along with another package, as if it were pulled from PyPI?
You might want to look at:
PEP 440 ("File URLs")
PEP 508
import setuptools
setuptools.setup(
# [...]
install_requires = [
'main # file:///path/to/main-0.1.tar.gz'
# [...]
],
)
Alternatively (probably better actually), use some combination of pip install options:
pip install --no-index --find-links '/path/to/distributions' main base
Reference:
https://pip.pypa.io/en/stable/user_guide/#installing-from-local-packages
Found a rough solution. I don't know how much it is for Feng Shui, but it works.
Add include main-0.1.tar.gz to MANIFEST.in
In setup.py, at the end of the file (after calling setup ()), add:
if 'sdist' not in sys.argv[1]:
os.system('pip install main-0.1.tar.gz')
The condition may be different if, for example, sdist is not used for building (python setup.py sdist). The main thing is to somehow determine that this is running setup for assembly, and not for installation (pip install base-0.1.tar.gz in the future).
In this case, we copy the local dependent package into the archive of the package being built, and it is distributed, accordingly, along with it. And installed the same way.
Background
I was about to try Python package downloaded from GitHub, and realized that it did not have a setup.py, so I could not install it with
pip install -e <folder>
Instead, the package had a pyproject.toml file which seems to have very similar entries as the setup.py usually has.
What I found
Googling lead me into PEP-518 and it gives some critique to setup.py in Rationale section. However, it does not clearly tell that usage of setup.py should be avoided, or that pyproject.toml would as such completely replace setup.py.
Questions
Is the pyproject.toml something that is used to replace setup.py? Or should a package come with both, a pyproject.toml and a setup.py?
How would one install a project with pyproject.toml in an editable state?
Yes, pyproject.toml is the specified file format of PEP 518 which contains the build system requirements of Python projects.
This solves the build-tool dependency chicken and egg problem, i.e. pip can read pyproject.toml and what version of setuptools or wheel one may need.
If you need a setup.py for an editable install, you could use a shim in setup.py:
#!/usr/bin/env python
import setuptools
if __name__ == "__main__":
setuptools.setup()
pyproject.toml is the new unified Python project settings file that replaces setup.py.
Editable installs still need a setup.py: import setuptools; setuptools.setup()
To use pyproject.toml, run python -m pip install .
Then, if the project is using poetry instead of pip, you can install dependencies (into %USERPROFILE%\AppData\Local\pypoetry\Cache\virtualenvs) like this:
poetry install
And then run dependencies like pytest:
poetry run pytest tests/
And pre-commit (uses .pre-commit-config.yaml):
poetry run pre-commit install
poetry run pre-commit run --all-files
What is it for?
Currently there are multiple packaging tools being popular in Python community and while setuptools still seems to be prevalent it's not a de facto standard anymore. This situation creates a number of hassles for both end users and developers:
For setuptools-based packages installation from source / build of a distribution can fail if one doesn't have setuptools installed;
pip doesn't support the installation of packages based on other packaging tools from source, so these tools had to generate a setup.py file to produce a compatible package. To build a distribution package one has to install the packaging tool first and then use tool-specific commands;
If package author decides to change the packaging tool, workflows must be changed as well to use different tool-specific commands.
pyproject.toml is a new configuration file introduced by PEP 517 and PEP 518 to solve these problems:
... think of the (rough) steps required to produce a built artifact for a project:
The source checkout of the project.
Installation of the build system.
Execute the build system.
This PEP [518] covers step #2. PEP 517 covers step #3 ...
Any tool can also extend this file with its own section (table) to accept tool-specific options, but it's up to them and not required.
PEP 621 suggests using pyproject.toml to specify package core metadata in static, tool-agnostic way. Which backends currently support this is shown in the following table:
enscons
flit_core
hatchling
pdm-pep517
poetry-core
setuptools
0.26.0+
3.2+
0.3+
0.3.0+
Issue #3332
61.0.0+
Does it replace setup.py?
For setuptools-based packages pyproject.toml is not strictly meant to replace setup.py, but rather to ensure its correct execution if it's still needed. For other packaging tools – yes, it is:
Where the build-backend key exists, this takes precedence and the source tree follows the format and conventions of the specified backend (as such no setup.py is needed unless the backend requires it). Projects may still wish to include a setup.py for compatibility with tools that do not use this spec.
How to install a package in editable mode?
Originally "editable install" was a setuptools-specific feature and as such it was not supported by PEP 517. Later on PEP 660 extended this concept to packages using pyproject.toml.
There are two possible conditions for installing a package in editable mode using pip:
Modern:
Both the frontend (pip) and a backend must support PEP 660.
pip supports it since version 21.3;
Legacy:
Packaging tool must provide a setup.py file which supports the develop command.
Since version 21.1 pip can also install packages using only setup.cfg file in editable mode.
The following table describes the support of editable installs by various backends:
enscons
flit_core
hatchling
pdm-pep517
poetry-core
setuptools
0.28.0+
3.4+
0.3+
0.8.0+
1.0.8+
64.0.0+
Answering this part only, as the rest has nicely been explained by others:
How would one install a project with pyproject.toml in an editable state?
Solution
Since the release of poetry-core v1.0.8 in Feb 2022 you can do this:
a) you need this entry in your pyproject.toml:
[build-system]
requires = ["poetry-core>=1.0.8"]
build-backend = "poetry.core.masonry.api"
b) run:
pip install -e .
Sources
https://github.com/python-poetry/poetry/issues/34#issuecomment-1054626460
pyproject.toml can declare the files in your python package and all the metadata for it that will show in PyPi.
A tool like flit can process the pyproject.toml file into a package that can be uploaded to PyPi or installed with pip.
Other tools use pyproject.toml for other purposes. For example, pytest stores information about where to find and how to run tests, and instructions to pytest about modifying pythonpath (sys.path) before running the tests. Many IDEs can use this to help developers conveniently run tests.
I want to create a new PyPI package, but this will have an special wheels where I will invoke it like this:
pip install misoftware[customer1]
Is this possible?
If so how can I provide patches for [customer1]
For example my main release is:
misoftware==1.1 and
misoftware[customer1]
I want
misoftware[customer1]==1.1.2
This will be 3 wheels total
You're describing setuptools 'extras'. This allows you to specify additional dependencies, so for example
misoftware just installs the misoftware package
misoftware[customer1] would install the misoftware package, plus some extra dependencies
The downside is that the dependencies you list in your extras must be hosted as packages themselves as well on PyPI. So you'd need to create a misoftware_customer1 package, and so on.
I have a program that uses dateutil from the package index. I would like to have setup.py check for for its presence and try to get it using easy_install if it is not there.
The documentation for distutils seems to indicate that this can be done using the requires keyword in setup(), but when I try, it installs on a system without dateutil without giving a warning or installing the required package.
The only thing I could find on google was this blog post about the same issue which did not have any answer either.
Am I using distutils wrong? Do I need to subclass distutils.command.install and do the checking/installing myself?
Automatic downloading of dependencies is a feature introduced by setuptools which is a third-party add-on to distutils, in particular, the install_requires argument it adds. See the setuptools documentation for more information.
Another option is to use requirements.txt file with pip rather than using easy_install as a package installer. pip has now become the recommended installer; see the Python Packaging User Guide for more information.
Update [2015-01]: The previous version of this answer referred to the distribute fork of setuptools. The distribute fork has since been merged back into a newer active setuptools project. distribute is now dead and should no longer be used. setuptools and pip are now very actively maintained and support Python 3.
The argument install_requires in setup function from distutils work for me well, only if I create sdist distributive, like: python setup.py sdist