I want to create a new PyPI package, but this will have an special wheels where I will invoke it like this:
pip install misoftware[customer1]
Is this possible?
If so how can I provide patches for [customer1]
For example my main release is:
misoftware==1.1 and
misoftware[customer1]
I want
misoftware[customer1]==1.1.2
This will be 3 wheels total
You're describing setuptools 'extras'. This allows you to specify additional dependencies, so for example
misoftware just installs the misoftware package
misoftware[customer1] would install the misoftware package, plus some extra dependencies
The downside is that the dependencies you list in your extras must be hosted as packages themselves as well on PyPI. So you'd need to create a misoftware_customer1 package, and so on.
Related
Using setuptools, is it possible to list another editable package as a dependency for an editable package?
I'm trying to develop a collection of packages in order to use them across different production services, one of these packages (my_pkg_1) depends on a subset of my package collection (my_pkg_2, my_pkg_x, ...), so far, I've managed to put together this pyproject.toml:
[build-system]
requires = ["setuptools"]
build-backend = "setuptools.build_meta"
[project]
name = "my_pkg_1"
version = "0.0.1"
dependencies = [
"my_pkg_2 # file:///somewhere/in/mysystem/my_pkg_2"
]
which does work when/for installing my_pkg_1 in editable mode, and it does install my_pkg_2 but not in editable mode. this is what I see when I pip list:
Package Version Editable project location
--------------- ------- -------------------------
my_pkg_2 0.0.1
my_pkg_1 0.0.1 /somewhere/in/mysystem/my_pkg_1
Is what I'm trying to do even possible? if so, how?
You may install my_pkg_2 explicitly in editable mode before installing my_pkg_1:
pip install --editable /somewhere/in/mysystem/my_pkg_2
Unfortunately, It is not possible to install dependencies (and dependencies of dependencies) automatically in editable mode by installing the main package. I am curious why it is not implemented.
Alternatively, you may add the package paths to the environment variable PYTHONPATH before running code from your main package. That way, you are able to import python modules from your other packages without having to install them.
This can not be done in pyproject.toml. At least not the way you want it and in a standard way.
If I were you I would write for myself a requirements.txt file (you could also give it a different name, obviously):
# install the current project as editable
--editable .
# install `my_pk_2` as editable
--editable /somewhere/in/mysystem/my_pkg_2
And you could use it like so:
path/to/venv/bin/python -m pip install --requirement 'path/to/requirements.txt'
for when you want to work on (edit) both pieces of software at the same time in the same environment.
Alternatively you could use a "development workflow tool" (such as PDM, Hatch, Poetry, and so on), and maybe one of those would be a better fit for your expectations.
I created a package which has 2 modes:
the basic mode with basic functionality only
extended functionality which adds additional modules and needs extra requirements.
For example:
MyPackageName
core
cyber_analyzer
parsing
Where "parsing" is the extension and needs "pandas" as requirement.
Then, I want my package to have 2 modes:
pip install mypackage
pip install mypackage[parsing]
I found out I can use extras_require to install "pandas". Yet, installing the whl file would install all the 3 modules: core, cyber_analyzer and parsing. I would like to install "parsing" only if the extra flag "parsing" was specified.
Is it possible to do so? How can I achieve it? Should I always install "parsing" and the users shouldn't use it?
There is a locally built package (eg main-0.1.tar.gz). There is another package (for example base-0.1) that requires main-0.1 as a dependency.
It is necessary that during the subsequent installation of the base-0.1 package, the main-0.1 package is also installed.
Those. You can specify only packages with PyPI in install_requires, but local adding packages to the assembly is not clear how.
You can add the package main-0.1.tag.gz to the base-0.1 archive using MANIFEST.in (include main-0.1.tag.gz). But further dependency_links, for example, does not work correctly.
How do I add a local package to the build of another package and then install it along with another package, as if it were pulled from PyPI?
You might want to look at:
PEP 440 ("File URLs")
PEP 508
import setuptools
setuptools.setup(
# [...]
install_requires = [
'main # file:///path/to/main-0.1.tar.gz'
# [...]
],
)
Alternatively (probably better actually), use some combination of pip install options:
pip install --no-index --find-links '/path/to/distributions' main base
Reference:
https://pip.pypa.io/en/stable/user_guide/#installing-from-local-packages
Found a rough solution. I don't know how much it is for Feng Shui, but it works.
Add include main-0.1.tar.gz to MANIFEST.in
In setup.py, at the end of the file (after calling setup ()), add:
if 'sdist' not in sys.argv[1]:
os.system('pip install main-0.1.tar.gz')
The condition may be different if, for example, sdist is not used for building (python setup.py sdist). The main thing is to somehow determine that this is running setup for assembly, and not for installation (pip install base-0.1.tar.gz in the future).
In this case, we copy the local dependent package into the archive of the package being built, and it is distributed, accordingly, along with it. And installed the same way.
Background
I was about to try Python package downloaded from GitHub, and realized that it did not have a setup.py, so I could not install it with
pip install -e <folder>
Instead, the package had a pyproject.toml file which seems to have very similar entries as the setup.py usually has.
What I found
Googling lead me into PEP-518 and it gives some critique to setup.py in Rationale section. However, it does not clearly tell that usage of setup.py should be avoided, or that pyproject.toml would as such completely replace setup.py.
Questions
Is the pyproject.toml something that is used to replace setup.py? Or should a package come with both, a pyproject.toml and a setup.py?
How would one install a project with pyproject.toml in an editable state?
Yes, pyproject.toml is the specified file format of PEP 518 which contains the build system requirements of Python projects.
This solves the build-tool dependency chicken and egg problem, i.e. pip can read pyproject.toml and what version of setuptools or wheel one may need.
If you need a setup.py for an editable install, you could use a shim in setup.py:
#!/usr/bin/env python
import setuptools
if __name__ == "__main__":
setuptools.setup()
pyproject.toml is the new unified Python project settings file that replaces setup.py.
Editable installs still need a setup.py: import setuptools; setuptools.setup()
To use pyproject.toml, run python -m pip install .
Then, if the project is using poetry instead of pip, you can install dependencies (into %USERPROFILE%\AppData\Local\pypoetry\Cache\virtualenvs) like this:
poetry install
And then run dependencies like pytest:
poetry run pytest tests/
And pre-commit (uses .pre-commit-config.yaml):
poetry run pre-commit install
poetry run pre-commit run --all-files
What is it for?
Currently there are multiple packaging tools being popular in Python community and while setuptools still seems to be prevalent it's not a de facto standard anymore. This situation creates a number of hassles for both end users and developers:
For setuptools-based packages installation from source / build of a distribution can fail if one doesn't have setuptools installed;
pip doesn't support the installation of packages based on other packaging tools from source, so these tools had to generate a setup.py file to produce a compatible package. To build a distribution package one has to install the packaging tool first and then use tool-specific commands;
If package author decides to change the packaging tool, workflows must be changed as well to use different tool-specific commands.
pyproject.toml is a new configuration file introduced by PEP 517 and PEP 518 to solve these problems:
... think of the (rough) steps required to produce a built artifact for a project:
The source checkout of the project.
Installation of the build system.
Execute the build system.
This PEP [518] covers step #2. PEP 517 covers step #3 ...
Any tool can also extend this file with its own section (table) to accept tool-specific options, but it's up to them and not required.
PEP 621 suggests using pyproject.toml to specify package core metadata in static, tool-agnostic way. Which backends currently support this is shown in the following table:
enscons
flit_core
hatchling
pdm-pep517
poetry-core
setuptools
0.26.0+
3.2+
0.3+
0.3.0+
Issue #3332
61.0.0+
Does it replace setup.py?
For setuptools-based packages pyproject.toml is not strictly meant to replace setup.py, but rather to ensure its correct execution if it's still needed. For other packaging tools – yes, it is:
Where the build-backend key exists, this takes precedence and the source tree follows the format and conventions of the specified backend (as such no setup.py is needed unless the backend requires it). Projects may still wish to include a setup.py for compatibility with tools that do not use this spec.
How to install a package in editable mode?
Originally "editable install" was a setuptools-specific feature and as such it was not supported by PEP 517. Later on PEP 660 extended this concept to packages using pyproject.toml.
There are two possible conditions for installing a package in editable mode using pip:
Modern:
Both the frontend (pip) and a backend must support PEP 660.
pip supports it since version 21.3;
Legacy:
Packaging tool must provide a setup.py file which supports the develop command.
Since version 21.1 pip can also install packages using only setup.cfg file in editable mode.
The following table describes the support of editable installs by various backends:
enscons
flit_core
hatchling
pdm-pep517
poetry-core
setuptools
0.28.0+
3.4+
0.3+
0.8.0+
1.0.8+
64.0.0+
Answering this part only, as the rest has nicely been explained by others:
How would one install a project with pyproject.toml in an editable state?
Solution
Since the release of poetry-core v1.0.8 in Feb 2022 you can do this:
a) you need this entry in your pyproject.toml:
[build-system]
requires = ["poetry-core>=1.0.8"]
build-backend = "poetry.core.masonry.api"
b) run:
pip install -e .
Sources
https://github.com/python-poetry/poetry/issues/34#issuecomment-1054626460
pyproject.toml can declare the files in your python package and all the metadata for it that will show in PyPi.
A tool like flit can process the pyproject.toml file into a package that can be uploaded to PyPi or installed with pip.
Other tools use pyproject.toml for other purposes. For example, pytest stores information about where to find and how to run tests, and instructions to pytest about modifying pythonpath (sys.path) before running the tests. Many IDEs can use this to help developers conveniently run tests.
I have developed a python package on github that I released on PyPi. It installs with pip install PACKAGENAME, but does not do anything with the dependencies that are stated in the "install_requires" of the setup.py file.
Weirdly enough, the zip file of the associated release does install all dependencies.. I tried with different virtual environments and on different computers but it never installs the dependencies.. Any help appreciated.
pip install pythutils downloads a wheel if it's available — and it's available for your package.
When generating a wheel setuptools runs python setup.py locally but doesn't include setup.py into the wheel. Download your wheel file and unzip it (it's just a zip archive) — there is your main package directory pythutils and a directory with metadata pythutils-1.1.1.dist-info. In the metadata directory there is a file METADATA that usually lists static dependencies but your file doesn't list any. Because when you were generating wheels all your dependencies has already been installed so all your dynamic code paths were skipped.
The archive that you downloaded from Github release install dependencies because it's not a wheel so pip runs python setup.py install and your dynamic dependencies work.
What you can do? My advice is to avoid dynamic dependencies. Declare static dependencies and allow pip to decide what versions to install:
install_requires=[
'numpy==1.16.5; python_version>="2" and python_version<"3"',
'numpy; python_version>="3"',
],
Another approach would be to create version-specific wheel files — one for Python 2 and another for Python 3 — with fixed dependencies.
Yet another approach is to not publish wheels at all and only publish sdist (source distribution). Then pip is forced to run python setup.py install on the target machine. That not the best approach and it certainly will be problematic for packages with C extensions (user must have a compiler and developer tools to install from sources).
Your setup.py does a series of checks like
try:
import numpy
except ImportError:
if sys.version_info[0] == 2:
install_requires.append('numpy==1.16.5')
if sys.version_info[0] == 3:
install_requires.append("numpy")
Presumably the system where you ran it had all the required modules already installed, and so ended up with an empty list in install_requires. But this is the wrong way to do it anyway; you should simply make a static list (or two static lists, one each for Python 2 and Python 3 if you really want to support both in the same package).