I've read a discussion where a suggestion was to use the requirements.txt inside the setup.py file to ensure the correct installation is available on multiple deployments without having to maintain both a requirements.txt and the list in setup.py.
However, when I'm trying to do an installation via pip install -e ., I get an error:
Obtaining file:///Users/myuser/Documents/myproject
Processing /home/ktietz/src/ci/alabaster_1611921544520/work
ERROR: Could not install packages due to an OSError: [Errno 2] No such file or directory:
'/System/Volumes/Data/home/ktietz/src/ci/alabaster_1611921544520/work'
It looks like pip is trying to look for packages that are available on pip (alabaster) on my local machine. Why? What am I missing here? Why isn't pip looking for the required packages on the PyPi server?
I have done it before the other way around, maintaining the setup file and not the requirements file. For the requirements file, just save it as:
*
and for setup, do
from distutils.core import setup
from setuptools import find_packages
try:
from Module.version import __version__
except ModuleNotFoundError:
exec(open("Module/version.py").read())
setup(
name="Package Name",
version=__version__,
packages=find_packages(),
package_data={p: ["*"] for p in find_packages()},
url="",
license="",
install_requires=[
"numpy",
"pandas"
],
python_requires=">=3.8.0",
author="First.Last",
author_email="author#company.com",
description="Description",
)
For reference, my version.py script looks like:
__build_number__ = "_LOCAL_"
__version__ = f"1.0.{__build_number__}"
Which Jenkins is replacing the build_number with a tag
This question consists of two separate questions, for the rather philosopihc choice of how to arrange setup requirements is actually unrelated to the installation error that you are experiencing.
First about the error: It looks like the project you are trying to install depends on another library (alabaster) of which you apparently also did an editable install using pip3 install -e . that points to this directory:
/home/ktietz/src/ci/alabaster_1611921544520/work
What the error tells you is that the directory where the install is supposed to be located does not exist anymore. You should only install your project itself in editable mode, but the dependencies should be installed into a classical system directory, i. e. without the option -e.
To clean up, I would suggest that you do the following:
# clean up references to the broken editable install
pip3 uninstall alabaster
# now do a proper non-editable install
pip3 install alabaster
Concerning the question how to arrange setup requirements, you should primarily use the install_requires and extras_require options of setuptools:
# either in setup.py
setuptools.setup(
install_requires = [
'dep1>=1.2',
'dep2>=2.4.1',
]
)
# or in setup.cfg
[options]
install_requires =
dep1>=1.2
dep2>=2.4.1
[options.extras_require]
extra_deps_a =
dep3
dep4>=4.2.3
extra_deps_b =
dep5>=5.2.1
Optional requirements can be organised in groups. To include such an extra group with the install, you can do pip3 install .[extra_deps_name].
If you wish to define specific dependency environments with exact versions (e. g. for Continuous Integration), you may use requirements.txt files in addition, but the general dependency and version constraint definitions should be done in setup.cfg or setup.py.
Background
I was about to try Python package downloaded from GitHub, and realized that it did not have a setup.py, so I could not install it with
pip install -e <folder>
Instead, the package had a pyproject.toml file which seems to have very similar entries as the setup.py usually has.
What I found
Googling lead me into PEP-518 and it gives some critique to setup.py in Rationale section. However, it does not clearly tell that usage of setup.py should be avoided, or that pyproject.toml would as such completely replace setup.py.
Questions
Is the pyproject.toml something that is used to replace setup.py? Or should a package come with both, a pyproject.toml and a setup.py?
How would one install a project with pyproject.toml in an editable state?
Yes, pyproject.toml is the specified file format of PEP 518 which contains the build system requirements of Python projects.
This solves the build-tool dependency chicken and egg problem, i.e. pip can read pyproject.toml and what version of setuptools or wheel one may need.
If you need a setup.py for an editable install, you could use a shim in setup.py:
#!/usr/bin/env python
import setuptools
if __name__ == "__main__":
setuptools.setup()
pyproject.toml is the new unified Python project settings file that replaces setup.py.
Editable installs still need a setup.py: import setuptools; setuptools.setup()
To use pyproject.toml, run python -m pip install .
Then, if the project is using poetry instead of pip, you can install dependencies (into %USERPROFILE%\AppData\Local\pypoetry\Cache\virtualenvs) like this:
poetry install
And then run dependencies like pytest:
poetry run pytest tests/
And pre-commit (uses .pre-commit-config.yaml):
poetry run pre-commit install
poetry run pre-commit run --all-files
What is it for?
Currently there are multiple packaging tools being popular in Python community and while setuptools still seems to be prevalent it's not a de facto standard anymore. This situation creates a number of hassles for both end users and developers:
For setuptools-based packages installation from source / build of a distribution can fail if one doesn't have setuptools installed;
pip doesn't support the installation of packages based on other packaging tools from source, so these tools had to generate a setup.py file to produce a compatible package. To build a distribution package one has to install the packaging tool first and then use tool-specific commands;
If package author decides to change the packaging tool, workflows must be changed as well to use different tool-specific commands.
pyproject.toml is a new configuration file introduced by PEP 517 and PEP 518 to solve these problems:
... think of the (rough) steps required to produce a built artifact for a project:
The source checkout of the project.
Installation of the build system.
Execute the build system.
This PEP [518] covers step #2. PEP 517 covers step #3 ...
Any tool can also extend this file with its own section (table) to accept tool-specific options, but it's up to them and not required.
PEP 621 suggests using pyproject.toml to specify package core metadata in static, tool-agnostic way. Which backends currently support this is shown in the following table:
enscons
flit_core
hatchling
pdm-pep517
poetry-core
setuptools
0.26.0+
3.2+
0.3+
0.3.0+
Issue #3332
61.0.0+
Does it replace setup.py?
For setuptools-based packages pyproject.toml is not strictly meant to replace setup.py, but rather to ensure its correct execution if it's still needed. For other packaging tools – yes, it is:
Where the build-backend key exists, this takes precedence and the source tree follows the format and conventions of the specified backend (as such no setup.py is needed unless the backend requires it). Projects may still wish to include a setup.py for compatibility with tools that do not use this spec.
How to install a package in editable mode?
Originally "editable install" was a setuptools-specific feature and as such it was not supported by PEP 517. Later on PEP 660 extended this concept to packages using pyproject.toml.
There are two possible conditions for installing a package in editable mode using pip:
Modern:
Both the frontend (pip) and a backend must support PEP 660.
pip supports it since version 21.3;
Legacy:
Packaging tool must provide a setup.py file which supports the develop command.
Since version 21.1 pip can also install packages using only setup.cfg file in editable mode.
The following table describes the support of editable installs by various backends:
enscons
flit_core
hatchling
pdm-pep517
poetry-core
setuptools
0.28.0+
3.4+
0.3+
0.8.0+
1.0.8+
64.0.0+
Answering this part only, as the rest has nicely been explained by others:
How would one install a project with pyproject.toml in an editable state?
Solution
Since the release of poetry-core v1.0.8 in Feb 2022 you can do this:
a) you need this entry in your pyproject.toml:
[build-system]
requires = ["poetry-core>=1.0.8"]
build-backend = "poetry.core.masonry.api"
b) run:
pip install -e .
Sources
https://github.com/python-poetry/poetry/issues/34#issuecomment-1054626460
pyproject.toml can declare the files in your python package and all the metadata for it that will show in PyPi.
A tool like flit can process the pyproject.toml file into a package that can be uploaded to PyPi or installed with pip.
Other tools use pyproject.toml for other purposes. For example, pytest stores information about where to find and how to run tests, and instructions to pytest about modifying pythonpath (sys.path) before running the tests. Many IDEs can use this to help developers conveniently run tests.
I built a python application (the "host" app) that defines a setuptools entry-point, so that it can be extended. Plugin-authors then have to add the following into their setup.py file:
setup(
# ...
entry_points = {
'myapp.plugins':
['plugin_1 = <foo.plugin.module>:<plugin-install-func>']
}
)
In order to test my setup, i have to
build a dummy wheel-package,
use pip to install it,
append new package's folder into sys.path and invoke pkg_resources.working_set.add_entry(package_dir)[*],
only then i can check the expected behavior (run TCs),
use pip to uninstall the package, and
finally remove installed package folder from sys.path,
And a separate package is needed for each test-case, if different functionality must be validated.
This whole testing-rig is rather verbose and clumsy.
Is there a more elegant way to write test-cases for setuptools entry-point plugins?
[*] Note: Installing a wheels or using pip* in develop mode with pip install -e <plugin-package> would not activate the plugin on the same interpreter on Linux; or at least not without appending afterwards the package folder in sys.path.
On Windows, the above problem exists only on develop mode.
I had a same problem and resolved by a dirty hack to build and 'pip install' the plugin package to test, into a tox's environment and run tests in that environment.
a test script does dirty hack, that is, to build and install a wheel package and run tests: https://github.com/ssato/yamllint-plugin-example/blob/master/tests/test_plugin.sh
tox configuration: https://github.com/ssato/yamllint-plugin-example/blob/master/tox.ini
This is the tree structure of the module I'm writing the setup.py file for:
ls .
LICENSE
README.md
bin
examples
module
scratch
setup.py
tests
tox.ini
I configured my setup.py as follows:
from setuptools import setup, find_packages
setup(
name="package_name",
version="0.1",
packages=find_packages(),
install_requires=[
# [...]
],
extras_require={
# [...]
},
tests_require={
'pytest',
'doctest'
},
scripts=['bin/bootstrap'],
data_files=[
('license', ['LICENSE']),
],
# [...]
# could also include long_description, download_url, classifiers, etc.
)
If I install the package from my python environment (also a virtualenv)
pip install .
the LICENSE file gets correctly installed.
But running tox:
[tox]
envlist = py27, py35
[testenv]
deps =
pytest
git+https://github.com/djc/couchdb-python
docopt
commands = py.test \
{posargs}
I get this error:
running install_data
creating build/bdist.macosx-10.11-x86_64/wheel/leafline-0.1.data
creating build/bdist.macosx-10.11-x86_64/wheel/leafline-0.1.data/data
creating build/bdist.macosx-10.11-x86_64/wheel/leafline-0.1.data/data/license
error: can't copy 'LICENSE': doesn't exist or not a regular file
Removing the data_files part from the setup.py makes tox running correctly.
Your issue here is that setuptools is not able to find the 'LICENSE' file in the files that have been included for building the source distribution. You have 2 options, to tell setuptools to include that file (both have been pointed to here):
Add a MANIFEST.in file (like https://github.com/pypa/sampleproject/)
Use include_package_data=True in your setup.py file.
Using MANIFEST.in is often simpler and easier to verify due to https://pypi.org/project/check-manifest/, making it possible to use automation to verify that things are indeed correct (if you use a VCS like Git or SVN).
pip install . builds a wheel using python setup.py bdist_wheel which is installed by simply unpacking it appropriately, as defined in the Wheel Specification: https://www.python.org/dev/peps/pep-0427/
tox builds a source distribution using python setup.py sdist, which is then unpacked and installed using python setup.py install.
That might be a reason for the difference in behavior for you.
I have some resource files inside my packages which I use during the execution. To make setup store them in a package with python code, I use include_package_data=True and I access them using importlib.resources. You can use backport for an older Python version than 3.7 or another library.
Before each release I have a script which verifies, that all files I need are placed inside a bdist wheel to be sure that everything is on the place.
What is setup.py and how can it be configured or used?
setup.py is a python file, the presence of which is an indication that the module/package you are about to install has likely been packaged and distributed with Distutils, which is the standard for distributing Python Modules.
This allows you to easily install Python packages. Often it's enough to write:
$ pip install .
pip will use setup.py to install your module. Avoid calling setup.py directly.
https://docs.python.org/3/installing/index.html#installing-index
It helps to install a python package foo on your machine (can also be in virtualenv) so that you can import the package foo from other projects and also from [I]Python prompts.
It does the similar job of pip, easy_install etc.,
Using setup.py
Let's start with some definitions:
Package - A folder/directory that contains __init__.py file.
Module - A valid python file with .py extension.
Distribution - How one package relates to other packages and modules.
Let's say you want to install a package named foo. Then you do,
$ git clone https://github.com/user/foo
$ cd foo
$ python setup.py install
Instead, if you don't want to actually install it but still would like to use it. Then do,
$ python setup.py develop
This command will create symlinks to the source directory within site-packages instead of copying things. Because of this, it is quite fast (particularly for large packages).
Creating setup.py
If you have your package tree like,
foo
├── foo
│ ├── data_struct.py
│ ├── __init__.py
│ └── internals.py
├── README
├── requirements.txt
└── setup.py
Then, you do the following in your setup.py script so that it can be installed on some machine:
from setuptools import setup
setup(
name='foo',
version='1.0',
description='A useful module',
author='Man Foo',
author_email='foomail#foo.example',
packages=['foo'], #same as name
install_requires=['wheel', 'bar', 'greek'], #external packages as dependencies
)
Instead, if your package tree is more complex like the one below:
foo
├── foo
│ ├── data_struct.py
│ ├── __init__.py
│ └── internals.py
├── README
├── requirements.txt
├── scripts
│ ├── cool
│ └── skype
└── setup.py
Then, your setup.py in this case would be like:
from setuptools import setup
setup(
name='foo',
version='1.0',
description='A useful module',
author='Man Foo',
author_email='foomail#foo.example',
packages=['foo'], #same as name
install_requires=['wheel', 'bar', 'greek'], #external packages as dependencies
scripts=[
'scripts/cool',
'scripts/skype',
]
)
Add more stuff to (setup.py) & make it decent:
from setuptools import setup
with open("README", 'r') as f:
long_description = f.read()
setup(
name='foo',
version='1.0',
description='A useful module',
license="MIT",
long_description=long_description,
author='Man Foo',
author_email='foomail#foo.example',
url="http://www.foopackage.example/",
packages=['foo'], #same as name
install_requires=['wheel', 'bar', 'greek'], #external packages as dependencies
scripts=[
'scripts/cool',
'scripts/skype',
]
)
The long_description is used in pypi.org as the README description of your package.
And finally, you're now ready to upload your package to PyPi.org so that others can install your package using pip install yourpackage.
At this point there are two options.
publish in the temporary test.pypi.org server to make oneself familiarize with the procedure, and then publish it on the permanent pypi.org server for the public to use your package.
publish straight away on the permanent pypi.org server, if you are already familiar with the procedure and have your user credentials (e.g., username, password, package name)
Once your package name is registered in pypi.org, nobody can claim or use it. Python packaging suggests the twine package for uploading purposes (of your package to PyPi). Thus,
the first step is to locally build the distributions using:
# prereq: wheel (pip install wheel)
$ python setup.py sdist bdist_wheel
then using twine for uploading either to test.pypi.org or pypi.org:
$ twine upload --repository testpypi dist/*
username: ***
password: ***
It will take few minutes for the package to appear on test.pypi.org. Once you're satisfied with it, you can then upload your package to the real & permanent index of pypi.org simply with:
$ twine upload dist/*
Optionally, you can also sign the files in your package with a GPG by:
$ twine upload dist/* --sign
Bonus Reading:
See a sample setup.py from a real project here: torchvision-setup.py
PEP 517, setuptools
why twine? using twine
setup.py is Python's answer to a multi-platform installer and make file.
If you’re familiar with command line installations, then make && make install translates to python setup.py build && python setup.py install.
Some packages are pure Python, and are only byte compiled. Others may contain native code, which will require a native compiler (like gcc or cl) and a Python interfacing module (like swig or pyrex).
If you downloaded package that has "setup.py" in root folder, you can install it by running
python setup.py install
If you are developing a project and are wondering what this file is useful for, check Python documentation on writing the Setup Script
setup.py is a Python script that is usually shipped with libraries or programs, written in that language. It's purpose is the correct installation of the software.
Many packages use the distutils framework in conjuction with setup.py.
http://docs.python.org/distutils/
setup.py can be used in two scenarios , First, you want to install a Python package. Second, you want to create your own Python package. Usually standard Python package has couple of important files like setup.py, setup.cfg and Manifest.in. When you are creating the Python package, these three files will determine the (content in PKG-INFO under egg-info folder) name, version, description, other required installations (usually in .txt file) and few other parameters. setup.cfg is read by setup.py while package is created (could be tar.gz ). Manifest.in is where you can define what should be included in your package. Anyways you can do bunch of stuff using setup.py like
python setup.py build
python setup.py install
python setup.py sdist <distname> upload [-r urltorepo] (to upload package to pypi or local repo)
There are bunch of other commands which could be used with setup.py . for help
python setup.py --help-commands
setup.py is a Python file like any other. It can take any name, except by convention it is named setup.py so that there is not a different procedure with each script.
Most frequently setup.py is used to install a Python module but server other purposes:
Modules:
Perhaps this is most famous usage of setup.py is in modules. Although they can be installed using pip, old Python versions did not include pip by default and they needed to be installed separately.
If you wanted to install a module but did not want to install pip, just about the only alternative was to install the module from setup.py file. This could be achieved via python setup.py install. This would install the Python module to the root dictionary (without pip, easy_install ect).
This method is often used when pip will fail. For example if the correct Python version of the desired package is not available via pipperhaps because it is no longer maintained, , downloading the source and running python setup.py install would perform the same thing, except in the case of compiled binaries are required, (but will disregard the Python version -unless an error is returned).
Another use of setup.py is to install a package from source. If a module is still under development the wheel files will not be available and the only way to install is to install from the source directly.
Building Python extensions:
When a module has been built it can be converted into module ready for distribution using a distutils setup script. Once built these can be installed using the command above.
A setup script is easy to build and once the file has been properly configured and can be compiled by running python setup.py build (see link for all commands).
Once again it is named setup.py for ease of use and by convention, but can take any name.
Cython:
Another famous use of setup.py files include compiled extensions. These require a setup script with user defined values. They allow fast (but once compiled are platform dependant) execution. Here is a simple example from the documentation:
from distutils.core import setup
from Cython.Build import cythonize
setup(
name = 'Hello world app',
ext_modules = cythonize("hello.pyx"),
)
This can be compiled via python setup.py build
Cx_Freeze:
Another module requiring a setup script is cx_Freeze. This converts Python script to executables. This allows many commands such as descriptions, names, icons, packages to include, exclude ect and once run will produce a distributable application. An example from the documentation:
import sys
from cx_Freeze import setup, Executable
build_exe_options = {"packages": ["os"], "excludes": ["tkinter"]}
base = None
if sys.platform == "win32":
base = "Win32GUI"
setup( name = "guifoo",
version = "0.1",
description = "My GUI application!",
options = {"build_exe": build_exe_options},
executables = [Executable("guifoo.py", base=base)])
This can be compiled via python setup.py build.
So what is a setup.py file?
Quite simply it is a script that builds or configures something in the Python environment.
A package when distributed should contain only one setup script but it is not uncommon to combine several together into a single setup script. Notice this often involves distutils but not always (as I showed in my last example). The thing to remember it just configures Python package/script in some way.
It takes the name so the same command can always be used when building or installing.
When you download a package with setup.py open your Terminal (Mac,Linux) or Command Prompt (Windows). Using cd and helping you with Tab button set the path right to the folder where you have downloaded the file and where there is setup.py :
iMac:~ user $ cd path/pakagefolderwithsetupfile/
Press enter, you should see something like this:
iMac:pakagefolderwithsetupfile user$
Then type after this python setup.py install :
iMac:pakagefolderwithsetupfile user$ python setup.py install
Press enter. Done!
To make it simple, setup.py is run as "__main__" when you call the install functions the other answers mentioned. Inside setup.py, you should put everything needed to install your package.
Common setup.py functions
The following two sections discuss two things many setup.py modules have.
setuptools.setup
This function allows you to specify project attributes like the name of the project, the version.... Most importantly, this function allows you to install other functions if they're packaged properly. See this webpage for an example of setuptools.setup
These attributes of setuptools.setup enable installing these types of packages:
Packages that are imported to your project and listed in PyPI using setuptools.findpackages:
packages=find_packages(exclude=["docs","tests", ".gitignore", "README.rst","DESCRIPTION.rst"])
Packages not in PyPI, but can be downloaded from a URL using dependency_links
dependency_links=["http://peak.telecommunity.com/snapshots/",]
Custom functions
In an ideal world, setuptools.setup would handle everything for you. Unfortunately this isn't always the case. Sometimes you have to do specific things, like installing dependencies with the subprocess command, to get the system you're installing on in the right state for your package. Try to avoid this, these functions get confusing and often differ between OS and even distribution.
To install a Python package you've downloaded, you extract the archive and run the setup.py script inside:
python setup.py install
To me, this has always felt odd. It would be more natural to point a package manager at the download, as one would do in Ruby and Nodejs, eg. gem install rails-4.1.1.gem
A package manager is more comfortable too, because it's familiar and reliable. On the other hand, each setup.py is novel, because it's specific to the package. It demands faith in convention "I trust this setup.py takes the same commands as others I have used in the past". That's a regrettable tax on mental willpower.
I'm not saying the setup.py workflow is less secure than a package manager (I understand Pip just runs the setup.py inside), but certainly I feel it's awkard and jarring. There's a harmony to commands all being to the same package manager application. You might even grow fond it.