I'm setting up the mozmill-automation package by running:
python setup.py develop
The setup.py file has a dependency definition:
deps = [...,
'mozmill == 2.1-dev',
'mozversion >= 0.7'
]
...
setup(..., install_requires=deps, ...)
It ends up downloading mozversion 1.1, which is the latest available version on pypi. Is setuptools guaranteed to download the latest available dependency versions?
It appears so, as per https://pythonhosted.org/setuptools/easy_install.html (search for "latest available version" on that page). But, I'm not sure if setuptools uses easy_install to download dependencies.
Note: I'm not using virtualenv
Related
I wanted to write a new Python package that I want to make available via PyPI.
In the past I always used setup.py. But this time I decided to embrace new
best practices of using setup.cfg
instead. So I started reading a little bit of the documentation, mainly
https://setuptools.pypa.io/en/latest/userguide/declarative_config.html.
I also decided to use pyscaffold for the generation of the files.
pyscaffold generated a setup.cfg file for me and I added (just for testing
purposes) version = 5.1.1 in under the metadata section, as described in the
documentation above.
For convenience I'm using anaconda and created a new empty environment:
$ python --version
Python 3.9.7
$ pip list
...
PyScaffold 4.1.1
setuptools 58.4.0
setuptools-scm 6.3.2
wheel 0.37.0
But when I executed pip install -e ., the version was ignored and pip assigned a
different one:
$ pip install -e .
Obtaining file:///tmp/test_package
Installing build dependencies ... done
Checking if build backend supports build_editable ... done
Getting requirements to build wheel ... done
Preparing metadata (pyproject.toml) ... done
Installing collected packages: test-package
Attempting uninstall: test-package
Found existing installation: test-package 0.0.post1.dev10+g3ed39c8.d20211106
Uninstalling test-package-0.0.post1.dev10+g3ed39c8.d20211106:
Successfully uninstalled test-package-0.0.post1.dev10+g3ed39c8.d20211106
Running setup.py develop for test-package
Successfully installed test-package-0.0.post1.dev1+g228b46c
https://stackoverflow.com/a/27077610/1480131 mentions that setuptools version
30.3.0 supports putting metadata in setup.cfg, I'm using 58.4.0, so a much
more recent version.
I also tested with different formats like version = file:VERSION.txt but that
didn't help either, pip simply ignores the version entry.
What am I missing? What am I doing wrong?
I prepared a small git repo with the files in order to be able to reproduce the
error: https://github.com/shaoran/test_package Does anybody get a different
result?
That's because pyscaffold generated a project that uses setuptools-scm for version detection. When setuptools-scm is used, the version is not read from version metadata, but parsed from the repository tags. The version 0.0.post1.dev10+g3ed39c8.d20211106 can be read as follows:
0.0.post1 - dummy version since you don't have any tags in repo yet;
dev10 - you have ten commits that are not included in any version tag (basically the amount of commits you made since tagging last);
g3ed39c8 - the short hash of commit you have installed from is 3ed39c8 (prefix g means it is a Git repo);
d20211106 - d means you have installed from a dirty state (some files versioned by Git were modified), the rest is just the timestamp.
If you want to use the version metadata from setup.cfg instead, just drop setuptools-scm activation in setup.py:
if __name__ == "__main__":
try:
setup()
except:
...
You can then also clean up pyproject.toml, although this isn't required explicitly:
[build-system]
requires = ["setuptools>=46.1.0", "wheel"]
build-backend = "setuptools.build_meta"
If you want to keep using setuptools-scm (which IMO is a great tool to prevent you from accidentally releasing dists with same version, but different contents), then just add a new tag to start versioning from:
$ git tag -a 5.1.1 -m 'start versioning'
If you had a clean repo state (no modified files), pip install -e . will install the version 5.1.1 right away. Otherwise, you will get a 5.1.1 with a suffix.
The neat part of using setuptools-scm is that the version metadata in setup.cfg is ignored, so you don't have to bump it yourself and can safely remove version = 5.1.1 line.
accordingly to my research the following should work:
from setuptools import setup
from setuptools import find_packages
...
REQUIRES_INSTALL = [
'spacy==2.3.2',
'tensorflow==1.14.0',
'Keras==2.2.4',
'keras-contrib#git+https://github.com/keras-team/keras-contrib.git#egg=keras-contrib',
'en-core-web-sm#https://github.com/explosion/spacy-models/releases/download/en_core_web_sm-2.3.0/en_core_web_sm-2.3.0.tar.gz#egg=en-core-web-sm'
]
...
setup(
name=NAME,
version=VERSION,
description=DESCRIPTION,
install_requires=REQUIRES_INSTALL,
...
)
When building a wheel or egg, everything is fine: python setup.py bdist_wheel.
But when trying to install the package (whl or egg) with pip install -U dist/mypack-....whl.
I get:
ERROR: Could not find a version that satisfies the requirement keras-contrib (from mypack==0.3.5) (from versions: none)
ERROR: No matching distribution found for keras-contrib (from mypack==0.3.5)
...
ERROR: Could not find a version that satisfies the requirement en-core-web-sm (from mypack==0.3.5) (from versions: none)
ERROR: No matching distribution found for en-core-web-sm (from mypack==0.3.5)
I have tried to same via setup.cfg but still no luck.
As reference - all these dependency are working when installing them first from requirments.txt and then installing the wheel.
spacy==2.3.2
tensorflow==1.14.0
Keras==2.2.4
keras-contrib#git+https://github.com/keras-team/keras-contrib.git#egg=keras-contrib
en-core-web-sm#https://github.com/explosion/spacy-models/releases/download/en_core_web_sm-2.3.0/en_core_web_sm-2.3.0.tar.gz#egg=en-core-web-sm
pip install -r requirements.txt
pip install -U dist/mypack-....whl
But this is not clean way, since a wheel should be self contained.
Thank you for any hint!
Environment
Python: 3.7.0
Pip: 20.2.4
setuptools: 50.3.2
Some time ago it was possible to define a single requirements.txt or similar containing both specs for PyPI packages and links to repositories and archives.
That demanded to parse the requirements.txt and split them into "requirements" and "dependencies", where "requirements" would contain definitions of PyPI packages and "dependencies" — links.
Setuptools has different args for setup() for these: install_requires and dependency_links.
And it really worked: one was able to define a requirements.txt and install a package both as python setup.py install and as pip install .. Moreover, it was possible to install just dependencies via pip install -r requirements.txt. All ways worked and allowed to have a single place to define all requirements including non-PyPI links.
However, support of dependency_links arg was dropped by pip since v19. And here's the weird part: it is not dropped by setuptools. But there's more.
As of today, pip:
Supports only install_requires.
Prefers PEP 508 notation for dependencies in definitions of packages (install_requires) and standalone requirements.txt or similar.
Aborts installation of packages, which contain links in their install_requires.
Your definitions of dependencies mix 2 notations: prefixes like keras-contrib# are from PEP 508 and #egg= parts are from setuptools links notation.
This is not an issue: pip will ignore "eggs" as names are already defined before #.
I believe the installation of the package via pip works fine, i.e.:
pip install .
However, the issues will arise if the package is installed via setuptools, i.e.:
python setup.py install
setuptools does not understand PEP 508 notation and ignores links in install_requires. As of today, to make setuptools following links, both install_requires and dependency_links have to be used, e.g.:
setup(
...
install_requires=[
...
"keras_contrib==2.0.8",
...
],
dependency_links=[
"https://github.com/keras-team/keras-contrib/tarball/master#egg=keras_contrib-2.0.8",
...
],
)
Here are several tricky points:
A single dependency is defined in 2 places: a package name in install_requires and a link in dependency_links to resolve the package dependency.
The link is not git+https://.../....git, but it's a link to an archive: https://.../tarball/....
Egg name is in snake_case, not in dash-case. While it's possible to use dash-case, this will not allow specifying the version.
Version in install_requires is delimited via == and in dependency_links — via -.
It's possible to omit the version. But the only viable use case for that is if the package is not present in PyPI and is rarely updated. If the package is present in PyPI, but an unpublished version is needed, then the version must be specified.
And here's the bummer: fixing links for setuptools will break pip, as PEP 508 does not allow to specify versions. Keeping keras-contrib==x.y.z # ... in install_requires will make pip to search for the package keras-contrib==x.y.z, where ==x.y.z is not a version, but a part of the name. At the same time, not specifying a version will make setuptools to grab the latest version available at PyPI, not at the link from dependency_links.
In your case neither keras-contrib nor en-core-web-sm are present at PyPI, so using keras_contrib#git+https://... + dependency_links without version specified might work.
Otherwise, stick to pip install . and avoid using python setup.py install if the package depends on links.
See also:
PEP 508
PEP508: why either version requirement or URL but not both?
How can I make setuptools install a package that's not on PyPI?
pip install dependency links
pip3 setup.py install_requires PEP 508 git URL for private repo
Why is dependency links in setup.py deprecated?
Changing PEP 508 URLs in setup.py doesn't reinstall the dependency
Updating remote links with new URLs for PEP508 functionality
Requirements using PEP 508 direct references ignore the URL
Suggest alternatives for --process-dependency-links
Un-deprecate --process-dependency-links until an alternative is implemented
Changes to the pip dependency resolver in 20.3 (2020)
Trivia: several issues on GitHub are still open and PEP 508 is still in Active state since 2015. Digging around source code would reveal that setuptools is a wrapper around Python's distutils. setuptools is not a part of Python's stdlib, but the docs of distutils imply stdlib docs will be removed after docs of setuptools will be updated. At the same time pip is already bundled with Python's installations as Python's module. And yet we have pipfiles, pipenv, poetry, conda, pipx, pip-tools, shiv, spack, and the rest. Looks a bit overwhelming.
I have developed a python package on github that I released on PyPi. It installs with pip install PACKAGENAME, but does not do anything with the dependencies that are stated in the "install_requires" of the setup.py file.
Weirdly enough, the zip file of the associated release does install all dependencies.. I tried with different virtual environments and on different computers but it never installs the dependencies.. Any help appreciated.
pip install pythutils downloads a wheel if it's available — and it's available for your package.
When generating a wheel setuptools runs python setup.py locally but doesn't include setup.py into the wheel. Download your wheel file and unzip it (it's just a zip archive) — there is your main package directory pythutils and a directory with metadata pythutils-1.1.1.dist-info. In the metadata directory there is a file METADATA that usually lists static dependencies but your file doesn't list any. Because when you were generating wheels all your dependencies has already been installed so all your dynamic code paths were skipped.
The archive that you downloaded from Github release install dependencies because it's not a wheel so pip runs python setup.py install and your dynamic dependencies work.
What you can do? My advice is to avoid dynamic dependencies. Declare static dependencies and allow pip to decide what versions to install:
install_requires=[
'numpy==1.16.5; python_version>="2" and python_version<"3"',
'numpy; python_version>="3"',
],
Another approach would be to create version-specific wheel files — one for Python 2 and another for Python 3 — with fixed dependencies.
Yet another approach is to not publish wheels at all and only publish sdist (source distribution). Then pip is forced to run python setup.py install on the target machine. That not the best approach and it certainly will be problematic for packages with C extensions (user must have a compiler and developer tools to install from sources).
Your setup.py does a series of checks like
try:
import numpy
except ImportError:
if sys.version_info[0] == 2:
install_requires.append('numpy==1.16.5')
if sys.version_info[0] == 3:
install_requires.append("numpy")
Presumably the system where you ran it had all the required modules already installed, and so ended up with an empty list in install_requires. But this is the wrong way to do it anyway; you should simply make a static list (or two static lists, one each for Python 2 and Python 3 if you really want to support both in the same package).
Looking as solutions in the past such as
pip ignores dependency_links in setup.py, this configuration should work.
Relevant content of my setup.py
packages=find_packages(),
dependency_links=['http://github.com/koji-project/koji/tarball/master#egg=koji'],
install_requires=['jira', 'PyYAML', 'requests', 'psycopg2',
'elasticsearch', 'beanbag', 'pyzabbix', 'enum34',
'beautifulsoup4', 'pytz', 'koji'],
tests_require=['flake8', 'autopep8', 'mock'],
include_package_data=True,
cmdclass={'test': setupTestRequirements}
The only thing I can think of is that my url is invalid. I don't see why it would be since it is of version 1.14.0.
Upon running pip install . I get.
Could not find a version that satisfies the requirement koji (from MARs==0.17.10) (from versions: ) No matching distribution found for koji (from MARs==0.17.10)
Upon running python setup.py develop --user, the output doesn't mention Koji
Your configuration is correct. However the problem lies elsewhere. Take a look at the koji repo on github: the project has no setup.py committed. As long as there's no setup.py script, neither pip nor setuptools (via setup.py install/setup.py develop) won't be able to install your project because they won't be able to install koji dependency as it is no valid python package at all.
Update:
The problem with koji repo on github is that it is only a mirror of the actual dev repo located on Fedora Pagure and is not synced with the upstream. So the correct answer is to use the real development repository instead of the github mirror:
dependency_links=['git+https://pagure.io/koji.git#egg=koji-1.14.0']
Easy peasy. :-)
Original answer (obsolete, only if you want to install from kojis repo mirror on Github):
I see two ways out of this situation:
Forking
fork koji on github
write your own setup.py script or copy it somewhere (see below for more info), commit and push
adapt the URL in dependency_links in your project's setup.py.
For testing, I prepared a fork of koji with a setup script; if I use its URL instead of the upstream repo, the installation succeeds. I also tagged my own "release" with koji-1.14.0.post1 to distinct the version with the setup script from the vanilla ones. Example setup.py with the new dependency:
from setuptools import setup, find_packages
setup(
name='spam',
version='0.1',
author='nobody',
author_email='nobody#nowhere.com',
url='www.example.com',
packages=[],
dependency_links=['https://github.com/hoefling/koji/tarball/master#egg=koji-1.14.0.post1'],
install_requires=['koji==1.14.0.post1'],
)
Testing the installation with pip yields:
$ pip install . --process-dependency-links
Obtaining file:///home/hoefling/python/spam
DEPRECATION: Dependency Links processing has been deprecated and will be removed in a future release.
Collecting koji==1.14.0.post1 (from spam==0.1)
Downloading https://github.com/hoefling/koji/tarball/master (1.4MB)
100% |████████████████████████████████| 1.4MB 759kB/s
Collecting pyOpenSSL (from koji==1.14.0.post1->spam==0.1)
Using cached pyOpenSSL-17.5.0-py2.py3-none-any.whl
Collecting pycurl (from koji==1.14.0.post1->spam==0.1)
Using cached pycurl-7.43.0.1.tar.gz
...
Installing collected packages: six, idna, asn1crypto, pycparser, cffi,
cryptography, pyOpenSSL, pycurl, python-dateutil, chardet, certifi,
urllib3, requests, pykerberos, requests-kerberos, rpm-py-installer,
koji, spam
Running setup.py install for rpm-py-installer ... done
Running setup.py install for koji ... done
Running setup.py install for spam ... done
Successfully installed asn1crypto-0.23.0 certifi-2017.11.5 cffi-1.11.2
chardet-3.0.4 cryptography-2.1.4 idna-2.6 koji-1.14.0.post1 pyOpenSSL-17.5.0
pycparser-2.18 pycurl-7.43.0.1 pykerberos-1.1.14 python-dateutil-2.6.1
requests-2.18.4 requests-kerberos-0.11.0 rpm-py-installer-0.5.0 six-1.11.0
spam-0.1 urllib3-1.22
Installed packages look good:
$ pip list
Package Version
----------------- ------------
asn1crypto 0.23.0
certifi 2017.11.5
cffi 1.11.2
chardet 3.0.4
cryptography 2.1.4
idna 2.6
koji 1.14.0.post1
pip 9.0.1
pycparser 2.18
pycurl 7.43.0.1
pykerberos 1.1.14
pyOpenSSL 17.5.0
python-dateutil 2.6.1
requests 2.18.4
requests-kerberos 0.11.0
rpm-py-installer 0.5.0
rpm-python 4.11.3
setuptools 38.2.4
six 1.11.0
spam 0.1
urllib3 1.22
wheel 0.30.0
The downside of this method is the additional overhead you get in maintaining the fork until the setup script is merged into upstream. This includes testing and eventually adapting koji's setup.py in your fork each time you want to sync the upstream updates. I would probably create a separate branch with the setup script committed there, sync the fork as usual and then rebase the branch on top of fork's master, but if you are used to another update strategy, stick to it.
Use koji package from TestPyPI
Actually, I found some koji wheels of the most recent version on TestPyPI. This is also the place where I got the setup.py for the fork above - I downloaded the source tar, unpacked it and copied the setup script. This means that the koji devs are looking into distributing the project via PyPI and are working on the setup script, but didn't commit it yet. While they are working on it, you can use the testing package index as the workaround. This way, you will not build the package from sources, taking the wheel instead that koji devs built and uploaded:
setup(
...
dependency_links=['https://testpypi.python.org/pypi/koji'],
install_requires=['koji'],
)
The downsides of this method are:
You don't know if the koji package from TestPyPI is installable at all. Even if it is, there's no guarantee that the installed code will work as intended (although it should). When you have the fork, you can always fix the setup script yourself - here you are doomed if the wheel file has errors.
Packages on TestPyPI are removed on a regular basis. From the docs:
Note: The database for TestPyPI may be periodically pruned, so it is not unusual for user accounts to be deleted.
Last note
You can of course combine the two workarounds and use both URLs in dependency_links:
setup(
...
dependency_links=[
'https://testpypi.python.org/pypi/koji',
'https://github.com/hoefling/tarball/master#egg=koji-1.14.0.post1',
],
install_requires=['koji'],
)
This way, if the package is not found on TestPyPI, it will be built from your fork.
Last note 2
You will probably need to install some additional system packages; at least for my system CentOS Linux release 7.3.1611 (Core) I had to install curl-devel to satisfy pycurl.
It's a similar question to How can I make setuptools install a package that's not on PyPI? but not the same.
As I would like to use the forked version of some package, setuptools ignore the dependency link (as it has the same version number).
Is there a way to force using the link from the dependency_links? Or is the only way to change the version number in the forked repo?
requires = [
...
'pyScss==1.1.3'
...
dependencies = [
'https://github.com/nadavshatz/pyScss/zipball/master#egg=pyScss-1.1.3'
]
Update
Weird, apparently it works if this package is the only one in the required list, that is not installed yet. If there's another missing package it will download it from pypi.
I believe you can just use dependency_links as described in that question:
from setuptools import setup
setup(name = 'mypkg',
version = '0.0.1',
description = 'Foo',
author = 'bar',
author_email = 'bar#example.com',
install_requires = ['pyScss==1.1.3'],
dependency_links = [
'https://github.com/nadavshatz/pyScss/zipball/master#egg=pyScss-1.1.3'
]
)
Tested using python setup.py develop
You probably want to rename the egg to emphasize it's a fork http://www.python.org/dev/peps/pep-0386/
Outside of the setup.py you can enforce this locally using requirements.txt and pip. Whilst this won't make your package depend on the fork you can easily document it as the way to install.
$ cat requirements.txt
https://github.com/nadavshatz/pyScss/zipball/master#egg=pyScss-1.1.3
$ pip install -r requirements.txt
I ended up doing something very similar to the answer in stackoverflow.com/a/17442663/368102.
I need a requests-file github package that name-conflicts with a different requests-file package in PyPi. They both have a version 1.0, and the PyPi version has some higher versions.
The workaround in my ias_tools/setup.py looks like this:
setup(
...
install_requires=[
'requests-file<=99.99',
],
dependency_links=[
'https://github.com/jvantuyl/requests-file/archive/b0a7b34af6e287e07a96bc7e89bac3bc855323ae.zip#egg=requests-file-99.99'
]
)
In my case, I'm using pip so I also had to use --process-dependency-links:
% pip install --process-dependency-links ./ias_tools
You are using pip version 6.0.6, however version 6.1.1 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
Processing ./ias_tools
DEPRECATION: Dependency Links processing has been deprecated and will be removed in a future release.
Collecting requests-file<=99.99 (from ias-tools==0.1)
Downloading https://github.com/jvantuyl/requests-file/archive/b0a7b34af6e287e07a96bc7e89bac3bc855323ae.zip
Requirement already satisfied (use --upgrade to upgrade): requests>=1.1.0 in ./venv/lib/python2.7/site-packages (from requests-file<=99.99->ias-tools==0.1)
Installing collected packages: ias-tools, requests-file
Running setup.py install for ias-tools
Running setup.py install for requests-file
Successfully installed ias-tools-0.1 requests-file-1.0
I'm not too worried about the deprecation notice, as a pull request was submitted to pip to deprecate the deprecation (after a discussion about it).