Here is my setup.py:
setup(
...
install_requires=['GEDThriftStubs'],
dependency_links=['git+ssh://user#git.server.com/ged-thrift-stubs.git#egg=GEDThriftStubs'],
...)
Then I create package:
python setup.py sdist
Then I try to install it:
pip install file://path/package-0.0.1.tar.gz
And get this in terminal:
Downloading/unpacking GEDThriftStubs (from package==0.0.1)
Could not find any downloads that satisfy the requirement GEDThriftStubs (from package==0.0.1)
No distributions at all found for GEDThriftStubs (from package==0.0.1)
And in pip.log messages like this:
Skipping link git+ssh://user#git.server.com/ged-thrift-stubs.git#egg=GEDThriftStubs; wrong project name (not gedthriftstubs)
And I don't have anywhere in my project that exact name "gedthriftstubs", if it matters.
But this works fine:
pip install git+ssh://user#git.server.com/ged-thrift-stubs.git#egg=GEDThriftStubs
Try:
$ pip install --process-dependency-links file://path/package-0.0.1.tar.gz
Note that this tag is removed from pip in pip 1.6. See this article on pip.pypa.io for more information.
In pip 1.5 processing dependency links was deprecated and it was removed completely in pip 1.6.
There's also a lengthy discussion ( issue #1519 ) regarding pip & dependency links
If that doesn't work, you may also need to add a version suffix on your link, like this:
git+ssh://user#git.server.com/ged-thrift-stubs.git#egg=GEDThriftStubs-0.0.1
where 0.0.1 is the version specified in the setup.py of ged-thrift-stubs
Related
This is an extension of SO setup.py ignores full path dependencies, instead looks for "best match" in pypi
I am trying to write setup.py to install a proprietary package from a .tar.gz file on an internal web site. Unfortunately for me the prop package name duplicates a public package in the public PyPI, so I need to force install of the proprietary package at a specific version. I'm building a docker image from a Debian-Buster base image, so pip, setuptools and tox are all freshly installed, the image brings python 3.8 and pip upgrades itself to version 21.2.4.
Solution 1 - dependency_links
I followed the instructions at the post linked above to put the prop package in install_requires and dependency_links. Here are the relevant lines from my setup.py:
install_requires=["requests", "proppkg==70.1.0"],
dependency_links=["https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0"]
Installation is successful in Debian-Buster if I run python3 setup.py install in my package directory. I see the proprietary package get downloaded and installed.
Installation fails if I run pip3 install . also tox (version 3.24.4) fails similarly. In both cases, pip shows a message "Looking in indexes" then fails with "ERROR: Could not find a version that satisfies the requirement".
Solution 2 - PEP 508
Studying SO answer pip ignores dependency_links in setup.py which states that dependency_links is deprecated, I started over, revised setup.py to have:
install_requires=[
"requests",
"proppkg # https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0"
],
Installation is successful in Debian-Buster if I run pip3 install . in my package directory. Pip shows a message "Looking in indexes" but still downloads and installs the proprietary package successfully.
Installation fails in Debian-Buster if I run python3 setup.py install in my package directory. I see these messages:
Searching for proppkg# https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0
..
Reading https://pypi.org/simple/proppkg/
..
error: Could not find suitable distribution for Requirement.parse(...).
Tox also fails in this scenario as it installs dependencies.
Really speculating now, it almost seems like there's an ordering issue. Tox invokes pip like this:
python -m pip install --exists-action w .tox/.tmp/package/1/te-0.3.5.zip
In that output I see "Collecting proppkg# https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0" as the first step. That install fails because it fails to import package requests. Then tox continues collecting other dependencies. Finally tox reports as its last step "Collecting requests" (and that succeeds). Do I have to worry about ordering of install steps?
I'm starting to think that maybe the proprietary package is broken. I verified that the prop package setup.py has requests in its install_requires entry. Not sure what else to check.
Workaround solution
My workaround is installing the proprietary package in the docker image as a separate step before I install my own package, just by running pip3 install https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz. The setup.py has the PEP508 URL in install_requires. Then pip and tox find the prop package in the pip cache, and work fine.
Please suggest what to try for the latest pip and tox, or if this is as good as it gets, thanks in advance.
Update - add setup.py
Here's a (slightly sanitized) version of my package's setup.py
from setuptools import setup, find_packages
def get_version():
"""
read version string
"""
version_globals = {}
with open("te/version.py") as fp:
exec(fp.read(), version_globals)
return version_globals['__version__']
setup(
name="te",
version=get_version(),
packages=find_packages(exclude=["tests.*", "tests"]),
author="My Name",
author_email="email#mycompany.com",
description="My Back-End Server",
entry_points={"console_scripts": [
"te-be=te.server:main"
]},
python_requires=">=3.7",
install_requires=["connexion[swagger-ui]",
"Flask",
"gevent",
"redis",
"requests",
"proppkg # https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0"
],
package_data={"te": ["openapi_te.yml"]},
include_package_data=True, # read MANIFEST.in
)
accordingly to my research the following should work:
from setuptools import setup
from setuptools import find_packages
...
REQUIRES_INSTALL = [
'spacy==2.3.2',
'tensorflow==1.14.0',
'Keras==2.2.4',
'keras-contrib#git+https://github.com/keras-team/keras-contrib.git#egg=keras-contrib',
'en-core-web-sm#https://github.com/explosion/spacy-models/releases/download/en_core_web_sm-2.3.0/en_core_web_sm-2.3.0.tar.gz#egg=en-core-web-sm'
]
...
setup(
name=NAME,
version=VERSION,
description=DESCRIPTION,
install_requires=REQUIRES_INSTALL,
...
)
When building a wheel or egg, everything is fine: python setup.py bdist_wheel.
But when trying to install the package (whl or egg) with pip install -U dist/mypack-....whl.
I get:
ERROR: Could not find a version that satisfies the requirement keras-contrib (from mypack==0.3.5) (from versions: none)
ERROR: No matching distribution found for keras-contrib (from mypack==0.3.5)
...
ERROR: Could not find a version that satisfies the requirement en-core-web-sm (from mypack==0.3.5) (from versions: none)
ERROR: No matching distribution found for en-core-web-sm (from mypack==0.3.5)
I have tried to same via setup.cfg but still no luck.
As reference - all these dependency are working when installing them first from requirments.txt and then installing the wheel.
spacy==2.3.2
tensorflow==1.14.0
Keras==2.2.4
keras-contrib#git+https://github.com/keras-team/keras-contrib.git#egg=keras-contrib
en-core-web-sm#https://github.com/explosion/spacy-models/releases/download/en_core_web_sm-2.3.0/en_core_web_sm-2.3.0.tar.gz#egg=en-core-web-sm
pip install -r requirements.txt
pip install -U dist/mypack-....whl
But this is not clean way, since a wheel should be self contained.
Thank you for any hint!
Environment
Python: 3.7.0
Pip: 20.2.4
setuptools: 50.3.2
Some time ago it was possible to define a single requirements.txt or similar containing both specs for PyPI packages and links to repositories and archives.
That demanded to parse the requirements.txt and split them into "requirements" and "dependencies", where "requirements" would contain definitions of PyPI packages and "dependencies" — links.
Setuptools has different args for setup() for these: install_requires and dependency_links.
And it really worked: one was able to define a requirements.txt and install a package both as python setup.py install and as pip install .. Moreover, it was possible to install just dependencies via pip install -r requirements.txt. All ways worked and allowed to have a single place to define all requirements including non-PyPI links.
However, support of dependency_links arg was dropped by pip since v19. And here's the weird part: it is not dropped by setuptools. But there's more.
As of today, pip:
Supports only install_requires.
Prefers PEP 508 notation for dependencies in definitions of packages (install_requires) and standalone requirements.txt or similar.
Aborts installation of packages, which contain links in their install_requires.
Your definitions of dependencies mix 2 notations: prefixes like keras-contrib# are from PEP 508 and #egg= parts are from setuptools links notation.
This is not an issue: pip will ignore "eggs" as names are already defined before #.
I believe the installation of the package via pip works fine, i.e.:
pip install .
However, the issues will arise if the package is installed via setuptools, i.e.:
python setup.py install
setuptools does not understand PEP 508 notation and ignores links in install_requires. As of today, to make setuptools following links, both install_requires and dependency_links have to be used, e.g.:
setup(
...
install_requires=[
...
"keras_contrib==2.0.8",
...
],
dependency_links=[
"https://github.com/keras-team/keras-contrib/tarball/master#egg=keras_contrib-2.0.8",
...
],
)
Here are several tricky points:
A single dependency is defined in 2 places: a package name in install_requires and a link in dependency_links to resolve the package dependency.
The link is not git+https://.../....git, but it's a link to an archive: https://.../tarball/....
Egg name is in snake_case, not in dash-case. While it's possible to use dash-case, this will not allow specifying the version.
Version in install_requires is delimited via == and in dependency_links — via -.
It's possible to omit the version. But the only viable use case for that is if the package is not present in PyPI and is rarely updated. If the package is present in PyPI, but an unpublished version is needed, then the version must be specified.
And here's the bummer: fixing links for setuptools will break pip, as PEP 508 does not allow to specify versions. Keeping keras-contrib==x.y.z # ... in install_requires will make pip to search for the package keras-contrib==x.y.z, where ==x.y.z is not a version, but a part of the name. At the same time, not specifying a version will make setuptools to grab the latest version available at PyPI, not at the link from dependency_links.
In your case neither keras-contrib nor en-core-web-sm are present at PyPI, so using keras_contrib#git+https://... + dependency_links without version specified might work.
Otherwise, stick to pip install . and avoid using python setup.py install if the package depends on links.
See also:
PEP 508
PEP508: why either version requirement or URL but not both?
How can I make setuptools install a package that's not on PyPI?
pip install dependency links
pip3 setup.py install_requires PEP 508 git URL for private repo
Why is dependency links in setup.py deprecated?
Changing PEP 508 URLs in setup.py doesn't reinstall the dependency
Updating remote links with new URLs for PEP508 functionality
Requirements using PEP 508 direct references ignore the URL
Suggest alternatives for --process-dependency-links
Un-deprecate --process-dependency-links until an alternative is implemented
Changes to the pip dependency resolver in 20.3 (2020)
Trivia: several issues on GitHub are still open and PEP 508 is still in Active state since 2015. Digging around source code would reveal that setuptools is a wrapper around Python's distutils. setuptools is not a part of Python's stdlib, but the docs of distutils imply stdlib docs will be removed after docs of setuptools will be updated. At the same time pip is already bundled with Python's installations as Python's module. And yet we have pipfiles, pipenv, poetry, conda, pipx, pip-tools, shiv, spack, and the rest. Looks a bit overwhelming.
I'm having serious trouble with using setup.py to pip install my package which also has dependency links. I have read this answer and this one thoroughly and none of the answers including the accepted ones help.
Here is the setup.py for the package trying to install.
Basically, it reads the requirements.txt to fill install_requires and dependency_links, most of the rest of the code is boilerplate from cookie-cutter. requirements.txt has a private github repo in it which is causing the issues. e.g. git+https://${GITHUB_OAUTH_TOKEN}#github.com/jmerkow/pripy.git#egg=pripy
When I run pip install -r requirements.txt everything works great, it installs the private repository. However, if I try to install using pip install . --process-dependency-links, I get this error:
Could not find a version that satisfies the requirement pripy (from mypackage==<sha>) (from versions: )
No matching distribution found for pripy (from mypackage==<sha>)
If I take off the #egg=xxx from the link in requirements, the private repo package is completely ignored by pip install . but not by pip install -r requirements.txt.
I have confirmed that dependency_links contains 'git+https://<actual-token>#github.com/jmerkow/pripy.git#egg=pripy' and that install_requires includes 'pripy'
How do you get setup to properly Is this a problem with the sub-package? setup.py in that repo is done pretty much the same, except there are no private links.
Ugh, this always happens. I put in all the work to the question, then I figure it out myself.
The issue is two things, first, all dependeny_links need to have a version, second to pull the version from the requirements file properly you need to do some magic on the string.
Comparing to the above setup.py, I changes the way requirements are added to the two lists (updated here). Then add the version to #egg=xxx on the link e.g.
git+https://${GITHUB_OAUTH_TOKEN}#github.com/jmerkow/pripy.git#egg=pripy-0.
Now setup.py will parse that file, take the egg version info, convert it to a pip version (basically replace the first '-' with an '==') for the install_requires, and you're good to go.
I've published a project to PyPI for the first time (https://pypi.org/project/xontrib-autojump/). But I get the following error when I try to install the project with this pip command:
$ pip install xontrib-autojump --user
Collecting xontrib-autojump
Could not find a version that satisfies the requirement xontrib-autojump (from versions: 0.1.linux-x86_64, 0.2.linux-x86_64, 0.3.linux-x86_64, 0.4.linux-x86_64)
No matching distribution found for xontrib-autojump
This project does appear when I run pip search xontrib-autojump:
$ pip search xontrib-autojump
xontrib-autojump (0.4) - autojump support for xonsh
...
Why can't I install this package with pip?
There are a number of possible problem areas - the main one is that it looks to me that you have not followed the naming convention needed to specify which python version the download is suitable for.
It is also a very good idea to set the metadata as this assist with finding packages.
The Packaging Tutorial is very helpful on this. It is also recommend that you test the upload and install process using the test instance of pypi.
The other big problem st that you "package" does not contain any python code and is not a python package in any way shape or form.
It's a similar question to How can I make setuptools install a package that's not on PyPI? but not the same.
As I would like to use the forked version of some package, setuptools ignore the dependency link (as it has the same version number).
Is there a way to force using the link from the dependency_links? Or is the only way to change the version number in the forked repo?
requires = [
...
'pyScss==1.1.3'
...
dependencies = [
'https://github.com/nadavshatz/pyScss/zipball/master#egg=pyScss-1.1.3'
]
Update
Weird, apparently it works if this package is the only one in the required list, that is not installed yet. If there's another missing package it will download it from pypi.
I believe you can just use dependency_links as described in that question:
from setuptools import setup
setup(name = 'mypkg',
version = '0.0.1',
description = 'Foo',
author = 'bar',
author_email = 'bar#example.com',
install_requires = ['pyScss==1.1.3'],
dependency_links = [
'https://github.com/nadavshatz/pyScss/zipball/master#egg=pyScss-1.1.3'
]
)
Tested using python setup.py develop
You probably want to rename the egg to emphasize it's a fork http://www.python.org/dev/peps/pep-0386/
Outside of the setup.py you can enforce this locally using requirements.txt and pip. Whilst this won't make your package depend on the fork you can easily document it as the way to install.
$ cat requirements.txt
https://github.com/nadavshatz/pyScss/zipball/master#egg=pyScss-1.1.3
$ pip install -r requirements.txt
I ended up doing something very similar to the answer in stackoverflow.com/a/17442663/368102.
I need a requests-file github package that name-conflicts with a different requests-file package in PyPi. They both have a version 1.0, and the PyPi version has some higher versions.
The workaround in my ias_tools/setup.py looks like this:
setup(
...
install_requires=[
'requests-file<=99.99',
],
dependency_links=[
'https://github.com/jvantuyl/requests-file/archive/b0a7b34af6e287e07a96bc7e89bac3bc855323ae.zip#egg=requests-file-99.99'
]
)
In my case, I'm using pip so I also had to use --process-dependency-links:
% pip install --process-dependency-links ./ias_tools
You are using pip version 6.0.6, however version 6.1.1 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
Processing ./ias_tools
DEPRECATION: Dependency Links processing has been deprecated and will be removed in a future release.
Collecting requests-file<=99.99 (from ias-tools==0.1)
Downloading https://github.com/jvantuyl/requests-file/archive/b0a7b34af6e287e07a96bc7e89bac3bc855323ae.zip
Requirement already satisfied (use --upgrade to upgrade): requests>=1.1.0 in ./venv/lib/python2.7/site-packages (from requests-file<=99.99->ias-tools==0.1)
Installing collected packages: ias-tools, requests-file
Running setup.py install for ias-tools
Running setup.py install for requests-file
Successfully installed ias-tools-0.1 requests-file-1.0
I'm not too worried about the deprecation notice, as a pull request was submitted to pip to deprecate the deprecation (after a discussion about it).