Can Pip install dependencies not specified in setup.py at install time? - python

I'd like pip to install a dependency that I have on GitHub when the user issues the command to install the original software, also from source on GitHub. Neither of these packages are on PyPi (and never will be).
The user issues the command:
pip -e git+https://github.com/Lewisham/cvsanaly#develop#egg=cvsanaly
This repo has a requirements.txt file, with another dependency on GitHub:
-e git+https://github.com/Lewisham/repositoryhandler#egg=repositoryhandler
What I'd like is a single command that a user can issue to install the original package, have pip find the requirements file, then install the dependency too.

This answer helped me solve the same problem you're talking about.
There doesn't seem to be an easy way for setup.py to use the requirements file directly to define its dependencies, but the same information can be put into the setup.py itself.
I have this requirements.txt:
PIL
-e git://github.com/gabrielgrant/django-ckeditor.git#egg=django-ckeditor
But when installing that requirements.txt's containing package, the requirements are ignored by pip.
This setup.py seems to coerce pip into installing the dependencies (including my github version of django-ckeditor):
from setuptools import setup
setup(
name='django-articles',
...,
install_requires=[
'PIL',
'django-ckeditor>=0.9.3',
],
dependency_links = [
'http://github.com/gabrielgrant/django-ckeditor/tarball/master#egg=django-ckeditor-0.9.3',
]
)
Edit:
This answer also contains some useful information.
Specifying the version as part of the "#egg=..." is required to identify which version of the package is available at the link. Note, however, that if you always want to depend on your latest version, you can set the version to dev in install_requires, dependency_links and the other package's setup.py
Edit: using dev as the version isn't a good idea, as per comments below.

Here's a small script I used to generate install_requires and dependency_links from a requirements file.
import os
import re
def which(program):
"""
Detect whether or not a program is installed.
Thanks to http://stackoverflow.com/a/377028/70191
"""
def is_exe(fpath):
return os.path.exists(fpath) and os.access(fpath, os.X_OK)
fpath, _ = os.path.split(program)
if fpath:
if is_exe(program):
return program
else:
for path in os.environ['PATH'].split(os.pathsep):
exe_file = os.path.join(path, program)
if is_exe(exe_file):
return exe_file
return None
EDITABLE_REQUIREMENT = re.compile(r'^-e (?P<link>(?P<vcs>git|svn|hg|bzr).+#egg=(?P<package>.+)-(?P<version>\d(?:\.\d)*))$')
install_requires = []
dependency_links = []
for requirement in (l.strip() for l in open('requirements')):
match = EDITABLE_REQUIREMENT.match(requirement)
if match:
assert which(match.group('vcs')) is not None, \
"VCS '%(vcs)s' must be installed in order to install %(link)s" % match.groupdict()
install_requires.append("%(package)s==%(version)s" % match.groupdict())
dependency_links.append(match.group('link'))
else:
install_requires.append(requirement)

does this answer your question?
setup(name='application-xpto',
version='1.0',
author='me,me,me',
author_email='xpto#mail.com',
packages=find_packages(),
include_package_data=True,
description='web app',
install_requires=open('app/requirements.txt').readlines(),
)

Related

setup.py file using requirements.txt

I've read a discussion where a suggestion was to use the requirements.txt inside the setup.py file to ensure the correct installation is available on multiple deployments without having to maintain both a requirements.txt and the list in setup.py.
However, when I'm trying to do an installation via pip install -e ., I get an error:
Obtaining file:///Users/myuser/Documents/myproject
Processing /home/ktietz/src/ci/alabaster_1611921544520/work
ERROR: Could not install packages due to an OSError: [Errno 2] No such file or directory:
'/System/Volumes/Data/home/ktietz/src/ci/alabaster_1611921544520/work'
It looks like pip is trying to look for packages that are available on pip (alabaster) on my local machine. Why? What am I missing here? Why isn't pip looking for the required packages on the PyPi server?
I have done it before the other way around, maintaining the setup file and not the requirements file. For the requirements file, just save it as:
*
and for setup, do
from distutils.core import setup
from setuptools import find_packages
try:
from Module.version import __version__
except ModuleNotFoundError:
exec(open("Module/version.py").read())
setup(
name="Package Name",
version=__version__,
packages=find_packages(),
package_data={p: ["*"] for p in find_packages()},
url="",
license="",
install_requires=[
"numpy",
"pandas"
],
python_requires=">=3.8.0",
author="First.Last",
author_email="author#company.com",
description="Description",
)
For reference, my version.py script looks like:
__build_number__ = "_LOCAL_"
__version__ = f"1.0.{__build_number__}"
Which Jenkins is replacing the build_number with a tag
This question consists of two separate questions, for the rather philosopihc choice of how to arrange setup requirements is actually unrelated to the installation error that you are experiencing.
First about the error: It looks like the project you are trying to install depends on another library (alabaster) of which you apparently also did an editable install using pip3 install -e . that points to this directory:
/home/ktietz/src/ci/alabaster_1611921544520/work
What the error tells you is that the directory where the install is supposed to be located does not exist anymore. You should only install your project itself in editable mode, but the dependencies should be installed into a classical system directory, i. e. without the option -e.
To clean up, I would suggest that you do the following:
# clean up references to the broken editable install
pip3 uninstall alabaster
# now do a proper non-editable install
pip3 install alabaster
Concerning the question how to arrange setup requirements, you should primarily use the install_requires and extras_require options of setuptools:
# either in setup.py
setuptools.setup(
install_requires = [
'dep1>=1.2',
'dep2>=2.4.1',
]
)
# or in setup.cfg
[options]
install_requires =
dep1>=1.2
dep2>=2.4.1
[options.extras_require]
extra_deps_a =
dep3
dep4>=4.2.3
extra_deps_b =
dep5>=5.2.1
Optional requirements can be organised in groups. To include such an extra group with the install, you can do pip3 install .[extra_deps_name].
If you wish to define specific dependency environments with exact versions (e. g. for Continuous Integration), you may use requirements.txt files in addition, but the general dependency and version constraint definitions should be done in setup.cfg or setup.py.

pip and tox ignore full path dependencies, instead look for "best match" in pypi

This is an extension of SO setup.py ignores full path dependencies, instead looks for "best match" in pypi
I am trying to write setup.py to install a proprietary package from a .tar.gz file on an internal web site. Unfortunately for me the prop package name duplicates a public package in the public PyPI, so I need to force install of the proprietary package at a specific version. I'm building a docker image from a Debian-Buster base image, so pip, setuptools and tox are all freshly installed, the image brings python 3.8 and pip upgrades itself to version 21.2.4.
Solution 1 - dependency_links
I followed the instructions at the post linked above to put the prop package in install_requires and dependency_links. Here are the relevant lines from my setup.py:
install_requires=["requests", "proppkg==70.1.0"],
dependency_links=["https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0"]
Installation is successful in Debian-Buster if I run python3 setup.py install in my package directory. I see the proprietary package get downloaded and installed.
Installation fails if I run pip3 install . also tox (version 3.24.4) fails similarly. In both cases, pip shows a message "Looking in indexes" then fails with "ERROR: Could not find a version that satisfies the requirement".
Solution 2 - PEP 508
Studying SO answer pip ignores dependency_links in setup.py which states that dependency_links is deprecated, I started over, revised setup.py to have:
install_requires=[
"requests",
"proppkg # https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0"
],
Installation is successful in Debian-Buster if I run pip3 install . in my package directory. Pip shows a message "Looking in indexes" but still downloads and installs the proprietary package successfully.
Installation fails in Debian-Buster if I run python3 setup.py install in my package directory. I see these messages:
Searching for proppkg# https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0
..
Reading https://pypi.org/simple/proppkg/
..
error: Could not find suitable distribution for Requirement.parse(...).
Tox also fails in this scenario as it installs dependencies.
Really speculating now, it almost seems like there's an ordering issue. Tox invokes pip like this:
python -m pip install --exists-action w .tox/.tmp/package/1/te-0.3.5.zip
In that output I see "Collecting proppkg# https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0" as the first step. That install fails because it fails to import package requests. Then tox continues collecting other dependencies. Finally tox reports as its last step "Collecting requests" (and that succeeds). Do I have to worry about ordering of install steps?
I'm starting to think that maybe the proprietary package is broken. I verified that the prop package setup.py has requests in its install_requires entry. Not sure what else to check.
Workaround solution
My workaround is installing the proprietary package in the docker image as a separate step before I install my own package, just by running pip3 install https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz. The setup.py has the PEP508 URL in install_requires. Then pip and tox find the prop package in the pip cache, and work fine.
Please suggest what to try for the latest pip and tox, or if this is as good as it gets, thanks in advance.
Update - add setup.py
Here's a (slightly sanitized) version of my package's setup.py
from setuptools import setup, find_packages
def get_version():
"""
read version string
"""
version_globals = {}
with open("te/version.py") as fp:
exec(fp.read(), version_globals)
return version_globals['__version__']
setup(
name="te",
version=get_version(),
packages=find_packages(exclude=["tests.*", "tests"]),
author="My Name",
author_email="email#mycompany.com",
description="My Back-End Server",
entry_points={"console_scripts": [
"te-be=te.server:main"
]},
python_requires=">=3.7",
install_requires=["connexion[swagger-ui]",
"Flask",
"gevent",
"redis",
"requests",
"proppkg # https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0"
],
package_data={"te": ["openapi_te.yml"]},
include_package_data=True, # read MANIFEST.in
)

Python setup.py install - remove previous version while updating

I have a setup.py script for my package which I install using
python ./setup.py install
What seems to happen is every time I increase the version, the old version is not removed in /usr/local/lib/python2.7/dist-packages so I see multiple versions.
Is there a way to set this up in a way that when a person updates, the old version gets removed?
There is a similar (but not quite) question on SO that asks how to uninstall a package in setup.py but I'm not really looking to uninstall as a separate option. I am looking for a clean 'update' process that removes old versions before installing new ones.
The other option is if I could just cleanly remove the version number from the installed package name, in which case I suppose it would overwrite, but I haven't been successful in doing that. If I remove version, it creates the package name with "0.0" which looks weird.
My setup script:
import io
import os
import sys
from setuptools import setup
#Package meta-data.
NAME = 'my_package'
DESCRIPTION = 'My description'
URL = 'https://github.com/myurl'
EMAIL = 'myemail#gmail.com'
AUTHOR = 'Me'
VERSION = '3.1.12'
setup(name = NAME,
version=VERSION,
py_modules = ['dir.mod1',
'dir.mod2',
]
)
If you want to remove previous versions from your packages then you could use pip in the parent directory of your package. Lets assume your setup.py is in the directory my_package then you can use:
pip install my_package --upgrade

How can I use git repos as dependencies for my PyPi package?

I have a package which I'm pushing to PyPi and some of the depedencies are not packages, but installable git repositories. My requirements.txt looks like this
sphinx_bootstrap_theme>=0.6.5
matplotlib>=2.2.0
numpy>=1.15.0
sphinx>=1.7.5
sphinx-argparse>=0.2.2
tensorboardX
tqdm>=4.24.0
Cython>=0.28.5
# git repos
git+git://github.com/themightyoarfish/svcca-gpu.git
Accordingly, my setup.py has this content:
#!/usr/bin/env python
from distutils.core import setup
import setuptools
import os
with open('requirements.txt', mode='r') as f:
requirements = f.read()
required_pkgs, required_repos = requirements.split('# git repos')
required_pkgs = required_pkgs.split()
required_repos = required_repos.split()
with open('README.md') as f:
readme = f.read()
setup(name=...
...
packages=setuptools.find_packages('.', include=[...]),
install_requires=required_pkgs,
dependency_links=required_repos,
zip_safe=False, # don't install egg, but source
)
But running pip install <package> does not actually install the git dependency. I assume that pip doesn't actually use the setup script. It works when I run python setup.py install manually.
Edit:
I also tried removing dependency_links and just using install_requires with the repository, but when installing my repository from GitHub (the project including the above files), I'm met with
Complete output from command python setup.py egg_info:
error in ikkuna setup command: 'install_requires' must be a string or
list of strings containing valid project/version requirement specifiers; Invalid requirement, parse error at "'+git://g'"
It has been suggested in other answers that one can put something like
git+https://github.com/themightyoarfish/svcca-gpu.git#egg=svcca
into requirements.txt, but that fails with
error in <pkg> setup command: 'install_requires' must be a string or list of strings containing valid project/version requirement specifiers; Invalid requirement, parse error at "'+https:/'
Question: (How) Can I list git repositories as dependencies for a pip package?
Out of the 50 or so different ways to specify git dependencies for Pip, the only one that did what I intended was this one (outline in PEP 508):
svcca # git+ssh://git#github.com/themightyoarfish/svcca-gpu
This can be used in install_requires, which solves the issue of dependency_links being ignored by pip.
An amusing side-effect is that the package cannot be uploaded to PyPi with such a dependency:
HTTPError: 400 Client Error: Invalid value for requires_dist. Error: Can't have direct dependency: 'svcca # git+ssh://git#github.com/themightyoarfish/svcca-gpu' for url: https://upload.pypi.org/legacy/
According to the next post related to How to state in requirements.txt a direct github source.
You could add a package from git remote repository with the next syntax:
-e git://github.com/themightyoarfish/svcca-gpu.git
Reference:
Install a project in editable mode (i.e. setuptools “develop mode”) from a local project path or a VCS url with-e

How to install data_files to absolute path?

I use pip with setuptools to install a package.
I want pip to copy some resource files to, say, /etc/my_package.
My setup.py looks like this:
setup(
...
data_files=[('/etc/my_package', ['config.yml'])]
)
When running pip install, the file ends up in
~/.local/lib/python3.5/site-packages/etc/my_package/config.yml
instead of /etc/my_package.
What am I doing wrong?
(pip version 9.0.1)
Short answer: use pip install --no-binary :all: to install your package.
I struggled with this for a while and eventually figured out that there is some weirdness/inconsistency in how data_files are handled between binary wheels and source distributions. Specifically, there is a bug with wheels that makes all paths in data_files relative to the install location (see https://github.com/pypa/wheel/issues/92 for an issue tracking this).
"Thats fine", you might say, "but I'm not using a wheel!". Not so fast! It turns out recent versions of pip (I am working with 9.0.1) will try to compile a wheel even from a source distribution. For example, if you have a package my_package you can see this doing something like
$ python setup.py sdist # create source tarball as dist/my_package.tar.gz
[...]
$ pip install dist/my_package.tar.gz # install the generated source
[...]
Building wheels for collected packages: my_package
Running setup.py bdist_wheel for my_package ... done
pip tries to be helpful and build a wheel to install from and cache for later. This means you will run into the above bug even though in theory you are not using bdist_wheel yourself. You can get around this by running python setup.py install directly from the package source folder. This avoids the building and caching of built wheels that pip will try to do but is majorly inconvenient when the package you want is already on PyPI somewhere. Fortunately pip offers an option to explicitly disable binaries.
$ pip install --no-binary :all: my_package
[...]
Skipping bdist_wheel for my_package, due to binaries being disabled for it.
Installing collected packages: my_package
Running setup.py install for my_package ... done
Successfully installed my_package-0.1.0
Using the --no-binary option prevents wheel building and lets us reference absolute paths in our data_files paths again. For the case where you are installing a lot of packages together and want to selectively disable wheels you can replace :all: with a comma separated list of packages.
it seems that data_files can't support absolute path, it will add sys.prefix before "/etc/my_package", if you want to put config.yml to ../site_packages/my_package, please try:
import os
import sys
from distutils.sysconfig import get_python_lib
relative_site_packages = get_python_lib().split(sys.prefix+os.sep)[1]
date_files_relative_path = os.path.join(relative_site_packages, "my_package")
setup(
...
data_files=[(date_files_relative_path, ['config.yml'])]
)
I ended up writing an init() function that installs the config file on first run instead of creating it during the installation:
def init():
try:
if not path.isdir(config_dir):
os.mkdir(cs_dir)
copyfile(pkg_resources.resource_filename(
__name__, "default_config.yml"), config_file)
print("INFO: config file created. ")
except IOError as ex:
print("ERROR: could not create config directory: " + str(ex)
if __name__ == "__main__":
init()
main()

Categories