Installing with pip, I can write the following requirements.txt file:
git+https://repo#branch#egg=foo&subdirectory=this/bar/foo
numpy
And successfully install the requirements file:
python3 -m pip install -r requirements.tx
However, I have co-located in the directory a setup.py script that lists:
setuptools.setup(
...
install_requires = get_lines('requirements.txt'),
...
)
And installing this submodule using pip involves pip running setup.py...which fails to handle the module link:
git+https://github.com/repo#branch#egg=foo&subdirectory=this/bar/foo
I can see a lot of ways around this, but it seems like there should be one non-ambiguous way to do this which changes as little as possible in the setup.py script.
Is there such a way?
You probably need to change the line in requirements.txt to something like:
foo # git+https://repo#branch#egg=foo&subdirectory=this/bar/foo
References:
https://pip.pypa.io/en/stable/reference/pip_install/#requirement-specifiers
https://www.python.org/dev/peps/pep-0440/#direct-references
Although I am not entirely sure it will work. There might be subtle differences between the notations accepted in requirements.txt files, pip directly and setuptools. In particular I do not know how well things like egg and subdirectory are supported.
Advices:
Avoid calling python setup.py install or python setup.py develop from now on, and make sure to call python -m pip install . or python -m pip install --editable . instead.
I do consider reading requirements.txt from within setup.py as a red flag (or at least yellow). The contents of install_requires of setuptools and requirements.txt usually serve different purposes.
According to pip documentation, it is possible to specify the hash of a requirement in the requirements.txt file.
Is it possible to get the same by specifying the hash in the setup.py so that the hash is checked when someone simply does pip install <package>?.
I'm specifying the requirements in the setup.py by passing the install_requires keyword argument to the setup function in the distutils package.
from distutils.core import setup
from setuptools import find_packages
setup(name='<package-name>',
...
...
install_requires=['ecdsa==0.13', 'base58==0.2.5']
Maybe there is another way to achieve the same but i couldn't find any documentation.
Currently, I don't believe there is a simple way to specify a hash check within setup.py. My solution around it is to simply use virtualenv with hashed dependencies in requirements.txt. Once installed in the virtual environment you can run pip setup.py install and it will check the local environment (which is your virtual environment) and the packages installed is hashed.
Inside requirements.txt your hashed packages will look something like this:
requests==2.19.1 \
--hash=sha256:63b52e3c866428a224f97cab011de738c36aec0185aa91cfacd418b5d58911d1 \
--hash=sha256:ec22d826a36ed72a7358ff3fe56cbd4ba69dd7a6718ffd450ff0e9df7a47ce6a
Activate your virtualenv and install requirements.txt file:
pip install -r requirements.txt --require-hashes
According to the setuptools documentation, setuptools version 30.3.0 (December 8, 2016) "allows using configuration files (usually setup.cfg) to define package’s metadata and other options which are normally supplied to setup() function". Similar to running pip install -r requirements.txt to install Python packages from a requirements file, is there a way to ask pip to install the packages listed in the install_requires option of a setup.cfg configuration file?
If you have all your dependencies and other metadata defined in setup.cfg, just create a minimal setup.py file in the same directory that looks like this:
from setuptools import setup
setup()
From now on you can run pip install and it will install all the dependencies defined in setup.cfg as if they were declared in setup.py.
If your setup.cfg belongs to a well-formed package, you can do e.g.:
pip install -e .[tests,dev]
(install this package in place, with given extras)
afterwards you can pip uninstall that package by name, leaving deps in place.
Here is my workaround. I use the following command to parse the install_requires element from the setup.cfg file and install the packages using pip.
python3 -c "import configparser; c = configparser.ConfigParser(); c.read('setup.cfg'); print(c['options']['install_requires'])" | xargs pip install
Here is a more readable version of the Python script before the pipe in the above command line.
import configparser
c = configparser.ConfigParser()
c.read('setup.cfg')
print(c['options']['install_requires'])
No, pip does not currently have facilities for parsing requirements from setup.cfg. It will only install dependencies along with the main package(s) provided in setup.py.
I'm building a virtualenv (system details follow) and numpy, scipy and pandas don't seem to be treated correctly as a dependency.
To clarify, this problem seems to exist regardless of whether numpy appears in the requirements.txt, even if they are placed in the correct order.
This is inconvenient, and the opposite of how a package manager is supposed to work, I think :)
So what gives? When I build the virtualenv from scratch, this is the output:
[bdundee#etl-dev Py26]$ ls
requirements.txt requirements.txt~
[bdundee#etl-dev Py26]$ virtualenv ./env/sqrt_python26 --no-site-packages
New python executable in ./env/sqrt_python26/bin/python
Installing setuptools, pip...done.
[bdundee#etl-devPy26]$ source ./env/sqrt_python26/bin/activate
(sqrt_python26)[bdundee#etl-devPy26]$ pip install -r ./requirements.txt
Downloading/unpacking Bottleneck==0.8.0 (from -r ./requirements.txt (line 6))
...
import numpy as np
ImportError: No module named numpy
Clearly numpy should be treated as a dependency of Bottleneck and isn't. The same problem occurs with matplotlib.
Bottleneck is not the only module with this issue, there are a few others. This has forced me to create pre_pip.sh:
#!/usr/bin/bash
## Install numpy
pip install numpy==1.7.1
## Install scipy
pip install scipy==0.12.0
## Install pandas
pip install pandas==0.12.0
I'm also running in to errors with scipy and pandas (for example, statsmodels).
The question(s):
Are these bugs in the setup instructions for these packages?
Is this a numpy-specific thing?
Is there a way to solve this without a "pre" build script that installs numpy, scipy and pandas?
System details:
AWS CentOS (whatever the current version is)
Python 2.6.9
numpy 1.7.1
pip seems to work as follows (feel free to correct me).
Each file is downloaded and unpacked.
Each file is built python setup.py build
Each file is installed python setup.py install
The problem is that the setup.py files in some modules require the modules in question to be present during the build or install step, which is not possible if numpy/scipy/etc. are in the requirements.txt.
A similar issue exists for matplotlib, the pip community's sentiment is "it's not pip". Fair enough.
The best workaround, in my opinion, is to just write a wrapper. If anyone else knows any better ways, please let me know :)
#!/usr/bin/bash
INSTALL_DIR=$IMPORT/../Environment/Py26/env/sqrt_python26
## Step 1: build the virtualenv
virtualenv $INSTALL_DIR
## Now use the virtualenv
source $INSTALL_DIR/bin/activate
## Install numpy
pip install numpy==1.7.1
## Install scipy
pip install scipy==0.12.0
## Install pandas
pip install pandas==0.12.0
## Some others...
pip install patsy==0.2.1
pip install pycurl==7.19.0
## Now run requirements.txt
pip install -r ./requirements.txt
## finished, shut down virtualenv
deactivate
It seems to be an issue with a few different packages using requirements.txt. You could use a script to parse each line and run install, I am sure there are more elegant ways to do it but at least it will install in order so you won't get the errors,
import pip
with open("requirements.txt", "r") as f:
for line in f:
pip.main(['install', line])
Is it possible to install packages using pip from the local filesystem?
I have run python setup.py sdist for my package, which has created the appropriate tar.gz file. This file is stored on my system at /srv/pkg/mypackage/mypackage-0.1.0.tar.gz.
Now in a virtual environment I would like to install packages either coming from pypi or from the specific local location /srv/pkg.
Is this possible?
PS
I know that I can specify pip install /srv/pkg/mypackage/mypackage-0.1.0.tar.gz. That will work, but I am talking about using the /srv/pkg location as another place for pip to search if I typed pip install mypackage.
What about::
pip install --help
...
-e, --editable <path/url> Install a project in editable mode (i.e. setuptools
"develop mode") from a local project path or a VCS url.
eg, pip install -e /srv/pkg
where /srv/pkg is the top-level directory where 'setup.py' can be found.
I am pretty sure that what you are looking for is called --find-links option.
You can do
pip install mypackage --no-index --find-links file:///srv/pkg/mypackage
From the installing-packages page you can simply run:
pip install /srv/pkg/mypackage
where /srv/pkg/mypackage is the directory, containing setup.py.
Additionally1, you can install it from the archive file:
pip install ./mypackage-1.0.4.tar.gz
1
Although noted in the question, due to its popularity, it is also included.
I am installing pyfuzzybut is is not in PyPI; it returns the message: No matching distribution found for pyfuzzy.
I tried the accepted answer
pip install --no-index --find-links=file:///Users/victor/Downloads/pyfuzzy-0.1.0 pyfuzzy
But it does not work either and returns the following error:
Ignoring indexes: https://pypi.python.org/simple
Collecting pyfuzzy
Could not find a version that satisfies the requirement pyfuzzy (from versions: )
No matching distribution found for pyfuzzy
At last , I have found a simple good way there: https://pip.pypa.io/en/latest/reference/pip_install.html
Install a particular source archive file.
$ pip install ./downloads/SomePackage-1.0.4.tar.gz
$ pip install http://my.package.repo/SomePackage-1.0.4.zip
So the following command worked for me:
pip install ../pyfuzzy-0.1.0.tar.gz.
Hope it can help you.
This is the solution that I ended up using:
import pip
def install(package):
# Debugging
# pip.main(["install", "--pre", "--upgrade", "--no-index",
# "--find-links=.", package, "--log-file", "log.txt", "-vv"])
pip.main(["install", "--upgrade", "--no-index", "--find-links=.", package])
if __name__ == "__main__":
install("mypackagename")
raw_input("Press Enter to Exit...\n")
I pieced this together from pip install examples as well as from Rikard's answer on another question. The "--pre" argument lets you install non-production versions. The "--no-index" argument avoids searching the PyPI indexes. The "--find-links=." argument searches in the local folder (this can be relative or absolute). I used the "--log-file", "log.txt", and "-vv" arguments for debugging. The "--upgrade" argument lets you install newer versions over older ones.
I also found a good way to uninstall them. This is useful when you have several different Python environments. It's the same basic format, just using "uninstall" instead of "install", with a safety measure to prevent unintended uninstalls:
import pip
def uninstall(package):
response = raw_input("Uninstall '%s'? [y/n]:\n" % package)
if "y" in response.lower():
# Debugging
# pip.main(["uninstall", package, "-vv"])
pip.main(["uninstall", package])
pass
if __name__ == "__main__":
uninstall("mypackagename")
raw_input("Press Enter to Exit...\n")
The local folder contains these files: install.py, uninstall.py, mypackagename-1.0.zip
An option --find-links does the job and it works from requirements.txt file!
You can put package archives in some folder and take the latest one without changing the requirements file, for example requirements:
.
└───requirements.txt
└───requirements
├───foo_bar-0.1.5-py2.py3-none-any.whl
├───foo_bar-0.1.6-py2.py3-none-any.whl
├───wiz_bang-0.7-py2.py3-none-any.whl
├───wiz_bang-0.8-py2.py3-none-any.whl
├───base.txt
├───local.txt
└───production.txt
Now in requirements/base.txt put:
--find-links=requirements
foo_bar
wiz_bang>=0.8
A neat way to update proprietary packages, just drop new one in the folder
In this way you can install packages from local folder AND pypi with the same single call: pip install -r requirements/production.txt
PS. See my cookiecutter-djangopackage fork to see how to split requirements and use folder based requirements organization.
Assuming you have virtualenv and a requirements.txt file, then you can define inside this file where to get the packages:
# Published pypi packages
PyJWT==1.6.4
email_validator==1.0.3
# Remote GIT repo package, this will install as django-bootstrap-themes
git+https://github.com/marquicus/django-bootstrap-themes#egg=django-bootstrap-themes
# Local GIT repo package, this will install as django-knowledge
git+file:///soft/SANDBOX/python/django/forks/django-knowledge#egg=django-knowledge
To install only from local you need 2 options:
--find-links: where to look for dependencies. There is no need for the file:// prefix mentioned by others.
--no-index: do not look in pypi indexes for missing dependencies (dependencies not installed and not in the --find-links path).
So you could run from any folder the following:
pip install --no-index --find-links /srv/pkg /path/to/mypackage-0.1.0.tar.gz
If your mypackage is setup properly, it will list all its dependencies, and if you used pip download to download the cascade of dependencies (ie dependencies of depencies etc), everything will work.
If you want to use the pypi index if it is accessible, but fallback to local wheels if not, you can remove --no-index and add --retries 0. You will see pip pause for a bit while it is try to check pypi for a missing dependency (one not installed) and when it finds it cannot reach it, will fall back to local. There does not seem to be a way to tell pip to "look for local ones first, then the index".
Having requirements in requirements.txt and egg_dir as a directory
you can build your local cache:
$ pip download -r requirements.txt -d eggs_dir
then, using that "cache" is simple like:
$ pip install -r requirements.txt --find-links=eggs_dir
What you need is --find-links of pip install.
-f, --find-links If a url or path to an html file, then parse for links to archives. If a local path or
file:// url that's a directory, then look for archives in the directory listing.
In my case, after python -m build, tar.gz package (and whl file) are generated in ./dist directory.
pip install --no-index -f ./dist YOUR_PACKAGE_NAME
Any tar.gz python package in ./dist can be installed by this way.
But if your package has dependencies, this command will prompt error.
To solve this, you can either pip install those deps from official pypi source, then add --no-deps like this
pip install --no-index --no-deps -f ./dist YOUR_PACKAGE_NAME
or copy your deps packages to ./dist directory.
I've been trying to achieve something really simple and failed miserably, probably I'm stupid.
Anyway, if you have a script/Dockerfile which download a python package zip file (e.g. from GitHub) and you then want to install it you can use the file:/// prefix to install it as shown in the following example:
$ wget https://example.com/mypackage.zip
$ echo "${MYPACKAGE_MD5} mypackage.zip" | md5sum --check -
$ pip install file:///.mypackage.zip
NOTE: I know you could install the package straight away using pip install https://example.com/mypackage.zip but in my case I wanted to verify the checksum (never paranoid enough) and I failed miserably when trying to use the various options that pip provides/the #md5 fragment.
It's been surprisingly frustrating to do something so simple directly with pip. I just wanted to pass a checksum and have pip verify that the zip was matching before installing it.
I was probably doing something very stupid but in the end I gave up and opted for this. I hope it helps others trying to do something similar.
In my case, it was because this library depended on another local library, which I had not yet installed. Installing the dependency with pip, and then the dependent library, solved the issue.
If you want to install one local package (package A) to be used inside another local project/package (B) this is quite simple. All you need is to CD to (B) and call:
pip install /path/to/package(A)
Of course you will need to first compile the package (A) with:
sudo python3 ./setup.py install
And, each time you change package A, just run again setup.py in package (A) then pip install ... inside the using project/package (B)
Just add directory on pip command
pip install mypackage file:/location/in/disk/mypackagename.filetype