I would like to install a set of packages from requirements.txt. pip seems to be incredibly slow in the default operation (1~5 kbps).
pip install -r requirements.txt
The following command was not much of help either; the download was still slow.
pip install --download DIR -r requirements.txt
The objective now is to download those package from the same link that pip would prefer, but download them with an accelerator (like axel achieving 500-700 kbps) to a directory DIR. Then I would be able to install them locally using the command.
pip install --no-index --find-links=DIR -r requirements.txt
How could I do this?
Specs: Pip-6.0.6, Python-2.7, Mac OS X 10.9
PS: All this to install Odoo (formerly OpenERP).
Related
I have a situation where I need to download some Python packages on an internet-connected asset, then install them on a disconnected asset. Let's say I go through the following steps:
Create a requirements file, requirements.txt
Download packages to a local folder: py -m pip download -r "requirements.txt" -d "pkg_dir"
Move requirements.txt and pkg_dir to the disconnected asset
Install the packages from pkg_dir: here is my question
Is this command: py -m pip install -r "requirements.txt" --no-index --find-links "pkg_dir"
equivalent to this command: py -m pip install -r "requirements.txt" --index-url "pkg_dir"
The pip documentation states that the --index-url is the
Base URL of the Python Package Index (default https://pypi.org/simple). This should point to a repository compliant with PEP 503 (the simple repository API) or a local directory laid out in the same format.
(emphasis mine)
I'm just curious if these commands behave the same way when the "index URL" is a local folder. I know that the typical --no-index --find-links variant works as expected; this is just something I was wondering about.
I am working in an offline Linux env. (RedHat 7.6)
until today I've used the full path to install
the files with pip, and it works great. (still, do)
Now on automated testing, I want to create a virtual
environment and pip install a requirements file.
The problem is, it keeps searching the web,
even though I've used --prefix, and tried --target
I can't get it to install from a certain folder,
always try to search the web
requirements file:
numpy==1.16.4
folder:
/custom_dev/install/
inside the folder:
numpy-1.16.4-cp37-37m-manylinux_x86_64.whl
tried:
pip3 install -r requirements.txt --target=/custom_dev/install/
pip3 install -r requirements.txt --prefix=/custom_dev/install/
and other stuff from StackOverflow, I've yet to find a solution to my problem, or a thread with the same one, suggestions?
ty!
Our pip-local does that:
c:\srv\bin> cat pip-local.bat
#echo off
rem pip install with `--upgrade --no-deps --no-index --find-links=file:///%SRV%/wheelhouse`
pip %* --upgrade --no-deps --no-index --find-links=file:///%SRV%/wheelhouse
the linux version uses $* instead of %* and $SRV instead of %SRV%:
pip $* --upgrade --no-deps --no-index --find-links=file:///${SRV}/wheelhouse
You can remove the --no-deps if you want dependencies to be found as well (although it will search the web if it can't find a wheel satisfying a dependency in your wheelhouse).
The companion tool is getwheel
c:\srv\bin> cat getwheel.bat
#echo off
rem
rem Download wheel file for package (getwheel foo==1.4.1)
rem
pip wheel --wheel-dir=%SRV%\wheelhouse %*
linux version:
pip wheel --wheel-dir=${SRV}/wheelhouse $*
which is used like:
getwheel numpy==1.16.4
or
getwheel -r requirements.txt
which causes the wheels of the package and its dependencies to be placed in the wheelhouse folder.
pip3 install -r requirements.txt --find-links=/custom_dev/install/ --no-index
The keyword to prevent pip to connect to PyPI via the network is --no-index.
I am mac user, used to run pip install with --user, but recently after brew update, I found there are some strange things, maybe related.
Whatever I tries, the packages are always installed to ~/Library/Python/2.7/lib/python/site-packages
Here are the commands I run.
$ python -m site --user-site
~/Library/Python/2.7/lib/python/site-packages
$ pip install --user -r requirements.txt
$ PYTHONUSERBASE=. pip install --user -r requirements.txt
So what should be the problem?
I used for lambda zip packaging
Updates:
If using Mac OS X and you have Python installed using Homebrew (see Homebrew), the accepted command will not work. A simple workaround is to add a setup.cfg file in your /path/to/project-dir with the following content.
[install]
prefix=
https://docs.aws.amazon.com/lambda/latest/dg/lambda-python-how-to-create-deployment-package.html
You can use the target (t) flag of pip install to specify a target location for installation.
In use:
pip install -r requirements.txt -t /path/to/directory
to the current directory:
pip install -r requirements.txt -t .
I have a requirement.txt file with the list of python package to install. One of the packages is psycopg2==2.6.2 I need to update this package to psycopg2==2.7. I tried to install by pip3 install psycopg2 But it doesn't affect requirement.txt file. Can you please point me in the right direction?
Notice that running pip3 install psycopg2 doesn't respect the requirements.txt file. To upgrade this package you need to use -U option:
pip3 install -U psycopg2
which is a shorthand for:
pip3 install --upgrade psycopg2
After that, you can update your requirements.txt with the following command:
pip freeze > requirements.txt
If you're looking for a solution to automatically update the requirements.txt file after you upgrade package/packages, you can use pip-upgrader.
Installation:
pip install pip-upgrader
Usage:
pip-upgrade
The above command auto-discovers the requirements file and prompts for selecting upgrades. You can also specify a path to the requirements file or/and specify a package to upgrade:
pip-upgrade /path/to/requirements.txt -p psycopg2
As you've discovered, pip doesn't update the requirements file. So the workflow you'd likely want to use is:
Update the version of psycopg2 in your requirements file from 2.6.2 to 2.7
Run pip install with the upgrade flag
pip3 install -U -r requirements.txt
If you're familiar with tools like npm that do update the version in the catalog file, you may be interested in using pipenv, which manages your dependencies and the virtual environment for you, much like npm does.
If you don't know the latest version of your package, then use pip to figure it out:
$ pip list --outdated | grep psycopg2
psycopg2 (2.7.3.2) - Latest: 2.7.4 [wheel]
you can try:
pip install --upgrade --force-reinstall -r requirements.txt
You can also ignore installed package and install the new one :
pip install --ignore-installed -r requirements.txt
I have python 2.7 installed through homebrew and running pip install -r requirements.txt on a project's requirements file. The packages download, everything goes fine until it's time to link the binaries - then Pip tries to put the binaries for f2py (a dependency of a package in the requirements.txt file) into /bin and I'm left with this error:
IOError: [Errno 1] Operation not permitted: '/bin/f2py'
I don't have root access so I'd like for pip to put all binaries in /usr/local/bin instead. How do I tell pip to install binaries into that directory?
I'd create a virtualenv (install it with pip first), then use the virtualenv to install all your requirements. That way you both have a writable path and keep your global Python installation clean for other projects.
Alternatively, you could use the --user option to install in the site.USER_SITE location:
pip install --local virtualenv
or
pip install --local -r requirements.txt
See the User Installs section in the documentation.
If you downloaded the package you can do
python setup.py install --user
pip now supports this behaviour by passing user to setup.py
pip install --user somepackage