Python pip rebuild requirements - python

Using a virtualenv with pip install and pip freeze is quite a nice way to work. All your requirements can be handled at the shell and at a later date another developer can rebuild things
pip install lib1
pip freeze > requirements.txt
# Do some development
# Oh I need this as well
pip install lib2
pip freeze > requirements.txt
# Another developer comes along
pip install -r requirements.txt
# They can carry on developer
However, if you then want to update your libraries thins are difficult (because you have all the dependencies in the freeze rather than just the packages you use).
Is there a way to work like this where but also update your libraries at a later date.
One approach is to use pip-tools and keep a requirements file (this is used internally by pipenv), but this isn't very "shelly"

My workflow avoids pip freeze. It goes:
# Oh, I need lib1
echo "lib1~=1.0" >> requirements.txt
pip install -r requirements.txt
# Oh, I need lib2
echo "lib2~=3.0" >> requirements.txt
pip install -r requirements.txt
That way, requirements.txt contains only my direct dependencies, so hopefully it's much easier to maintain.

Related

Does `pip install -U pip -r requirements.txt` upgrade pip before installing the requirements?

It seems to be common practice to set up a Python virtual environment using some variant of the following:
python -m venv venv && source ./venv/bin/activate
python -m pip install -U pip -r requirements.txt
What I hope the above command does is:
Upgrade pip first
Run the installation of the packages in requirements.txt
However, what actually seems to happen is:
Collects all packages, including newest version of pip
Installs them all together
The original/outdated version of pip is what actually runs the installs
And the new version of pip is not used until after this command
Question(s)
Is it possible to have pip upgrade itself and then install a requirements file, in one command?
Would this infer any particular benefits?
Should I switch to the following?
python -m venv venv && source ./venv/bin/activate
python -m pip install -U pip
python -m pip install -r requirements.txt
What is the optimal method to install requirements files?
I see people sometimes installing/upgrading wheel and setuptools as well
The answers to your questions are:
No. pip doesn't currently treat itself as a special dependency, so it doesn't know to install then execute itself, which is what it would need to do to overcome the problems you observed.
Updating pip in a separate step is indeed the recommended way to proceed.
You may from time to time see pip issue a message advising that a newer version is available. This happens a lot if you create them from a python with an outdated pip.
I had a situation similar to yours, I needed to first upgrade pip and then install a bunch of libraries in a lab that had 20 PCs. What I did was writing all the librarie's name in a requirements.txt file, then create a .bat file with two commands:
`python -m pip install --upgrade pip`
`pip install -r requirements.txt`
The first command for upgrading pip and the second one for installing all the libraries listed in the requirements.txt file.

How to make pip run a specific command from requirements file?

I would like to add lines for torch and torchvision on my requirements.txt file, to allow for easy clone, and I will be moving from computer to computer and to cloud in the near future.
I want an easy pip install -r requirements.txt and be done with it for my project.
> pip freeze > requirements.txt
gives something like
...
torch==1.5.0
torchvision==0.6.0
...
However, pip install -r requirements.txt (which is in fact pip install torch) doesn't work, and instead, as the official torch site, clearly says the command should be:
pip install torch===1.5.0 torchvision===0.6.0 -f https://download.pytorch.org/whl/torch_stable.html
How do I make the requirements file reflect this?
My desktop is Windows 10.
Bonus question:
My cloud is Linux based.
How do I make the requirements file fit both desktop and cloud?
You can use the same options in your requirements.txt file, e.g.
torch===1.5.0
torchvision===0.6.0
-f https://download.pytorch.org/whl/torch_stable.html
Then simply run pip install -r requirements.txt

pip force reinstall in requirements.txt

I have a requirements.txt file in which I have some git+ references. I would like to always reinstall these as for some reason, even if I make changes and bump the version and push it to my github repo, pip says requirements already satisfied and doesn't install.
Here is part of my requirements.txt file:-
Django==1.10
git+https://github.com/myaccount/myrepo.git#master#egg=some_egg
I don't want to reinstall everything in the requirements.txt file. Only the git+ requirements.
I tried this:-
git+https://github.com/myaccount/myrepo.git#master#egg=some_egg --install-option="--upgrade --ignore-installed --force-reinstall"
But none of the above options worked.
The problem is that you haven't adviced pip what version do you have in git:
git+https://github.com/myaccount/myrepo.git#master#egg=some_egg
For VCS URLs pip doesn't look into the repo to find out the version, it only look at the URL:
git+https://github.com/myaccount/myrepo.git#master#egg=some_egg-version
example:
git+https://github.com/myaccount/myrepo.git#master#egg=package-1.0.8
When you push a new version to Github update your requirements.txt with new version(s) and run pip install -r requirements.txt -U.
Probably one option is to install the package in editable mode, like
Django==1.10
-e git+https://github.com/myaccount/myrepo.git#master#egg=some_egg
Pip developers stated in 2017 that they don't want you to be able to force reinstall in requirements.txt, although I don't think they explained why.
I use this:
pip install -r requirements.txt
And you can use some thing more like :
pip install -r requirements.txt --no-index --find-links
--no-index - Ignore package index (only looking at --find-links URLs instead).
-f, --find-links <URL> - If a URL or path to an html file, then parse for links to archives

Upgrade python packages from requirements.txt using pip command

How do I upgrade all my python packages from requirements.txt file using pip command?
tried with below command
$ pip install --upgrade -r requirements.txt
Since, the python packages are suffixed with the version number (Django==1.5.1) they don't seem to upgrade. Is there any better approach than manually editing requirements.txt file?
EDIT
As Andy mentioned in his answer packages are pinned to a specific version, hence it is not possible to upgrade packages through pip command.
But, we can achieve this with pip-tools using the following command.
$ pip-review --auto
this will automatically upgrade all packages from requirements.txt (make sure to install pip-tools using pip install command).
I already answered this question here. Here's my solution:
Because there was no easy way for upgrading package by package, and updating the requirements.txt file, I wrote this pip-upgrader which also updates the versions in your requirements.txt file for the packages chosen (or all packages).
Installation
pip install pip-upgrader
Usage
Activate your virtualenv (important, because it will also install the new versions of upgraded packages in current virtualenv).
cd into your project directory, then run:
pip-upgrade
Advanced usage
If the requirements are placed in a non-standard location, send them as arguments:
pip-upgrade path/to/requirements.txt
If you already know what package you want to upgrade, simply send them as arguments:
pip-upgrade -p django -p celery -p dateutil
If you need to upgrade to pre-release / post-release version, add --prerelease argument to your command.
Full disclosure: I wrote this package.
you can try:
pip install --upgrade --force-reinstall -r requirements.txt
You can also ignore installed package and install the new one :
pip install --ignore-installed -r requirements.txt
No. Your requirements file has been pinned to specific versions. If your requirements are set to that version, you should not be trying to upgrade beyond those versions. If you need to upgrade, then you need to switch to unpinned versions in your requirements file.
Example:
lxml>=2.2.0
This would upgrade lxml to any version newer than 2.2.0
lxml>=2.2.0,<2.3.0
This would upgrade lxml to the most recent version between 2.2.0 and 2.3.0.
I suggest freezing all of your dependencies in order to have predictable builds.
When doing that, you can update all dependencies at once like this:
sed -i '' 's/[~=]=/>=/' requirements.txt
pip install -U -r requirements.txt
pip freeze | sed 's/==/~=/' > requirements.txt
Having done the above, test your project with the new set of packages and eventually commit the requirements.txt file to the repository while still allowing for installing hot-fixes.
Another solution is to use the upgrade-requirements package
pip install upgrade-requirements
And then run :
upgrade-requirements
It will upgrade all the packages that are not at their latest versions, and also create an updated requirements.txt at the end.
Fixing dependencies to a specific version is the recommended practice.
Here's another solution using pur to keep the dependencies fresh!
Give pur your requirements.txt file and it will auto update all your high-level packages to the latest versions, keeping your original formatting and comments in-place.
For example, running pur on the example requirements.txt updates the packages to the currently available latest versions:
$ pur -r requirements.txt
Updated flask: 0.9 -> 0.10.1
Updated sqlalchemy: 0.9.10 -> 1.0.12
Updated alembic: 0.8.4 -> 0.8.6
All requirements up-to-date.
As pur never modifies your environment or installed packages, it's extremely fast and you can safely run it without fear of corrupting your local virtual environment. Pur separates updating your requirements.txt file from installing the updates. So you can use pur, then install the updates in separate steps.
I've just had to do the same... used this small one-liner to do the job:
packages=$(cat requirements.txt | sed 's/==.*//g'); echo $packages | xargs pip3 install -U; freeze=$(pip3 freeze); for p in $(echo $packages); do echo $freeze | grep -E "^${p}==" >> requirements.new; done
which:
packages=$(cat requirements.txt | sed 's/==.*//g') creates a list of the current packages names in requirements.txt (removing the version).
echo $packages | xargs pip3 install -U then passes all of the packages as arguments to pip3 to upgrade.
freeze=$(pip3 freeze); Gets all of the current package versions in the format required for requirements.txt
for p in $(echo $packages) then iterates through the package names
echo $freeze | grep -E "^${p}==" >> requirements.new gets the package version line from the pip freeze output which matches the package and writes to new requirements.txt
This has the added benefit of preserving the ordering of the original requirements.txt. :)
Hope this helps!
The second answer is the most useful but what I wanted to do is lock some packages while having others at the latest version (e.g. youtube-dl).
An example requirements.txt would look like this (~ means compatible):
Pillow==6.2.2
requests~=2.22.0
youtube_dl
Then in the terminal, use the command pip install --upgrade -r requirements.txt
This ensures that Pillow will stay at 6.2.2, requests will be upgraded to the latest 2.22.x (if available), and the latest version of youtube-dl will be installed if not already.
Since I couldn't do that using bash, I wrote a python module to create a new requirements file with no versions and use it:
data = open('requirements-prod.pip', 'r')
data2 = open('requirements-prod-no-version.pip', 'w')
for line in data.readlines():
new_line = line[:line.index('==')]
data2.write(new_line + '\n')
data2.flush()
Then install the libs from the new file pip install -U -r requirements-prod-no-version.pip
Finally freeze the versions to the original file pip freeze > requirements-prod.pip
More robust solution is IMO to use a dependency management such as poetry, https://python-poetry.org which comes with an exhaustive dependency resolver.
I guess the simplest solution is creating the requirements.txt with:
pip freeze | sed 's/==/>=/' > requirements.txt
You can use below command on Linux and Mac:
cat requirements.txt | cut -f1 -d= | xargs pip install -U
1) To upgrade pip installed files from reqs.txt
add the >= in replacement of ==
this will tell pip to install lib greater than or equal to the version you are requesting, here by installing the most to-date version of requested library
1.a) **My answer for thread ** By adding py -m pip install -r reqs.txt to a daily restart... or something of the nature you can update your installed libs.
Summed up by Andy Perfectly
-My reason For entering this thread was to find information on how to update virtual env base pip (usually 10.0.03 for me??)
in-hopes of satisfying an issue of which have I was able to derive one of two solutions
A. creation of venv || B. Installation of Required libs
Thanks to Andy I have satisfied need B
By adding pip >= requested version in reqs.txt
upon instantiation of new virtual-Environment || re-instantiation of previous Venv
py -m venv devenv
to setup new dev env
devenv\scripts\activate.bat
to activate dev env
python -m pip install -r requirenments.txt
to install base libs
yeilds output
Collecting pip >= 20.0.2 (from -r requirenments.txt (line 1))
Using cached >https://files.pythonhosted.org/packages/54/0c/d01aa759fdc501a58f431eb594a17495f15b88da142ce14b5845662c13f3/pip-20.0.2-py2.py3-none-any.whl
Found existing installation: pip 10.0.1
Uninstalling pip-10.0.1:
Successfully uninstalled pip-10.0.1
Successfully installed pip-20.0.2
Sorry for the Brain Dump, Hopes this helps someone :)
🤳 Austin 👨‍🎤🚀🥊
If you install anything in your django project and after installation you want to update your requirement file this command can update you requirement.txt file
pip freeze > requirements.txt
if your requirement file not exist in you project you can use this command for make new requirement.txt file
pip freeze > requirements.txt
With pip-tools you have a basic requirements.in with desired dependencies and a requirements.txt file with pinned versions. pip-tools then generates the pinned versions automatically, which makes handling the whole process including upgrading your dependencies a lot easier.
# requirements.in
django
and the autogenerated requirements.txt (to pin all dependencies)
$ pip-compile requirements.in
#
# This file is autogenerated by pip-compile
# To update, run:
#
# pip-compile requirements.in
#
asgiref==3.2.3
# via django
django==3.0.3
# via -r requirements.in
pytz==2019.3
# via django
sqlparse==0.3.0
# via django
If you use that workflow, which I can highly recommend, it's
pip-compile --upgrade
which generates the requirements.txt with the latest versions.
I edit the requirements.txt as below and run $sh ./requirements.txt
pip install -U amqp;
pip install -U appdirs;
pip install -U arrow;
pip install -U Babel;
pip install -U billiard;
pip install -U celery;
pip install -U Django;
pip install -U django-cors-headers;
pip install -U django-crispy-forms;
pip install -U django-filter;
pip install -U django-markdown-deux;
pip install -U django-pagedown;
pip install -U django-timezone-field;
pip install -U djangorestframework;
pip install -U fcm-django;
pip install -U flower;
pip install -U gunicorn;
pip install -U kombu;
pip install -U Markdown;
pip install -U markdown2;
pip install -U packaging;

How can I use a pip requirements file to uninstall as well as install packages?

I have a pip requirements file that changes during development.
Can pip be made to uninstall packages that do not appear in the requirements file as well as installing those that do appear? Is there a standard method?
This would allow the pip requirements file to be the canonical list of packages - an 'if and only if' approach.
Update: I suggested it as a new feature at https://github.com/pypa/pip/issues/716
This should uninstall anything not in requirements.txt:
pip freeze | grep -v -f requirements.txt - | grep -v '^#' | xargs pip uninstall -y
Although this won't work quite right with packages installed with -e, i.e. from a git repository or similar. To skip those, just filter out packages starting with the -e flag:
pip freeze | grep -v -f requirements.txt - | grep -v '^#' | grep -v '^-e ' | xargs pip uninstall -y
Then, obviously:
pip install -r requirements.txt
Update for 2016:
You probably don't really want to actually use the above approach, though. Check out pip-tools and pip-sync which accomplish what you are probably looking to do in a much more robust way.
https://github.com/nvie/pip-tools
Update for May, 2016:
You can now also use pip uninstall -r requirements.txt, however this accomplishes basically the opposite - it uninstalls everything in requirements.txt
Update for May, 2019:
Check out pipenv or Poetry. A lot has happened in the world of package management that makes this sort of question a bit obsolete. I'm actually still quite happily using pip-tools, though.
You can now pass the -r requirements.txt argument to pip uninstall.
pip uninstall -r requirements.txt -y
At least as of pip 8.1.2, pip help uninstall shows:
...
Uninstall Options:
-r, --requirement <file> Uninstall all the packages listed in the given requirements file. This option can be
used multiple times.
...
It's not a feature of pip, no. If you really want such a thing, you could write a script to compare the output of pip freeze with your requirements.txt, but it would likely be more hassle than it's worth.
Using virtualenv, it is easier and more reliable to just create a clean environment and (re)install from requirements.txt, like:
deactivate
rm -rf venv/
virtualenv venv/
source venv/bin/activate
pip install -r requirements.txt
The short answer is no, you can't do that with pip.
Here's a simple solution that works:
pip uninstall $(pip freeze) -y
This is an old question (but a good one), and things have changed substantially since it was asked.
There's an offhand reference to pip-sync in another answer, but deserves its own answer, because it solves precisely the OP's problem.
pip-sync takes a requirements.txt file as input, and "trues up" your current Python environment so that it matches exactly what's in that requirements.txt. This includes removing any packages that are present in your env but absent from requirements.txt.
Example: Suppose we want our env to contain (only) 3 libraries: libA, libB, and libC, like so:
> cat requirements.txt
libA==1.0
libB==1.1
libC==1.2
But our env currently contains libC and libD:
> pip freeze
libC==1.2
libD==1.3
Running pip-sync will result in this, which was our desired final state:
> pip-sync requirements.txt
> pip freeze
libA==1.0
libB==1.1
libC==1.2
Stephen's proposal is a nice idea, but unfortunately it doesn't work
if you include only direct requirements in your file, which sounds
cleaner to me.
All dependencies will be uninstalled,
including even distribute, breaking down pip itself.
Maintaining a clean requirements file while version tracking a virtual environment
Here is how I try to version-track my virtual environment.
I try to maintain a minimal requirements.txt, including only
the direct requirements, and not even mentioning version constraints where
I'm not sure.
But besides, I keep, and include in version tracking (say git),
the actual status of my virtualenv in a venv.pip file.
Here is a sample workflow:
setup virtualenv workspace, with version tracking:
mkdir /tmp/pip_uninstalling
cd /tmp/pip_uninstalling
virtualenv venv
. venv/bin/activate
initialize version tracking system:
git init
echo venv > .gitignore
pip freeze > venv.pip
git add .gitignore venv.pip
git commit -m "Python project with venv"
install a package with dependencies, include it in requirements file:
echo flask > requirements.txt
pip install -r requirements.txt
pip freeze > venv.pip
Now start building your app, then commit and start a new branch:
vim myapp.py
git commit -am "Simple flask application"
git checkout -b "experiments"
install an extra package:
echo flask-script >> requirements.txt
pip install -r requirements.txt
pip freeze > venv.pip
... play with it, and then come back to earlier version
vim manage.py
git commit -am "Playing with flask-script"
git checkout master
Now uninstall extraneous packages:
pip freeze | grep -v -f venv.pip | xargs pip uninstall -y
I suppose the process can be automated with git hooks, but let's not go off topic.
Of course, it makes sense then to use some package caching system
or local repository like pip2pi
you can create a new file with all installed packages
pip freeze > uninstall.txt
and then uninstall all of those
pip uninstall -r uninstall.txt -y
and then finally re-install the packages you had in your original requirements.txt file
pip install -r requirements.txt
Piggybacking off #stephen-j-fuhry here is a powershell equivalent I use:
pip freeze | ? { $_ -notmatch ((gc req.txt) -join "|") }
While this doesn't directly answer the question, a better alternative to requirements.txt now is using a Pipfile. This functions similarly to a Ruby Gemfile. Currently, you need to make use of the pipenv tool but hopefully this will eventually be incorporated into pip. This provides the pipenv clean command which does what you want.
(Note that you can import an existing requirements.txt with pipenv install -r requirements.txt. After this you should have a Pipfile and the requirements.txt can be removed.)
It is possible now using:
pip uninstall -r requirements.txt

Categories