Upgrade python packages from requirements.txt using pip command - python
How do I upgrade all my python packages from requirements.txt file using pip command?
tried with below command
$ pip install --upgrade -r requirements.txt
Since, the python packages are suffixed with the version number (Django==1.5.1) they don't seem to upgrade. Is there any better approach than manually editing requirements.txt file?
EDIT
As Andy mentioned in his answer packages are pinned to a specific version, hence it is not possible to upgrade packages through pip command.
But, we can achieve this with pip-tools using the following command.
$ pip-review --auto
this will automatically upgrade all packages from requirements.txt (make sure to install pip-tools using pip install command).
I already answered this question here. Here's my solution:
Because there was no easy way for upgrading package by package, and updating the requirements.txt file, I wrote this pip-upgrader which also updates the versions in your requirements.txt file for the packages chosen (or all packages).
Installation
pip install pip-upgrader
Usage
Activate your virtualenv (important, because it will also install the new versions of upgraded packages in current virtualenv).
cd into your project directory, then run:
pip-upgrade
Advanced usage
If the requirements are placed in a non-standard location, send them as arguments:
pip-upgrade path/to/requirements.txt
If you already know what package you want to upgrade, simply send them as arguments:
pip-upgrade -p django -p celery -p dateutil
If you need to upgrade to pre-release / post-release version, add --prerelease argument to your command.
Full disclosure: I wrote this package.
you can try:
pip install --upgrade --force-reinstall -r requirements.txt
You can also ignore installed package and install the new one :
pip install --ignore-installed -r requirements.txt
No. Your requirements file has been pinned to specific versions. If your requirements are set to that version, you should not be trying to upgrade beyond those versions. If you need to upgrade, then you need to switch to unpinned versions in your requirements file.
Example:
lxml>=2.2.0
This would upgrade lxml to any version newer than 2.2.0
lxml>=2.2.0,<2.3.0
This would upgrade lxml to the most recent version between 2.2.0 and 2.3.0.
I suggest freezing all of your dependencies in order to have predictable builds.
When doing that, you can update all dependencies at once like this:
sed -i '' 's/[~=]=/>=/' requirements.txt
pip install -U -r requirements.txt
pip freeze | sed 's/==/~=/' > requirements.txt
Having done the above, test your project with the new set of packages and eventually commit the requirements.txt file to the repository while still allowing for installing hot-fixes.
Another solution is to use the upgrade-requirements package
pip install upgrade-requirements
And then run :
upgrade-requirements
It will upgrade all the packages that are not at their latest versions, and also create an updated requirements.txt at the end.
Fixing dependencies to a specific version is the recommended practice.
Here's another solution using pur to keep the dependencies fresh!
Give pur your requirements.txt file and it will auto update all your high-level packages to the latest versions, keeping your original formatting and comments in-place.
For example, running pur on the example requirements.txt updates the packages to the currently available latest versions:
$ pur -r requirements.txt
Updated flask: 0.9 -> 0.10.1
Updated sqlalchemy: 0.9.10 -> 1.0.12
Updated alembic: 0.8.4 -> 0.8.6
All requirements up-to-date.
As pur never modifies your environment or installed packages, it's extremely fast and you can safely run it without fear of corrupting your local virtual environment. Pur separates updating your requirements.txt file from installing the updates. So you can use pur, then install the updates in separate steps.
I've just had to do the same... used this small one-liner to do the job:
packages=$(cat requirements.txt | sed 's/==.*//g'); echo $packages | xargs pip3 install -U; freeze=$(pip3 freeze); for p in $(echo $packages); do echo $freeze | grep -E "^${p}==" >> requirements.new; done
which:
packages=$(cat requirements.txt | sed 's/==.*//g') creates a list of the current packages names in requirements.txt (removing the version).
echo $packages | xargs pip3 install -U then passes all of the packages as arguments to pip3 to upgrade.
freeze=$(pip3 freeze); Gets all of the current package versions in the format required for requirements.txt
for p in $(echo $packages) then iterates through the package names
echo $freeze | grep -E "^${p}==" >> requirements.new gets the package version line from the pip freeze output which matches the package and writes to new requirements.txt
This has the added benefit of preserving the ordering of the original requirements.txt. :)
Hope this helps!
The second answer is the most useful but what I wanted to do is lock some packages while having others at the latest version (e.g. youtube-dl).
An example requirements.txt would look like this (~ means compatible):
Pillow==6.2.2
requests~=2.22.0
youtube_dl
Then in the terminal, use the command pip install --upgrade -r requirements.txt
This ensures that Pillow will stay at 6.2.2, requests will be upgraded to the latest 2.22.x (if available), and the latest version of youtube-dl will be installed if not already.
Since I couldn't do that using bash, I wrote a python module to create a new requirements file with no versions and use it:
data = open('requirements-prod.pip', 'r')
data2 = open('requirements-prod-no-version.pip', 'w')
for line in data.readlines():
new_line = line[:line.index('==')]
data2.write(new_line + '\n')
data2.flush()
Then install the libs from the new file pip install -U -r requirements-prod-no-version.pip
Finally freeze the versions to the original file pip freeze > requirements-prod.pip
More robust solution is IMO to use a dependency management such as poetry, https://python-poetry.org which comes with an exhaustive dependency resolver.
I guess the simplest solution is creating the requirements.txt with:
pip freeze | sed 's/==/>=/' > requirements.txt
You can use below command on Linux and Mac:
cat requirements.txt | cut -f1 -d= | xargs pip install -U
1) To upgrade pip installed files from reqs.txt
add the >= in replacement of ==
this will tell pip to install lib greater than or equal to the version you are requesting, here by installing the most to-date version of requested library
1.a) **My answer for thread ** By adding py -m pip install -r reqs.txt to a daily restart... or something of the nature you can update your installed libs.
Summed up by Andy Perfectly
-My reason For entering this thread was to find information on how to update virtual env base pip (usually 10.0.03 for me??)
in-hopes of satisfying an issue of which have I was able to derive one of two solutions
A. creation of venv || B. Installation of Required libs
Thanks to Andy I have satisfied need B
By adding pip >= requested version in reqs.txt
upon instantiation of new virtual-Environment || re-instantiation of previous Venv
py -m venv devenv
to setup new dev env
devenv\scripts\activate.bat
to activate dev env
python -m pip install -r requirenments.txt
to install base libs
yeilds output
Collecting pip >= 20.0.2 (from -r requirenments.txt (line 1))
Using cached >https://files.pythonhosted.org/packages/54/0c/d01aa759fdc501a58f431eb594a17495f15b88da142ce14b5845662c13f3/pip-20.0.2-py2.py3-none-any.whl
Found existing installation: pip 10.0.1
Uninstalling pip-10.0.1:
Successfully uninstalled pip-10.0.1
Successfully installed pip-20.0.2
Sorry for the Brain Dump, Hopes this helps someone :)
🤳 Austin 👨🎤🚀🥊
If you install anything in your django project and after installation you want to update your requirement file this command can update you requirement.txt file
pip freeze > requirements.txt
if your requirement file not exist in you project you can use this command for make new requirement.txt file
pip freeze > requirements.txt
With pip-tools you have a basic requirements.in with desired dependencies and a requirements.txt file with pinned versions. pip-tools then generates the pinned versions automatically, which makes handling the whole process including upgrading your dependencies a lot easier.
# requirements.in
django
and the autogenerated requirements.txt (to pin all dependencies)
$ pip-compile requirements.in
#
# This file is autogenerated by pip-compile
# To update, run:
#
# pip-compile requirements.in
#
asgiref==3.2.3
# via django
django==3.0.3
# via -r requirements.in
pytz==2019.3
# via django
sqlparse==0.3.0
# via django
If you use that workflow, which I can highly recommend, it's
pip-compile --upgrade
which generates the requirements.txt with the latest versions.
I edit the requirements.txt as below and run $sh ./requirements.txt
pip install -U amqp;
pip install -U appdirs;
pip install -U arrow;
pip install -U Babel;
pip install -U billiard;
pip install -U celery;
pip install -U Django;
pip install -U django-cors-headers;
pip install -U django-crispy-forms;
pip install -U django-filter;
pip install -U django-markdown-deux;
pip install -U django-pagedown;
pip install -U django-timezone-field;
pip install -U djangorestframework;
pip install -U fcm-django;
pip install -U flower;
pip install -U gunicorn;
pip install -U kombu;
pip install -U Markdown;
pip install -U markdown2;
pip install -U packaging;
Related
Does `pip install -U pip -r requirements.txt` upgrade pip before installing the requirements?
It seems to be common practice to set up a Python virtual environment using some variant of the following: python -m venv venv && source ./venv/bin/activate python -m pip install -U pip -r requirements.txt What I hope the above command does is: Upgrade pip first Run the installation of the packages in requirements.txt However, what actually seems to happen is: Collects all packages, including newest version of pip Installs them all together The original/outdated version of pip is what actually runs the installs And the new version of pip is not used until after this command Question(s) Is it possible to have pip upgrade itself and then install a requirements file, in one command? Would this infer any particular benefits? Should I switch to the following? python -m venv venv && source ./venv/bin/activate python -m pip install -U pip python -m pip install -r requirements.txt What is the optimal method to install requirements files? I see people sometimes installing/upgrading wheel and setuptools as well
The answers to your questions are: No. pip doesn't currently treat itself as a special dependency, so it doesn't know to install then execute itself, which is what it would need to do to overcome the problems you observed. Updating pip in a separate step is indeed the recommended way to proceed. You may from time to time see pip issue a message advising that a newer version is available. This happens a lot if you create them from a python with an outdated pip.
I had a situation similar to yours, I needed to first upgrade pip and then install a bunch of libraries in a lab that had 20 PCs. What I did was writing all the librarie's name in a requirements.txt file, then create a .bat file with two commands: `python -m pip install --upgrade pip` `pip install -r requirements.txt` The first command for upgrading pip and the second one for installing all the libraries listed in the requirements.txt file.
Python update package version in requirements.txt
I have a requirement.txt file with the list of python package to install. One of the packages is psycopg2==2.6.2 I need to update this package to psycopg2==2.7. I tried to install by pip3 install psycopg2 But it doesn't affect requirement.txt file. Can you please point me in the right direction?
Notice that running pip3 install psycopg2 doesn't respect the requirements.txt file. To upgrade this package you need to use -U option: pip3 install -U psycopg2 which is a shorthand for: pip3 install --upgrade psycopg2 After that, you can update your requirements.txt with the following command: pip freeze > requirements.txt If you're looking for a solution to automatically update the requirements.txt file after you upgrade package/packages, you can use pip-upgrader. Installation: pip install pip-upgrader Usage: pip-upgrade The above command auto-discovers the requirements file and prompts for selecting upgrades. You can also specify a path to the requirements file or/and specify a package to upgrade: pip-upgrade /path/to/requirements.txt -p psycopg2
As you've discovered, pip doesn't update the requirements file. So the workflow you'd likely want to use is: Update the version of psycopg2 in your requirements file from 2.6.2 to 2.7 Run pip install with the upgrade flag pip3 install -U -r requirements.txt If you're familiar with tools like npm that do update the version in the catalog file, you may be interested in using pipenv, which manages your dependencies and the virtual environment for you, much like npm does. If you don't know the latest version of your package, then use pip to figure it out: $ pip list --outdated | grep psycopg2 psycopg2 (2.7.3.2) - Latest: 2.7.4 [wheel]
you can try: pip install --upgrade --force-reinstall -r requirements.txt You can also ignore installed package and install the new one : pip install --ignore-installed -r requirements.txt
How can I revert changes done to python packages?
I'm debugging an issue on a staging, and I've added a bunch of logging statements to a 3rd party package. Once I'm done with that, I'd like to get them back to their original state. In ruby, I could do a gem pristine lib_name and that would restore the lib to it's original source code. It might be relevant to mention that I'm modifying code that was installed with sudo pip install some_pkg. What's the usual way of reverting any changes done to a lib?
On Linux: Just type the following command in a terminal (with pip, pip2 or pip3, accordingly to the Python version you're targetting): sudo -H pip install --upgrade --force-reinstall some_pkg On Windows: Open an admin terminal, and run the following command (ditto): pip install --upgrade --force-reinstall some_pkg
Try this pip install -r requirements.txt --force --upgrade
I suppose you have one "requriemnt.txt" file which you want to revert back packages to: #remove all currently installed packages pip freeze > remove.txt pip uninstall -r remove.txt #re-insatll all packages pip install -r requriement.txt This ensures that you also remove all unwanted packages
pip - installation of sub-dependencies overrides other packages on requirements.txt
my requirements file is like that : https://github.com/sontek/pyramid_webassets/archive/38b0b9f9f4e36dc22b3a5c10eabf4d9228d97740.zip#egg=pyramid_webassets-0.0 https://github.com/miracle2k/webassets/archive/334d55c6bcfd091cb2d984777daf943acde0d364.zip#egg=webassets-0.8.dev when running pip install -r requirements.txt I want it to install the specific version of pyramid_webassets, and then the specific webassets version (0.8.dev) the problem is that pyramid_webassets have the webassets as sub-dependency, and it installs the latest of this package. so the output of pip freeze is Chameleon==2.14 Mako==0.9.1 MarkupSafe==0.18 PasteDeploy==1.5.2 WebOb==1.3.1 argparse==1.2.1 pyramid==1.4.5 pyramid-webassets==0.0 repoze.lru==0.6 translationstring==1.1 venusian==1.0a8 webassets==0.9 wsgiref==0.1.2 zope.deprecation==4.1.0 zope.interface==4.0.5 you might notice that webassets version is the latest (0.9) though I specified the version I want (0.8.dev). I tried to reorder the list, adding the --upgrade flag- nothing helped. any idea how can I install it and still having the required version of webassets? Thanks. soultion: I found this commend useful: cat requirements.txt | xargs -L1 pip install that will install one by one the packages orderly but we should add --upgrade for the last package so it'll upgrade it.
use pip install option to not install package dependencies $ pip install --no-deps -r requirements.txt Doing a pip freeze afterwards gottfried#sascha-Latitude-XT2:~/venv$ bin/pip freeze argparse==1.2.1 pyramid-webassets==0.0 webassets==0.8.dev wsgiref==0.1.2 References pip cookbook - Ensuring repeatability
What happens when you move webassets higher then pyramid_webassets on the list?
How can I use a pip requirements file to uninstall as well as install packages?
I have a pip requirements file that changes during development. Can pip be made to uninstall packages that do not appear in the requirements file as well as installing those that do appear? Is there a standard method? This would allow the pip requirements file to be the canonical list of packages - an 'if and only if' approach. Update: I suggested it as a new feature at https://github.com/pypa/pip/issues/716
This should uninstall anything not in requirements.txt: pip freeze | grep -v -f requirements.txt - | grep -v '^#' | xargs pip uninstall -y Although this won't work quite right with packages installed with -e, i.e. from a git repository or similar. To skip those, just filter out packages starting with the -e flag: pip freeze | grep -v -f requirements.txt - | grep -v '^#' | grep -v '^-e ' | xargs pip uninstall -y Then, obviously: pip install -r requirements.txt Update for 2016: You probably don't really want to actually use the above approach, though. Check out pip-tools and pip-sync which accomplish what you are probably looking to do in a much more robust way. https://github.com/nvie/pip-tools Update for May, 2016: You can now also use pip uninstall -r requirements.txt, however this accomplishes basically the opposite - it uninstalls everything in requirements.txt Update for May, 2019: Check out pipenv or Poetry. A lot has happened in the world of package management that makes this sort of question a bit obsolete. I'm actually still quite happily using pip-tools, though.
You can now pass the -r requirements.txt argument to pip uninstall. pip uninstall -r requirements.txt -y At least as of pip 8.1.2, pip help uninstall shows: ... Uninstall Options: -r, --requirement <file> Uninstall all the packages listed in the given requirements file. This option can be used multiple times. ...
It's not a feature of pip, no. If you really want such a thing, you could write a script to compare the output of pip freeze with your requirements.txt, but it would likely be more hassle than it's worth. Using virtualenv, it is easier and more reliable to just create a clean environment and (re)install from requirements.txt, like: deactivate rm -rf venv/ virtualenv venv/ source venv/bin/activate pip install -r requirements.txt
The short answer is no, you can't do that with pip.
Here's a simple solution that works: pip uninstall $(pip freeze) -y
This is an old question (but a good one), and things have changed substantially since it was asked. There's an offhand reference to pip-sync in another answer, but deserves its own answer, because it solves precisely the OP's problem. pip-sync takes a requirements.txt file as input, and "trues up" your current Python environment so that it matches exactly what's in that requirements.txt. This includes removing any packages that are present in your env but absent from requirements.txt. Example: Suppose we want our env to contain (only) 3 libraries: libA, libB, and libC, like so: > cat requirements.txt libA==1.0 libB==1.1 libC==1.2 But our env currently contains libC and libD: > pip freeze libC==1.2 libD==1.3 Running pip-sync will result in this, which was our desired final state: > pip-sync requirements.txt > pip freeze libA==1.0 libB==1.1 libC==1.2
Stephen's proposal is a nice idea, but unfortunately it doesn't work if you include only direct requirements in your file, which sounds cleaner to me. All dependencies will be uninstalled, including even distribute, breaking down pip itself. Maintaining a clean requirements file while version tracking a virtual environment Here is how I try to version-track my virtual environment. I try to maintain a minimal requirements.txt, including only the direct requirements, and not even mentioning version constraints where I'm not sure. But besides, I keep, and include in version tracking (say git), the actual status of my virtualenv in a venv.pip file. Here is a sample workflow: setup virtualenv workspace, with version tracking: mkdir /tmp/pip_uninstalling cd /tmp/pip_uninstalling virtualenv venv . venv/bin/activate initialize version tracking system: git init echo venv > .gitignore pip freeze > venv.pip git add .gitignore venv.pip git commit -m "Python project with venv" install a package with dependencies, include it in requirements file: echo flask > requirements.txt pip install -r requirements.txt pip freeze > venv.pip Now start building your app, then commit and start a new branch: vim myapp.py git commit -am "Simple flask application" git checkout -b "experiments" install an extra package: echo flask-script >> requirements.txt pip install -r requirements.txt pip freeze > venv.pip ... play with it, and then come back to earlier version vim manage.py git commit -am "Playing with flask-script" git checkout master Now uninstall extraneous packages: pip freeze | grep -v -f venv.pip | xargs pip uninstall -y I suppose the process can be automated with git hooks, but let's not go off topic. Of course, it makes sense then to use some package caching system or local repository like pip2pi
you can create a new file with all installed packages pip freeze > uninstall.txt and then uninstall all of those pip uninstall -r uninstall.txt -y and then finally re-install the packages you had in your original requirements.txt file pip install -r requirements.txt
Piggybacking off #stephen-j-fuhry here is a powershell equivalent I use: pip freeze | ? { $_ -notmatch ((gc req.txt) -join "|") }
While this doesn't directly answer the question, a better alternative to requirements.txt now is using a Pipfile. This functions similarly to a Ruby Gemfile. Currently, you need to make use of the pipenv tool but hopefully this will eventually be incorporated into pip. This provides the pipenv clean command which does what you want. (Note that you can import an existing requirements.txt with pipenv install -r requirements.txt. After this you should have a Pipfile and the requirements.txt can be removed.)
It is possible now using: pip uninstall -r requirements.txt