I'm using pip requirements files for keeping my dependency list.
I also try to follow best practices for managing dependencies and provide precise package versions inside the requirements file. For example:
Django==1.5.1
lxml==3.0
The question is: Is there a way to tell that there are any newer package versions available in the Python Package Index for packages listed inside requirements.txt?
For this particular example, currently latest available versions are 1.6.2 and 3.3.4 for Django and lxml respectively.
I've tried pip install --upgrade -r requirements.txt, but it says that all is up-to-date:
$ pip install --upgrade -r requirements.txt
Requirement already up-to-date: Django==1.5.1 ...
Note that at this point I don't want to run an actual upgrade - I just want to see if there are any updates available.
Pip has this functionality built-in. Assuming that you're inside your virtualenv type:
$ pip list --outdated
psycopg2 (Current: 2.5.1 Latest: 2.5.2)
requests (Current: 2.2.0 Latest: 2.2.1)
$ pip install -U psycopg2 requests
After that new versions of psycopg2 and requests will be downloaded and installed. Then:
$ pip freeze > requirements.txt
And you are done. This is not one command but the advantage is that you don't need any external dependencies.
Just found a python package specifically for the task - piprot, with the following slogan:
How rotten are your requirements?
It's very straightforward to work with:
$ piprot requirements.txt
Django (1.5.1) is 315 days out of date. Latest is 1.6.2
lxml (3.0) is 542 days out of date. Latest is 3.3.4
Your requirements are 857 days out of date
Also you can "pipe" pip freeze to piprot command, so it can actually inspect how rotten are the packages installed in your sandbox/virtual environment:
pip freeze | piprot
Hope that will help somebody in the future.
Since you mentioned you like to follow best practices, I am guessing you are using virtualenv too, correct? Assuming that is the case, and since you are already pinning your packages, there is a tool called pip-tools that you can run against your virtualenv to check for updates.
There is a down side, and why I mentioned the use of virtualenv though.
[the tool] checks PyPI and reports available updates. It uses the list of
currently installed packages to check for updates, it does not use any
requirements.txt
If you run it in your virtualenv, you can easily see which packages have updates available for your current active environment. If you aren't using virtualenv, though, it's probably not best to run it against the system as your other projects may depend on different versions (or may not work well with updated version even if they all currently work).
From the documentation provided, usage is simple. The pip-review shows you what updates are available, but does not install them.
$ pip-review
requests==0.13.4 available (you have 0.13.2)
redis==2.4.13 available (you have 2.4.9)
rq==0.3.2 available (you have 0.3.0)
If you want to automatically install as well, the tool can handle that too: $ pip-review --auto. There is also an --interactive switch that you can use to selectively update packages.
Once all of this is done, pip-tools provides a way to update your requirements.txt with the newest versions: pip-dump. Again, this runs against the currently active environment, so it is recommended for use within a virtualenv.
Installation of the project can be accomplished via pip install pip-tools.
Author's note: I've used this for small Django projects and been very pleased with it. One note, though, if you install pip-tools into your virtual environment, when you run pip-dump you'll find that it gets added to your requirements.txt file. Since my projects are small, I've always just manually removed that line. If you have a build script of some kind, you may want to automatically strip it out before you deploy.
You can just simply do something like this in your env (virtual or non virtual):
pip freeze | cut -d = -f 1 | xargs -n 1 pip search | grep -B2 'LATEST:'
Related
For some context, my project uses python version 3.8 and has some packages in a requirements.txt file. I'm trying to upgrade the python version to 3.10 and also all the packages in my requirements.txt file to the latest possible version such that there are no conflicts between dependencies. Is there a way to do this?
It is a bit hard to say what will work best for you since you've given no info on your OS, and that thing you need is done differently on macOS and Windows.
If you're on macOS (this one could also work for Linux I guess), the best way to manage python versions is with pyenv and to update python packages with pip-review, both of which you can install via brew.
So first, you want to create a list of all your packages installed. To do that, you type pip3 freeze > requirements.txt in the terminal. The command creates a txt file in your home folder. In the file, you can see the names of all your packages installed, and it will look like this:
astroid==2.11.6
async-generator==1.10
attrs==21.4.0
autopep8==1.6.0
beautifulsoup4==4.11.1
Secondly, to update your python version, you'll need to type pyen install -l in the terminal to get a list of all currently available versions of python. To install the one you need, let's say it's the most recent at the moment 3.10.5, type pyenv install 3.10.5. To set this version as a default one, type pyenv global 3.10.5. If you don't need your previous version of python, just type pyenv uninstall and the number of the version you want to delete after a space (it's really just like installing one).
Thirdly, to move all your previous packages in the newly installed python version, just type pip3 install -r requirements.txt in the terminal. The command will get the packages of the same versions you had before.
Lastly, to upgrade the packages to the most recent version, type pip-review --auto. It will fetch the latest versions available and install them with no more commands from you.
Now here's why I think pip-review works better than anything else I've tried so far: in case some dependencies are broken (i. e. a package you are using needs another package of an older version than you've just upgraded to), it will point that out in the terminal, and you could get the right one.
yes, you can do that just you need to update your requirements file.
and run the requirements file. it will automatically update your all files.
You can update to Python3.10 and drop the previous requirements.txt into your project root directory. From there you can run pip install -r requirements.txt to install all the packages.
You can also run pip install -U <package-name> to update any package to the latest version.
Also, to re-generate a new requirements.txt file, run pip freeze > requirements.txt.
Pretty sure there is no such thing as automatically updating dependencies in such way that there are no conflicts. Some dependencies may have changes that are not backward compatible and a complete test of all your project's features will be necessary whenever you change the version of anything.
There is also an open-source project called Renovate that can help you maintain your packages and update them to recent versions
I would like to install the newest version of docutils via pip, but it can't figure out how to upgrade the system version installed via apt.
$ sudo --set-home python2 -m pip install --upgrade docutils
Collecting docutils
Using cached https://files.pythonhosted.org/packages/3a/dc/bf2b15d1fa15a6f7a9e77a61b74ecbbae7258558fcda8ffc9a6638a6b327/docutils-0.15.2-py2-none-any.whl
Installing collected packages: docutils
Found existing installation: docutils 0.14
ERROR: Cannot uninstall 'docutils'. It is a distutils installed project and thus we cannot accurately determine which files belong to it which would lead to only a partial uninstall.
$ apt-cache show python-docutils | head -n 3
Package: python-docutils
Architecture: all
Version: 0.14+dfsg-3
None of the solutions I've thought of or found on the web appeal:
Delete the apt version with rm -rf /usr/lib/python2.7/dist-packages/docutils*. This silences pip but means:
Installed files on the system no longer match what the Debian packaging system thinks
I might break dependencies of system software on docutils 0.14
Any updates to the Debian package will cause apt to reinstall
Other problems discussed in this answer
pip install --force-reinstall. (Same problems.)
pip install --ignore-install. (Same problems.)
Is there a way to get a default environment that works for me with the newest versions of stuff from pip but has no chance of breaking any system software? The same answer above suggests using one of virtualenv, venv, pyenv, pipenv. I tried pipenv and it doesn't want to install individual packages listed on the commandline using --system and I don't know whether creating a Pipfile will actually solve this problem.
I would rather not have to manually switch environments somehow to use the apt-installed packages versus the pip-installed packages. Is there a way to get only apt-installed software to use one environment and otherwise use the environment with the pip-installed stuff?
I would rather not have to manually switch environments somehow to use the apt-installed packages versus the pip-installed packages. Is there a way to get only apt-installed software to use one environment and otherwise use the environment with the pip-installed stuff?
Ideally, one should use either the system version or the pip version.
Per the Debian Python Policy,
As long as you don't install other versions of Python in your path, Debian's Python versions won't be affected by a new version.
If you install a different micro version of the version of Python you have got installed, you will need to be careful to install all the modules you use for that version of Python too.
So far adding the following to ~/.bashrc seems work well:
if [ ! -d ~/venv/python3 ]; then
python3 -m venv --system-site-packages ~/venv/python3
fi
if [ -d ~/venv/python3 ]; then
VIRTUAL_ENV_DISABLE_PROMPT=1 . ~/venv/python3/bin/activate
fi
Most of the system-installed scripts have one of the Pythons in /usr/bin hard-coded instead of using /usr/bin/env python so they are unaffected by this.
I am using tox to manage some testing environments. I have a dependency (backports.ssl-match-hostname) that I cannot download using the latest version of pip, so I need to revert back to pip 8.0.3 to allow the install to work.
I have included the 8.0.3 version of pip inside my tox.ini file for dependencies.
deps=
pip==8.0.3
However, when I run
source .tox/py27/bin/activate
and enter the virtual testing environment, and then run
pip --version
I end up with
8.1.2
However, outside of my tox environment, when I run the same command, I get
8.0.3
Is there anything special that tox does when grabbing pip? Why am I not able to specify the version of pip that I want to use as a dependency?
EDIT : to add to this, it seems as though I am able to grab the dependency pip==8.0.3, but for the other dependencies, they are still running from the command launched with pip==8.1.2
So, I need to be able to grab pip==8.0.3 first, and then once installed, grab everything else. Still unsure why tox is starting with pip==8.1.2
This was apparently the result of the "virtualenvs" python package containing a pre-selected group of python packages that it refers to, one of which was the latest and greatest pip.
I don't know if this is the preferred way of doing this, but I found success by running
pip uninstall virtualenv
And then reinstalling with the version that worked
pip install virtualenv==15.0.1
With the "correct" version of virtualenv in place, I was able to run my tox command
source .tox/py27/bin/activate
and see the desired version of pip
pip --version
pip 8.0.3
A workaround for this is here: https://github.com/pypa/pip/issues/3666
Although to make it work I had to write "pip install pip==8.1.1" in my script. So to recap:
Add a pip.sh script to your project:
#!/bin/bash
pip install pip==8.1.1
pip install "$#"
Add to your tox.ini:
install_command = {toxinidir}/pip.sh {opts} {packages}
I've recently hit this problem. I've had it for a while but it just didn't register because I had such occasional failures with Python 2/3 code. Another way that this can happen is, if like me, you change the virtualenv between different Python versions and don't clean up.
Check /bin or /Scripts to see whether python2 points to python. If the virtualenv is Python 3 then this will mean that python2 actually calls Python 3. Vice versa, of course, if you the virtualenv is Python 2 and you want to test Python 3 code.
New versions of virtualenv reach out to download the latest pip, setuptools, and wheel -- you can disable this behavior when running through tox with the tox-virtualenv-no-download package See: https://github.com/asottile/tox-virtualenv-no-download#wait-why
This feels like such a simple question, but I can't find any reference in the pip documentation and the only question that seemed relevant mentions a flag that has apparently been deprecated since version 1.5 (version 8.1 is current at the time of this writing).
How do I "pretend" to install a package or list of packages using pip, without actually installing them? I have two separate use cases for this:
I need to see what packages out of a long (~70 line) requirements.txt are missing, without actually installing them; seeing what requirements are already satisfied without installing the missing requirements would satisfy this for me.
Finding the dependencies for a package that I have not yet installed on my computer, without using something like Portage or Aptitude.
There is also the pretty useful pip-tools package that provides a pip-sync tool which you can execute in a "dry run" mode against your requirements file(s):
$ mkvirtualenv test_so
New python executable in test_so/bin/python
Installing setuptools, pip, wheel...done.
...
(test_so) $ pip install pip-tools
...
Installing collected packages: six, click, first, pip-tools
(test_so) $ echo "Django==1.6.11" > requirements.txt
(test_so) $ pip-sync --dry-run requirements.txt
Would install:
Django==1.6.11
Also, here is a partially relevant thread: Check if requirements are up to date.
Per the pip documentation, the proper way to generate the requirements.txt file is via pip freeze > requirements.txt. Hopefully this is what you wanted.
I'm working with fabric(0.9.4)+pip(0.8.2) and I need to install some python modules for multiple servers. All servers have old version of setuptools (0.6c8) which needs to be upgraded for pymongo module. Pymongo requires setuptools>=0.6c9.
My problem is that pip starts installation with pymongo instead of setuptools which causes pip to stop. Shuffling module order in requirements file doesn't seem to help.
requirements.txt:
setuptools>=0.6c9
pymongo==1.9
simplejson==2.1.3
Is there a way to specify install order for pip as it doesn't seem to do it properly by itself?
This can be resolved with two separate requirements files but it would be nice if I didn't need to maintain multiple requirements files now or in the future.
Problem persists with pip 0.8.3.
You can just use:
cat requirements.txt | xargs pip install
To allow all types of entries (for example packages from git repositories) in requirements.txt you need to use the following set of commands
cat requirements.txt | xargs -n 1 -L 1 pip install
-n 1 and -L 1 options are necessary to install packages one by one and treat every line in the requirements.txt file as a separate item.
This is a silly hack, but might just work. Write a bash script that reads from your requirements file line by line and runs the pip command on it.
#!/bin/bash
for line in $(cat requirements.txt)
do
pip install $line -E /path/to/virtualenv
done
Sadly the upgrade suggestion won't work. If you read the other details in https://github.com/pypa/pip/issues/24 you will see why
pip will build all packages first, before attempting to install them. So with a requirements file like the following
numpy==1.7.1
scipy==0.13.2
statsmodels==0.5.0
The build of statsmodels will fail with the following statement
ImportError: statsmodels requires numpy
The workaround given for manually calling pip for each entry in the requirements file (via a shell script) seems to be the only current solution.
Pymongo requires setuptools>=0.6c9
How do you know? Requires to build or to install? You don't say what version of Pymongo you were trying to install but looking at setup.py file for current (3.2.2) version there's no specification of neither what Pymongo requires to run setup.py (setup_requires) nor what it requires to install (install_requires). With no such information pip can't ensure specific version of setuptools. If Pymongo requires specific version of setuptools to run its setup.py (as opposed to requiring setuptools to run setup function itself) then the other problem is that until recently there was no way to specify this. Now there's specification – PEP 518 – Specifying Minimum Build System Requirements for Python Projects, which should be shortly implemented in pip – Implement PEP 518 support #3691.
As to order of installation, this was fixed in pip 6.1.0;
From pip install – Installation Order section of pip's documentation:
As of v6.1.0, pip installs dependencies before their dependents, i.e.
in "topological order". This is the only commitment pip currently
makes related to order.
And later:
Prior to v6.1.0, pip made no commitments about install order.
However, without proper specification of requirements by Pymongo it won't help either.
Following on from #lukasrms's solution - I had to do this to get pip to install my requirements one-at-a-time:
cat requirements.txt | xargs -n 1 pip install
If you have comments in your requirements file you'll want to use:
grep -v "^#" requirements.txt | xargs pip install
I ended up running pip inside virtualenv instead of using "pip -E" because with -E pip could still see servers site-packages and that obviously messed up some of the installs.
I also had trouble with servers without virtualenvs. Even if I installed setuptools with separate pip command pymongo would refuse to be installed.
I resolved this by installing setuptools separately with easy_install as this seems to be problem between pip and setuptools.
snippets from fabfile.py:
env.activate = "source %s/bin/activate" % virtualenv_path
_virtualenv("easy_install -U setuptools")
_virtualenv("pip install -r requirements.txt")
def _virtualenv(command)
if env.virtualenv:
sudo(env.activate + "&&" + command)
else:
sudo(command)
I had these problems with pip 0.8.3 and 0.8.2.
Sorry, my first answer was wrong, because I had setuptools>=0.6c9.
It seems it is not possible because pymongo's setup.py needs setuptools>=0.6c9, but pip has only downloaded setuptools>=0.6c9, and not installed yet.
Someone discussed about it in the issue I pointed before.
I have my own created an issue some weeks ago about it: Do not run egg_info to each package in requirements list before installing the previous packages.
Sorry for the noisy.
First answer:
Upgrade your pip to 0.8.3 version, it has a bugfix to installation order.
Now if you upgrade everything works :-)
Check the news here: http://www.pip-installer.org/en/0.8.3/news.html