How do I "pretend" to install a package using pip? - python

This feels like such a simple question, but I can't find any reference in the pip documentation and the only question that seemed relevant mentions a flag that has apparently been deprecated since version 1.5 (version 8.1 is current at the time of this writing).
How do I "pretend" to install a package or list of packages using pip, without actually installing them? I have two separate use cases for this:
I need to see what packages out of a long (~70 line) requirements.txt are missing, without actually installing them; seeing what requirements are already satisfied without installing the missing requirements would satisfy this for me.
Finding the dependencies for a package that I have not yet installed on my computer, without using something like Portage or Aptitude.

There is also the pretty useful pip-tools package that provides a pip-sync tool which you can execute in a "dry run" mode against your requirements file(s):
$ mkvirtualenv test_so
New python executable in test_so/bin/python
Installing setuptools, pip, wheel...done.
...
(test_so) $ pip install pip-tools
...
Installing collected packages: six, click, first, pip-tools
(test_so) $ echo "Django==1.6.11" > requirements.txt
(test_so) $ pip-sync --dry-run requirements.txt
Would install:
Django==1.6.11
Also, here is a partially relevant thread: Check if requirements are up to date.

Per the pip documentation, the proper way to generate the requirements.txt file is via pip freeze > requirements.txt. Hopefully this is what you wanted.

Related

List Python packages that will be installed from requirements.txt

In my requirements.txt I have packages defined in following manner:
Django ~= 2.2.0
It means that when I use pip install -r requirements.txt pip will find the latest available 2.2.x version and install it along with all dependencies.
What I need is requirements-formatted list of all packages with explicit versions that will be installed but without actually installing any packages. So example output would be something like:
Django==2.2.23
package1==0.2.1
package2==1.4.3
...
So in other words I'm looking for something like pip freeze results but without installing anything.
pip-compile is what you need!
Doc: https://github.com/jazzband/pip-tools)
python -m pip install pip-tools
pip-compile requirements.txt --output-file requirements-all.txt
The pip-compile command lets you compile a requirements.txt file from your dependencies, this way you can pip install you dependencies to always have the same environment
TL;DR
Try pipdetree or pip-tree.
Explanation
pip, contrary to most package managers, doesn't have a big dependency graph to look up. What it does is that it lets arbitrary setup code to be executed, which automatically pulls the dependencies. This means that, for example, a package could manage their dependencies in an other way than putting them in requirements.txt (see fastai for an example of a project that handles the dependencies differently).
So, there is, theoretically, no other way to see all the dependencies than to actually run an install on an isolated environment, see what was pulled, then delete the environment (because it could potentially be the same part of the code that does the installation and that brings the dependencies). You could actually do that with venv.
In practice, tools like pipdetree or pip-tree fetch the dependencies based on some standardization of the requirements (most packages separate the dependencies and the installation, and actually let pip handle both).

(Re)Checking Dependencies with PIP

Is it possible to re-check the dependencies of packages installed with pip? That is, suppose we have a working environment. Then, one of the packages changes (gets upgraded, etc). Is there a command one can run to make to make sure that the dependency tree is still sound and does not have conflicts?
Nowadays python -m pip check should do the trick.
Seems to have been added as early as pip 9.0.0 released on 2016-11-02.
It's not part of pip, but there is a tool you can use called pip-conflict-checker. Just install it through pip and run pipconflictchecker to get a dump of all the conflicts. pipdeptree could also help here.
You might also be interested in reading this article about dealing with pip dependency issues. The article also discusses the two tools I mentioned above along with strategies to fix broken dependencies.
In recent pip versions using pip install -r requirements.txt will fail if you have any conflict in your dependencies specified in requirements.txt.

Check if requirements are up to date

I'm using pip requirements files for keeping my dependency list.
I also try to follow best practices for managing dependencies and provide precise package versions inside the requirements file. For example:
Django==1.5.1
lxml==3.0
The question is: Is there a way to tell that there are any newer package versions available in the Python Package Index for packages listed inside requirements.txt?
For this particular example, currently latest available versions are 1.6.2 and 3.3.4 for Django and lxml respectively.
I've tried pip install --upgrade -r requirements.txt, but it says that all is up-to-date:
$ pip install --upgrade -r requirements.txt
Requirement already up-to-date: Django==1.5.1 ...
Note that at this point I don't want to run an actual upgrade - I just want to see if there are any updates available.
Pip has this functionality built-in. Assuming that you're inside your virtualenv type:
$ pip list --outdated
psycopg2 (Current: 2.5.1 Latest: 2.5.2)
requests (Current: 2.2.0 Latest: 2.2.1)
$ pip install -U psycopg2 requests
After that new versions of psycopg2 and requests will be downloaded and installed. Then:
$ pip freeze > requirements.txt
And you are done. This is not one command but the advantage is that you don't need any external dependencies.
Just found a python package specifically for the task - piprot, with the following slogan:
How rotten are your requirements?
It's very straightforward to work with:
$ piprot requirements.txt
Django (1.5.1) is 315 days out of date. Latest is 1.6.2
lxml (3.0) is 542 days out of date. Latest is 3.3.4
Your requirements are 857 days out of date
Also you can "pipe" pip freeze to piprot command, so it can actually inspect how rotten are the packages installed in your sandbox/virtual environment:
pip freeze | piprot
Hope that will help somebody in the future.
Since you mentioned you like to follow best practices, I am guessing you are using virtualenv too, correct? Assuming that is the case, and since you are already pinning your packages, there is a tool called pip-tools that you can run against your virtualenv to check for updates.
There is a down side, and why I mentioned the use of virtualenv though.
[the tool] checks PyPI and reports available updates. It uses the list of
currently installed packages to check for updates, it does not use any
requirements.txt
If you run it in your virtualenv, you can easily see which packages have updates available for your current active environment. If you aren't using virtualenv, though, it's probably not best to run it against the system as your other projects may depend on different versions (or may not work well with updated version even if they all currently work).
From the documentation provided, usage is simple. The pip-review shows you what updates are available, but does not install them.
$ pip-review
requests==0.13.4 available (you have 0.13.2)
redis==2.4.13 available (you have 2.4.9)
rq==0.3.2 available (you have 0.3.0)
If you want to automatically install as well, the tool can handle that too: $ pip-review --auto. There is also an --interactive switch that you can use to selectively update packages.
Once all of this is done, pip-tools provides a way to update your requirements.txt with the newest versions: pip-dump. Again, this runs against the currently active environment, so it is recommended for use within a virtualenv.
Installation of the project can be accomplished via pip install pip-tools.
Author's note: I've used this for small Django projects and been very pleased with it. One note, though, if you install pip-tools into your virtual environment, when you run pip-dump you'll find that it gets added to your requirements.txt file. Since my projects are small, I've always just manually removed that line. If you have a build script of some kind, you may want to automatically strip it out before you deploy.
You can just simply do something like this in your env (virtual or non virtual):
pip freeze | cut -d = -f 1 | xargs -n 1 pip search | grep -B2 'LATEST:'

Installing Python packages from local file system folder to virtualenv with pip

Is it possible to install packages using pip from the local filesystem?
I have run python setup.py sdist for my package, which has created the appropriate tar.gz file. This file is stored on my system at /srv/pkg/mypackage/mypackage-0.1.0.tar.gz.
Now in a virtual environment I would like to install packages either coming from pypi or from the specific local location /srv/pkg.
Is this possible?
PS
I know that I can specify pip install /srv/pkg/mypackage/mypackage-0.1.0.tar.gz. That will work, but I am talking about using the /srv/pkg location as another place for pip to search if I typed pip install mypackage.
What about::
pip install --help
...
-e, --editable <path/url> Install a project in editable mode (i.e. setuptools
"develop mode") from a local project path or a VCS url.
eg, pip install -e /srv/pkg
where /srv/pkg is the top-level directory where 'setup.py' can be found.
I am pretty sure that what you are looking for is called --find-links option.
You can do
pip install mypackage --no-index --find-links file:///srv/pkg/mypackage
From the installing-packages page you can simply run:
pip install /srv/pkg/mypackage
where /srv/pkg/mypackage is the directory, containing setup.py.
Additionally1, you can install it from the archive file:
pip install ./mypackage-1.0.4.tar.gz
1
Although noted in the question, due to its popularity, it is also included.
I am installing pyfuzzybut is is not in PyPI; it returns the message: No matching distribution found for pyfuzzy.
I tried the accepted answer
pip install --no-index --find-links=file:///Users/victor/Downloads/pyfuzzy-0.1.0 pyfuzzy
But it does not work either and returns the following error:
Ignoring indexes: https://pypi.python.org/simple
Collecting pyfuzzy
Could not find a version that satisfies the requirement pyfuzzy (from versions: )
No matching distribution found for pyfuzzy
At last , I have found a simple good way there: https://pip.pypa.io/en/latest/reference/pip_install.html
Install a particular source archive file.
$ pip install ./downloads/SomePackage-1.0.4.tar.gz
$ pip install http://my.package.repo/SomePackage-1.0.4.zip
So the following command worked for me:
pip install ../pyfuzzy-0.1.0.tar.gz.
Hope it can help you.
This is the solution that I ended up using:
import pip
def install(package):
# Debugging
# pip.main(["install", "--pre", "--upgrade", "--no-index",
# "--find-links=.", package, "--log-file", "log.txt", "-vv"])
pip.main(["install", "--upgrade", "--no-index", "--find-links=.", package])
if __name__ == "__main__":
install("mypackagename")
raw_input("Press Enter to Exit...\n")
I pieced this together from pip install examples as well as from Rikard's answer on another question. The "--pre" argument lets you install non-production versions. The "--no-index" argument avoids searching the PyPI indexes. The "--find-links=." argument searches in the local folder (this can be relative or absolute). I used the "--log-file", "log.txt", and "-vv" arguments for debugging. The "--upgrade" argument lets you install newer versions over older ones.
I also found a good way to uninstall them. This is useful when you have several different Python environments. It's the same basic format, just using "uninstall" instead of "install", with a safety measure to prevent unintended uninstalls:
import pip
def uninstall(package):
response = raw_input("Uninstall '%s'? [y/n]:\n" % package)
if "y" in response.lower():
# Debugging
# pip.main(["uninstall", package, "-vv"])
pip.main(["uninstall", package])
pass
if __name__ == "__main__":
uninstall("mypackagename")
raw_input("Press Enter to Exit...\n")
The local folder contains these files: install.py, uninstall.py, mypackagename-1.0.zip
An option --find-links does the job and it works from requirements.txt file!
You can put package archives in some folder and take the latest one without changing the requirements file, for example requirements:
.
└───requirements.txt
└───requirements
├───foo_bar-0.1.5-py2.py3-none-any.whl
├───foo_bar-0.1.6-py2.py3-none-any.whl
├───wiz_bang-0.7-py2.py3-none-any.whl
├───wiz_bang-0.8-py2.py3-none-any.whl
├───base.txt
├───local.txt
└───production.txt
Now in requirements/base.txt put:
--find-links=requirements
foo_bar
wiz_bang>=0.8
A neat way to update proprietary packages, just drop new one in the folder
In this way you can install packages from local folder AND pypi with the same single call: pip install -r requirements/production.txt
PS. See my cookiecutter-djangopackage fork to see how to split requirements and use folder based requirements organization.
Assuming you have virtualenv and a requirements.txt file, then you can define inside this file where to get the packages:
# Published pypi packages
PyJWT==1.6.4
email_validator==1.0.3
# Remote GIT repo package, this will install as django-bootstrap-themes
git+https://github.com/marquicus/django-bootstrap-themes#egg=django-bootstrap-themes
# Local GIT repo package, this will install as django-knowledge
git+file:///soft/SANDBOX/python/django/forks/django-knowledge#egg=django-knowledge
To install only from local you need 2 options:
--find-links: where to look for dependencies. There is no need for the file:// prefix mentioned by others.
--no-index: do not look in pypi indexes for missing dependencies (dependencies not installed and not in the --find-links path).
So you could run from any folder the following:
pip install --no-index --find-links /srv/pkg /path/to/mypackage-0.1.0.tar.gz
If your mypackage is setup properly, it will list all its dependencies, and if you used pip download to download the cascade of dependencies (ie dependencies of depencies etc), everything will work.
If you want to use the pypi index if it is accessible, but fallback to local wheels if not, you can remove --no-index and add --retries 0. You will see pip pause for a bit while it is try to check pypi for a missing dependency (one not installed) and when it finds it cannot reach it, will fall back to local. There does not seem to be a way to tell pip to "look for local ones first, then the index".
Having requirements in requirements.txt and egg_dir as a directory
you can build your local cache:
$ pip download -r requirements.txt -d eggs_dir
then, using that "cache" is simple like:
$ pip install -r requirements.txt --find-links=eggs_dir
What you need is --find-links of pip install.
-f, --find-links If a url or path to an html file, then parse for links to archives. If a local path or
file:// url that's a directory, then look for archives in the directory listing.
In my case, after python -m build, tar.gz package (and whl file) are generated in ./dist directory.
pip install --no-index -f ./dist YOUR_PACKAGE_NAME
Any tar.gz python package in ./dist can be installed by this way.
But if your package has dependencies, this command will prompt error.
To solve this, you can either pip install those deps from official pypi source, then add --no-deps like this
pip install --no-index --no-deps -f ./dist YOUR_PACKAGE_NAME
or copy your deps packages to ./dist directory.
I've been trying to achieve something really simple and failed miserably, probably I'm stupid.
Anyway, if you have a script/Dockerfile which download a python package zip file (e.g. from GitHub) and you then want to install it you can use the file:/// prefix to install it as shown in the following example:
$ wget https://example.com/mypackage.zip
$ echo "${MYPACKAGE_MD5} mypackage.zip" | md5sum --check -
$ pip install file:///.mypackage.zip
NOTE: I know you could install the package straight away using pip install https://example.com/mypackage.zip but in my case I wanted to verify the checksum (never paranoid enough) and I failed miserably when trying to use the various options that pip provides/the #md5 fragment.
It's been surprisingly frustrating to do something so simple directly with pip. I just wanted to pass a checksum and have pip verify that the zip was matching before installing it.
I was probably doing something very stupid but in the end I gave up and opted for this. I hope it helps others trying to do something similar.
In my case, it was because this library depended on another local library, which I had not yet installed. Installing the dependency with pip, and then the dependent library, solved the issue.
If you want to install one local package (package A) to be used inside another local project/package (B) this is quite simple. All you need is to CD to (B) and call:
pip install /path/to/package(A)
Of course you will need to first compile the package (A) with:
sudo python3 ./setup.py install
And, each time you change package A, just run again setup.py in package (A) then pip install ... inside the using project/package (B)
Just add directory on pip command
pip install mypackage file:/location/in/disk/mypackagename.filetype

How to specify install order for python pip?

I'm working with fabric(0.9.4)+pip(0.8.2) and I need to install some python modules for multiple servers. All servers have old version of setuptools (0.6c8) which needs to be upgraded for pymongo module. Pymongo requires setuptools>=0.6c9.
My problem is that pip starts installation with pymongo instead of setuptools which causes pip to stop. Shuffling module order in requirements file doesn't seem to help.
requirements.txt:
setuptools>=0.6c9
pymongo==1.9
simplejson==2.1.3
Is there a way to specify install order for pip as it doesn't seem to do it properly by itself?
This can be resolved with two separate requirements files but it would be nice if I didn't need to maintain multiple requirements files now or in the future.
Problem persists with pip 0.8.3.
You can just use:
cat requirements.txt | xargs pip install
To allow all types of entries (for example packages from git repositories) in requirements.txt you need to use the following set of commands
cat requirements.txt | xargs -n 1 -L 1 pip install
-n 1 and -L 1 options are necessary to install packages one by one and treat every line in the requirements.txt file as a separate item.
This is a silly hack, but might just work. Write a bash script that reads from your requirements file line by line and runs the pip command on it.
#!/bin/bash
for line in $(cat requirements.txt)
do
pip install $line -E /path/to/virtualenv
done
Sadly the upgrade suggestion won't work. If you read the other details in https://github.com/pypa/pip/issues/24 you will see why
pip will build all packages first, before attempting to install them. So with a requirements file like the following
numpy==1.7.1
scipy==0.13.2
statsmodels==0.5.0
The build of statsmodels will fail with the following statement
ImportError: statsmodels requires numpy
The workaround given for manually calling pip for each entry in the requirements file (via a shell script) seems to be the only current solution.
Pymongo requires setuptools>=0.6c9
How do you know? Requires to build or to install? You don't say what version of Pymongo you were trying to install but looking at setup.py file for current (3.2.2) version there's no specification of neither what Pymongo requires to run setup.py (setup_requires) nor what it requires to install (install_requires). With no such information pip can't ensure specific version of setuptools. If Pymongo requires specific version of setuptools to run its setup.py (as opposed to requiring setuptools to run setup function itself) then the other problem is that until recently there was no way to specify this. Now there's specification – PEP 518 – Specifying Minimum Build System Requirements for Python Projects, which should be shortly implemented in pip – Implement PEP 518 support #3691.
As to order of installation, this was fixed in pip 6.1.0;
From pip install – Installation Order section of pip's documentation:
As of v6.1.0, pip installs dependencies before their dependents, i.e.
in "topological order". This is the only commitment pip currently
makes related to order.
And later:
Prior to v6.1.0, pip made no commitments about install order.
However, without proper specification of requirements by Pymongo it won't help either.
Following on from #lukasrms's solution - I had to do this to get pip to install my requirements one-at-a-time:
cat requirements.txt | xargs -n 1 pip install
If you have comments in your requirements file you'll want to use:
grep -v "^#" requirements.txt | xargs pip install
I ended up running pip inside virtualenv instead of using "pip -E" because with -E pip could still see servers site-packages and that obviously messed up some of the installs.
I also had trouble with servers without virtualenvs. Even if I installed setuptools with separate pip command pymongo would refuse to be installed.
I resolved this by installing setuptools separately with easy_install as this seems to be problem between pip and setuptools.
snippets from fabfile.py:
env.activate = "source %s/bin/activate" % virtualenv_path
_virtualenv("easy_install -U setuptools")
_virtualenv("pip install -r requirements.txt")
def _virtualenv(command)
if env.virtualenv:
sudo(env.activate + "&&" + command)
else:
sudo(command)
I had these problems with pip 0.8.3 and 0.8.2.
Sorry, my first answer was wrong, because I had setuptools>=0.6c9.
It seems it is not possible because pymongo's setup.py needs setuptools>=0.6c9, but pip has only downloaded setuptools>=0.6c9, and not installed yet.
Someone discussed about it in the issue I pointed before.
I have my own created an issue some weeks ago about it: Do not run egg_info to each package in requirements list before installing the previous packages.
Sorry for the noisy.
First answer:
Upgrade your pip to 0.8.3 version, it has a bugfix to installation order.
Now if you upgrade everything works :-)
Check the news here: http://www.pip-installer.org/en/0.8.3/news.html

Categories