Single pip command to download + keep + search + install packages - python

I have a requirements.txt file with several dependencies listed.
Whenever I try pip install -r requirements.txt in a brand new system, this will usually fail when some dependency is not met (see for example: here and here) This happens specially with the matplotlib package.
After downloading the entire package (in the case of matplotlib it's about 50Mb) and failing to install it, I go and fix the issue and then attempt to install the package again.
pip does not seem to be smart enough to realize it just downloaded that package and automatically re-use that same file (perhaps because it keeps no copy by default?) so the package will be downloaded entirely again.
To get around this issue I can follow the instructions given here and use:
pip install --download=/path/to/packages -r requirements.txt
to first download all packages and then:
pip install --no-index --find-links=/path/to/packages -r requirements.txt
to install all the packages using the locally stored files.
My question: is there a smart command that includes both these directives? I'm after a single line I can run repeatedly so that it will use stored copies of the packages if they exist or download them if they don't, and in this last case, store those copies to some location so they can be re-used later on if needed.

Related

Run pip install -r requirements.txt from a specific local directory?

I have a requirements.txt file that contains both normal package names (to install from pypi) and paths to local tar.gz packages within the repo, e.g.
flask
pandas
local_dir/local_pkg.tar.gz
The problem is, there are two different deployment pipelines used for this repo, which both need to work.
The first will only run pip install -r requirements.txt (I cannot modify this command or add any additional options), but it always runs it from the base repo path. So currently, this runs successfully without issue.
The second is the problem. It runs from a different location entirely, and installs the packages via pip install -r /path/to/repo/requirements.txt. The trouble is, pip install doesn't automatically look in /path/to/repo/ for the listed local package path (local_dir/local_pkg.tar.gz); it instead looks for that local package path in the location where the command is being run. It obviously can't find the local package there, and so throws an error.
With this second deployment pipeline, I can add additional options to pip install. However, I've tried out some of the listed options and cannot find anything that resolves my issue.
tl;dir:
How can I modify the pip install -r /path/to/repo/requirements.txt command, so that it looks for local packages as if it's running from /path/to/repo/ (regardless of where the command is actually being run from)?

How to create a python program where some libraries are automatically installed

I have created a program that uses some external libraries such as matplotlib, NumPy, pandas... I want to share my program with other users. The problem is that those users need to install those libraries and I want to avoid that. I have read that an executable file can be created, however, I want the other users could also be able to edit the program. How could I do this?
If you share the program as an executable file then it is true that the user won't have to install the libraries. But once the code is converted it can not be edited, each time you edit the code you will need to convert it every time to update the version. If you want to share editable code then the best option is to share a requirements.txt file along with the program.
You can generate requirements.txt which lists all your current dependencies. Use pipreqs module to list only the libraries used in your program.
install pipreqs (without $)
$ pip install pipreqs
Then go to your program directory and in the terminal/PowerShell/command prompt type (without $)
$ pipreq .
it will generate requirements.txt in the program directory zip the entire folder and share it. User need to install from this requirements.txt file, they can install from the file using the following command (without $)
$ pip install -r requirements.txt
What you need is to add a requirements.txt. This is a file where one specifies dependencies for a certain project. For example your program can have dependency on a certain NumPy version and by adding it to the requirements.txt file, when another user needs to install the dependencies for that project, he can easily automate it.
If you are using a virtual environment, you could simply get the environment.txt doing a pip freeze > requirements.txt. Otherwise you might need to add the used libraries to this file by yourself.
To let other people install the dependencies on their environment, they will need to execute the following pip command instead of installing every single module:
$ pip install -r requirements.txt
As mentioned in the comments, using the requirements file is the way to go and a common standard. You can create the requirements file using the following pip command:
$ cd <root directory>
$ pip freeze > requirements.txt

Get list of filenames pip would install from local folder

I am using pip to install requirements from a local folder containing .whl (wheel) files. I.e., using
pip install [package] --no-index --find-links /some/local/folder
I'm interested to know how I can get the filenames of the wheels Python WOULD install without actually installing them, so that I can copy the wheels to create a deployment.
I've seen some work on a pip "dry run" or "resolve" feature but this is still work in progress. Ideally I would also want transitive wheels to be resolved as well, but this is optional (e.g. pandas also requires numpy).
Example input (requirements.txt):
aiofiles
pandas
Desired output (filenames from /some/local/folder):
aiofiles-0.5.0-py3-none-any.whl
pandas-0.23.4-cp37-cp37m-manylinux1_x86_64.whl
It's the next day and I've managed to find a solution by myself now. The following copies all needed wheels, including transitive dependencies, into the destination you specify:
pip download -r requirements.txt --no-index --find-links /some/local/folder
--dest /will/copy/wheels/here
pip download reference

Pip install binary and preserve requirements.txt

I'm making a Python package dependent on spacy. Spacy works with binary language models. So I have the URLs listed at the end of my requirements.txt
https://github.com/explosion/spacy-models/releases/download/es_core_news_sm-2.0.0/es_core_news_sm-2.0.0.tar.gz#egg=spacy-english-model
But if I freeze the environment the package does not appear with the URL for download:
spacy-english-model==2.0.0
So if I add a package I can't pip install it and then pip freeze. How can specify the package in requirements.txt so that its URL shows up when freezeing?
You don't need to use pip freeze to distribute your package. When writing a package you'll need to add the models in your requirements.txt file, according to the documentation like so:
https://github.com/explosion/spacy-models/releases/download/es_core_news_sm-2.0.0/es_core_news_sm-2.0.0.tar.gz
you can see that happened here (check the dev-requirements.txt)
I don't know where did pip freeze get the spacy-english-model from. I'd start with a new python virtualenv and test everything again.

Installing Packages Offline with Pip, but for some reason it will not recognize a wheel?

I'm trying to make a script to install python packages offline. Here's what I've done:
Uninstall Python and delete old Python 2.7 folder on C:
Install clean copy of Python 2.7.16
Install all required packages via pip install (with internet connection) like normal
In command prompt, navigate to the folder I am going to put my wheels into and do pip freeze -> requirements.txt
Do pip download -r requirements.txt
Uninstall python, delete Python 2.7 again, install python, test script
My script is literally: python -m pip install --no-index --find-links . -r requirements.txt
It is installing packages normally, until it hits cffi. For some reason, it does not recognize this package even though it is in the folder. In the folder, its name is cffi-1.12.2-cp27-cp27m-win32.whl. In the requirements file, it's listed as cffi==1.12.2. They're the same version, so I'm not quite sure what is causing this, and I couldn't really find any questions with a similar problem where it was just one package and all the versions and procedure was similar. If someone could steer me in the right direction, that would be great. Thank you! All I can think is to install it explicitly at the beginning of the script, but I really don't want to have to do it that way if I can avoid it (in case I run into that issue with other packages later).

Categories