Pip install binary and preserve requirements.txt - python

I'm making a Python package dependent on spacy. Spacy works with binary language models. So I have the URLs listed at the end of my requirements.txt
https://github.com/explosion/spacy-models/releases/download/es_core_news_sm-2.0.0/es_core_news_sm-2.0.0.tar.gz#egg=spacy-english-model
But if I freeze the environment the package does not appear with the URL for download:
spacy-english-model==2.0.0
So if I add a package I can't pip install it and then pip freeze. How can specify the package in requirements.txt so that its URL shows up when freezeing?

You don't need to use pip freeze to distribute your package. When writing a package you'll need to add the models in your requirements.txt file, according to the documentation like so:
https://github.com/explosion/spacy-models/releases/download/es_core_news_sm-2.0.0/es_core_news_sm-2.0.0.tar.gz
you can see that happened here (check the dev-requirements.txt)
I don't know where did pip freeze get the spacy-english-model from. I'd start with a new python virtualenv and test everything again.

Related

List Python packages that will be installed from requirements.txt

In my requirements.txt I have packages defined in following manner:
Django ~= 2.2.0
It means that when I use pip install -r requirements.txt pip will find the latest available 2.2.x version and install it along with all dependencies.
What I need is requirements-formatted list of all packages with explicit versions that will be installed but without actually installing any packages. So example output would be something like:
Django==2.2.23
package1==0.2.1
package2==1.4.3
...
So in other words I'm looking for something like pip freeze results but without installing anything.
pip-compile is what you need!
Doc: https://github.com/jazzband/pip-tools)
python -m pip install pip-tools
pip-compile requirements.txt --output-file requirements-all.txt
The pip-compile command lets you compile a requirements.txt file from your dependencies, this way you can pip install you dependencies to always have the same environment
TL;DR
Try pipdetree or pip-tree.
Explanation
pip, contrary to most package managers, doesn't have a big dependency graph to look up. What it does is that it lets arbitrary setup code to be executed, which automatically pulls the dependencies. This means that, for example, a package could manage their dependencies in an other way than putting them in requirements.txt (see fastai for an example of a project that handles the dependencies differently).
So, there is, theoretically, no other way to see all the dependencies than to actually run an install on an isolated environment, see what was pulled, then delete the environment (because it could potentially be the same part of the code that does the installation and that brings the dependencies). You could actually do that with venv.
In practice, tools like pipdetree or pip-tree fetch the dependencies based on some standardization of the requirements (most packages separate the dependencies and the installation, and actually let pip handle both).

How to pip freeze source package

I am learning how to use venv here: https://packaging.python.org/guides/installing-using-pip-and-virtual-environments/#installing-from-source
And it says I can install a source package by:
python3 -m pip install .
Which works, but now if I do pip freeze then I see:
my-package # file:///Users/joesmith/my-package
The problem is if I export this to a requirements.txt and try to install this environment on another machine then it won't work cause the path to source changed obviously.
What is the proper way to use a source package locally like i did but also export it afterwards so that another person can recreate the environment/run the code on another machine?
Pip has support for VCS like git. You can upload your code to git (e.g. Github, Gitlab, ..) for example and then use the requirements.txt. like this:
git+http://git.example.com/MyProject#egg=MyProject
https://pip.pypa.io/en/stable/cli/pip_install/#vcs-support
You would install package from PyPI rather than from source.
i.e. pip install requests
In this way other developers will also easily run your project.

Dependencies in Python project

I am currently working on a Python project, and I want the user to automatically have access to the dependencies that I used. Do they have to download (e.g. pip install) them manually? If so, is there an easy way to make them download the necessary packages easily?
virtualenv is What you need. You install packages your project needed when developing it. After coding, you can run pip freeze > requirements.txt to save all packages to requirements.txt. pip install -r requirements.txt will install all packages automatically.
further, Docker is more better for releasing projects to PC.
You need to create a virtual environment, see the link on how to, Then could use pip freeze > requirements.txt, to store your env dependencies to a file and then your user can simply use pip install -r requirements.txt to install them in one go
See documentation for more details

Setup.py for uninstalling via pip

According to another question, pip offers a facility for uninstalling eggs, which it's help also indicates.
I have a project that, once installed, has a structure in my local site-packages folder that looks like this:
projecta/
projecta-1.0-py2.6.egg-info/
Using an up to date version, pip uninstall projecta asks me the following question:
/path/to/python2.6/site-packages/projecta-1.0-py2.6.egg-info
Proceed (y/n)?
Answering y will remove the .egg-info directory, but not the main projecta directory, without saying there was any sort of error. Why doesn't pip manage or know to remove this directory?
The project itself is installed via a setup.py file using distutils. Are there any special settings I could/should use in that file to help pip with the removal process?
If I recall correctly pip knows how to uninstall packages installed via setuptools/distribute, not raw distutils.
There are some setuptools's features pip is based on - like --record option, which stores package metadata (and it is what allows pip to uninstal package related files).
Try doing:
$ pip install /path/to/projecta
$ pip uninstall projecta

Any productive way to install a bunch of packages

I had one machine with my commonly used python package installed.
and i would like to install the same package on another machine or same machine with different python version. I would like to know whether pip or easy-install or some other method can let me install those packages in a batch. When i use perl, it has something like a bundle package, how to do that in python?
Pip has some great features for this.
It lets you save all requirements from an environment in a file using pip freeze > reqs.txt
You can then later do : pip install -r reqs.txt and you'll get the same exact environnement.
You can also bundle several libraries into a .pybundle file with the command pip bundle MyApp.pybundle -r reqs.txt, and later install it with pip install MyApp.pybundle.
I guess that's what you're looking for :)
I keep a requirements.txt file in one of my repositories that has all my basic python requirements and use PIP to install them on any new machine.
Each of my projects also has it's own requirements.txt file that contains all of it's dependencies for use w/virtualenv.

Categories