pip build-system has unpinned dependencies that break on update - python

I'm looking to pin Python dependencies installed as part of the pip build system. Here is the scenario I'm working with:
I'm using a requirements file to install a 3rd party package (from a git+ssh source, but I doubt that matters).
i.e. requirements.txt
git+ssh://git#gitremote/some/path/included-project.git#1.2.3#egg=included-project
Then, of course:
pip install -r requirements.txt
That "included-project" has a pyproject.toml that looks like this:
[build-system]
requires = ["poetry>=0.12"]
build-backend = "poetry.masonry.api"
Because poetry is not pinned, it upgrades to the newest version. However, the new version of poetry appears to be pulling in keyring, which pulls in SecretStorage, which pulls in cryptography... none of which are pinned. The new version of cryptography that's pulled in more recently started breaking my build because it's missing Rust.
I want my build to be deterministic, so I'd like to pin all the dependencies that poetry installs as part of the "build-system". This build-system and pep517 is new to me; my understanding is that it creates a virtualenv of sorts just to install the build-backend (poetry and deps) and then abandons that after the build is done. And as I understand it, that virtualenv is different than the virtualenv that is created by the normal install.
As such, I'm not sure how to pin packages inside of that temporary virtualenv that build-system creates. How can I do this?

Related

List Python packages that will be installed from requirements.txt

In my requirements.txt I have packages defined in following manner:
Django ~= 2.2.0
It means that when I use pip install -r requirements.txt pip will find the latest available 2.2.x version and install it along with all dependencies.
What I need is requirements-formatted list of all packages with explicit versions that will be installed but without actually installing any packages. So example output would be something like:
Django==2.2.23
package1==0.2.1
package2==1.4.3
...
So in other words I'm looking for something like pip freeze results but without installing anything.
pip-compile is what you need!
Doc: https://github.com/jazzband/pip-tools)
python -m pip install pip-tools
pip-compile requirements.txt --output-file requirements-all.txt
The pip-compile command lets you compile a requirements.txt file from your dependencies, this way you can pip install you dependencies to always have the same environment
TL;DR
Try pipdetree or pip-tree.
Explanation
pip, contrary to most package managers, doesn't have a big dependency graph to look up. What it does is that it lets arbitrary setup code to be executed, which automatically pulls the dependencies. This means that, for example, a package could manage their dependencies in an other way than putting them in requirements.txt (see fastai for an example of a project that handles the dependencies differently).
So, there is, theoretically, no other way to see all the dependencies than to actually run an install on an isolated environment, see what was pulled, then delete the environment (because it could potentially be the same part of the code that does the installation and that brings the dependencies). You could actually do that with venv.
In practice, tools like pipdetree or pip-tree fetch the dependencies based on some standardization of the requirements (most packages separate the dependencies and the installation, and actually let pip handle both).

How to build python project based on pyproject.toml

I would like to understand current state of Python build systems and requirements management.
Imagine, that I checked out sources of some project that is using poetry (or pipenv). And this project has pyproject.toml file with build system specified. Of course I can look into pyproject, see that this one is using Poetry, install poetry and run poetry install, but I would like to avoid it.
Question: Is there a build-system-agnostic way to build Python project?
By "build" I mean install all the necessary requirements for the project to be run in-place.
With requirements.txt I would achieve that by running pip install -r requirements.txt.
To install the project (and its dependencies), recent versions of pip are perfectly capable of doing this:
path/to/python -m pip install path/to/project
or
path/to/python -m pip install --editable path/to/project
To build distributions of the project, currently build is the only build back-end agnostic tool I know of:
python -m build
For the content of the pyproject.toml itself, see:
https://stackoverflow.com/a/64151860

How do I "pretend" to install a package using pip?

This feels like such a simple question, but I can't find any reference in the pip documentation and the only question that seemed relevant mentions a flag that has apparently been deprecated since version 1.5 (version 8.1 is current at the time of this writing).
How do I "pretend" to install a package or list of packages using pip, without actually installing them? I have two separate use cases for this:
I need to see what packages out of a long (~70 line) requirements.txt are missing, without actually installing them; seeing what requirements are already satisfied without installing the missing requirements would satisfy this for me.
Finding the dependencies for a package that I have not yet installed on my computer, without using something like Portage or Aptitude.
There is also the pretty useful pip-tools package that provides a pip-sync tool which you can execute in a "dry run" mode against your requirements file(s):
$ mkvirtualenv test_so
New python executable in test_so/bin/python
Installing setuptools, pip, wheel...done.
...
(test_so) $ pip install pip-tools
...
Installing collected packages: six, click, first, pip-tools
(test_so) $ echo "Django==1.6.11" > requirements.txt
(test_so) $ pip-sync --dry-run requirements.txt
Would install:
Django==1.6.11
Also, here is a partially relevant thread: Check if requirements are up to date.
Per the pip documentation, the proper way to generate the requirements.txt file is via pip freeze > requirements.txt. Hopefully this is what you wanted.

Python - Best way to auto install dependencies

I have a python project hosted on PyPI. I've discovered I have some dependency conflicts that need to be pinned. I know pip looks at the install_requires key in setup.py for dependencies, but I've read it's best to place pinned dependencies in a requirements.txt file. I've included this file (see below) using pip freeze, but I am unsure whether pip install project is sufficient to install dependencies as well.
# requirments.txt
numpy==1.9.2
pandas==0.16.2
I would like to make the simplest installation process for the user. For a package hosted on PyPI:
How do I setup requirements to simply pip install a project and include all of it's pinned dependencies automatically (similar to conda)?
Must install_requires=['numpy', 'pandas'] be included? If so, how do I best set it up to install the pinned versions only.

How to specify dependencies for your python code

I am working in a team and wrote some python code that uses libraries that need to be installed separately (because they are not part of standard python distribution). How should I specify those ? What is the right/correct/pythonic way to do this?
I personally use pip install -r requirements.txt
https://pip.pypa.io/en/latest/user_guide.html#requirements-files
Check out tool called pip. It's what most python projects use these days.
Typically, one would do the following (for example, we want to install the requests package for our new project):
pip install requests
and then
pip freeze > requirements.txt
Now, we have installed requests on our system and saved the dependency version to a file which we can distribute with our project.
At this point, requirements.txt contains:
requests==2.7.0
To install the same set of requirements (in our case only the requests package) on some other system, one would do the following:
pip install -r requirements.txt
You need to make a setup.py file for your package that specifies required packages. You may need to reorganize your file structure and include data and a manifest.
Then create a distribution of your package, EG: a wheel file.
Then when using pip install your_package_distro.whl, pip will determine your packages dependencies are met and install them from PyPI unless you specify another package source (EG: https://pypi.anaconda.org/)
Read through the following references to distribute your code:
Python 2.7.10 documentation - Distributing Python Modules, Section 2: Writing the Setup Script
Setuptools - Building and Distributing Packages with Setuptools
Hitchhiker's Guide to Packaging
Hitchhiker's Guide to Python - Packaging your Code

Categories