How to pin pipenv requirements with brackets? - python

I just did:
pipenv install django[argon2]
And this changed my Pipfile:
-django = "==2.1.5"
+django = {extras = ["argon2"],version = "*"}
I want to pin the requirements. First I will pin django to 2.1.5:
django = {extras = ["argon2"],version = "==2.1.5"}
What about argon2? Is that a separate package? There is no such package when I do pip freeze:
$ pip freeze | grep -i argon2
argon2-cffi==19.1.0
What is that? How do I fully pin django[argon2]?

In my Pipfile, I found this possible by double-quoting the package and the version
[packages]
"django[argon2]" = "==2.1.5"

From the Requirement Specifier docs for pip, you can combine these forms:
SomeProject == 1.3
SomeProject >=1.2,<2.0
SomeProject[foo, bar]
This means you can do this command:
pipenv install "django[argon2]==2.1.5"
Which generates this Pipfile entry:
django = {version = "==2.1.5", extras = ["argon2"]}
That command installs Django and:
Pins Django at version 2.1.5 (or whatever is specified as ==VERSION)
Includes Django's optional support for Argon2
There is no argon2 package. The [argon2] means it is an optional dependency or an optional feature of Django. What gets installed is the argon2-cffi and cffi packages, which are the optional dependencies Django needs to use Argon2. You can see this in the Pipfile.lock:
"argon2-cffi": {
"hashes": [
...
],
"version": "==20.1.0"
},
"cffi": {
"hashes": [
...
],
"version": "==1.14.6"
},
"django": {
"extras": [
"argon2"
],
"hashes": [
...
],
"index": "pypi",
"version": "==2.1.5"
},
This is also mentioned in the Django docs:
To use Argon2 as your default storage algorithm, do the following:
This can be done by running python -m pip install django[argon2], which is equivalent to python -m pip install argon2-cffi (along with any version requirement from Django’s setup.cfg)
The difference of doing pipenv install django[argon2] compared to installing django and argon2-cffi separately (as with this other answer) is that, during installation, you let Django's setuptools decide which version of argon2-cffi to use. This is better because the Django maintainers probably wrote and tested the code for Argon2 support using a compatible version of argon2-cffi.
This can be seen in Django's setup.cfg file (for Django 3.2.6 at the time of this writing):
[options.extras_require]
argon2 = argon2-cffi >= 19.1.0
which indicates that when using optional [argon2] feature it needs to install that range of version of argon2-cffi. As James O' Brien commented: "A specific version of django would require specific versions of the extras."

If you want full control you can:
pipenv install "django==2.1.5" "argon2-cffi==19.1"
Is it what you need?

Related

How to build multiple packages from a single python module using pyproject.toml and poetry?

I want to achieve a similar behavior as the library Dask does, it is possible to use pip to install dask, dask[dataframe], dask[array] and others. They do it by using the setup.py with a packages key like this. If I install only dask the dask[dataframe] is not installed and they warn you about this when executing the module.
I found this in the poetry documentation but when I execute poetry build I only get one .whl file with all of the packages within.
How can I package my module to be able to install specific parts of a library using poetry?
Actually the Dask example does not install sub packages sepparatelly, it just installs the custom dependencies sepparatelly as explained in this link.
In order to accomplish the same behavior using poetry you need to use this (as mentioned by user #sinoroc in this comment)
The example pyproject.toml from the poetry extras page is this:
[tool.poetry]
name = "awesome"
[tool.poetry.dependencies]
# These packages are mandatory and form the core of this package’s distribution.
mandatory = "^1.0"
# A list of all of the optional dependencies, some of which are included in the
# below `extras`. They can be opted into by apps.
psycopg2 = { version = "^2.7", optional = true }
mysqlclient = { version = "^1.3", optional = true }
[tool.poetry.extras]
mysql = ["mysqlclient"]
pgsql = ["psycopg2"]
By using poetry build --format wheel a single wheel file would be created.
In order to install a specific set of extra dependencies using pip and the wheel file you should use:
pip install "wheel_filename.whl[mysql]"

Download dependencies declared in pyproject.toml using Pip

I have a Python project that doesn't contain requirements.txt.
But it has a pyproject.toml file.
How can I download packages (dependencies) required by this Python project and declared in pyproject.toml using the Pip package manager (instead of the build tool Poetry).
So instead of pip download -r requirements.txt, something like pip download -r pyproject.toml.
Here is an example of .toml file:
[build-system]
requires = [
"flit_core >=3.2,<4",
]
build-backend = "flit_core.buildapi"
[project]
name = "aedttest"
authors = [
{name = "Maksim Beliaev", email = "beliaev.m.s#gmail.com"},
{name = "Bo Yang", email = "boy#kth.se"},
]
readme = "README.md"
requires-python = ">=3.7"
classifiers = ["License :: OSI Approved :: MIT License"]
dynamic = ["version", "description"]
dependencies = [
"pyaedt==0.4.7",
"Django==3.2.8",
]
[project.optional-dependencies]
test = [
"black==21.9b0",
"pre-commit==2.15.0",
"mypy==0.910",
"pytest==6.2.5",
"pytest-cov==3.0.0",
]
deploy = [
"flit==3.4.0",
]
to install core dependencies you run:
pip install .
if you need test(develop) environment (we use test because it is a name defined in .toml file, you can use any):
pip install .[test]
To install from Wheel:
pip install C:\git\aedt-testing\dist\aedttest-0.0.1-py3-none-any.whl[test]
pip supports installing pyproject.toml dependencies natively.
As of version 10.0, pip supports projects declaring dependencies that are required at install time using a pyproject.toml file, in the form described in PEP 518. When building a project, pip will install the required dependencies locally, and make them available to the build process. Furthermore, from version 19.0 onwards, pip supports projects specifying the build backend they use in pyproject.toml, in the form described in PEP 517.
From the project's root, use pip's local project install:
python -m pip install .
You can export the dependencies to a requirements.txt and use pip download afterwards:
poetry export -f requirements.txt > requirements.txt
pip download -r requirements.txt

Pipenv: dependencies of platform specific packages are installed unconditionally?

I am trying to integrate pipenv into my new project because I need to have some dependencies for development only and I couldn't find an easy way to have additional dev dependencies with pip or venv.
However, I ran into an issue when trying it:
My project depends on pypiwin32 when it's used in windows. Because it's a windows dependency, I installed it with the command:
pipenv install "pypiwin32 ; platform_system == 'Windows'"
which succesfully added this dependency to my Pipfile with the platform restriction:
# Pipfile
[[source]]
name = "pypi"
url = "https://pypi.org/simple"
verify_ssl = true
[dev-packages]
# dependencies...
[packages]
# dependencies...
pypiwin32 = {markers = "platform_system == 'Windows'",version = "*"}
[requires]
python_version = "3.7"
The problem is that pypiwin32 depends on the pywin32 package and when I looked in the Pipfile.lock file I saw that it's not restricted to windows only:
# Pipfile.lock
"pypiwin32": {
"index": "pypi",
"markers": "platform_system == 'Windows'",
"version": "==223"
},
"pywin32": {
"version": "==224"
},
which is weird. I tested it by changing the restriction to 'Linux' and trying to install the dependencies into a new environment without the Pipfile.lock and it ignored pypiwin32 as expected but it installed pywin32 anyway despite the fact that it's not listed in the Pipfile and that it's a dependency of a package which was ignored.
This is a serious problem because if I decide to deploy or develop on a linux machine, it will simply not work because pywin32 isn't available on linux or mac (as far as I know).
Is there a way to correctly mark the dependencies of a package with the restrictions of the package?
I don't want to manually edit Pipfile.lock (I also don't think it will work because when I tried to install it into a new environment, I copied the Pipfile only).
Alternatively, is there a better way to manage my dependencies? I need the ability to specify development dependencies that won't be installed on a production environment and I need the ability to specify platform specific dependencies. pip with the requirements.txt can't handle the first request as far as I know, which is why I tried switching to pipenv.
I tried looking it up with no success. Any help would be appreciated.
Adding platform specific installation of both pypiwin32 and pywin32 packages explicitly in the pipfile should solve the issue.
pypiwin32 = {version = "*", sys_platform = "== 'win32'"}
pywin32 = {version = "*", sys_platform = "== 'win32'"}
The prerequisite package pywin32 will in that case also be ignored on linux.

How can I make setuptools install a package from another source that's also available on pypi with the same version number?

It's a similar question to How can I make setuptools install a package that's not on PyPI? but not the same.
As I would like to use the forked version of some package, setuptools ignore the dependency link (as it has the same version number).
Is there a way to force using the link from the dependency_links? Or is the only way to change the version number in the forked repo?
requires = [
...
'pyScss==1.1.3'
...
dependencies = [
'https://github.com/nadavshatz/pyScss/zipball/master#egg=pyScss-1.1.3'
]
Update
Weird, apparently it works if this package is the only one in the required list, that is not installed yet. If there's another missing package it will download it from pypi.
I believe you can just use dependency_links as described in that question:
from setuptools import setup
setup(name = 'mypkg',
version = '0.0.1',
description = 'Foo',
author = 'bar',
author_email = 'bar#example.com',
install_requires = ['pyScss==1.1.3'],
dependency_links = [
'https://github.com/nadavshatz/pyScss/zipball/master#egg=pyScss-1.1.3'
]
)
Tested using python setup.py develop
You probably want to rename the egg to emphasize it's a fork http://www.python.org/dev/peps/pep-0386/
Outside of the setup.py you can enforce this locally using requirements.txt and pip. Whilst this won't make your package depend on the fork you can easily document it as the way to install.
$ cat requirements.txt
https://github.com/nadavshatz/pyScss/zipball/master#egg=pyScss-1.1.3
$ pip install -r requirements.txt
I ended up doing something very similar to the answer in stackoverflow.com/a/17442663/368102.
I need a requests-file github package that name-conflicts with a different requests-file package in PyPi. They both have a version 1.0, and the PyPi version has some higher versions.
The workaround in my ias_tools/setup.py looks like this:
setup(
...
install_requires=[
'requests-file<=99.99',
],
dependency_links=[
'https://github.com/jvantuyl/requests-file/archive/b0a7b34af6e287e07a96bc7e89bac3bc855323ae.zip#egg=requests-file-99.99'
]
)
In my case, I'm using pip so I also had to use --process-dependency-links:
% pip install --process-dependency-links ./ias_tools
You are using pip version 6.0.6, however version 6.1.1 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
Processing ./ias_tools
DEPRECATION: Dependency Links processing has been deprecated and will be removed in a future release.
Collecting requests-file<=99.99 (from ias-tools==0.1)
Downloading https://github.com/jvantuyl/requests-file/archive/b0a7b34af6e287e07a96bc7e89bac3bc855323ae.zip
Requirement already satisfied (use --upgrade to upgrade): requests>=1.1.0 in ./venv/lib/python2.7/site-packages (from requests-file<=99.99->ias-tools==0.1)
Installing collected packages: ias-tools, requests-file
Running setup.py install for ias-tools
Running setup.py install for requests-file
Successfully installed ias-tools-0.1 requests-file-1.0
I'm not too worried about the deprecation notice, as a pull request was submitted to pip to deprecate the deprecation (after a discussion about it).

Does pip handle extras_requires from setuptools/distribute based sources?

I have package "A" with a setup.py and an extras_requires line like:
extras_require = {
'ssh': ['paramiko'],
},
And a package "B" that depends on util:
install_requires = ['A[ssh]']
If I run python setup.py install on package B, which uses setuptools.command.easy_install under the hood, the extras_requires is correctly resolved, and paramiko is installed.
However, if I run pip /path/to/B or pip hxxp://.../b-version.tar.gz, package A is installed, but paramiko is not.
Because pip "installs from source", I'm not quite sure why this isn't working. It should be invoking the setup.py of B, then resolving & installing dependencies of both B and A.
Is this possible with pip?
We use setup.py and pip to manage development dependencies for our packages, though you need a newer version of pip (we're using 1.4.1 currently).
#!/usr/bin/env python
from setuptools import setup
from myproject import __version__
required = [
'gevent',
'flask',
...
]
extras = {
'develop': [
'Fabric',
'nose',
]
}
setup(
name="my-project",
version=__version__,
description="My awsome project.",
packages=[
"my_project"
],
include_package_data=True,
zip_safe=False,
scripts=[
'runmyproject',
],
install_requires=required,
extras_require=extras,
)
To install the package:
$ pip install -e . # only installs "required"
To develop:
$ pip install -e .[develop] # installs develop dependencies
This is suppported since pip 1.1, which was released in February 2012 (one year after this question was asked).
The answer from #aaronfay is completely correct but it may be nice to point out that if you're using zsh that the install command pip install -e .[dev] needs to be replaced by pip install -e ".[dev]".

Categories