I'm on NixOS. How can I install packages on PyPy? - python

I'm on NixOS 22.11, and I'm trying to install pypy3 along with packages. In particular, I'm modifying /etc/nixos/configuration.nix.
The plain CPython is fine to install:
environment.systemPackages = with pkgs; [
(python3.withPackages (p: with p; [
scipy
matplotlib
torch
]))
];
But doing this for PyPy is a pain. I tried this:
environment.systemPackages = with pkgs; [
(pypy3.withPackages (p: with p; [
matplotlib
]))
];
And sudo nixos-rebuild switch complained about tkinter-7.3.9 being not supported on PyPy.
I also tried this:
environment.systemPackages = with pkgs; [
(pypy3.withPackages (p: with p; [
torch
]))
];
And sudo nixos-rebuild switch complained about protobuf-4.21.8 being not supported on PyPy.
Does this mean I cannot install these anyway, or is it just a dependency issue? I thought NixOS was supposed to solve all dependency issues.

Related

Install pytorch before importing in setup.py of another package in a single shot or instruction

I have a project that makes use of certain C package. Building the package is done as below in a setup.py file:
from setuptools import setup
import torch
if torch.cuda.is_available():
from torch.utils.cpp_extension import BuildExtension, CUDAExtension
setup(
name='iou3d',
ext_modules=[
CUDAExtension('iou3d_cuda', [
'src/iou3d.cpp',
'src/iou3d_kernel.cu',
],
extra_compile_args={'cxx': ['-g'],
'nvcc': ['-O2']})
],
cmdclass={'build_ext': BuildExtension})
Now this is being setup correctly if I already have pytorch installed in my environment.
But if I am doing a fresh install on a clean environment using requirements.txt, and I want to install everything in one shot by pip install -r requirements.txt I am not sure how can I get to install pytorch first to be able to import it in the setup.py.
Appreciate your help.

nrfutil - "ImportError: No module named main" on Nixos

I'm using the tool nrfutil which is implemented in Python. To be able to use it under NixOS I was using a default.nix file, that installed nrfutil into a venv. This worked for some time very well. (The last build on the build server using Nix within an alpine container could build the software I'm working on 11 days ago successfully.) When I do exactly the same things (i.e. restarting the CI server build without changes), the build fails now complaining about pip being incorrect:
$ nix-shell
New python executable in /home/matthias/source/tbconnect/bootloader/.venv/bin/python2.7
Not overwriting existing python script /home/matthias/source/tbconnect/bootloader/.venv/bin/python (you must use /home/matthias/source/tbconnect/bootloader/.venv/bin/python2.7)
Installing pip, wheel...
done.
Traceback (most recent call last):
File "/home/matthias/source/tbconnect/bootloader/.venv/bin/pip", line 6, in <module>
from pip._internal.main import main
ImportError: No module named main
To me it seems that the module main should exist:
$ ls -l .venv/lib/python2.7/site-packages/pip/_internal/main.py
-rw-r--r-- 1 matthias matthias 1359 10月 15 12:27 .venv/lib/python2.7/site-packages/pip/_internal/main.py
I'm not very much into the Python environment, so I don't know any further. Has somebody any pointer for me where to continue debugging? How is Python resolving modules? Why doesn't it find the module, that seems to be present to me?
This is my default.nix that I use to install pip:
with import <nixpkgs> {};
with pkgs.python27Packages;
stdenv.mkDerivation {
name = "impurePythonEnv";
buildInputs = [
automake
autoconf
gcc-arm-embedded-7
# these packages are required for virtualenv and pip to work:
#
python27Full
python27Packages.virtualenv
python27Packages.pip
# the following packages are related to the dependencies of your python
# project.
# In this particular example the python modules listed in the
# requirements.txt require the following packages to be installed locally
# in order to compile any binary extensions they may require.
#
taglib
openssl
git
stdenv
zlib ];
src = null;
shellHook = ''
# set SOURCE_DATE_EPOCH so that we can use python wheels
SOURCE_DATE_EPOCH=$(date +%s)
virtualenv --no-setuptools .venv
export PATH=$PWD/.venv/bin:$PATH
#pip install nrfutil
pip help
# the following is required to build micro_ecc_lib_nrf52.a in the SDK
export GNU_INSTALL_ROOT="${gcc-arm-embedded-7}/bin/"
unset CC
'';
}
I replaced pip install nrfutil with pip help to make sure the problem is not the package I try to install itself.
I'm still using python 2.7 as the nrfutil still is not fit for Python 3.
Anyway replacing python27 with python37 did not change the error I get when trying to start pip.)
NixOS version used locally is 19.09. Nix in the CI docker container is nixos/nix:latest which is the nix package manager on Alpine Linux.
Update:
Actually it works when I replace the call to pip install nrfutil with python2.7 -m pip install nrfutil. This actually confuses me even more. python2.7 is exactly the binary that is in the shebang of pip:
[nix-shell:~/source/tbconnect/bootloader]$ type python2.7
python2.7 is /home/matthias/source/tbconnect/bootloader/.venv/bin/python2.7
[nix-shell:~/source/tbconnect/bootloader]$ type pip
pip is /home/matthias/source/tbconnect/bootloader/.venv/bin/pip
[nix-shell:~/source/tbconnect/bootloader]$ head --lines 2 .venv/bin/pip
#!/home/matthias/source/tbconnect/bootloader/.venv/bin/python2.7
# -*- coding: utf-8 -*-
Update 2:
I found out that another way to fix the problem is to edit .venv/bin/pip. This script tried the following import:
from pip._internal.main import main
Which I think is the new module path starting with pip 19.3. But I still have pip 19.2. When I change this line to:
from pip._internal import main
Running pip by typing pip is working.
The thing is I have no idea why the pip script is trying to load the new module path while NixOS still has the old version of pip.
I also opened an issue for NixOS on GitHub: https://github.com/NixOS/nixpkgs/issues/71178
I got your shell derivation to work by dropping the Python27Packages.pip,
(nix-shell) 2d [azul:/tmp/lixo12333] $
>>> pip list
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Package Version
---------------- -------
behave 1.2.6
Click 7.0
crcmod 1.7
ecdsa 0.13.3
enum34 1.1.6
future 0.18.2
intelhex 2.2.1
ipaddress 1.0.23
libusb1 1.7.1
linecache2 1.0.0
nrfutil 5.2.0
parse 1.12.1
parse-type 0.5.2
pc-ble-driver-py 0.11.4
piccata 1.0.1
pip 19.3.1
protobuf 3.10.0
pyserial 3.4
pyspinel 1.0.0a3
PyYAML 4.2b4
setuptools 41.6.0
six 1.12.0
tqdm 4.37.0
traceback2 1.4.0
virtualenv 16.4.3
wheel 0.33.6
wrapt 1.11.2
(nix-shell) 2d [azul:/tmp/lixo12333] $
and my default.nix
with import <nixpkgs> {};
with pkgs.python27Packages;
stdenv.mkDerivation {
name = "impurePythonEnv";
buildInputs = [
automake
autoconf
gcc-arm-embedded-7
# these packages are required for virtualenv and pip to work:
#
python27Full
python27Packages.virtualenv
# the following packages are related to the dependencies of your python
# project.
# In this particular example the python modules listed in the
# requirements.txt require the following packages to be installed locally
# in order to compile any binary extensions they may require.
#
taglib
openssl
git
stdenv
zlib ];
src = null;
shellHook = ''
# set SOURCE_DATE_EPOCH so that we can use python wheels
SOURCE_DATE_EPOCH=$(date +%s)
virtualenv .venv
export PATH=$PWD/.venv/bin:$PATH
pip install nrfutil
#pip help
# the following is required to build micro_ecc_lib_nrf52.a in the SDK
export GNU_INSTALL_ROOT="${gcc-arm-embedded-7}/bin/"
unset CC
'';
}

How to pin pipenv requirements with brackets?

I just did:
pipenv install django[argon2]
And this changed my Pipfile:
-django = "==2.1.5"
+django = {extras = ["argon2"],version = "*"}
I want to pin the requirements. First I will pin django to 2.1.5:
django = {extras = ["argon2"],version = "==2.1.5"}
What about argon2? Is that a separate package? There is no such package when I do pip freeze:
$ pip freeze | grep -i argon2
argon2-cffi==19.1.0
What is that? How do I fully pin django[argon2]?
In my Pipfile, I found this possible by double-quoting the package and the version
[packages]
"django[argon2]" = "==2.1.5"
From the Requirement Specifier docs for pip, you can combine these forms:
SomeProject == 1.3
SomeProject >=1.2,<2.0
SomeProject[foo, bar]
This means you can do this command:
pipenv install "django[argon2]==2.1.5"
Which generates this Pipfile entry:
django = {version = "==2.1.5", extras = ["argon2"]}
That command installs Django and:
Pins Django at version 2.1.5 (or whatever is specified as ==VERSION)
Includes Django's optional support for Argon2
There is no argon2 package. The [argon2] means it is an optional dependency or an optional feature of Django. What gets installed is the argon2-cffi and cffi packages, which are the optional dependencies Django needs to use Argon2. You can see this in the Pipfile.lock:
"argon2-cffi": {
"hashes": [
...
],
"version": "==20.1.0"
},
"cffi": {
"hashes": [
...
],
"version": "==1.14.6"
},
"django": {
"extras": [
"argon2"
],
"hashes": [
...
],
"index": "pypi",
"version": "==2.1.5"
},
This is also mentioned in the Django docs:
To use Argon2 as your default storage algorithm, do the following:
This can be done by running python -m pip install django[argon2], which is equivalent to python -m pip install argon2-cffi (along with any version requirement from Django’s setup.cfg)
The difference of doing pipenv install django[argon2] compared to installing django and argon2-cffi separately (as with this other answer) is that, during installation, you let Django's setuptools decide which version of argon2-cffi to use. This is better because the Django maintainers probably wrote and tested the code for Argon2 support using a compatible version of argon2-cffi.
This can be seen in Django's setup.cfg file (for Django 3.2.6 at the time of this writing):
[options.extras_require]
argon2 = argon2-cffi >= 19.1.0
which indicates that when using optional [argon2] feature it needs to install that range of version of argon2-cffi. As James O' Brien commented: "A specific version of django would require specific versions of the extras."
If you want full control you can:
pipenv install "django==2.1.5" "argon2-cffi==19.1"
Is it what you need?

Travis CI Python build failing on osx - "2.7 not installed"

I am trying to build a Python package with some wrapped C++ code on Max osx via Travis CI. This is my build config:
{
"os": "osx",
"env": "PYTHON=3.6 CPP=14 CLANG DEBUG=1",
"sudo": false,
"script": [
"python setup.py install",
"py.test"
],
"install": [
"if [ \"$TRAVIS_OS_NAME\" = \"osx\" ]; then\n if [ \"$PY\" = \"3\" ]; then\n brew update && brew upgrade python\n else\n curl -fsSL https://bootstrap.pypa.io/get-pip.py | $PY_CMD - --user\n fi\n fi\nif [[ \"${TRAVIS_OS_NAME}\" == \"osx\" ]]; then\n export CXX=clang++ CC=clang;\n # manually install python on osx\n brew update\n brew install python3\n brew reinstall gcc\n virtualenv venv\n source venv/bin/activate\n pip install -r requirements.txt --upgrade\nfi\n",
"pip install -r requirements.txt --upgrade",
"python --version"
],
"language": "python",
"osx_image": "xcode9"
}
I get the following build error:
2.7 is not installed; attempting download
Downloading archive: https://s3.amazonaws.com/travis-python-archives/binaries/osx/10.12/x86_64/python-2.7.tar.bz2
$ curl -sSf -o python-2.7.tar.bz2 ${archive_url}
curl: (22) The requested URL returned error: 403 Forbidden
Unable to download 2.7 archive. The archive may not exist. Please consider a different version.
I'm not sure what to do about this.

Does pip handle extras_requires from setuptools/distribute based sources?

I have package "A" with a setup.py and an extras_requires line like:
extras_require = {
'ssh': ['paramiko'],
},
And a package "B" that depends on util:
install_requires = ['A[ssh]']
If I run python setup.py install on package B, which uses setuptools.command.easy_install under the hood, the extras_requires is correctly resolved, and paramiko is installed.
However, if I run pip /path/to/B or pip hxxp://.../b-version.tar.gz, package A is installed, but paramiko is not.
Because pip "installs from source", I'm not quite sure why this isn't working. It should be invoking the setup.py of B, then resolving & installing dependencies of both B and A.
Is this possible with pip?
We use setup.py and pip to manage development dependencies for our packages, though you need a newer version of pip (we're using 1.4.1 currently).
#!/usr/bin/env python
from setuptools import setup
from myproject import __version__
required = [
'gevent',
'flask',
...
]
extras = {
'develop': [
'Fabric',
'nose',
]
}
setup(
name="my-project",
version=__version__,
description="My awsome project.",
packages=[
"my_project"
],
include_package_data=True,
zip_safe=False,
scripts=[
'runmyproject',
],
install_requires=required,
extras_require=extras,
)
To install the package:
$ pip install -e . # only installs "required"
To develop:
$ pip install -e .[develop] # installs develop dependencies
This is suppported since pip 1.1, which was released in February 2012 (one year after this question was asked).
The answer from #aaronfay is completely correct but it may be nice to point out that if you're using zsh that the install command pip install -e .[dev] needs to be replaced by pip install -e ".[dev]".

Categories