Including GLIBC in a conda package - python

I'm learning how to create conda packages, and I am having trouble with a package that will work locally, but that when downloaded from anaconda into a different server will return an error:
/lib64/libm.so.6: version `GLIBC_2.29' not found
The meta.yaml looks like the following:
package:
name: app
version: 2.4
source:
git_url: https://gitlab.com/user/repo.git
git_tag: v2.4
requirements:
build:
host:
run:
about:
home: https://gitlab.com/user/repo
license: GPL-3
license_family: GPL
summary: blabla.
The app is build with a simple build.sh script:
#!/bin/bash
set -x
echo $(pwd)
make
BIN=$PREFIX/bin
mkdir -p $BIN
cp app $BIN
I assumed that build: glibc >= 2.29 under requirements would make the job, but that results in an error when running conda build ..
How can I include GLIBC in the package? is that something meant to be done manually? from the package version I can download from anaconda, I can see other packages are downloaded as well (e.g. libgcc-ng) that I did not really mention in the meta.yaml or anywhere.

How can I include GLIBC in the package?
You can't, for reasons explained here.
Your best bet is to build on a system (or in a docker container) which has the lowest version of GLIBC you need to support (i.e. the version installed on the server(s) you will be running your package on).

Related

build conda package on platform with no dependencies for a platform with dependencies

I have a package written on pure python which does not require any platform specific build procedures. However, one of run dependencies is available only for linux-64 platform. I surely understand that I am not able to run my package on win-64 platform and just want to create a package for linux-64 platform on my windows pc. I'd be rather happy to not publish the windows package and then run conda convert -p linux-64.
But if I'll try to build the following meta.yaml on windows platform
package:
name: device
version: 1.0.0
source:
path: ../
build:
entry_points:
- Starter = device:main
script: "{{ PYTHON }} -m pip install . -vv"
requirements:
build:
- python
- setuptools
run:
- python>=3.10
- pytango>=9.3.6
- slsdetlib==6.1.2 # linux specific
- slsdet==6.1.2 # linux specific
- bidict>=0.22.1
- numpy>=1.24.1
I get a reasonable error message:
conda_build.exceptions.DependencyNeedsBuildingError: Unsatisfiable dependencies for platform win-64: {'slsdetlib==6.1.2', 'slsdet==6.1.2'}
Can I somehow define a target platform which differs from the platform I working on and create package for linux-64 only? I know that I can just build this package in linux virtual machine and I already do so but would like to have a more elegant way.

Install newer version of sqlite3 on AWS Lambda for use with Python

I have a Python script running in a Docker container on AWS Lambda. I'm using the recommended AWS image (public.ecr.aws/lambda/python:3.9), which comes with SQLite version 3.7.17 (from 2013!). When I test the container locally on my M1 Mac, I see this:
$ docker run --env-file .env --entrypoint bash -ti my-image
bash-4.2# uname -a
Linux e9ed14d35cbe 5.10.104-linuxkit #1 SMP PREEMPT Thu Mar 17 17:05:54 UTC 2022 aarch64 aarch64 aarch64 GNU/Linux
bash-4.2# sqlite3 --version
3.7.17 2013-05-20 00:56:22 118a3b35693b134d56ebd780123b7fd6f1497668
However, I use newer SQLite features, so I need to find a way to use a newer version of the library. The most straightforward solution would be to install a binary package as suggested in this answer. The docs say it should be as simple as installing using pip. Unfortunately, when I attempt to use this approach inside the Docker container, I get this:
bash-4.2# pip3 install pysqlite3-binary
ERROR: Could not find a version that satisfies the requirement pysqlite3-binary (from versions: none)
ERROR: No matching distribution found for pysqlite3-binary
And I get the same error when I attempt to install it outside the container using pipenv (which is what I'm actually using for package management):
🕙 01:08:24 ❯ pipenv install pysqlite3-binary
Installing pysqlite3-binary...
Error: An error occurred while installing pysqlite3-binary!
Error text:
ERROR: Could not find a version that satisfies the requirement pysqlite3-binary (from versions: none)
ERROR: No matching distribution found for pysqlite3-binary
✘ Installation Failed
Am I doing something wrong? And if not, how can I get a recent version of SQLite which Python can use in this container? Do I really need to use a separate build stage in the Dockerfile as suggested here and copy the rpm components into place as laid out here? That feels like a lot of work for something that many people presumably need to do all the time.
Update: I tried the rpm approach inside the container using version 3.26 from EPEL8 (IIUC) and it failed with a bunch of dependency errors like this:
bash-4.2# curl --output-dir /tmp -sO https://vault.centos.org/centos/8/BaseOS/aarch64/os/Packages/sqlite-3.26.0-15.el8.aarch64.rpm
bash-4.2# yum localinstall /tmp/sqlite-3.26.0-15.el8.aarch64.rpm
Loaded plugins: ovl
Examining /tmp/sqlite-3.26.0-15.el8.aarch64.rpm: sqlite-3.26.0-15.el8.aarch64
# etc.
--> Finished Dependency Resolution
Error: Package: sqlite-3.26.0-15.el8.aarch64 (/sqlite-3.26.0-15.el8.aarch64)
Requires: libc.so.6(GLIBC_2.28)(64bit)
# Plus 6 other package dependency errors
Error: Package: nss-softokn-3.67.0-3.amzn2.0.1.aarch64 (#amzn2-core)
Requires: libsqlite3.so.0()(64bit)
Removing: sqlite-3.7.17-8.amzn2.1.1.aarch64 (#amzn2-core)
libsqlite3.so.0()(64bit)
Updated By: sqlite-3.26.0-15.el8.aarch64 (/sqlite-3.26.0-15.el8.aarch64)
Not found
Obsoleted By: sqlite-3.26.0-15.el8.aarch64 (/sqlite-3.26.0-15.el8.aarch64)
Not found
Available: sqlite-3.7.17-8.amzn2.0.2.aarch64 (amzn2-core)
libsqlite3.so.0()(64bit)
You could try using --skip-broken to work around the problem
You could try running: rpm -Va --nofiles --nodigest
When I try --skip-broken, it just skips installing the 3.26 package altogether.
Update 2: I've tried downloading the Python 3.9 wheel from pysqlite3-binary manually. However, it looks like that project only produces wheels for x86_64, not the aarch64 platform which Lambda uses. (This is not correct, see answer.) So presumably that's why pip is not finding it.
The problem was that I was running Docker locally to do my testing, on an M1 Mac. Hence the aarch64 architecture. Lambda does allow you to use ARM, but thankfully it still defaults to x86_64. I confirmed that my Lambda function was running x86_64, which is what the binary wheel uses, so that's good:
So I needed to do three things:
Change my Pipfile to conditionally install the binary package only on x86_64:
pysqlite3-binary = { version = "*", platform_machine = "== 'x86_64'" }
Tweak the sqlite import, as described in the original answer:
try:
import pysqlite3 as sqlite3
except ModuleNotFoundError:
import sqlite3 # for local testing because pysqlite3-binary couldn't be installed on macos
print(f"{sqlite3.sqlite_version=}")
Set my Docker container to launch in x86 emulation mode locally.
$ DOCKER_DEFAULT_PLATFORM=linux/amd64 docker build -t my-image .
$ DOCKER_DEFAULT_PLATFORM=linux/amd64 docker run -ti my-image
Et, voilà!
sqlite3.sqlite_version='3.39.2'

Conda glibc dependency conflict

I get a weird error when I try to build a conda package.
$ conda-build pkg2
....
Found conflicts! Looking for incompatible packages.
This can take several minutes. Press CTRL-C to abort.
failed
....
The following specifications were found to be incompatible with your system:
- feature:/linux-64::__glibc==2.17=0
- feature:|#/linux-64::__glibc==2.17=0
- pkg1 -> __glibc[version='>=2.17,<3.0.a0']
Your installed version is: 2.17
It looks to me as if glibc 2.17 satisfies all three requirements, however conda thinks there is a conflict.
To clarify, pkg2 depends on another package that I have built locally, pkg1. pkg1 is a C++ library with a python interface, that depends on libfftw. I could not find libfftw on conda, so had to install it via yum on the build host, which runs CentOS7 with glibc 2.17, hence the dependency.
In pkg1 meta.yaml I have:
requirements:
build:
- sysroot_linux-64 >=2.17 [linux]
pkg2 meta.yaml:
requirements:
host:
- pkg1
run:
- pkg1
Thanks #merv for the fftw tip.
I strongly suspect that the issue was caused by a mixture of packages from conda-forge and Anaconda default channels in the same environment. According to conda developers, this is considered a bad practice. Once I have changed it around to using conda-forge only, the problems went away.

Package not found after pip install

I've published a module to PyPi using Flit: a2d_diary (I've checked that the tar.gz contains all the scripts).
Then, I tried to install it in a virtual env in Windows and Linux using pip install a2d_diary and although it works and all dependencies are installed, if I try to run a2d_diary in a terminal (with the venv activate) it does not find my package.
Is this a problem with Flit, PyPi or am I missing something in the main script? The source code is here
Thanks!
The file a2d_diary.py is installed, but it won't be accessible via running $ ad2_diary.py from your terminal. These are the package files that were installed:
$ pip show -f a2d_diary
Name: a2d-diary
Version: 0.1
Summary: A2D-Diary web app. Create and encode paper diaries
automatically
Home-page: https://a2d-diary.netlify.com
Author: Julio Vega
Author-email: julio.vega#protonmail.com
License: UNKNOWN
Location: /Users/hoefling/.virtualenvs/stackoverflow/lib/python3.6/site-packages
Requires: PyPDF2, numpy, waitress, opencv-python, reportlab, falcon-multipart, falcon, Pillow
Files:
__pycache__/a2d_diary.cpython-36.pyc
a2d_diary-0.1.dist-info/INSTALLER
a2d_diary-0.1.dist-info/LICENSE
a2d_diary-0.1.dist-info/METADATA
a2d_diary-0.1.dist-info/RECORD
a2d_diary-0.1.dist-info/WHEEL
a2d_diary.py
If you want the script to be executable after the installation, you have to declare it as such in the package setup file (btw, I don't see any setup.py in your repository - did you commit it?). Example setup.py:
from setuptools import setup, find_packages
setup(
name='a2d_diary',
version='0.1',
packages=find_packages(where='src'),
package_dir={
'': 'src',
},
scripts=['src/a2d_diary.py'],
)
Another thing you will need in order to make your a2d_diary.py script executable is the shebang line (works for Unix, no idea what to do on Windows since I don't do Windows at all): first line in a2d_diary.py should be
#!/usr/bin/env python
if your script can be run with any version of Python or
#!/usr/bin/env python3
for Python 3 specifically or
#!/usr/bin/env python2
for Python 2 specifically.
Now, if you build a wheel or source tar and install it, you will be able to run the script via
$ a2d_diary.py

Minimal working example for conda package build

I am trying to look into conda for python package/virtual environments management. However, I can't seem to be able to build my own conda package. Could somebody help me to construct a minimal working example?
First of all some directory structure:
- src/
|- foo1.py
|- foo2.py
- conda-build/
|- meta.yaml
- setup.py
Running python setup.py install installs the packages using pip. Now if I try to cd to the conda-build directory and run conda build . I get the following output
Removing old build directory
Removing old work directory
BUILD START: example_pkg-0.5.1-abc
Fetching package metadata: ......
Solving package specifications: .
The following NEW packages will be INSTALLED:
pip: 6.1.1-py34_0
python: 3.4.3-0
setuptools: 15.0-py34_0
Linking packages ...
[ COMPLETE ]|##################################################| 100%
Removing old work directory
Copying C:\some\path\ to C:\Anaconda3\conda-bld\work
Package: example_pkg-0.5.1-abc
source tree in: C:\Anaconda3\conda-bld\work
number of files: 0
Fixing permissions
Fixing permissions
BUILD END: example_pkg-0.5.1-abc
Nothing to test for: example_pkg-0.5.1-abc
# If you want to upload this package to binstar.org later, type:
#
# $ binstar upload C:\Anaconda3\conda-bld\win-64\example_pkg-0.5.1- abc.tar.bz2
#
# To have conda build upload to binstar automatically, use
# $ conda config --set binstar_upload yes
I can indeed find the package in the directory C:\Anaconda3\conda-bld\win-64 but the package does not seem to contain any files. I can install the package using conda install --use-local .\example_pkg-0.5.1-abc.tar.bz2 and then it is listed by conda list, however I cannot import it in Python. This is my meta.yaml:
package:
name: example_pkg
version: "0.5.1"
source:
path: ../src
build:
number: 1
string: abc
script: python setup.py install
requirements:
build:
- python
run:
- python
Any help is greatly appreciated! :)
Looks like there's an issue where build/script doesn't work on Windows. Until that PR is merged, you should just create a bld.bat with
python setup.py install
if errorlevel 1 exit 1
and put it in the conda recipe.

Categories