Build Error on apple silicon M1 with docker - python

I was trying to dockerize a flask application with a third-party cli (plastimatch) on my M1.
I used ubuntu:18.04 as base image. The build on more recent version would fail with the error message 'no installation candidate was found'. The first odd thing I noticed was that the exact same build would succeed on a linux server.
I used a local venv to finalize the application and as I started to dockerize everything I got the following error:
#16 22.37 note: This error originates from a subprocess, and is likely not a problem with pip.
#16 22.37 ERROR: Failed building wheel for pylibjpeg-libjpeg
#16 22.37 Failed to build pylibjpeg-openjpeg pylibjpeg-libjpeg
#16 22.37 ERROR: Could not build wheels for pylibjpeg-openjpeg, pylibjpeg-libjpeg, which is required to install pyproject.toml-based projects
These python packages are wrappers for different C++ libaries, that handle images. The local build fails and the build on our linux server runs perfectly fine.
Has anyone noticed similar problems when dockerizing there applications locally in development? And are there any solutions to it?
Here is the reference of the used Dockerfile and requirements.txt (currently missing specific versions):
FROM ubuntu:18.04 as base
RUN apt-get update -y && apt-get install -y && apt-get upgrade -y
RUN apt-get install -y software-properties-common
RUN add-apt-repository ppa:deadsnakes/ppa
RUN apt-get install -y python3.8 python3-pip
RUN rm /usr/bin/python3 && ln -s /usr/bin/python3.8 /usr/bin/python3
RUN apt-get install -y \
plastimatch \
zlib1g \
cmake
WORKDIR /app
COPY requirements.txt requirements.txt
RUN python3 -m pip install -U --force-reinstall pip
RUN pip3 install --upgrade pip setuptools wheel
RUN pip3 install -r requirements.txt
ENV LC_ALL=C.UTF-8
ENV LANG=C.UTF-8
FROM base as upload-dev
RUN echo "Building dev version"
COPY requirements_dev.txt requirements_dev.txt
RUN pip3 install -r requirements_dev.txt
COPY . .
python-dotenv
cython
pynrrd
flask-cors
Flask
Werkzeug
httplib2
numpy
pydicom
highdicom
dicomweb-client
Update: 01. July 2022
I could track down the error.
The problem was the missing wheel of some third party libraries. If no wheel could be located, the source code will be fetched and installed by a compiler. This crashed on my machine during the installation of libraries that use C++ at their core.
An easy approach to fix this problem would be to directly use the linux AMD64 image.
FROM --platform=linux/amd64 $YOUR_BASE_IMAGE
This would be a bit slower but for most development environments sufficient.
A detailed explanation: https://pythonspeed.com/articles/docker-build-problems-mac/

For me, the fix was to install Rosetta 2, which is included in the Docker documentation: https://docs.docker.com/desktop/mac/apple-silicon/#system-requirements
softwareupdate --install-rosetta

Related

Unable to install tensorflow v1 Python package in Docker image

I am trying to install tensorflow<2.0,>=1.15 pip package during the Docker build. I am not able to build it, and I am getting this error in my terminal during the pip installation:
> [12/12] RUN pip3 install --no-cache-dir -r requirements.txt:
#16 0.488 ERROR: Could not find a version that satisfies the requirement tensorflow<2.0,>=1.15 (from versions: none)
#16 0.489 ERROR: No matching distribution found for tensorflow<2.0,>=1.15
To replicate the error:
Dockerfile:
FROM python:3.7-slim-buster
RUN apt-get update
RUN apt-get install -y unzip
RUN apt-get install -y build-essential
RUN apt-get install -y python-all-dev
RUN apt-get install -y libexiv2-dev
RUN apt-get install -y libboost-python-dev
RUN apt-get install -y wget
COPY . /usr/src/app
WORKDIR /usr/src/app
ENV PYTHONUNBUFFERED True
RUN pip3 install --upgrade pip
RUN pip3 install --no-cache-dir -r requirements.txt
requirements.txt:
tensorflow>=1.15,<2.0
I have tried to build FROM (first line in the Dockerfile) other Python versions, either 3.7 or lower, never newer. Still the same result.
I use Docker Desktop for Mac M1 version 4.3.2, Engine version 20.10.11.
When I run it on Fedora Linux, I can build it successfully.
I suspect that this can be Docker-related. There might be a difference between Docker Desktop and Docker for Linux. But I might be doing something wrong.
Have some of you folks ever encountered the same error? How did you solve this? Thanks for any tips.
Tensorflow 1.x does not support the Mac M1 chip. It is recommended to install Tensorflow >=2.5 on Mac M1
Take a look at these release notes from Mac Tensorflow:
https://github.com/apple/tensorflow_macos/releases

How do I install Cython, cartopy and shapely in a python docker container?

I am trying to get Cython, cartopy and shapely running in a docker container so I can leverage a python library traffic. I am currently getting an error with Cython:
Collecting Cython==0.26 (from -r requirements.txt (line 1))
Downloading https://files.pythonhosted.org/packages/87/6c/53a9e636c9dbe7acd5c002422c1a7a48a367f3b4c0cf6490908f43398ca6/Cython-0.26-cp27-cp27mu-manylinux1_x86_64.whl (7.0MB)
Collecting geos (from -r requirements.txt (line 2))
Downloading https://files.pythonhosted.org/packages/11/9b/a190f02fb92f465a7640b9ee7da732d91610415a1102f6e9bb08125a3fef/geos-0.2.2.tar.gz (365kB)
Collecting cartopy (from -r requirements.txt (line 3))
Downloading https://files.pythonhosted.org/packages/e5/92/fe8838fa8158931906dfc4f16c5c1436b3dd2daf83592645b179581403ad/Cartopy-0.17.0.tar.gz (8.9MB)
Complete output from command python setup.py egg_info:
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/tmp/pip-build-Se89QB/cartopy/setup.py", line 42, in <module>
raise ImportError('Cython 0.15.1+ is required to install cartopy.')
ImportError: Cython 0.15.1+ is required to install cartopy.
----------------------------------------
Command "python setup.py egg_info" failed with error code 1 in /tmp/pip-build-Se89QB/cartopy/
The command '/bin/sh -c pip install --no-cache-dir -r requirements.txt' returned a non-zero code: 1
Below is my setup:
Dockerfile:
FROM ubuntu:latest
WORKDIR /usr/src/app
#apt-get install -y build-essential -y python python-dev python-pip python-virtualenv libmysqlclient-dev curl&& \
RUN \
apt-get update && \
apt-get install -y build-essential -y python python-dev python-pip python-virtualenv libmysqlclient-dev curl&& \
rm -rf /var/lib/apt/lists/*
COPY requirements.txt ./
RUN pip install --no-cache-dir -r requirements.txt
COPY . .
# Install cron
RUN apt-get update
RUN apt-get install cron
# Add crontab file in the cron directory
ADD crontab /etc/cron.d/simple-cron
# Add shell script and grant execution rights
ADD script.sh /script.sh
RUN chmod +x /script.sh
# Give execution rights on the cron job
RUN chmod 0644 /etc/cron.d/simple-cron
# Create the log file to be able to run tail
RUN touch /var/log/cron.log
# Run the command on container startup
CMD cron && tail -f /var/log/cron.log
requirements.txt
Cython==0.26
geos
cartopy
shapely
traffic
Try installing Cython first with pip, before using your requirements.txt.
cartopy has a lot of dependencies, and some -- especially Proj -- may not be resolvable using PIP or apt-get. numpy and cython may be resolved by installing them separately and just prior to installing cartopy (like u/dopplershift suggests) -- but Proj will never resolve, grr.
My solution was to use conda install, which solves the dependencies for you. Unfortunately Docker and Conda don't play well together, but you can kind of work around it using miniconda. Try this:
FROM ubuntu:latest
FROM python:3.8.5
RUN mkdir /app
ADD . /app
WORKDIR /app
# cartopy cannot be installed using PIP because the proj never gets resolved.
# The proj dependency never gets resolved because there are two Python packages
# called proj, and PIP always loads the wrong one. The conda install command,
# however, using the conda-forge channel, does know how to resolve the dependency
# issues, including packages like numpy.
#
# Here we install miniconda, just so we can use the conda install command
# for cartopy.
FROM continuumio/miniconda3
RUN conda install -c conda-forge cartopy
Since 2022-09-09 this is much easier, because Cartopy v0.21.0 does not depend on PROJ.
Solution
Dockerfile:
FROM python:3.11-slim-bullseye
RUN apt update && apt install -y git gcc build-essential python3-dev libgeos-dev
RUN python3 -m pip install --upgrade pip setuptools wheel
ADD requirements.txt .
RUN python3 -m pip install --no-cache-dir --compile -r requirements.txt
# add files and set cmd/entrpypoint down here
requirements.txt:
Cartopy==0.21.0
Test
docker build -t cartopy -f Dockerfile .
docker run -it cartopy pip freeze
Results in:
Cartopy==0.21.0
certifi==2022.12.7
contourpy==1.0.6
cycler==0.11.0
fonttools==4.38.0
kiwisolver==1.4.4
matplotlib==3.6.2
numpy==1.23.5
packaging==22.0
Pillow==9.3.0
pyparsing==3.0.9
pyproj==3.4.0
pyshp==2.3.1
python-dateutil==2.8.2
Shapely==1.8.5.post1
six==1.16.0

Problem building docker with numpy and pandas over arm64

I'm trying to build a docker image with docker-compose in my ARM64 rasperry pi but it seems to be imposible.
This is my dockerfile:
FROM python:3.6-slim
RUN apt-get update && apt-get -y install python3-dev
RUN apt-get -y install python3-numpy
RUN apt-get -y install python3-pandas
ENTRYPOINT ["python3", "app.py"]
It seems to be OK, but when app.py is run, it gives an error: "Module numpy not found", and the same for pandas module.
If I try to install numpy and pandas using pip:
RUN pip install numpy pandas
It gives me an error or, more usually, the raspberry just gets frozen and I have to unplug it to recover.
I have tried with different versions of python for the source image and also using several ubuntu images and installing python.
Any idea of how can I install numpy and pandas in docker for my raspberry pi (ARM64)?
Thanks
The problems seems to be with the python version. I'm using a python3.6 docker image but, both python3-numpy and python3-pandas packages require python3.5, so when those packages are installed a new version of python is also installed. This is why when I'm trying to import those modules the python interpreter can't found them, because they are installed for another python version.
Finaly I solved it using a generic docker debian image and installing python3.5 myself instead of using a python docker image.
FROM debian:stretch-slim
RUN apt-get update && apt-get -y dist-upgrade
RUN apt-get -y install build-essential libssl-dev libffi-dev python3.5 libblas3 libc6 liblapack3 gcc python3-dev python3-pip cython3
RUN apt-get -y install python3-numpy python3-sklearn
RUN apt-get -y install python3-pandas
COPY requirements.txt /tmp/
RUN pip3 install -r /tmp/requirements.txt
(Disclaimer: The Raspberry Pi 3 B+ is probably too slow to install big dependecies like numpy)
This Dockerfile worked for me on the Raspberry Pi 3 B+ with Software-Version: Linux raspberrypi 5.10.63-v7+ (Consider updating it)
FROM python:3.9-buster
WORKDIR /
COPY requirements.txt requirements.txt
RUN pip3 install -r requirements.txt
I am not sure, but I think it helped also to clean docker i.e. remove all images and containers with the following commands:
Warning: This commands deletes all images and containers!
$ docker container prune
$ docker image prune -a
Or reset Docker completely (deletes also volumes and networks):
$ docker system prune --volumes
I recommend to create requirements.txt file.
Inside you can declare packets to install.
The `Dockerfile':
FROM python
COPY app.py /workdir/
COPY requirements.txt /workdir/
WORKDIR /workdir
RUN pip install --trusted-host pypi.python.org -r requirements.txt
CMD python app.py
edit
I create Dockerfile which import pandas lib and then checking if it work:
cat Dockerfile
FROM python
COPY app.py /workdir/
WORKDIR /workdir
RUN python -m pip install pandas
CMD python app.py

ImportError: libGL.so.1: cannot open shared object file: No such file or directory

I am trying to run cv2, but when I try to import it, I get the following error:
ImportError: libGL.so.1: cannot open shared object file: No such file or directory
The suggested solution online is installing
apt install libgl1-mesa-glx
but this is already installed and the latest version.
NB: I am actually running this on Docker, and I am not able to check the OpenCV version. I tried importing matplotlib and that imports fine.
Add the following lines to your Dockerfile:
RUN apt-get update && apt-get install ffmpeg libsm6 libxext6 -y
These commands install the cv2 dependencies that are normally present on the local machine, but might be missing in your Docker container causing the issue.
[minor update on 20 Jan 2022: as Docker recommends, never put RUN apt-get update alone, causing cache issue]
Even though the above solutions work. But their package sizes are quite big.
libGL.so.1 is provided by package libgl1. So the following code is sufficient.
apt-get update && apt-get install libgl1
This is a little bit better solution in my opinion. Package python3-opencv includes all system dependencies of OpenCV.
RUN apt-get update && apt-get install -y python3-opencv
RUN pip install opencv-python
Try installing opencv-python-headless python dependency instead of opencv-python. That includes a precompiled binary wheel with no external dependencies (other than numpy), and is intended for headless environments like Docker. This saved almost 700mb in my docker image compared with using the python3-opencv Debian package (with all its dependencies).
The package documentation discusses this and the related (more expansive) opencv-contrib-python-headless pypi package.
Example reproducing the ImportError in the question
# docker run -it python:3.9-slim bash -c "pip -q install opencv-python; python -c 'import cv2'"
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/usr/local/lib/python3.9/site-packages/cv2/__init__.py", line 5, in <module>
from .cv2 import *
ImportError: libGL.so.1: cannot open shared object file: No such file or directory
# docker run -it python:3.9-slim bash -c "pip -q install opencv-python-headless; python -c 'import cv2'"
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
For me, the only WA that worked is following:
# These are for libGL.so issues
# RUN apt-get update
# RUN apt install libgl1-mesa-glx
# RUN apt-get install -y python3-opencv
# RUN pip3 install opencv-python
RUN pip3 install opencv-python-headless==4.5.3.56
If you're on CentOS, RHEL, Fedora, or other linux distros that use yum, you'll want:
sudo yum install mesa-libGL -y
In my case it was enough to do the following which also saves space in comparison to above solutions
RUN apt-get update && apt-get install -y --no-install-recommends \
libgl1 \
libglib2.0-0 \
Put this in the Dockerfile
RUN apt-get update
RUN apt install -y libgl1-mesa-glx
Before the line
COPY requirements.txt requirements.txt
For example
......
RUN apt-get update
RUN apt install -y libgl1-mesa-glx
COPY requirements.txt requirements.txt
......
I was getting the same error when I was trying to use OpenCV in the GCP Appengine Flex server environment. Replacing "opencv-python" by "opencv-python-headless" in the requirements.txt solved the problem.
The OpenCV documentation talks about different packages for desktop vs. Server (headless) environments.
I met this problem while using cv2 in a docker container. I fixed it by:
pip install opencv-contrib-python
install opencv-contrib-python rather than opencv-python.
Here is the solution you need:
pip install -U opencv-python
apt update && apt install -y libsm6 libxext6 ffmpeg libfontconfig1 libxrender1 libgl1-mesa-glx
had the same issue on centos 8 after using pip3 install opencv on a non gui server which is lacking all sorts of graphics libraries.
dnf install opencv
pulls in all needed dependencies.
"installing opencv-python-headless instead of opencv-python"
this works in my case!
I was deploying my website to Azure and pop up this exception:
ImportError: libGL.so.1: cannot open shared object file: No such file or directory
then I uninstall the opencv-python package, install the later one,
freeze the requirements and then deploy it again,
then problem solved.
For a raspberry pi, put this , work for me :
sudo apt-get install ffmpeg libsm6 libxext6 -y
For me, the problem was related to proxy setting. For pypi, I was using nexus mirror to pypi, for opencv nothing worked. Until I connected to a different network.
In rocky linux 9 i resolved the error using command
dnf install mesa-libGLU
Use opencv-python-headless if you're using docker or in server environment.
I got the same issue on Ubuntu desktop, and none of the other solutions worked for me.
libGL.so.1 was correctly installed but for some reason Python wasn’t able to see it:
$ ldconfig -p | grep libGL.so.1
libGL.so.1 (libc6,x86-64) => /lib/x86_64-linux-gnu/libGL.so.1
The only solution that worked was to force it in LD_LIBRARY_PATH. Add the following in your ~/.bashrc then run source ~/.bashrc or restart your shell:
export LD_LIBRARY_PATH="/lib/x86_64-linux-gnu:$LD_LIBRARY_PATH"
I understand that LD_LIBRARY_PATH is bad but for me this is the only solution that works.

apache-airflow fails install

I'm trying to install apache-airflow the recommended way with pip install apache-airflow. During the install of pendulum (a dependency), I get an error:
error: can't copy 'pendulum/parsing': doesn't exist or not a regular file
I think it's related to Python distutils error: "[directory]... doesn't exist or not a regular file", but that doesn't give an answer as to how one resolves this when using pip. Pulling the tar for pendulum and installing using python setup.py install works, but then when subsequently I do pip install apache-airflow again, it sees that pendulum is already installed, UNINSTALLS, and then tries to install again using pip, resulting in the same error. I'm using a docker container and installing python-setuptools with apt-get before I do any of this. Here's my dockerfile, fwiw...
FROM phusion/baseimage:0.10.1
MAINTAINER a curious dev
RUN apt-get update && apt-get install -y python-setuptools python-pip python-dev libffi-dev libssl-dev zip wget
ENV SLUGIFY_USES_TEXT_UNIDECODE=yes
RUN wget https://files.pythonhosted.org/packages/5b/57/71fc910edcd937b72aa0ef51c8f5734fbd8c011fa1480fce881433847ec8/pendulum-2.0.4.tar.gz
RUN tar -xzvf pendulum-2.0.4.tar.gz
RUN cd pendulum-2.0.4/ && python setup.py install
RUN pip install apache-airflow
CMD airflow initdb && airflow webserver -p 8080
Does anyone see anything I'm doing wrong? I haven't found anyone else with this error so I think there's something really obvious I'm missing. Thanks for reading.
Upgrade pip first.
FROM phusion/baseimage:0.10.1
RUN apt-get update && apt-get install -y python-setuptools python-pip python-dev libffi-dev libssl-dev zip wget
ENV SLUGIFY_USES_TEXT_UNIDECODE=yes
RUN pip install -U pip
RUN pip install apache-airflow
CMD airflow initdb && airflow webserver -p 8080
seems to work fine for me.

Categories