Lightgbm inside docker libgomp.so.1: cannot open shared object file - python

I have LightGBM installed in my mac and tested earlier for a different project.
Now I am inside a docker with python 3.6 on my mac. As soon as I add import lightgbm as lgbm in my Flask application, I get error
OSError: libgomp.so.1: cannot open shared object file: No such file or directory
What is going on? Can anyone please suggest?

This worked for me, include it in your dockerfile
RUN apt-get update && apt-get install -y --no-install-recommends apt-utils
RUN apt-get -y install curl
RUN apt-get install libgomp1
Source: https://github.com/microsoft/LightGBM/issues/2223#issuecomment-499788066

Depending on the image yo use you may need a c++ compiler, together with libgomp1 too. The issue is that Lightgbm is coded in c++ indeed and the base image of your dockerfile may not have all dependencies installed by default (while your mac has).
Following these links
( https://raw.githubusercontent.com/Microsoft/LightGBM/master/docker/dockerfile-cli)
(https://github.com/microsoft/LightGBM/issues/2223)
the solution would be to add to the dockerfile
RUN apt-get update && \
apt-get install -y --no-install-recommends \
ca-certificates \
cmake \
build-essential \
gcc \
g++ \
git && \
rm -rf /var/lib/apt/lists/* && \
apt-get install libgomp1 -y

Related

Issue with the installation of Python 3.8 with docker file

I am trying to update my CI Pipeline for git lab, but my pipeline keeps on failing because the docker in docker of my runner fails to install python 3.8.
In my Docker file I am running the following commands
FROM ubuntu:latest
ENV http_proxy $HTTPS_PROXY
ENV https_proxy $HTTPS_PROXY
RUN apt-get update && apt-get install -y \
python3.8 \
python3-pip \
&& rm -rf /var/lib/apt/lists/*
but my pipeline fails giving me the following error
Package python3.8 is not available, but is referred to by another package. This may mean that the package is missing, has been obsoleted, or is only available from another source
E: Package 'python3.8' has no installation candidate
error building image: error building stage: failed to execute command: waiting for process to exit: exit status 100
In many suggestions I have found the using the apt-get update command should solve the problem however that is not working for me.
Latest Ubunt repos don't contain old Python versions by default.
You can either try using a newer Python version or adding the deadsnakes repo with something like this:
FROM ubuntu:latest
ENV http_proxy $HTTPS_PROXY
ENV https_proxy $HTTPS_PROXY
RUN apt-get install -y software-properties-common && sudo add-apt-repository ppa:deadsnakes/ppa && apt-get update && apt-get install -y \
python3.8 \
python3-pip \
&& rm -rf /var/lib/apt/lists/*
You may also need to apt update before installing the software-properties-common package.
As an alternative you could always consider using one of the official python docker images, rather than installing python on top of an ubuntu image yourself.
python:3.8-buster or python:3.8-slim-buster may be close enough to what you need?

Install pip with specefic python version

I am building a ubuntu docker image that is going to run my python application, and I have some libraries that require python <= 3.6 to work otherwise it will throw errors.
My problem is that when I install pip, it will always automatically use python 3.8, and I'm not sure how to let pip use an older version of python, this is the installation in my Dockerfile
RUN apt-get update && \
apt-get upgrade -y && \
apt-get install -y software-properties-common && \
add-apt-repository ppa:deadsnakes/ppa && \
apt-add-repository universe && \
apt-get update && \
apt-get install -y \
libmysqlclient-dev \
netcat \
python3 \
python-dev \
build-essential \
python3-setuptools \
python3-pip \
supervisor && \
pip install -U pip setuptools && \
rm -rf /var/lib/apt/lists/*
I tried to change python3-pip by just python-pip but when I run it it gives me the following error
E: Unable to locate package python-pip
I've tried a lot of solutions but always the same problem
Outside of Docker, if python3.6 is the python you need, you can do:
python3.6 -m pip install
In Docker right now obviously python3 is pointing to Python 3.8 so you must first install python3.6 and find out how to call it (python3.6 or python3). You might need to compile it from source and probably create some symbolic link. This can get very ugly to do inside a Docker, but you can try to write a shell script with all commands and to run the shell script inside a Docker. Or if you are lucky you may find a ready Python3.6 Docker package that works for you and apt-get install it instead of python3 the same way as you do now.

Removing perl from ubuntu docker image cause pyodbc to fail

Running into an expected issue trying to prepare an ubuntu 20.04 based image with python and pyodbc.
FROM ubuntu:20.04
# install mssql odbc driver
RUN apt-get update && apt-get upgrade -y && apt-get install -y curl gnupg build-essential
RUN curl https://packages.microsoft.com/keys/microsoft.asc | apt-key add - \
&& curl https://packages.microsoft.com/config/ubuntu/20.04/prod.list > /etc/apt/sources.list.d/mssql-release.list
RUN apt-get update && ACCEPT_EULA=Y apt-get install -y msodbcsql17 unixodbc-dev
# install python 3.7.9 from source
RUN apt-get install -y python3 python3-pip
# clean up
# this does not work
RUN apt-get remove -y perl curl gnupg && apt-get autoremove -y
# this works
# RUN apt-get remove -y curl gnupg && apt-get autoremove -y
RUN pip3 install pyodbc
If perl is not removed, the installation of pyodbc is uneventful, but if perl is removed, the following error is displayed:
src/pyodbc.h:56:10: fatal error: sql.h: No such file or directory
As if the unixodbc-dev is also removed for some reason. Has anyone run into this before? If perl is required, wouldn't apt-get prevent it from being deleted? Or I need to install a different set of c-bindings to make this work.
Also running apt-get install -f after installing msodbcsql17 doesn't help either.
Thanks.
unixodbc-dev was installed as a transitive dependency and was automatically removed when no longer needed, i.e. after perl was removed. You need to install it explicitly:
RUN apt-get install -y unixodbc-dev
See the following bug report for details: https://github.com/mkleehammer/pyodbc/issues/441

Couldn't find any package by regex in python:3.8.3 docker image

I'm new to docker and I created a docker image and this is how my docker file looks like.
FROM python:3.8.3
RUN apt-get update \
&& apt-get install -y --no-install-recommends \
postgresql-client \
&& rm -rf /var/lib/apt/lists/* \
&& apt-get install -y gcc libtool-ltdl-devel xmlsec1-1.2.20 xmlsec1-devel-1.2.20 xmlsec1 openssl-
1.2.20 xmlsec1-openssl-devel-1.2.20 \
&& apt-get -y install curl gnupg \
&& curl -sL https://deb.nodesource.com/setup_14.x | bash - \
&& apt-get -y install nodejs
WORKDIR /app/
COPY . /app
RUN pip install -r production_requirements.txt \
&& front_end/noa-frontend/npm install
This image is used in docker-compose.yml's app service. So when I run the docker-compose build, I'm getting the below error saying it couldn't find the package. Those are few dependencies which I want to install in order to install a python package.
In the beginning, I've run the apt-get update to update the package lists.
Can anyone please help me with this issue.
Updated Dockerfile
FROM python:3.8.3
RUN apt-get update
RUN apt-get install -y postgresql-client\
&& apt-get install -y gcc libtool-ltdl-devel xmlsec1-1.2.20 xmlsec1-
devel-1.2.20 xmlsec1 openssl-1.2.20 xmlsec1-openssl-devel-1.2.20 \
&& apt-get -y install curl gnupg \
&& curl -sL https://deb.nodesource.com/setup_14.x | bash - \
&& apt-get -y install nodejs
WORKDIR /app/
COPY . /app
RUN pip install -r production_requirements.txt \
&& front_end/noa-frontend/npm install
You are trying to use apt-get install after doing rm -rf /var/lib/apt/lists/*. That is guaranteed not to end well. Try removing the rm command initially to see if that helps. If you really need to keep the size of the image down then put the rm command as the very last command in the run statement.
If you really want to reduce your image size then try switching to using python:3.8-slim or python:3.8-alpine. Alpine is a different OS to the default of Ubuntu, but its package manager can be told not to cache files locally. eg.
FROM python:3.8-alpine
RUN apk add --no-cache postgresql-client
RUN apk add --no-cache gcc libtool-ltdl-devel xmlsec1-1.2.20 xmlsec1-devel-1.2.20 xmlsec1 \
openssl-1.2.20 xmlsec1-openssl-devel-1.2.20
RUN apk add --no-cache curl gnupg
RUN apk add --no-cache nodejs
RUN curl -sL https://deb.nodesource.com/setup_14.x | bash -
WORKDIR /app/
COPY . /app
RUN pip install -r production_requirements.txt \
&& front_end/noa-frontend/npm install
Certain bits of software might be available under different package names, so you'll have to check that out.
The instruction rm -rf /var/lib/apt/lists/* is more or less negating apt-get update. APT is no longer able to access the list of available packages after that. Move the rm to the end (and perhaps consider using the safer apt-get clean all).

How to install xapian with Python 3.6 on Ubuntu 16.04?

I installed Python 3.6 on Ubuntu 16.04 on Docker using the ppa:jonathonf/python-3.6 repository. Now I'd like to install xapian so I can use it with Python. I have not found any ready-made packages, so I am trying to build it from sources. I set PYTHON3 and PYTHON3_LIB parameters to point to Python 3.6. During the build process I get the following error:
ImportError: libxapian.so.30: cannot open shared object file: No such file or directory
I tried xapian versions 1.3.7 and 1.4.5 without luck.
How can I install xapian?
Here's a Dockerfile to reproduce my error:
FROM ubuntu:16.04
RUN apt-get update \
&& apt-get install -y software-properties-common python-software-properties
RUN add-apt-repository ppa:jonathonf/python-3.6
RUN apt-get update \
&& apt-get install -y python3-pip docker.io python3.6 python3.6-dev software-properties-common \
python-software-properties build-essential wget unzip cmake python3-sphinx \
&& cd /usr/local/bin \
&& ln -s /usr/bin/python3.6 python
RUN python -m pip install --upgrade pip
# install xapian 1.4.5
RUN apt-get update && apt-get install -y curl uuid-dev zlib1g-dev
WORKDIR /root
RUN curl --silent --show-error --fail --next -O https://oligarchy.co.uk/xapian/1.4.5/xapian-core-1.4.5.tar.xz
RUN curl --silent --show-error --fail --next -O https://oligarchy.co.uk/xapian/1.4.5/xapian-bindings-1.4.5.tar.xz
RUN tar xvf xapian-core-1.4.5.tar.xz
RUN tar xvf xapian-bindings-1.4.5.tar.xz
WORKDIR /root/xapian-core-1.4.5
RUN ./configure && make && make install
WORKDIR /root/xapian-bindings-1.4.5
RUN ./configure PYTHON3=/usr/bin/python3.6 PYTHON3_LIB=/usr/lib/python3.6 --with-python3 && make && make install
RUN python -c "import xapian"
The problem is that the Xapian library (libxapian.so.30) is being installed into /usr/local/lib by default, but Ubuntu doesn't know that it's been put there yet. You can tell it by adding:
RUN ldconfig
after installing the core (so before you change WORKDIR to build the bindings).
There's some helpful information about ldconfig and library search paths on Ubuntu in the answers to this Unix Stackexchange question.

Categories