Bazel cross compile of tensorflow for ARM fails - python

I am trying to build tensorflow to run on a Zynq, specifically, the Z7020. I have petalinux running on the board, and python 3.4.9. When trying to build tensorflow following the instructions found here:[https://www.tensorflow.org/install/install_raspbian#cross-compiling_from_sources]
Note that both petalinux and raspbian are both Debian derivatives and the Z7020 has the same CortexA9 cores as the raspberry-pi 0 and 1 series boards.
I am trying to build on an Ubuntu 16.04 host. The command I am using to build is:
sudo CI_DOCKER_EXTRA_PARAMS="-e CI_BUILD_PYTHON=python3 -e CROSSTOOL_PYTHON_INCLUDE=/home/rklein/Python-3.4.9/Include" tensorflow/tools/ci_build/ci_build.sh PI-PYTHON3 tensorflow/tools/ci_build/pi/build_raspberry_pi.sh PI_ONE
Bazel churns for about 2 hours and comes back with the following error message:
/home/rklein/tensorflow/bazel-ci_build-cache/.cache/bazel/_bazel_root/eab0--lots of hex digits--85e8/external/arm_compiler/bin/arm-linux-gnueablhf-gcc --lots of options
In file included from /usr/include/python2.7/Python.h:8:0, from ./tensorflow/python/lib/core/bfloat16.h:19,
from tensorflow/python/lib/core/bfloat16.h:18:
from /usr/include/python2.7/pyconfig.h:13:54:
fatal error: arm-linux-gnueabihf/python2.7/pyconfig.h: No such file or directory
#include <arm-linux-gnueabihf/python2.7/pyconfig.h>
^
compilation terminated.
What settings are needed to tell Bazel to use python3? Note that there is no /usr/include/python2.7 directory on the host machine, so I suspect that Basel is doing some voodoo behind the scenes. The command
find ~ -name python2.7
comes up empty.
I have tried to read up as much as I can on Bazel, but the documentation seems pretty lean - any good references would be appreciated.

I can't help you with your error message (or Bazel altogether). However I installed TensorFlow on an Xilinx Zynq Ultrascale+ with a Petalinux kernel and an Ubuntu (arm64) root filesystem. It's not the same exact chip (but the installation process should be similar). I didn't build TensorFlow myself, instead I used the packages provided by the tensorflow-on-arm project. Maybe my experience will be useful for other people to get TensorFlow running:
You need a working OS (Xilinx has documentation for that). Depending on your chip you need either a 32 (armhf) or 64 Bit (arm64) rootfs. I used an Ubuntu rootfs, so I could use apt-install.
You need to install some dependencies. I followed the instructions from the tensorflow-on-arm project.
apt-get install openjdk-8-jdk automake autoconf curl zip unzip libtool swig libpng12-dev zlib1g-dev pkg-config git g++ wget xz-utils
You also need Python (be sure to install Python v3.5 - not Python v3.6, etc.).
apt-get install python3-numpy python3-dev python3-pip python3-mock
I also needed to install two not listed packages.
apt-get install cython3 libhdf5-dev
Install some pip3 packages (you might want to install those in a virtual-environment and also update pip3).
pip3 install -U --user keras_applications==1.0.5 --no-deps
pip3 install -U --user keras_preprocessing==1.0.3 --no-deps
pip3 install -U --user numpy grpcio h5py
Now you should download the TensorFlow pip package. The different packages are listed under Releases. I chose TensorFlow v.1.12 for Python v3.5 and arm64 / aarch64.
wget https://github.com/lhelontra/tensorflow-on-arm/releases/download/v1.12.0/tensorflow-1.12.0-cp35-none-linux_aarch64.whl
Now you can install the package with pip3.
pip3 install -U --user tensorflow-1.12.0*
I hope it worked for you!

Related

Can't install Proj 8.0.0 for cartopy linux

I am trying to install Cartopy on Ubuntu and need to install proj v8.0.0 binaries for Cartopy. However when I try to apt-get install proj-bin I can only get proj v6.3.1. How do I install the latest (or at least v8.0.0) proj for cartopy?
I'm answering my own question here partly to help others with this problem, and partly as an archive for myself so I know how to fix this issue if I come across it again. I spent quite a while trying to figure it out, and wrote detailed instructions, so see below:
Installing cartopy is a huge pain, and I've found using conda to be a very bad idea (it has bricked itself and python along with it multiple times for me)
THIS INSTALLATION IS FOR LINUX.
Step 0. Update apt:
apt update
Step 1. Install GEOS:
Run the following command to install GEOS:
apt-get install libgeos-dev
In case that doesn't do it, install all files with this:
apt-get install libgeos-dev libgeos++-dev libgeos-3.8.0 libgeos-c1v5 libgeos-doc
Step 2. Install proj dependencies:
Install cmake:
apt install cmake
Install sqlite3:
apt install sqlite3
Install curl devlopment package:
apt install curl && apt-get install libcurl4-openssl-dev
Step 3. Install Proj
Trying apt-get just in case it works:
Unfortunately, cartopy requires proj v8.0.0 as a minimum, but if you install proj using apt you can only install proj v6.3.1
Just for reference in case anything changes, this is the command to install proj from apt:
apt-get install proj-bin
I'm fairly sure this is all you need, but in case it's not, this command will install the remaining proj files:
apt-get install proj-bin libproj-dev proj-data
To remove the above installation, run:
apt-get remove proj-bin
or:
apt-get remove proj-bin libproj-dev proj-data
Building Proj from source
So if the above commands don't work (it's not working as of 2022/4/8), then follow the below instructions to install proj from source:
Go to your install folder and download proj-9.0.0 (or any version with proj-x.x.x.tar.gz):
wget https://download.osgeo.org/proj/proj-9.0.0.tar.gz
Extract the tar.gz file:
tar -xf proj-9.0.0.tar.gz
cd into the folder:
cd proj-9.0.0
Make a build folder and cd into it:
mkdir build && cd build
Run (this may take a while):
cmake ..
cmake --build .
cmake --build . --target install
Run to make sure everything installed correctly:
ctest
The test command failed on one test for me (19 - nkg), but otherwise was fine.
You should find the required files in the ./bin directory
Finally:
Move binaries to the /bin directory:
cp ./bin/* /bin
As per Justino, you may also need to move the libraries:
cp ./lib/* /lib
Now after all this, you can finally install cartopy with pip:
pip install cartopy
After doing this, my cartopy still wasn't working. I went home to work on this next week, came back, and all of a sudden it was working so maybe try restarting
The libraries should be copied manually
sudo cp ./lib/* /lib
This works for me

How to compile and install python3.9.6 on unbuntu

I'm trying to install python3.9.6 on ubuntu
apt only had python3.8
so I tried this https://tecadmin.net/how-to-install-python-3-9-on-ubuntu-18-04/
but it installed python3.9.5,
next, I tried to compile and build python but it didn't install pip so I had to install zlib and spend like 5 days trying to make it work, and it did work and I was able to install both python2.7.18 and 3.9.6 with pip but it didn't install the SSL module so I had to install that and bla bla...
it worked fine after installing openssl but when I tried to install scapy it showed an error message, after some research I found out that the error was caused by outdated SSL module
I figured that compiling and building python had too many problems it didn't installed all the packages for tools like pip.
if I spend some more time I think could fix this but I'm worried that this kind of problem
could happen again in the future,
I'm really desperate, so if you got any ideas please let me know.
1. Update your local repositories
sudo apt update
2. Install supporting software (installing from source requires additional tools)
sudo apt install build-essential zlib1g-dev libncurses5-dev \
libgdbm-dev libnss3-dev libssl-dev libreadline-dev libffi-dev wget
3. Download the latest version of Python Source Code
You might want to do this in a separate directory (like /tmp)
wget https://www.python.org/ftp/python/3.9.6/Python-3.9.6.tgz
4. Extract downloaded files
tar -xf Python-3.9.6.tgz
5. Test system and optimize python
cd Python-3.9.6
./configure --enable-optimizations
This might take a bit of time to complete
6a. Install a second instance of Python (highly recommended)
sudo make altinstall
It is recommended that you use the altinstall method. Your Ubuntu system may have software packages dependent on Python2.x/3.x.
6b. Overwrite default python installation (not recommended!!!)
sudo make install
7. Verify Python installation
python3 --version
# or
python3.6 --version

ImportError: libGL.so.1: cannot open shared object file: No such file or directory

I am trying to run cv2, but when I try to import it, I get the following error:
ImportError: libGL.so.1: cannot open shared object file: No such file or directory
The suggested solution online is installing
apt install libgl1-mesa-glx
but this is already installed and the latest version.
NB: I am actually running this on Docker, and I am not able to check the OpenCV version. I tried importing matplotlib and that imports fine.
Add the following lines to your Dockerfile:
RUN apt-get update && apt-get install ffmpeg libsm6 libxext6 -y
These commands install the cv2 dependencies that are normally present on the local machine, but might be missing in your Docker container causing the issue.
[minor update on 20 Jan 2022: as Docker recommends, never put RUN apt-get update alone, causing cache issue]
Even though the above solutions work. But their package sizes are quite big.
libGL.so.1 is provided by package libgl1. So the following code is sufficient.
apt-get update && apt-get install libgl1
This is a little bit better solution in my opinion. Package python3-opencv includes all system dependencies of OpenCV.
RUN apt-get update && apt-get install -y python3-opencv
RUN pip install opencv-python
Try installing opencv-python-headless python dependency instead of opencv-python. That includes a precompiled binary wheel with no external dependencies (other than numpy), and is intended for headless environments like Docker. This saved almost 700mb in my docker image compared with using the python3-opencv Debian package (with all its dependencies).
The package documentation discusses this and the related (more expansive) opencv-contrib-python-headless pypi package.
Example reproducing the ImportError in the question
# docker run -it python:3.9-slim bash -c "pip -q install opencv-python; python -c 'import cv2'"
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/usr/local/lib/python3.9/site-packages/cv2/__init__.py", line 5, in <module>
from .cv2 import *
ImportError: libGL.so.1: cannot open shared object file: No such file or directory
# docker run -it python:3.9-slim bash -c "pip -q install opencv-python-headless; python -c 'import cv2'"
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
For me, the only WA that worked is following:
# These are for libGL.so issues
# RUN apt-get update
# RUN apt install libgl1-mesa-glx
# RUN apt-get install -y python3-opencv
# RUN pip3 install opencv-python
RUN pip3 install opencv-python-headless==4.5.3.56
If you're on CentOS, RHEL, Fedora, or other linux distros that use yum, you'll want:
sudo yum install mesa-libGL -y
In my case it was enough to do the following which also saves space in comparison to above solutions
RUN apt-get update && apt-get install -y --no-install-recommends \
libgl1 \
libglib2.0-0 \
Put this in the Dockerfile
RUN apt-get update
RUN apt install -y libgl1-mesa-glx
Before the line
COPY requirements.txt requirements.txt
For example
......
RUN apt-get update
RUN apt install -y libgl1-mesa-glx
COPY requirements.txt requirements.txt
......
I was getting the same error when I was trying to use OpenCV in the GCP Appengine Flex server environment. Replacing "opencv-python" by "opencv-python-headless" in the requirements.txt solved the problem.
The OpenCV documentation talks about different packages for desktop vs. Server (headless) environments.
I met this problem while using cv2 in a docker container. I fixed it by:
pip install opencv-contrib-python
install opencv-contrib-python rather than opencv-python.
Here is the solution you need:
pip install -U opencv-python
apt update && apt install -y libsm6 libxext6 ffmpeg libfontconfig1 libxrender1 libgl1-mesa-glx
had the same issue on centos 8 after using pip3 install opencv on a non gui server which is lacking all sorts of graphics libraries.
dnf install opencv
pulls in all needed dependencies.
"installing opencv-python-headless instead of opencv-python"
this works in my case!
I was deploying my website to Azure and pop up this exception:
ImportError: libGL.so.1: cannot open shared object file: No such file or directory
then I uninstall the opencv-python package, install the later one,
freeze the requirements and then deploy it again,
then problem solved.
For a raspberry pi, put this , work for me :
sudo apt-get install ffmpeg libsm6 libxext6 -y
For me, the problem was related to proxy setting. For pypi, I was using nexus mirror to pypi, for opencv nothing worked. Until I connected to a different network.
In rocky linux 9 i resolved the error using command
dnf install mesa-libGLU
Use opencv-python-headless if you're using docker or in server environment.
I got the same issue on Ubuntu desktop, and none of the other solutions worked for me.
libGL.so.1 was correctly installed but for some reason Python wasn’t able to see it:
$ ldconfig -p | grep libGL.so.1
libGL.so.1 (libc6,x86-64) => /lib/x86_64-linux-gnu/libGL.so.1
The only solution that worked was to force it in LD_LIBRARY_PATH. Add the following in your ~/.bashrc then run source ~/.bashrc or restart your shell:
export LD_LIBRARY_PATH="/lib/x86_64-linux-gnu:$LD_LIBRARY_PATH"
I understand that LD_LIBRARY_PATH is bad but for me this is the only solution that works.

Can one use python 3.5 in a docker container based out of the TensorFlow docker image?

I was trying to use python 3.5 with my docker container. I tried:
gcr.io/tensorflow/tensorflow:latest-devel
and
gcr.io/tensorflow/tensorflow:latest-devel-py3
but it seems that both images only have python version up to 3.4. Is it possible to have as base image the docker container but also have python 3.5? Or even better, is it possible to have the base image from the official tensorflow image have python 3.5 itself?
I know its possible to pip install it in the Dockerfile as in (as shown in the tf download page):
RUN export TF_BINARY_URL=https://storage.googleapis.com/tensorflow/linux/cpu/tensorflow-0.12.1-cp35-cp35m-linux_x86_64.whl
pip3 install --upgrade $TF_BINARY_URL
however that seems that would not get me the latest tensorflow version. If one can pip install the most recent TensorFlow version why does the latest base image not a way to get the most recent TensorFlow build and have it in python3.5?
I have definitively tried installing python 3.5 as suggested by here however, even though the installation of python 3.5 is successful, it breaks numpy in a way I can't fix (as explained here). Honestly, the best solution would be to just have python 3.5 automatically available on the image but for some reason its not there. I have done some research on this and it seems to install python 3.5 its a little difficult. Why is that? Is the reason python 3.5 is missing is because of tensorflow or because of ubuntu? My ideal solution would be to not have me install python 3.5 and that it comes, but it seems there might be a fundamental issue with this. What is it? Is it just because it has not been installed for tensorflow docker image and ubuntu, or am I over complicated a simple problem?
as another solution, I was thinking maybe to install anaconda or something and then do that, but I wanted to have tensorflow as my base image and it seems anaconda suggests to have their image as base. Since there isn't an easy way to install anaconda with apt-install I am still working to see how I can programatically install anaconda so that there can be a tensorflow image as base and then install as instructed in a Dockerfile, some version of anaconda.
There is now a git issue ine official tensorflow for this:
https://github.com/tensorflow/tensorflow/issues/7368
I mentioned that one can just install TensorFlow in the DockerFile directly so here is an example docker file that worked for me:
RUN apt-get update && apt-get install -y build-essential git libjpeg-dev
RUN apt-get install -y vim
# get wget
RUN apt-get install wget
# install python 3.5
RUN add-apt-repository -y ppa:fkrull/deadsnakes
RUN apt-get -y update
RUN apt-get -y install python3.5
RUN wget https://bootstrap.pypa.io/get-pip.py
RUN python3.5 get-pip.py
RUN python3.5 -m pip install -U numpy
#Install some stuff my lib needs
RUN python3.5 -m pip install -U numpy
RUN python3.5 -m pip install -U namespaces
RUN python3.5 -m pip install -U scikit-learn
RUN python3.5 -m pip install -U scipy
RUN python3.5 -m pip install -U pdb
RUN python3.5 -m pip install -U keras
#
#export TF_BINARY_URL=https://storage.googleapis.com/tensorflow/linux/cpu/tensorflow-0.12.1-cp35-cp35m-linux_x86_64.whl
RUN python3.5 -m pip install https://storage.googleapis.com/tensorflow/linux/cpu/tensorflow-0.12.1-cp35-cp35m-linux_x86_64.whl
I think the only interesting thing to note is that I installed pip directly because that package/intallation of python3.5 doesn't come with pip for it for some reason. That lead to me to install python packages to use:
python3.5 -m pip install
instead of
pip3
you can see more of those details here: How does one install/fix a failed numpy installation that works on python 3.4 but not in 3.5?
Also note that I had issues installing python the "official way" (i.e. with apt-get or something like that) so I resorted to what the following question/answer suggested: https://askubuntu.com/questions/682869/how-do-i-install-newer-python-versions-using-apt-get
With a plain 'ubuntu' docker image from DockerHub and relying on pip to do its own dependency resolution (I wonder if not doing that is what causes your numpy problems) I ran:
apt-get install python3
apt-get install python3-pip
pip install tensorflow
As far as I can tell, this gave me python3.5 with the latest tensorflow - like their docker image but with python3.5.
To me, the provided docker image is something that's intended to work 'as is' and presumably the bits and pieces provided are intended to convey the developer's higher confidence that they all work correctly together. If you need to make substantial changes it seems easier and simpler to just start from scratch.
Yes. Python 3 docker images are available on dockerhub nightly builds.
CPU only
docker pull tensorflow/tensorflow:nightly-py3
with GPU support
docker pull tensorflow/tensorflow:nightly-gpu-py3
https://hub.docker.com/r/tensorflow/tensorflow/tags/
https://github.com/tensorflow/tensorflow/issues/3467
I have tried to pull in a python:3.5 image and install tensorflow basing on this image, which works.
what I have in the dockerfile:
FROM python:3.5
RUN pip install tensorflow

I can't install python-ldap

When I run the following command:
sudo pip install python-ldap
I get this error:
In file included from Modules/LDAPObject.c:9:
Modules/errors.h:8: fatal error: lber.h: No such file or directory
Any ideas how to fix this?
The python-ldap is based on OpenLDAP, so you need to have the development files (headers) in order to compile the Python module. If you're on Ubuntu, the package is called libldap2-dev.
Debian/Ubuntu:
sudo apt-get install libsasl2-dev python-dev libldap2-dev libssl-dev
RedHat/CentOS:
sudo yum install python-devel openldap-devel
To install python-ldap successfully with pip, following development libraries are needed (package names taken from ubuntu environment):
sudo apt-get install -y python-dev libldap2-dev libsasl2-dev libssl-dev
On CentOS/RHEL 6, you need to install:
sudo yum install python-devel
sudo yum install openldap-devel
and yum will also install cyrus-sasl-devel as a dependency. Then you can run:
pip-2.7 install python-ldap
"Don't blindly remove/install software"
In a Ubuntu or Debian based distro, you can use apt-file to find the name of the exact package that includes the missing header file.
# do this once
sudo apt-get install apt-file
sudo apt-file update
$ apt-file search lber.h
libldap2-dev: /usr/include/lber.h
As you could see from the output of apt-file search lber.h, you'd just need to install the package libldap2-dev.
sudo apt-get install libldap2-dev
In Ubuntu it looks like this :
$ sudo apt-get install python-dev libldap2-dev libsasl2-dev libssl-dev
$ sudo pip install python-ldap
Windows: I completely agree with the accepted answer, but digging through the comments took a while to get to the meat of what I needed. I ran across this specific problem with Reviewboard on Windows using the Bitnami. To give an answer for windows then, I used this link mentioned in the comments:
http://www.lfd.uci.edu/~gohlke/pythonlibs/#python-ldap
placed that wheel (whl file) into my reviewboard install directory
Then, executed the following commands
easy_install pip
pip install python_ldap-2.4.20-cp27-none_win32.whl
(because I had python 2.7 and a 32bit install at that)
easy_install python-ldap
For those having the same issue of missing Iber.h on Alpine Linux, in a docker image that you are trying to adapt to Alpine for instance.
The package you are looking for is: openldap-dev
So run
apk add openldap-dev
Available from version 3.3 up to Edge
Available for both armhf and x86_64 Architectures.
On Fedora 22, you need to do this instead:
sudo dnf install python-devel
sudo dnf install openldap-devel
On openSUSE you need to install the packages openldap2-devel, cyrus-sasl-devel, python-devel and libopenssl-devel.
zypper install openldap2-devel cyrus-sasl-devel python-devel libopenssl-devel
python3 does not support python-ldap. Rather to install ldap3.
For alpine docker
apk add openldap-dev
if the python version is 3 and above try
pip install python3-ldap
I had problems with the installation on Windows, so one of the solutions is to install the ldap package manually.
A few steps:
Go to the page pyldap or/and python-ldap and download the latest version *whl.
Open a console then cd to where you've downloaded your file like some-package.whl and use:
pip install some-package.whl
The current version for pyldap is 2.4.45. On a concrete example the installation would be:
pip install .\pyldap-2.4.45-cp37-cp37m-win_amd64.whl
# or
pip install .\python_ldap‑3.3.1‑cp39‑cp39‑win_amd64.whl
Output:
Installing collected packages: pyldap
Successfully installed pyldap-2.4.45
EDIT
You can install the proper version for Python-3.X though using following command:
# if pip3 is the default pip alias for python-3
pip3 install python3-ldap
# otherwise
pip install python3-ldap
Also here is the link of PiPy package for further information: python3-ldap 0.9.8.4
OR
ldap3 is a strictly RFC 4510 conforming LDAP V3 pure Python client library. The same codebase runs in Python 2, Python 3, PyPy and PyPy3: https://github.com/cannatag/ldap3
pip install ldap3
from ldap3 import Server, Connection, SAFE_SYNC
server = Server('my_server')
conn = Connection(server, 'my_user', 'my_password', client_strategy=SAFE_SYNC, auto_bind=True)
status, result, response, _ = conn.search('o=test', '(objectclass=*)')
# usually you don't need the original request (4th element of the returned tuple)
For most systems, the build requirements are now mentioned in python-ldap's documentation, in the "Installing" section.
If anything is missing for your system (or your system is missing entirely), please let maintainer know!
(As of 2018, I am the maintainer, so a comment here should be enough. Or you can send a pull request or mail.)
To correct the error due to dependencies to install the python-ldap : Windows 7/10
download the whl file
http://www.lfd.uci.edu/~gohlke/pythonlibs/#python-ldap.
python 3.6 suit with
python_ldap-3.2.0-cp36-cp36m-win_amd64.whl
Deploy the file in :
c:\python36\Scripts\
install it with
python -m pip install python_ldap-3.2.0-cp36-cp36m-win_amd64.whl
sudo apt-get install build-essential python3-dev python2.7-dev libldap2-dev libsasl2-dev slapd ldap-utils python-tox lcov valgrind
Debian Reference :
https://www.python-ldap.org/en/latest/installing.html#debian
For others: https://www.python-ldap.org/en/latest/installing.html
On OSX, you need the xcode CLI tools. Just open a terminal and run:
xcode-select --install
For ArchLinux/Manjaro for me helped the following command:
yay libldap24
As of december 2021 there was/is a strange problem with the ldap library (at least in arch/manjaro).
While installing python-ldap (at 'Building wheel for python-ldap') I got the message 'ERROR: Failed building wheel for python-ldap':
/usr/bin/ld: cannot find -lldap_r
collect2: error: ld returned 1 exit status
error: command '/usr/bin/gcc' failed with exit code 1
a workaround is provided here: https://github.com/python-ldap/python-ldap/issues/432#issuecomment-974799221
I cite:
As a workaround create the file /usr/lib64/libldap_r.so with content
INPUT ( libldap.so ). The approach works on all systems that use a GNU
ld-compatible linker.
# cat > /usr/lib64/libldap_r.so << EOF
INPUT ( libldap.so )
EOF
In FreeBSD 11:
pkg install openldap-client # for lber.h
pkg install cyrus-sasl # if you need sasl.h
pip install python-ldap
As a general solution to install Python packages with binary dependencies [1] on Debian/Ubuntu:
sudo apt-get build-dep python-ldap
# installs system dependencies (but not the package itself)
pew workon my_virtualenv # enter your virtualenv
pip install python-ldap
You'll have to check the name of your Python package on Ubuntu versus PyPI. In this case they're the same.
Obviously doesn't work if the Python package is not in the Ubuntu repos.
[1] I learnt this trick when trying to pip install matplotlib on Ubuntu.
If you're working with windows machines, you can find 'python-ldap' wheel in this Link and then you can install it
for those who are using alphine linux,
apk add openldap-dev
try:
ARCHFLAGS="-arch x86_64" pip3 install python-ldap
Adding also libzbar-dev solved for me the installation of python-ldap when building DOCKER
The full command becomes:
apt-get install -y python-dev libldap2-dev libsasl2-dev libssl-dev libzbar-dev
A hack answer for FreeBSD 13.1 (yes, I know this is deep South of best practices, but I just needed a quick fix):
pkg install openldap24-client
cd /usr/local/include/python3.9
ln -s ../<all of the below> .
lber.h
lber_types.h
ldap.h
ldap_cdefs.h
ldap_features.h
ldap_schema.h
ldap_utf8.h
openldap.h
sasl
pip install python-ldap

Categories