LINK : fatal error LNK1181: cannot open input file 'lapack.lib' - python

LINK : fatal error LNK1181: cannot open input file 'lapack.lib'
error: command 'C:\\Program Files (x86)\\Microsoft Visual Studio\\2022\\BuildTools\\VC\\Tools\\MSVC\\14.34.31933\\bin\\HostX86\\x64\\link.exe' failed with exit code 1181
I'm getting this error while installing cvxopt. I've set the directory for lapack lib in the setup.py file inside the cvxopt folder. However running the command
python setup.py install
generates this error. Can anyone help me with this?

If you use pip install cvxopt it will use a pre-built binary wheel package already.
If you want to build from source (as you are currently doing) then you need to have BLAS and LAPACK installed on your system
https://cvxopt.org/install/#building-and-installing-from-source
Required and optional software
The package requires version 3.x of Python, and building from source requires core binaries and header files and libraries for Python.
The installation requires BLAS and LAPACK. Using an architecture optimized implementation such as ATLAS, OpenBLAS, or MKL is recommended and gives a large performance improvement over reference implementations of the BLAS and LAPACK libraries.
The installation also requires SuiteSparse. We recommend linking against a shared SuiteSparse library. It is also possible to build the required components of SuiteSparse when building CVXOPT, but this requires the SuiteSparse source which is no longer included with CVXOPT and must be downloaded separately.
The following software libraries are optional.
The GNU Scientific Library GSL.
FFTW is a C library for discrete Fourier transforms.
GLPK is a linear programming package.
MOSEK version 9 is a commercial library of convex optimization solvers.
DSDP5.8 is a semidefinite programming solver.

Related

Importing Numpy fails after building from source against amd blis

I'm trying to build a local version of Numpy from source against BLIS (for BLAS and CBLAS) and against OpenBLAS for LAPACK.
I started with building BLIS locally for zen3 with CBLAS enabled, like so:
./configure --enable-threading=openmp --enable-cblas --prefix=$HOME/blis zen3
then ran the tests (which all passed) and ran make install. I made sure all relevant files are in the $HOME/blis library (see attached screenshot).
I also built openBLAS locally, no special configs there.
Afterwards, I modified numpy's site.cfg to configure openBLAS and blis folders' accordingly:
[blis]
libraries = blis
library_dirs = /home/or/blis/lib/
include_dirs = /home/or/blis/include/blis
runtime_library_dirs = /home/or/blis/lib/
[openblas]
libraries = openblas
library_dirs = /opt/OpenBLAS/lib
include_dirs = /opt/OpenBLAS/include
runtime_library_dirs = /opt/OpenBLAS/lib
I continued by building and installing numpy with:
NPY_BLAS_ORDER=blis NPY_LAPACK_ORDER=openblas NPY_CBLAS_LIBS= python ./numpy/setup.py build -j 32
Note that NPY_CBLAS_LIBS is empty as numpy's build docs say to do so if CBLAS is included in the BLIS library, which it is.
Then, importing numpy resulting in:
Original error was: /home/or/.pyenv/versions/3.9.6/lib/python3.9/site-packages/numpy-1.24.0.dev0+998.g6a5086c9b-py3.9-linux-x86_64.egg/numpy/core/_multiarray_umath.cpython-39-x86_64-linux-gnu.so: undefined symbol: cblas_sgemm
I'm clueless at this point as I couldn't find anything online about this specific case.
Installing numpy from pip (which comes built with openblas) can be imported successfully.
Update 1:
While reading make install logs, I found out that it couldn't find my BLIS library files at the location, even though the files are in the specified path. I also tried to recompile and install BLIS in various paths and reconfigure numpy before compiling it, but got the same result.
When I downloaded a pre-compiled version of BLIS from AMD's website, numpy seems to get it, but this isn't the recommended way to go because I'm missing optimizations for Zen3.

OpenCV won't install to python

I am trying to install OpenCV 4.5.5 with CUDA support. I installed the source from their GitHub and the additional OpenCV contrib and built it with CMAKE. I checked BUILD_opencv_python3 when it was done configuring and generating it showed that it had detected python3 and had installed cv2 in site-packages.
But cv2 isn't in the site-packages.
So it doesn't load in python. But it does give an interesting error
What should I do?
I already tried doing a fresh build with CMAKE but it didn't work.
After building OpenCV it generates a PYD file in the lib folder. I don't know if that is helpful or not.
Thanks
You must have missed a step in the process.
"Configure" step in cmake-gui, investigates the environment (various paths, e.g. where python wants packages to go) and prepares information for the build
"Generate" step in cmake-gui, generates the actual build files (VS Solution)
open the .sln in Visual Studio
build the ALL_BUILD target
build the INSTALL target, and this installs the files, also the python package

linking ipopt against openblas

Currently, I am trying to build Ipopt linking against openblas. I downloaded the openblas source and did make in the parent directory.
The configure script of Ipopt has several options to link against blas:
I tried ./configure --with-blas="-L/home/moritz/build/CoinIpopt_test/ThirdParty/OpenBLAS-0.2.14/libopenblas.so"
but I do get the error
checking whether user supplied BLASLIB="-L/home/moritz/build/CoinIpopt_test/ThirdParty/OpenBLAS-0.2.14/libopenblas.so" works... no
configure: error: user supplied BLAS library "-L/home/moritz/build/CoinIpopt_test/ThirdParty/OpenBLAS-0.2.14/libopenblas.so" does not work
Any tips how to achieve what I want ? Finally, I would like to make a conda package. I do have installed openblas with anaconda. But I do get the same error message if I link against the installed libopenblas.so
Managed to get it work. I had to install openblas to a directory of my choice by
make install PREFIX=/home/....../
aferwards I compiled Ipopt using
./configure --with-blas-incdir="-I/home/.../openblas/include/" --with-blas-lib="-L/home/.../openblas/lib/"

Installing lxml for Python 3.4 on Windows x 86 (32 bit) with Visual Studio C++ 2010 Express

Related
Related questions:
error: Unable to find vcvarsall.bat
LXML 3.3 with Python 3.3 on windows 7 32-bit
Related answers:
https://stackoverflow.com/a/18045219/1175496
Related comments:
Building lxml for Python 2.7 on Windows
"#ziyuang This would mean you use Python 3.3 which uses Microsoft Visual Studio 2010. If that's the case then the answer is yes, you should install this version."
Facts
Windows x86 (32-bit)
Installed both Visual Studio C++ 2008 (from here) Express and Visual Studio C++ 2010 (from here)
Python 3.4.1 (apparently compiled with newer version than Visual Studio 2008)
I use pip (or pip3.4.exe; builtin to Python 3.4) to pip install lxml
distutils uses Visual Studio C++ 2010 Express to compile
The last few lines of my error, logged by pip:
cl : Command line warning D9025 : overriding '/W3' with '/w'
lxml.etree.c
C:\Users\NATHAN~1\AppData\Local\Temp\pip_build_nathanielanderson\lxml\src\lxml\includes\etree_defs.h(9)
: fatal error C1083: Cannot open include file: 'libxml/xmlversion.h':
No such file or directory
C:\Python34\lib\distutils\dist.py:260: UserWarning: Unknown
distribution option: 'bugtrack_url'
warnings.warn(msg)
error: command 'C:\Program Files\Microsoft Visual Studio
10.0\VC\BIN\cl.exe' failed with exit status 2
So I can't install from the .egg or by compiling...
Other Options
I also can't find Windows installer (exe or msi or whatever) for this version of Python
Not here at PyPi
Nor here at Chris' great site
Update 10/16/2019
As commenter says, the executable links are no longer available
This archived version (from 2014; executable links don't work) shows the old links
As commenter says; the site currently has whl (wheel) files for the lxml library; you can use pip to install from whl files
Or check out this set of links to executable files on pypi.org; the executables only range from 2.6, 2.7, 3.2, 3.3, and 3.4
Looks like Chris does provide a direct exe here:
http://www.lfd.uci.edu/~gohlke/pythonlibs/#lxml
Thanks, Chris! Any ideas why I cannot compile using pip?
I also got this problem, but the workarounds provided above are not work for me as well.
Here is my system configuration:
Win7 64bit
python3.3
visual studio 2013
I tried to use the method in the first link in the Related questions, but it's fail. This method is to create a system variable for vs2010 use, and the variable is actually copy from my original configuration in visual studio 2013.
However, the command line prompted error with "libxml/xmlversion.h" no suck file or directory
Then I further searched it on the internet and got a method which works in my case.
downloading the precompiled lxml plugin
Precompiled lxml 3.5: https://pypi.python.org/pypi/lxml/3.3.5#downloads
if your system is 64bit, then you can get a unofficial version for x64 at here: http://www.lfd.uci.edu/~gohlke/pythonlibs/#lxml (this is what i use)
installing with command in command line easy_install lxml-3.2.1.win32-py3.3.‌exe
Reference: https://pytools.codeplex.com/workitem/1520
If you are using python 3.4 this is the download link :
Download Here
If you have any other configuration find it HERE according to your need. ;-)
The short version is: You need to have the C library libxml2 (and also libxslt2) before you can build lxml.
As the lxml installation docs say:
Unless you are using a static binary distribution (e.g. from a Windows binary installer), you need to install libxml2 and libxslt, in particular:
libxml2 2.6.21 or later. It can be found here: http://xmlsoft.org/downloads.html
We recommend `libxml2 2.7.8 or a later version.
If you want to use XPath, do not use libxml2 2.6.27.
If you want to use the feed parser interface, especially when parsing from unicode strings, do not use libxml2 2.7.4 through 2.7.6.
libxslt 1.1.15 or later. It can be found here: http://xmlsoft.org/XSLT/downloads.html
We recommend libxslt 1.1.26 or later.
The build from source docs similarly start off with:
To build lxml from source, you need libxml2 and libxslt properly installed, including the header files.
Windows (unlike most other platforms) doesn't come with these libraries. You don't mention anything in your "Facts" about having them.
And the error message that you showed is:
C:\Users\NATHAN~1\AppData\Local\Temp\pip_build_nathanielanderson\lxml\src\lxml\includes\etree_defs.h(9) :
fatal error C1083: Cannot open include file: 'libxml/xmlversion.h':
No such file or directory
That 'libxml/xmlversion.h' that it can't find is part of libxml2.
It's also worth noting that the same installation docs explicitly say:
consider using the binary builds from PyPI or the unofficial Windows binaries that Christoph Gohlke generously provides.
So, the fact that you thought Christoph Gohlke didn't provide binaries for lxml implies that you hadn't found these docs.
So, it's possible that you did install libxml2, but not in a way that lxml2's setup script can find it. But all the evidence implies it's a lot more likely that you just don't have it.

How to include external library with python wheel package

I want to create package for python that embeds and uses an external library (.so) on Linux using the cffi module.
Is there standard way to include .so file into python package?
The package will be used only internally and won't be published to pypi.
I think Wheel packages are the best option - they would create platform specific package with all files ready to be copied so there will be no need to build anything on target environments.
You can use auditwheel to inject the external libraries into the wheel:
auditwheel repair: copies these external shared libraries into the wheel itself, and automatically modifies the appropriate RPATH entries such that these libraries will be picked up at runtime. This accomplishes a similar result as if the libraries had been statically linked without requiring changes to the build system. Packagers are advised that bundling, like static linking, may implicate copyright concerns.
You can pre-build the external c++ library by typically executing the following:
./configure && make && make install
This will generate an my_external_library.so file and install it in the appropriate path. However, you'll need to ensure that the library path is properly set in order for the auditwheel to discover the missing dependency.
export LD_LIBRARY_PATH=/usr/local/lib
You can then build the python wheel by executing:
python setup.py bdist_wheel
Finally, you can repair the wheel, which will inject the my_external_library.so into the package.
auditwheel repair my-python-wheel-1.5.2-cp35-cp35m-linux_x86_64.whl
I successfully applied the above steps to the python library confluent-kafka-python which has a required c/c++ dependency on librdkafka.
Note: auditwheel is Linux-only. For MacOS, see the delocate tool.
Wheels are the standard way of distributing Python packages, but there is a problem when you have extension modules that depend on other so's. This is because the normal Linux dynamic linker is used, and that only looks in /usr/lib or /usr/local/lib. This is a problem when installing a wheel in a virtualenv.
As far as I know, you have three options:
Static linking, so the 'wrapper' does not depend on anything else;
using ctypes to wrap the so directly from Python;
Split the distribution into a wheel with the Python code & wrapper, and a separate RPM or DEB to install the so into either /usr/lib or /usr/local/lib.
A wheel may work if you include the dependent so as a data file to be stored in /lib and install into the root Python environment (haven't tried that), but this will break if someone tries to install the wheel into a virtualenv (did try that).

Categories