Cython dynamic library linking - python

I'm actually trying to link an existing C library to my Cython program.
I have access to the entrypoint header (.h) of the library with all functions declared as:
EXPORT_API int _stdcall LibFunction();
I suppose the EXPORT_API is used to create the dll with __declspec(dllexport)...
I also have access to the .lib and .dll files.
I've tried to use this function with the usual cdef extern fromof Cython:
cdef extern from "include\\entrypoint.h":
int LibFunction()
def c_LibFunction():
LibFunction()
And I'm using the following setup.py
from setuptools import setup, Extension
from Cython.Distutils import build_ext
NAME = 'testlib'
REQUIRES = ['cython']
SRC_DIR = 'testlib'
PACKAGES = [SRC_DIR]
INCLUDE_DIR = 'testlib\include'
LIB_DIR = 'testlib\lib'
ext = Extension(SRC_DIR + '.wrapped',
[SRC_DIR + '/wrapped.pyx'],
include_dirs=[INCLUDE_DIR],
library_dirs = [LIB_DIR],
libraries=['cfunc', 'MyLib']
)
if __name__ == "__main__":
setup(
install_requires=REQUIRES,
packages=PACKAGES,
name=NAME,
ext_modules=[ext],
cmdclass={"build_ext": build_ext}
)
But when I compile my Cython python setup.py build_ext I get an unresolved external reference:
error LNK2001: unresolved external symbol __imp_LibFunction
As I found on other thread it seems so be a question of static or dynamic library linking.
I think it comes from the setuptools compiling options, I tried to investigate using distutils documentation and Cython documentation.
The thing is, I also tried to do my own C library (cfunc.lib, a static one) and I managed to use function in it the same way I described above.
I also used DUMPBIN on MyLib.lib and I found the symbo int __cdecl LibFunction(void)and as expected, the __imp_ is not in the symbol.
It someone has an idea of what's going on, why it's going on and how I can solve my problem it could be really helpful !

I finally found the solution so I post it if someone needs help in the future !
I work on Windows using Visual Studio to compile my code.
Even if I created my cfuncproject as a Visual C++ project, it didn't compile as a C++ project but as a C project so it worked by default (it only has a .c and a .h files).
My entrypoint.h only contains C-style function declaration but the dll is compiled as a C++ project, that's why it couldn't work, the name mangling was bad.
So I just added language = 'c++'in my setup.py

Related

Compiling cython with gcc: No such file or directory from #include "ios"

Given a file docprep.pyx as simple as
from spacy.structs cimport TokenC
print("loading")
And trying to cythonize it via
cythonize -3 -i docprep.pyx
I get the following error message
docprep.c:613:10: fatal error: ios: No such file or directory
#include "ios"
^~~~~
compilation terminated
As you can tell from the paths, this system has an Anaconda installation with Python 3.7. numpy, spacy and cython are all installed through conda.
In my case, it worked using #mountrix tip, just add the language="c++" to your setup.py, an example:
from distutils.core import setup
from distutils.extension import Extension
from Cython.Build import cythonize
import numpy
extensions = [
Extension("processing_module", sources=["processing_module.pyx"], include_dirs=[numpy.get_include()], extra_compile_args=["-O3"], language="c++")
]
setup(
name="processing_module",
ext_modules = cythonize(extensions),
)
<ios> is a c++-header. The error message shows that you try to compile a C++-code as C-code.
Per default, Cython will produce a file with extension *.c, which will be interpreted as C-code by the compiler later on.
Cython can also produce a file with the right file-extension for c++, i.e. *.cpp. And there are multiple ways to trigger this behavior:
adding # distutils: language = c++ at the beginning of the pyx-file.
adding language="c++" to the Extension definition in the setup.py-file.
call cython with option --cplus.
in IPython, calling %%cython magic with -+, i.e. %%cython -+.
for alternatives when building with pyximport, see this SO-question.
Actually, for cythonize there is no command line option to trigger c++-generation, thus the first options looks like the best way to go:
# distutils: language = c++
from spacy.structs cimport TokenC
print("loading")
The problem is that spacy/structs.pxd uses c++-constructs, for example vectors or anything else cimported from libcpp:
...
from libcpp.vector cimport vector
...
and thus also c++-libraries/headers are needed for the build.

Python and C++: How to use pybind11 with Cmakelists including GSL libraries

I want to be able to call my C++ code as a python package. To do this I am using pybind11 with CMakelists (following this example https://github.com/pybind/cmake_example). My problem is that I have to include GSL libraries in the compilation of the code, and these need an explicit linker -lgsl .
If I were just to compile and run the C++ without wrapping it with python, the following Cmakelists.txt file does the job
cmake_minimum_required(VERSION 3.0)
set(CMAKE_BUILD_TYPE Debug)
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -std=c++14")
project(myProject)
add_executable(
myexecutable
main.cpp
function1.cpp
)
find_package(GSL REQUIRED)
target_link_libraries(myexecutable GSL::gsl GSL::gslcblas)
but when using pybind11 the template I found doesn't allow the add_executable therefore target_link_libraries doesn't work.
I have trie this
project(myProject)
set(CMAKE_CXX_STANDARD 11)
set(CMAKE_CXX_STANDARD_REQUIRED YES) # See below (1)
# Set source directory
set(SOURCE_DIR "project")
# Tell CMake that headers are also in SOURCE_DIR
include_directories(${SOURCE_DIR})
set(SOURCES "${SOURCE_DIR}/functions.cpp")
# Generate Python module
add_subdirectory(lib/pybind11)
pybind11_add_module(namr ${SOURCES} "${SOURCE_DIR}/bindings.cpp")
FIND_PACKAGE(GSL REQUIRED)
target_link_libraries(GSL::gsl GSL::gslcblas)
but this produces errors in the building.
Any idea ?
Function pybind11_add_module creates a library target, which can be used for link added module with other libraries:
pybind11_add_module(namr ${SOURCES} "${SOURCE_DIR}/bindings.cpp")
target_link_libraries(namr PUBLIC GSL::gsl GSL::gslcblas)
This is explicitely stated in documentation:
This function behaves very much like CMake’s builtin add_library (in fact, it’s a wrapper function around that command). It will add a library target called <name> to be built from the listed source files. In addition, it will take care of all the Python-specific compiler and linker flags as well as the OS- and Python-version-specific file extension. The produced target <name> can be further manipulated with regular CMake commands.

Writing a python wrapper for c++ code

I modified a C++ code (Kraskov_v1.C) and I now wish to call it from Python.
I am able to convert the C++ code (Kraskov_v1.C) into a .so file and integrate it into my python library. However when I try and import the library, it throws up an error. The error says "undefined symbol: _Z8mir_xnynPPdiiiiS_S_S_"
mir_xn_yn is a function (written in another c++ file namely miutils) that my Kraskov_v1 code calls. I included the header file
include "miutils.h"
in my file containing Kraskov_v1.
Here is the setup.py file I wrote to build and install this package.
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
from distutils.core import setup
from distutils.extension import Extension
from Cython.Distutils import build_ext
import numpy.distutils.misc_util
setup(name='Kraskov_v1',
version='0.1.0',
ext_modules=[Extension('_Kraskov_v1',sources =
["Kraskov_v1.i","Kraskov_v1.C"],
include_dirs = ['src'])
])
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Can someone tell me whats wrong? I am new to python and c++ and would appreciate some help.
The Extension needs a list of libraries to link with after compiling.
Missing symbols means a required library is not linked to the shared object (.so) and definitions from that library are not available.
setup(name='Kraskov_v1',
version='0.1.0',
ext_modules=[Extension('_Kraskov_v1',sources =
["Kraskov_v1.i","Kraskov_v1.C"],
include_dirs = ['src']),
libraries=['kraskov', <..>],
])

Compiling an optional cython extension only when possible in setup.py

I have a python module fully implemented in python. (For portability reasons.)
The implementation of a small part has been duplicated in a cython module. To improve perfomance where possible.
I know how to install the .c modules created by cython with distutils. However if a machine has no compiler installed, I suspect the setup will fail even though the module is still usable in pure python mode.
Is there a way to compile the .c module if possible but fail gracefully and install without it if compiling is not possible?
I guess you will have to make some modification both in your setup.py and in one __init__ file in your module.
Let say the name of your package will be "module" and you have a functionality, sub for which you have pure python code in the sub subfolder and the equivalent C code in c_sub subfolder.
For example in your setup.py :
import logging
from setuptools.extension import Extension
from setuptools.command.build_ext import build_ext
from distutils.errors import CCompilerError, DistutilsExecError, DistutilsPlatformError
logging.basicConfig()
log = logging.getLogger(__file__)
ext_errors = (CCompilerError, DistutilsExecError, DistutilsPlatformError, IOError, SystemExit)
setup_args = {'name': 'module', 'license': 'BSD', 'author': 'xxx',
'packages': ['module', 'module.sub', 'module.c_sub'],
'cmdclass': {'build_ext': build_ext}
}
ext_modules = [Extension("module.c_sub._sub", ["module/c_sub/_sub.c"])]
try:
# try building with c code :
setup(ext_modules=ext_modules, **setup_args)
except ext_errors as ex:
log.warn(ex)
log.warn("The C extension could not be compiled")
## Retry to install the module without C extensions :
# Remove any previously defined build_ext command class.
if 'build_ext' in setup_args['cmdclass']:
del setup_args['cmdclass']['build_ext']
# If this new 'setup' call don't fail, the module
# will be successfully installed, without the C extension :
setup(**setup_args)
log.info("Plain-Python installation succeeded.")
Now you will need to include something like this in your __init__.py file (or at any place relevant in your case):
try:
from .c_sub import *
except ImportError:
from .sub import *
In this way the C version will be used if it was build, other-wise the plain python version is used. It assumes that sub and c_sub will provide the same API.
You can find an example of setup file doing this way in the Shapely package. Actually most of the code I posted was copied (the construct_build_ext function) or adapted (lines after) from this file.
Class Extension has parameter optional in constructor:
optional - specifies that a build failure in the extension should not
abort the build process, but simply skip the extension.
Here is also a link to the quite interesting history of piece of code proposed by mgc.
The question How should I structure a Python package that contains Cython code
is related, there the question is how to fallback from Cython to the "already generated C code". You could use a similar strategy to select which of the .py or the .pyx code to install.

Importing python boost module

I built a DLL in VS2010 with boost::python to export some function to a python module:
myDLL.cpp:
std::string greet() { return "hello, world"; }
int square(int number) { return number * number; }
BOOST_PYTHON_MODULE(getting_started1)
{
// Add regular functions to the module.
def("greet", greet);
def("square", square);
}
Up to here, everything compiles just fine. I then get the myDLL.dll and myDLL.lib file in c:\myDLL\Debug.
According to boost doc (http://wiki.python.org/moin/boost.python/SimpleExample), I need to add this to PYTHONPATH, so I added c:\myDLL\Debug to it:
PYTHONPATH:
C:\Python27;c:\myDLL\Debug;
then, from my .py file, I try to import it:
import getting_started1
print getting_started1.greet()
number = 11
print number, '*', number, '=', getting_started1.square(number)
I have also tried from myDLL import getting_started1, and from getting_started1 import *, and all possible combinations of sorts.
Can anyone tell me how am I supposed to call my module? Thanks
EDIT:
According to cgohlke, there should be a getting_started1.pyd somewhere in my PYTHONPATH when I compile in VS? This file is inexistant... Do I have to set somethign different in VS2010? I have a default win32 DLL project.
But the boost doc says " If we build this shared library and put it on our PYTHONPATH", isn't a shared library on windows a DLL? ergo, the DLL should be in the PYTHONPATH?
The standard, portable way to build Python extensions is via distutils. However, Visual Studio 2010 is not a supported compiler for Python 2.7. The following setup.py works for me with Visual Studio 2008 and boost_1_48_0. The build command is python setup.py build_ext --inplace.
# setup.py
from distutils.core import setup
from distutils.extension import Extension
setup(name="getting_started1",
ext_modules=[
Extension("getting_started1", ["getting_started1.cpp"],
include_dirs=['boost_1_48_0'],
libraries = ['boost_python-vc90-mt-1_48'],
extra_compile_args=['/EHsc', '/FD', '/DBOOST_ALL_DYN_LINK=1']
)
])
For your Visual Studio 2010 project, try to change the linker output file to getting_started1.pyd instead of myDLL.dll.
I managed to get it working only in Release configuration and not in Debug.
From the project properties, on the General tab, modify Target Extension to .pyd
The project should be indeed a dll, as you did
In the Python script you need to specify the location of the dll as in this example:
import sys
sys.path.append("d:\\TheProjectl\\bin\\Release")
import getting_started #Dll name without the extension

Categories