I modified a C++ code (Kraskov_v1.C) and I now wish to call it from Python.
I am able to convert the C++ code (Kraskov_v1.C) into a .so file and integrate it into my python library. However when I try and import the library, it throws up an error. The error says "undefined symbol: _Z8mir_xnynPPdiiiiS_S_S_"
mir_xn_yn is a function (written in another c++ file namely miutils) that my Kraskov_v1 code calls. I included the header file
include "miutils.h"
in my file containing Kraskov_v1.
Here is the setup.py file I wrote to build and install this package.
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
from distutils.core import setup
from distutils.extension import Extension
from Cython.Distutils import build_ext
import numpy.distutils.misc_util
setup(name='Kraskov_v1',
version='0.1.0',
ext_modules=[Extension('_Kraskov_v1',sources =
["Kraskov_v1.i","Kraskov_v1.C"],
include_dirs = ['src'])
])
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Can someone tell me whats wrong? I am new to python and c++ and would appreciate some help.
The Extension needs a list of libraries to link with after compiling.
Missing symbols means a required library is not linked to the shared object (.so) and definitions from that library are not available.
setup(name='Kraskov_v1',
version='0.1.0',
ext_modules=[Extension('_Kraskov_v1',sources =
["Kraskov_v1.i","Kraskov_v1.C"],
include_dirs = ['src']),
libraries=['kraskov', <..>],
])
Related
I wrote some binding code to bind C++ code with python in pybindx.cpp file. I want to call some functions (implemented in C++) using python. When I use python setup.py build_ext command, the .so file ./build/lib.linux-x86_64-3.8/pybindx.cpython-38-x86_64-linux-gnu.so is getting created, but when I try to import(import pybindx) in test.py to call binded functions, It gives the following error:
ImportError: <path-to-.so-file>/pybindx.cpython-38-x86_64-linux-gnu.so: undefined symbol: _ZN6google8protobuf8internal26fixed_address_empty_stringB5cxx11E
I have added <path-to-.so-file> to PYTHONPATH and LD_LIBRARY_PATH.
My setup.py file contains following code:
import os, sys
from distutils.core import setup, Extension
from distutils import sysconfig
cpp_args = ['-std=c++11']
ext_modules = [
Extension(
'pybindx',
['class1.cpp', 'class2.cpp', 'base_class1.cpp', 'base_class2.cpp', 'pybindx.cpp'],
include_dirs=['paths/to/include/header/files', 'path/to/protobuf/include'],
language='c++',
extra_compile_args = cpp_args,
),
]
setup(
name='pybindx',
version='0.0.1',
author='xxxxx',
author_email='xxxxx',
description='desc',
ext_modules=ext_modules,
)
Where, class1.cpp, class2.cpp, base_class1.cpp, base_class2.cpp are the files having implementation of classes and functions which I want to bind with python.
I am new to pybind11, can someone help me with this?
Thanks!
I tried writing small example code without protobuf, where I am able to call the C++ function using test.py, but here I want to use protobuf.
Anyways, fixed it. I Used Makefile rather than setup.py. I have to link static libprotobuf.a with my pybindx.so.
Added following command with other required commands in Makefile:
g++ -shared -fPIC ./build/*.o <path-to-static-protobuf>/libprotobuf.a -o pybindx.so
Where, *.o are the object files created using some command(s) in Makefile.
When I compile with Cython the Python code which uses Python.NET to access .NET assemblies, it can't find those assemblies:
ModuleNotFoundError: No module named 'System'
Without compilation it works ok.
For demo code, I used https://github.com/pythonnet/pythonnet/blob/master/demo/helloform.py
My setup.py file:
from distutils.core import setup
from distutils.extension import Extension
from Cython.Build import cythonize
ext_modules = [
Extension(
'helloform',
sources = ['helloform.py'],
language = 'c++'
)
]
setup(
name = 'helloform',
ext_modules = cythonize(ext_modules),
)
Then I build it with python setup.py build_ext --inplace.
I wanted to load compiled module from Python prompt with import helloform but it failed with
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "helloform.py", line 8, in init helloform
ModuleNotFoundError: No module named 'System'
This answer is untested - I don't think I can easily set up an environment to test so it's a bit of a guess. If it doesn't work I'll remove it.
This probably is a bug, and if you want it fixed in the longer term you should report it. Cython does try to be compatible with Python wherever possible.... A quick investigate suggests that Python.NET overrides the built-in __import__ function. Cython looks to lookup and use this function in Python 2, but not in Python 3. This is no longer the preferred way of customizing import behaviour (but is still supported). I'd guess it would work in Cython + Python 2?
As a workaround you should probably just run the import statements in Python. There's two obvious ways to do it:
Write a small separate module just containing the import statements, then in Cython import from that module:
from import_module import WinForms, Size, Point
Run the import statements in exec; extract the values out of the global dict you pass it:
import_dict = {}
exec("""import clr
# etc...
""", import_dict) # pass a dict in as `globals`
WinForms = import_dict['WinForms']
# etc.
I have a Cython module compiled from pyx files to c files that I am trying to import and use in a python module. I'm running python 3.6 on a Mac. When I run gcc -v the output is:
Configured with: --prefix=/Library/Developer/CommandLineTools/usr - -with-gxx-include-dir=/Library/Developer/CommandLineTools/SDKs/MacOSX10.14.sdk/usr/include/c++/4.2.1 Apple LLVM version 10.0.1 (clang-1001.0.46.4) Target: x86_64-apple-darwin18.7.0 Thread model: posix InstalledDir: /Library/Developer/CommandLineTools/usr/bin
Running python setup.py build and python setup.py install gives no errors, and the .so and .c files for the corresponding files appear in the right directory, which is on the path.
When I try to import the module, I get an error in the init file, from a line that tries to import another submodule:
from . import subModule
I've tried updating python and Cython, and I've made sure that gcc is in user/lib.
This is my setup.py file:
from Cython.Build import cythonize
setupArgs = dict(
name="module",
version="1.1.0",
description=description,
url="https://github.com/XXXX/XXXXX/module",
author="CTcue",
author_email="info#XXXXXX.com",
ext_modules=cythonize(["module/*.pyx"]),
)
# Use distutils setup
from distutils.core import setup
setup(**setupArgs)
This is the error message:
File "module/subModule.pyx", line 4, in init module.subModule
ModuleNotFoundError: No module named 'thirdModule'
The thirdModule in question has a .c file and a .so file that correspond to the .pyx file, and as far as I can tell everything is order there.
module's init.py:
__title__ = 'Pseudonomizer'
__version__ = '1.0.2'
__author__ = 'CTcue'
__license__ = 'Proprietary'
__copyright__ = 'Copyright 2016 CTcue'
from . import subModule
subModule:
import thirdModule
thirdModule.doSomething()
third module:
import re
from . import anotherModule
def doSomething:
#code that does something
Edit : In an attempt to see if the compiler is at fault, I tried to manually compile the .c file of thirdModule with "gcc thirdModule", and got the following error:
Undefined symbols for architecture x86_64:
This seems to suggest that the issue is compiler-related, but I still haven't found the solution.
Any help would be much appreciated.
It turns out #ead was right, and the problem was that the module had implicit relative imports which are no longer allowed in python 3.
I have written a multi-threaded module called fast_nn in Cython and compiled it with the following setup.py:
from distutils.core import setup
from distutils.extension import Extension
from Cython.Build import cythonize
import numpy
setup(
ext_modules = cythonize([
Extension("fast_nn", ["fast_nn.pyx"], language = 'c++', extra_compile_args = ['-O3', '-fopenmp'], extra_link_args = ['-fopenmp'], include_dirs = [numpy.get_include()])
])
)
In addition, I use the python bindings of the caffe framework.
If I use my module alone or pycaffe alone, everything is fine. However, the following combination is problematic:
import caffe.io
import fast_nn
On one machine, it does not lead to any problems. On another machine, I get the following exception:
Traceback (most recent call last):
File "test.py", line 2, in <module>
import fast_nn
ImportError: dlopen: cannot load any more object with static TLS
From this thread I understand that too many libraries with initial-exec static TLS have been loaded dynamically. But what can I do to work around this problem? Is there some way to compile my module with static TLS disabled or a better solution?
However, the output of readelf -a fast_nn.cpython-35m-x86_64-linux-gnu.so | grep TLS does not show any TLS symbols in my library. But some dependent libraries contain such symbols.
Solutions tried so far
Changing the order of the two imports (i.e., importing caffe after my cython module) solves the problem. However, since my actual real-world case is much more complex and both modules are imported indirectly from other imports, I cannot change the order of imports.
Similarly, exporting the path of my compiled module library to LD_PRELOAD has the same effect, but is not an elegant solution either.
I have a python module fully implemented in python. (For portability reasons.)
The implementation of a small part has been duplicated in a cython module. To improve perfomance where possible.
I know how to install the .c modules created by cython with distutils. However if a machine has no compiler installed, I suspect the setup will fail even though the module is still usable in pure python mode.
Is there a way to compile the .c module if possible but fail gracefully and install without it if compiling is not possible?
I guess you will have to make some modification both in your setup.py and in one __init__ file in your module.
Let say the name of your package will be "module" and you have a functionality, sub for which you have pure python code in the sub subfolder and the equivalent C code in c_sub subfolder.
For example in your setup.py :
import logging
from setuptools.extension import Extension
from setuptools.command.build_ext import build_ext
from distutils.errors import CCompilerError, DistutilsExecError, DistutilsPlatformError
logging.basicConfig()
log = logging.getLogger(__file__)
ext_errors = (CCompilerError, DistutilsExecError, DistutilsPlatformError, IOError, SystemExit)
setup_args = {'name': 'module', 'license': 'BSD', 'author': 'xxx',
'packages': ['module', 'module.sub', 'module.c_sub'],
'cmdclass': {'build_ext': build_ext}
}
ext_modules = [Extension("module.c_sub._sub", ["module/c_sub/_sub.c"])]
try:
# try building with c code :
setup(ext_modules=ext_modules, **setup_args)
except ext_errors as ex:
log.warn(ex)
log.warn("The C extension could not be compiled")
## Retry to install the module without C extensions :
# Remove any previously defined build_ext command class.
if 'build_ext' in setup_args['cmdclass']:
del setup_args['cmdclass']['build_ext']
# If this new 'setup' call don't fail, the module
# will be successfully installed, without the C extension :
setup(**setup_args)
log.info("Plain-Python installation succeeded.")
Now you will need to include something like this in your __init__.py file (or at any place relevant in your case):
try:
from .c_sub import *
except ImportError:
from .sub import *
In this way the C version will be used if it was build, other-wise the plain python version is used. It assumes that sub and c_sub will provide the same API.
You can find an example of setup file doing this way in the Shapely package. Actually most of the code I posted was copied (the construct_build_ext function) or adapted (lines after) from this file.
Class Extension has parameter optional in constructor:
optional - specifies that a build failure in the extension should not
abort the build process, but simply skip the extension.
Here is also a link to the quite interesting history of piece of code proposed by mgc.
The question How should I structure a Python package that contains Cython code
is related, there the question is how to fallback from Cython to the "already generated C code". You could use a similar strategy to select which of the .py or the .pyx code to install.