How to compile .c code from Cython with gcc - python

Now that I've successfully installed Cython on Windows 7, I try to compile some Cython code using Cython, but gcc makes my life hard.
cdef void say_hello(name):
print "Hello %s" % name
Using gcc to compile the code throws dozens of undefined reference to -erros, and I'm pretty sure the libpython.a is available (as the installation tutorial said, undefined reference to -errors are thrown if this file is missing).
$ cython ctest.pyx
$ gcc ctest.c -I"C:\Python27\include"
C:\Users\niklas\AppData\Local\Temp\cckThGrF.o:ctest.c:(.text+0x1038): undefined reference to `_imp__PyString_FromStringAndSize'
C:\Users\niklas\AppData\Local\Temp\cckThGrF.o:ctest.c:(.text+0x1075): undefined reference to `_imp___Py_TrueStruct'
C:\Users\niklas\AppData\Local\Temp\cckThGrF.o:ctest.c:(.text+0x1086): undefined reference to `_imp___Py_ZeroStruct'
C:\Users\niklas\AppData\Local\Temp\cckThGrF.o:ctest.c:(.text+0x1099): undefined reference to `_imp___Py_NoneStruct'
C:\Users\niklas\AppData\Local\Temp\cckThGrF.o:ctest.c:(.text+0x10b8): undefined reference to `_imp__PyObject_IsTrue'
c:/program files/mingw/bin/../lib/gcc/mingw32/4.5.2/../../../libmingw32.a(main.o):main.c:(.text+0xd2): undefined reference to `WinMain#16'
collect2: ld returned 1 exit status
The weird thing is, using pyximport* or a setup-script works pretty fine, but it's both not very handy when still working on a module.
How to compile those .c files generated with Cython using gcc ?
or any other compiler, important is that it will work !
*pyximport: Is it normal that only python-native functions and classes are contained in the imported module and not cdef-functions and classes ?
like:
# filename: cython_test.pyx
cdef c_foo():
print "c_foo !"
def foo():
print "foo !"
c_foo()
import pyximport as p; p.install()
import cython_test
cython_test.foo()
# foo !\nc_foo !
cython_test.c_foo()
# AttributeError, module object has no attribute c_foo
UPDATE
Calling $ gcc ctest.c "C:\Python27\libs\libpython27.a" kills the undefined reference to -erros, but this one:
c:/program files/mingw/bin/../lib/gcc/mingw32/4.5.2/../../../libmingw32.a(main.o):main.c:(.text+0xd2): undefined reference to `WinMain#16'

Try:
gcc -c -IC:\Python27\include -o ctest.o ctest.c
gcc -shared -LC:\Python27\libs -o ctest.pyd ctest.o -lpython27
-shared creates a shared library. -lpython27 links with the import library C:\Python27\libs\libpython27.a.

That is a linker (ld) error and not a compiler error. You should provide the path to the library (-l and -L) and not only to the headers (-I).

Related

How can I solve "undefined reference error to" error in cython when using MinGW-w64?

I am new to Cython and I am trying to convert a python file to a C file and then to an executable using Cython, but after generating the C file when I try to compile it using GCC in Windows I get a lot of Undefined reference errors:
Here's what I did;
My pyx file:
cdef public void fun():
print('hello, world!')
if __name__ == "__main__":
fun()
Then I used this code to generate the C file:
python -m cython hello.pyx --embed -3
this went succesfully and I got a hello.c file. Then I tried to compile this using:
x86_64-w64-mingw32-gcc -mconsole -DSIZEOF_VOID_P=8 hello.c -IE:\phyton\include -LE:\phyton\libs -lpython38 -o hello.exe -DMS_WIN64
but that didn't work and I got this:
C:\Users\STIP\AppData\Local\Temp\ccZz6iu8.o:hello.c:(.text+0xf7c): undefined reference to `__imp_PyExc_SystemError'
C:\Users\STIP\AppData\Local\Temp\ccZz6iu8.o:hello.c:(.text+0x1175): undefined reference to `__imp__Py_NoneStruct'
C:\Users\STIP\AppData\Local\Temp\ccZz6iu8.o:hello.c:(.text+0x129c): undefined reference to `__imp_PyUnicode_Type'
C:\Users\STIP\AppData\Local\Temp\ccZz6iu8.o:hello.c:(.text+0x12b7): undefined reference to `__imp_PyUnicode_Type'
C:\Users\STIP\AppData\Local\Temp\ccZz6iu8.o:hello.c:(.text+0x17da): undefined reference to `__imp__Py_NoneStruct'
C:\Users\STIP\AppData\Local\Temp\ccZz6iu8.o:hello.c:(.text+0x17f4): undefined reference to `__imp__Py_NoneStruct'
C:\Users\STIP\AppData\Local\Temp\ccZz6iu8.o:hello.c:(.text+0x19a3): undefined reference to `__imp__Py_FalseStruct'
C:\Users\STIP\AppData\Local\Temp\ccZz6iu8.o:hello.c:(.text+0x19ac): undefined reference to `__imp__Py_TrueStruct'
C:\Users\STIP\AppData\Local\Temp\ccZz6iu8.o:hello.c:(.text+0x19fe): undefined reference to `__imp__Py_FalseStruct'
C:\Users\STIP\AppData\Local\Temp\ccZz6iu8.o:hello.c:(.text+0x1a16): undefined reference to `__imp__Py_FalseStruct'
C:\Users\STIP\AppData\Local\Temp\ccZz6iu8.o:hello.c:(.text+0x1a23): undefined reference to `__imp__Py_TrueStruct'
C:\Users\STIP\AppData\Local\Temp\ccZz6iu8.o:hello.c:(.text+0x21a0): undefined reference to `__imp_PyModule_Type'
C:\Users\STIP\AppData\Local\Temp\ccZz6iu8.o:hello.c:(.text+0x21b8): undefined reference to `__imp_PyModule_Type'
C:\Users\STIP\AppData\Local\Temp\ccZz6iu8.o:hello.c:(.text+0x22b4): undefined reference to `__imp_PyBaseObject_Type'
C:\Users\STIP\AppData\Local\Temp\ccZz6iu8.o:hello.c:(.text+0x284b): undefined reference to `__imp__Py_TrueStruct'
C:\Users\STIP\AppData\Local\Temp\ccZz6iu8.o:hello.c:(.text+0x285f): undefined reference to `__imp__Py_FalseStruct'
C:\Users\STIP\AppData\Local\Temp\ccZz6iu8.o:hello.c:(.text+0x2875): undefined reference to `__imp__Py_NoneStruct'
C:\Users\STIP\AppData\Local\Temp\ccZz6iu8.o:hello.c:(.text+0x28d7): undefined reference to `__imp_PyExc_DeprecationWarning'
C:\Users\STIP\AppData\Local\Temp\ccZz6iu8.o:hello.c:(.text+0x292e): undefined reference to `__imp_PyExc_TypeError'
collect2.exe: error: ld returned 1 exit status
So where is my mistake, most of the time this has to do with linking libraries, but I have done that properly so can anybody help?
While MSVC needs the corresponding *.lib-file for linking a dll, x86_64-w64-mingw32-gcc-linker is different and uses directly the *.dll (i.e. similar to using *.so-file for linking on Linux). See also this SO-post for more details.
Your command passes -LE:\phyton\libs to the linker and this is where the *.lib-files reside. The linker finds python38.lib (so there is no error, that lpython38 was not found) but doesn't find the expected symbols inside.
You need to provide path where the needed python38.dll can be found. Normally it is in the same folder as the python-executable, so probably in E:\python, that means
x86_64-w64-mingw32-gcc <....> -LE:\phyton -lpython38 -o hello.exe -DMS_WIN64 -municode
should be used. Why -municode is needed see this SO-post.

Is there a way to create a python module using SWIG C++ which can be imported in both Python2 and Python3

I'm writing a SWIG c++ file to generate a python module. I want to let the users import it in both Python2 and Python3 scripts. Since SWIG has different flags for binding Python2 and Python 3, I was wondering is there a way I can write a create a general module for both.
Let's rewind a little and keep SWIG itself out of the question to start with.
Historically it's been necessary to compile a Python module for every (sometimes even minor) version change of the Python interpreter. This led to PEP-384, which defined a stable ABI for a subset of the Python C-API from Python 3.2 onwards. So obviously that doesn't actually work for your request, because you're interested in 2 vs 3. Furthermore SWIG itself doesn't actually generate PEP-384 compatible, code.
To experiment further and see a bit more about what's going on I've made the following SWIG file:
%module test
%inline %{
char *frobinate(const char *in) {
static char buf[1024];
snprintf(buf, 1024, "World of hello: %s", in);
return buf;
}
%}
If we try to compile this with -DPy_LIMITED_API it failed:
swig3.0 -Wall -python test.i && gcc -Wall -Wextra -I/usr/include/python3.2 -shared -o _test.so test_wrap.c -DPy_LIMITED_API 2>&1|head
test_wrap.c: In function ‘SWIG_PyInstanceMethod_New’:
test_wrap.c:1114:3: warning: implicit declaration of function ‘PyInstanceMethod_New’ [-Wimplicit-function-declaration]
return PyInstanceMethod_New(func);
^
test_wrap.c:1114:3: warning: return makes pointer from integer without a cast
test_wrap.c: In function ‘SWIG_Python_UnpackTuple’:
test_wrap.c:1315:5: warning: implicit declaration of function ‘PyTuple_GET_SIZE’ [-Wimplicit-function-declaration]
Py_ssize_t l = PyTuple_GET_SIZE(args);
^
test_wrap.c:1327:2: warning: implicit declaration of function ‘PyTuple_GET_ITEM’ [-Wimplicit-function-declaration]
I.e. nobody has picked up that ticket, at least not on the version of SWIG 3.0.2 that I am testing with.
So where does that leave us? Well the SWIG 3.0 documentation on Python 3 support says something interesting:
SWIG is able to support Python 3.0. The wrapper code generated by SWIG can be compiled with both Python 2.x or 3.0. Further more, by passing the -py3 command line option to SWIG, wrapper code with some Python 3 specific features can be generated (see below subsections for details of these features). The -py3 option also disables some incompatible features for Python 3, such as -classic.
My reading of that statement is that if you want to generate source code that can be compiled with both 2.x and 3.x Python headers all you need to do is omit the -py3 argument when running SWIG. And that seems to work with my testing, the same code generated by SWIG compiles just fine with all of:
$ gcc -Wall -Wextra -I/usr/include/python2.7/ -shared -o _test.so test_wrap.c
$ gcc -Wall -Wextra -I/usr/include/python3.2 -shared -o _test.so test_wrap.c
$ gcc -Wall -Wextra -I/usr/include/python3.4 -shared -o _test.so test_wrap.c
(Note that 3.4 does generate some warnings, but no errors and it does work)
The problem is that there's no interchangeability between the compiled _test.so of any given version. Trying to import a version from one in another will fail with errors from the dynamic linker about missing or undefined symbols. So we're still stuck at the initial problem, although we can compile our module for any Python interpreter we can't compile it for all versions.
In my view the right way to deal with this is use distuitls and simply install your module into the search path for each version of Python you want to support on any given machine.
That said there is a workaround we could use. Essentially we want to build multiple versions of our module for each interpreter and use some generic code to switch between the various implementations. Assuming that you're not building with code generated from SWIG's -builtin option there are two places we could try and do this:
In the shared object itself
In some Python code
The later is substantially simpler to achieve, so let's look at that. I did something by using the following bash script to build a shared object for each version of Python I intended to support:
#!/bin/bash
for v in 2.7 3.2 3.4
do
d=$(echo $v|tr . _)
mkdir -p $d
touch $d/__init__.py
gcc -Wall -Wextra -I/usr/include/python$v -shared -o $d/_test.so test_wrap.c
done
This builds 3 versions of _test.so, in directories named after the Python version they're intended to support. If we tried to import our SWIG generated module now it would complain because there's no indication of where to find the _test module we've compiled. So we need to fix that. There are three ways to do it:
Modify the SWIG generated import code. I didn't manage to make this work with SWIG 3.0.2 - perhaps it's too old?
Modify the Python search path, before the second import. We can do that using %pythonbegin:
%pythonbegin %{
import os.path
import sys
sys.path.append(os.path.join(os.path.abspath(os.path.dirname(__file__)), '%d_%d' % (sys.version_info.major, sys.version_info.minor)))
%}
Write a _test.py stub that finds the right one and then switches them around:
from sys import version_info,modules
from importlib import import_module
real_mod = import_module('%d_%d.%s' % (version_info.major, version_info.minor, __name__))
modules[__name__] = modules[real_mod.__name__]
Either of the last two options worked and resulted in import test succeeding in all three versions of Python. But it's cheating a bit really and far better just to install from source 3 times over.

Cython generated module can't be used because of undefined symbol

The python extension lsblib.so is created by cython and distutils, the build command generated from distutil is as following:
/gpfs/software/openlava_sqa/3.3.3/etc/../include/lsbatch.h:1681: warning: function declaration isn’t a prototype
gcc -pthread -shared -L/gpfs/DEV/PLT/software/anaconda2-4.0.0/lib -Wl,-rpath=/gpfs/DEV/PLT/software/anaconda2-4.0.0/lib,--no-as-needed build/temp.linux-x86_64-2.7/openlava/lsblib.o /gpfs/software/openlava_sqa/3.3.3/etc/../lib/liblsfint.a /gpfs/software/openlava_sqa/3.3.3/etc/../lib/liblsf.a /gpfs/software/openlava_sqa/3.3.3/etc/../lib/liblsbatch.a -L/gpfs/software/openlava_sqa/3.3.3/etc/../lib -L/gpfs/DEV/PLT/software/anaconda2-4.0.0/lib -llsfint -llsf -llsbatch -lnsl -ltools -lpython2.7 -o build/lib.linux-x86_64-2.7/openlava/lsblib.so -g
Note that, linked the four libraries twice by lsblib.o+lib*.a and -lXX, the target module is lsblib.so
-llsfint -llsf -llsbatch -lnsl -ltools
build/temp.linux-x86_64-2.7/openlava/lsblib.o /gpfs/software/openlava_sqa/3.3.3/etc/../lib/liblsfint.a /gpfs/software/openlava_sqa/3.3.3/etc/../lib/liblsf.a /gpfs/software/openlava_sqa/3.3.3/etc/../lib/liblsbatch.a
When I import the module:
/gpfs/software/openlava_sqa/3.3.3/lib/liblsbatch.so.0: undefined
symbol: mergeResreq
However, the compiled lsblib.so has the symbol
000000000009dc9b T mergeResreq
and the symbol mergeResreq is in the liblsfint.a and liblsfinit.so.0.0.1
nm /gpfs/software/openlava_sqa/3.3.3/lib/liblsfint.so.0.0.1 |grep merge
0000000000010017 T mergeResreq
00000000000161d5 t mergeW
nm /gpfs/software/openlava_sqa/3.3.3/lib/liblsfint.a |grep merge
00000000000041ff T mergeResreq
0000000000000581 t mergeW
Why when importing the module, it finds the symbol in the wrong lsbatch.so?It do not use the one in lsblib.so and even if I set the LD_LIBRARY_PATH, it doesn't use the one lsfint.so.0.0.1

Python / C++ binding, how to link agains static c++ library (portaudio) with distutils?

I am trying to staticaly link the "c++ portaudio library" against my "C++ demo module" which is a python callable library (module).
I'm doing this with distutils, and in order to perform the static linking, I've added the libportaudio to the extra_objects argument, as follows:
module1 = Extension(
"demo",
sources=cppc,
# TODO remove os dependency
extra_compile_args=gccArgs,
# link against shared libraries
#libraries=[""]
# link against static libraries
extra_objects=["./clib-3rd-portaudio/libportaudio.a"]) # << I've added the static lib here
Compiling with "python setup.py build" results in the following linker error:
/usr/bin/ld: ./clib-3rd-portaudio/libportaudio.a(pa_front.o): relocation R_X86_64_32 against `.rodata.str1.8' can not be used when making a shared object; recompile with -fPIC
./clib-3rd-portaudio/libportaudio.a: error adding symbols: Bad value
collect2: error: ld returned 1 exit status
So at this point I've tried the obvious, I've added the -fPIC flagg to gccArgs (note extra_compile_args=gccArgs above), as follows:
gccArgs = [
"-Icsrc",
"-Icsrc/paExamples",
"-Icinc-3rd-portaudio",
"-Icinc-3rd-portaudio/common",
"-Icinc-3rd-portaudio/linux",
"-fPIC"] # << I've added the -fPIC flag here
However this results in the exact same error, so I guess the -fPIC flag is not the root cause. I'm probably missing something trivial, but I'm a bit lost here, hope somebody can help.
As the error message said, you should recompile the external library libportaudio.a with -fPIC argument, NOT your own codes. That's why it doesn't help to add -fPIC to your extra_compile_args.
Several other posts suggest that the file libportaudio.a cannot be used to build shared library, probably because the default build settings of portaudio don't include -fPIC.
To recompile portaudio correctly, download the source and try to run ./configure with -shared option (or something similar). If you cannot find the proper option, then modify the Makefile and append -fPIC to the extra compile options. You can also compile each object file manually and pack them into libportaudio.a.
Since your target file (libdemo.so) is a shared library, you must make sure ANY object codes included inside are compiled with -fPIC option. To understand why you need this option, please refer to:
What does -fPIC mean when building a shared library? and Position Independent Code (PIC) in shared libraries

undefined symbol: PyOS_InputHook, from shared library

I wrote a C++ "python plugin" for an a non-python C++ application.
At some point, this plugin, which is a .so, initializes the python interpreter and opens a python console.
For convenience the "readline" module is then imported and we get this error:
ImportError: /usr/lib/python2.7/lib-dynload/readline.so: undefined symbol: PyOS_InputHook
The link command (generated by cmake) goes:
/usr/bin/c++ -fPIC -Wall -Wextra -O3 -DNDEBUG -Xlinker -export-dynamic -Wl,-fwhole-program /usr/lib/libpython2.7.a -shared -Wl,-soname,libMyplugin.so -o libMyplugin.so [sources] [qt libs] -lGLU -lGL -lX11 -lXext -lc -lc -lpython2.7 -Wl,-rpath,/src:/usr/local/Trolltech/Qt-4.8.4/lib:
nm libMyplugin.so gives the following python-related symbols:
U Py_Finalize
U Py_Initialize
00000000002114a8 B PyOS_InputHook
U PyRun_InteractiveLoopFlags
U PyRun_SimpleStringFlags
We observe that PyOS_InputHook is defined in the BSS section of the plugin. Yet, python's readline.so fails to find it.
The question is why, and how to fix it.
The issue is with how the main application loads the plugin: it uses dlopen() without the flag RTLD_GLOBAL.
This implies that the symbols present in the plugin that are not currently needed (like PyOS_InputHook in this instance) are not resolved and will not be resolved for other shared libraries that will be loaded afterwards (like readline.so in this instance).
To fix this, the flag RTLD_GLOBAL should be used when loading the plugin.
If there is no control over the main application (as in this instance) and on how it uses dlopen(), it is still possible to "reload" the plugin from within the plugin itself using dlopen() with flags RTLD_NOLOAD | RTLD_GLOBAL, so as to resolve all previously unresolved symbols in the currently loaded library.
Doing this solves the issue.

Categories