Python.h not found using swig and Anaconda Python - python

I'm trying to compile a simple python/C example following this tutorial:
http://www.swig.org/tutorial.html
I'm on MacOS using Anaconda python.
however, when I run
gcc -c example.c example_wrap.c -I/Users/myuser/anaconda/include/
I get:
example_wrap.c:130:11: fatal error: 'Python.h' file not found
# include <Python.h>
^
It seems that this problem is reported in a number of questions:
Missing Python.h while trying to compile a C extension module
Missing Python.h and impossible to find
Python.h: No such file or directory
but none seem to provide an answer specific to Anaconda on MacOS
Anyone solved this?

Use the option -I/Users/myuser/anaconda/include/python2.7 in the gcc command. (That's assuming you are using python 2.7. Change the name to match the version of python that you are using.) You can use the command python-config --cflags to get the full set of recommended compilation flags:
$ python-config --cflags
-I/Users/myuser/anaconda/include/python2.7 -I/Users/myuser/anaconda/include/python2.7 -fno-strict-aliasing -I/Users/myuser/anaconda/include -arch x86_64 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes
However, to build the extension module, I recommend using a simple setup script, such as the following setup.py, and let distutils figure out all the compiling and linking options for you.
# setup.py
from distutils.core import setup, Extension
example_module = Extension('_example', sources=['example_wrap.c', 'example.c'])
setup(name='example', ext_modules=[example_module], py_modules=["example"])
Then you can run:
$ swig -python example.i
$ python setup.py build_ext --inplace
(Take a look at the compiler commands that are echoed to the terminal when setup.py is run.)
distutils knows about SWIG, so instead of including example_wrap.c in the list of source files, you can include example.i, and swig will be run automatically by the setup script:
# setup.py
from distutils.core import setup, Extension
example_module = Extension('_example', sources=['example.c', 'example.i'])
setup(name='example', ext_modules=[example_module], py_modules=["example"])
With the above version of setup.py, you can build the extension module with the single command
$ python setup.py build_ext --inplace
Once you've built the extension module, you should be able to use it in python:
>>> import example
>>> example.fact(5)
120
If you'd rather not use the script setup.py, here's a set of commands that worked for me:
$ swig -python example.i
$ gcc -c -I/Users/myuser/anaconda/include/python2.7 example.c example_wrap.c
$ gcc -bundle -undefined dynamic_lookup -L/Users/myuser/anaconda/lib example.o example_wrap.o -o _example.so
Note: I'm using Mac OS X 10.9.4:
$ gcc --version
Configured with: --prefix=/Library/Developer/CommandLineTools/usr --with-gxx-include-dir=/usr/include/c++/4.2.1
Apple LLVM version 5.1 (clang-503.0.40) (based on LLVM 3.4svn)
Target: x86_64-apple-darwin13.3.0
Thread model: posix

Related

Cannot avoid Cython from compiling external C module in Python2.7 rather than Python3.x

This question is related to a couple of questions in SO like this one and this other one. Unfortunately the solution there is not working for me so far. I have a module.pxc file which I am compiling by a setup.py file, such as the following one:
# setup.py
module_extension = Extension(
name="iolif",
sources=["/home/maurizio/Ongoing.Projects/c_libraries/dcomplex_libc.c",
"/home/maurizio/Ongoing.Projects/c_libraries/special_functions_libc.c",
"/home/maurizio/Ongoing.Projects/c_libraries/models/freq_cv_libc.c",
"module.pyx"],
libraries=['gsl', 'gslcblas', 'm'],
# library_dirs=["lib"],
include_dirs=["/home/maurizio/Ongoing.Projects/pycustommodules",
"/home/maurizio/Ongoing.Projects/c_libraries",
"/home/maurizio/Ongoing.Projects/c_libraries/models"]
)
setup(
name="iolif",
ext_modules=cythonize([module_extension])
)
From command line, in the same directory of module.pxc, when writing python setup.py build_ext --inplace compilation works fine, and the iolif.so library is produced. The issue is that I can only import this library if I use Python2.7, whereas if I attempt to import it in Python3.x I get the known ImportError: dynamic module does not define module export function (PyInit_iolif).
Googling around, and as pointed out in the two questions linked above, it seems that this is due to the fact that cython is looking at Python2.7 rather than Python3.x (which is the one I work with instead). Accordingly, I attempted asking cythonize in my setup.py to use Python3.x by:
...
setup(
name="iolif",
ext_modules=cythonize([module_extension],
compiler_directives={'language_level': "3"})
)
but it still does not work. The last compilation message indeed produces:
gcc -pthread -shared -Wl,-z,relro -Wl,--as-needed -Wl,-z,now -
specs=/usr/lib/rpm/redhat/redhat-hardened-ld build/temp.linux-x86_64-2.7/pylif_io.o
build/temp.linux-x86_64-2.7/home/maurizio/Ongoing.Projects/c_libraries/dcomplex_libc.o
build/temp.linux-x86_64-2.7/home/maurizio/Ongoing.Projects/c_libraries/special_functions_libc.o build/temp.linux-x86_64-2.7/home/maurizio/Ongoing.Projects/c_libraries/models/freq_cv_libc.o -L/usr/lib64
-lgsl -lgslcblas -lm -lpython2.7 -o /home/maurizio/Ongoing.Projects/DePitta.PNAS/Software/LIF.Analysis/iolif.so
where you can see that it is still linking with the -lpython2.7 library (whereas it should use for example -lpython3.7m). How do I solve it? What am I missing?
Easy solution. My python command was still associated with python2.7 (I recently moved to Python3.x). Sorry about it. Hence:
python3 setup.py build_ext --inplace
will make the trick. Indeed compilation now reads:
gcc -pthread -shared -Wl,-z,relro -Wl,--as-needed -Wl,-z,now -g build/temp.linux-x86_64-3.7/pylif_io.o build/temp.linux-x86_64-3.7/home/maurizio/Ongoing.Projects/c_libraries/dcomplex_libc.o build/temp.linux-x86_64-3.7/home/maurizio/Ongoing.Projects/c_libraries/special_functions_libc.o build/temp.linux-x86_64-3.7/home/maurizio/Ongoing.Projects/c_libraries/models/freq_cv_libc.o -L/usr/lib64 -lgsl -lgslcblas -lm -lpython3.7m -o /home/maurizio/Ongoing.Projects/DePitta.PNAS/Software/LIF.Analysis/iolif.cpython-37m-x86_64-linux-gnu.so
as required.

Python3 shared extension doesn't link against library dependency

I'm creating a shared Python extension for my library and I'm using distutils to build it.
These are the relevant sections of my setup.py:
import distuitls.core as dc
from os.path import join as path_join
module = dc.Extension(module_name,
sources = [path_join(meson_src_root, "py3_bindings", "module.c")],
include_dirs = [path_join(meson_src_root, "include")],
libraries = ["bbmputil"],
runtime_library_dirs = [meson_build_root])
dc.setup(name = module_name,
version = module_version,
description = "Python3 bindings for the bbmp_utils library",
ext_modules = [module])
Running $ setup.py build results in the shared extension module being built successfully, but it isn't getting linked against the "bbmputil" library.
$ ldd build/lib.linux-x86_64-3.8/bbmp_utils.cpython-38-x86_64-linux-gnu.so
linux-vdso.so.1 (0x00007ffc85ce1000)
libc.so.6 => /usr/lib/libc.so.6 (0x00007f49f0d70000)
/usr/lib64/ld-linux-x86-64.so.2 (0x00007f49f0f74000)
libbbmputil.so is nowhere to be found, despite being specified in the libraries kwarg of Extension().
It does exist in the location specified in the runtime_library_dirs kwarg.
This leads to the python interpreter raising an ImportError exception when a symbol from the non-linked library is referenced in the extension:
$ env PYTHONPATH="sharedextension_build_path" python3
>>> import bbmp_utils
ImportError: /home/bogdan/dev/bbmp_utils/build_dbg/build/lib.linux-x86_64-3.8/bbmp_utils.cpython-38-x86_64-linux-gnu.so: undefined symbol: bbmp_vertflip
where bbmp_vertflip is a symbol defined in the library that doesn't seem to be linked for some reason.
The two C compiler invocations look as follows:
gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -march=x86-64 -mtune=generic -O3 -pipe -fno-plt -march=x86-64 -mtune=generic -O3 -pipe -fno-plt -march=x86-64 -mtune=generic -O3 -pipe -fno-plt -fPIC -I/home/bogdan/dev/bbmp_utils/include -I/usr/include/python3.8 -c /home/bogdan/dev/bbmp_utils/py3_bindings/module.c -o build/temp.linux-x86_64-3.8/home/bogdan/dev/bbmp_utils/py3_bindings/module.o
gcc -pthread -shared -Wl,-O1,--sort-common,--as-needed,-z,relro,-z,now -Wl,-O1,--sort-common,--as-needed,-z,relro,-z,now build/temp.linux-x86_64-3.8/home/bogdan/dev/bbmp_utils/py3_bindings/module.o -L/usr/lib -Wl,--enable-new-dtags,-R/home/bogdan/dev/bbmp_utils/build_dbg -lbbmputil -o build/lib.linux-x86_64-3.8/bbmp_utils.cpython-38-x86_64-linux-gnu.so
In the 2nd invocation both -lbbmputil as well as -R are passed properly when building the shared extension so I'm out of ideas.
Minimal example producing the same behavior
Attempting to build a module that utilizes functions and other symbols from the math shared library:
#!/usr/bin/env python3
import distutils.core as dc
module = dc.Extension('example',
sources = ['example.c'],
libraries = ['m'])
dc.setup(name = 'example',
version = '0.1',
ext_modules = [module])
$ ./setup.py build
$ ldd .../.../example.cpython-38-x86_64-linux-gnu.so
linux-vdso.so.1 (0x00007ffd0b9e5000)
libc.so.6 => /usr/lib/libc.so.6 (0x00007fab528e8000)
/usr/lib64/ld-linux-x86-64.so.2 (0x00007fab52aec000)
Again, libm.so dependency is nowhere to be found.
Environment:
python3 3.8.1
linux 5.4.6
gcc 9.2.0
ld 2.33.1
ldd 2.3.0
UPDATE : The problem in this case is the linker optimization option
--as-needed that is enabled by default, see
Missing a library in ldd after using gcc -l
Adding --no-as-needed fixes this error
For debugging linker errors you can use LD_DEBUG=files,libs /usr/local/ABC/bin/ABC where ABC is the executable that throws linker errors at runtime, cf http://www.bnikolic.co.uk/blog/linux-ld-debug.html and libm.so.6: cannot open shared object file: No such file or directory On linux you locate a .so with i.e. locate libm (i think you know this)
As the linking is dynamically it is an option to specify the path where your .so files can be found using the library_dirs option of disutils.core that is the -L or equivalently LD_LIBRARY_PATH gcc linker option and for reasons of debugging and testing i would use the absolute path (https://docs.python.org/2/distutils/apiref.html)
In your python minimal example the code is then :
#!/usr/bin/env python3
import distutils.core as dc
module = dc.Extension('example',
sources = ['example.c'],
library_dirs = ['/usr/lib/x86_64-linux-gnu/libm.so'],
libraries = ['m'])
dc.setup(name = 'example',
version = '0.1',
ext_modules = [module])
You use the -R linker flag to specify the rpath in your gcc invokation, cf Shared library dependencies with distutils and What does the gcc -R parameter do? . In https://www.mpcdf.mpg.de/services/computing/software/libraries/static-and-dynamic-linking-on-linux-systems is a description of the linking process. It is said that LD_LIBRARY_PATH or equivalently the -L gcc linker option overrides the rpath and that it should be avoided, however you should give it a try anyway ...
Another possiblity for this behavior could be permission problems, i.e. when you execute example does it have the permission to access libm cf https://unix.stackexchange.com/questions/303292/permission-denied-on-some-shared-libraries

python setup.py - how to remove compiler option HP-UX

I'm new to python, and try to compile python module in HP-UX 11.31.
% CC=gcc python setup.py build
unix
running build
running build_ext
building 'pygoldilocks' extension
gcc -DNDEBUG -O +z -DPYGOLDILOCKS_VERSION=1.0
gcc: +z: not recognized ...
this produces '+z' not recognized.
How to solove it? Can I remove the option '+z'?

Using cython to cross compile project from intel ubuntu to arm

I have simple python + cython project (hello world example from http://docs.cython.org/src/tutorial/cython_tutorial.html) on my ubuntu 16 x86_64. I can build this project with cython for x86_64.
How can I build the project for armv7 version of ubuntu 15 without using real armv7 board/cpu?
I have arm-linux-gnueabihf-gcc (http://packages.ubuntu.com/xenial/devel/gcc-arm-linux-gnueabihf) and it can compile simple C programs for armv7. How can I change settings of cython to use cross compiler for building shared objects for arm?
Architecture dependent libraries and headers files are needed for cross compiling.
When testing if python3.5-dev package and others could be installed after dpkg --add-architecture armhf and apt-get update (after some modification to sources.list), the result was basically.
python3.5-dev:armhf : Depends: python3.5:armhf (= 3.5.1-10) but it is not going to be installed
apt-get install python3.5:armhf is something that doesn't work, see
The existing proposals allow for the co-installation of libraries and
headers for different architectures, but not (yet) binaries.
One possible solution that does not require "full" virtual machine is provided by QEMU and chroot. A suitable directory for chroot can be created by debootstrap command. After creation, schroot can give access to that environment.
Substitute <DIRECTORY> and <USER> in the following commands:
apt-get install -y debootstrap qemu-user-static binfmt-support schroot
debootstrap --arch=armhf --foreign --include=gcc,g++,python3.5-dev xenial <DIRECTORY>
cp /usr/bin/qemu-arm-static <DIRECTORY>/usr/bin
chroot <DIRECTORY>
/debootstrap/debootstrap --second-stage
echo "deb http://ports.ubuntu.com/ubuntu-ports xenial universe" >> /etc/apt/sources.list
echo "deb http://ports.ubuntu.com/ubuntu-ports xenial multiverse" >> /etc/apt/sources.list
apt-get update
apt-get install -y cython cython3
exit
cat <<END > /etc/schroot/chroot.d/xenial-armhf
[xenial-armhf]
description=Ubuntu xenial armhf
type=directory
directory=/home/xenial-armhf
groups=sbuild,root
root-groups=sbuild,root
users=root,<USER>
END
The environment should be accessible by
schroot -c chroot:xenial-armhf
and for root user session (the user must be in a group listed in root-groups) ,
schroot -c chroot:xenial-armhf -u root
After this, it is also possible to cross compile a cython module:
hello.pyx:
print("hello world")
compiling (python3.5-config --cflags and python3.5-config --libs in chroot for options, note -fPIC):
cython hello.pyx
arm-linux-gnueabihf-gcc --sysroot <DIRECTORY> -I/usr/include/python3.5m -I/usr/include/python3.5m -Wno-unused-result -Wsign-compare -g -fstack-protector-strong -Wformat -Werror=format-security -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -c hello.c
arm-linux-gnueabihf-gcc --shared --sysroot <DIRECTORY> -lpython3.5m -lpthread -ldl -lutil -lm hello.o -o hello.so
The module can be then tested
schroot -c chroot:xenial-armhf
python3
import hello
Cross compiling cython based python modules may also work. With setup.py
from distutils.core import setup
from distutils.extension import Extension
from Cython.Distutils import build_ext
import os
os.environ['CC'] = 'arm-linux-gnueabihf-gcc'
os.environ['LDSHARED'] = 'arm-linux-gnueabihf-gcc -shared'
sysroot_args=['--sysroot', '/path/to/xenial-armhf']
setup(cmdclass = {'build_ext': build_ext},
ext_modules= [ Extension("hello", ["hello.pyx"],
extra_compile_args=sysroot_args,
extra_link_args=sysroot_args) ])
Building a simple hello world module was possible this way. The file name for the module was wrong, in this case it was hello.cpython-35m-x86_64-linux-gnu.so. After renaming it as hello.so it was possible to import it.

Cannot get Cython to find the MinGW gcc compiler even after editing PATH, making a file in distutils, removing all instances of -mno-cygwin

I am trying to get cython to realize I have a c compiler in MinGW 32-bit and I've tried everything I can find on the web but it's still not working. I am running Windows 7 Professional 64-bit. Here is what I have tried:
(1) I have Python 2.7 and I just installed MinGW with options gcc and g++ and some other options
(2) I edited the PATH environmental variable so it includes
C:\MinGW\bin;C:\MinGW\MSYS\1.0\local\bin;C:\MinGW\MSYS\1.0\bin
(3) I told Python to use MinGW as the default compiler by creating a file named
C:\Python27\Lib\distutils\distutils.cfg, containing
[build]
compiler = mingw32
(I do have MinGW32 by the way)
(4) I removed all instances of -mno-cygwin from the file C:\Python27\Lib\distutils\cygwincompiler.py
(5) I have a file called setup.py and a module called tryingcython.pyx that is written in python. My setup.py says
from distutils.core import setup
from distutils.extension import Extension
from Cython.Distutils import build_ext
setup(
cmdclass = {'build_ext':build_ext},
ext_modules=[Extension("tryingcython",["tryingcython.pyx"])]
)
So then I open Command Prompt and get into the directory that contains setup.py and tryingcython.pyx, and I type
python setup.py build_ext --inplace --compiler=mingw32
Then it tells me:
running build_ext
skipping 'tryingcython.c' Cython extension (up-to-date)
building 'tryingcython.c' extension
gcc -mdll -O -Wall -IC:\Python27\include -IC:\Python27\PC -c tryingcython.c -o build\
temp.win32-2.7\Release\tryingcython.o
error: command 'gcc' failed: No such file or directory
So I guess Cython can't tell that I have gcc and it can't find it or what, even though I've tried about every single piece of advice I can find online for making it realize that I have MinGW which has gcc included.
Any help/additional ideas on how I can get cython to actually work would be much appreciated.
You are using exactly the same operational system and versions than me.
Try to cal gcc using:
SET input=intput.c
SET output=output.pyd
gcc -shared -IC:\Python27\include -LC:\Python27\libs -O2 -o %output% %input% -lpython27
Usually I put this call in a cythongcc.bat file, in a directory recognized by the PATH environment variable as:
gcc -shared -IC:\Python27\include -LC:\Python27\libs -O3 -mtune=native -o %1.pyd %2.c -lpython27
So that I can , from where my cython .pyx files are, just do:
cython input.pyx
cythongcc input input
To get the compiled .pyd working!

Categories