I have a large C/C++ project which I usually just build as a standalone executable. It also requires some libraries (system libraries like pthread, but also MKL from intel for example) which I specify when compiling the project.
Now, I want to use some functions from this project in Python. So my first plan was to simply build the project as a static library and then use that to write a cython wrapper. I.e.
Build the c project: icc -c src/main.cpp .. options and stuff.. linking to all libraries needed .. -o main.o
Create a static library (including the libraries I needed in step 1): ar -rc libmain.a main.o ${MKLROOT}/lib/intel64/libmkl_intel_lp64.a ... /usr/lib/x86_64-linux-gnu/libpthread.a ...
I use the generated static library to build my cython wrapper
when I try to execute a test python script calling a random function from my C program I get this error ImportError: /home/.../main.cpython-37m-x86_64-linux-gnu.so: undefined symbol: __kmpc_ok_to_fork
It seems like I need to tell cython to link to the corresponding libraries again or something, but I'm not really sure how to resolve this.. Also I don't want the python project to be dependent on some special libraries that I have installed on my system (that's what I try to achieve by adding all the libraries in step 2), is this possible at all?
My setup.py looks like this
from setuptools import setup, Extension
from Cython.Build import cythonize
import os
os.environ["CC"] = "icc"
os.environ["CXX"] = "icc"
setup(
ext_modules = cythonize([Extension("test", ["test.pyx"], language="c++",
library_dirs=[r'.'], libraries=['main'])])
)
Create a shared library (.so on linux). Python can only support dynamic or shared libraries. The linker is looking for a .so not a .a.
So your setup.py would be remain the same,
from setuptools import setup, Extension
from Cython.Build import cythonize
import os
os.environ["CC"] = "icc"
os.environ["CXX"] = "icc"
setup(
ext_modules = cythonize([Extension("test", ["test.pyx"], language="c++",
library_dirs=[r'.'], libraries=['main'])])
)
Related - Using Cython To Link Python To A Shared Library.
Related
I have written a Python C-API package that uses openssl to perform https requests. This builds fine using the below setup:
from distutils.core import setup, Extension
my_module = Extension('my_module',
sources = ['my_module.cpp'],
include_dirs = ['<...>/openssl-1.1/x64/include', '<...>/cpp-httplib'],
library_dirs=['<...>/openssl-1.1/x64/lib'],
libraries = ['libssl', 'libcrypto']
)
setup(name = 'my_module', version = '1.0',description = '', ext_modules = [my_module])
But when I use the module (test.pyd), i.e.
import my_module
It complains that the DLLs for openSSL (libcrypto-1_1.dll and libssl-1_1.dll) are missing:
ImportError: DLL load failed while importing my_module: The specified module could not be found.
I could add these to the folder to make the import work, but I find this somewhat unnecessary because these DLLs can also be found in the installation of Python itself (at C:\Users\<username>\AppData\Local\Programs\Python\Python38\DLLs. Is there anyway I can modify the setup so that Python looks for the missing DLLs in the Python DLLs folder?
I'm writing an extension for a C++ library to make it available in Python using Pybind11. The library itself depends on a couple of other C++ libraries.
What I don't get is what files am I supposed to include in my distribution package and how. After mixing up some code from Python Packaging Guide and Building C++ Extensions I got the following files
setup.py
from setuptools import setup, Extension
#from distutils.core import setup, Extension #used this at first, switched to setuptools. Didn't see the difference
src = ['module.cpp', /*... other cpp files */]
include = ['MyLibrary/include', /*... other header files for 3rdparty libs*/]
module = Extension(
'TestlibPy',
sources = src,
include_dirs= include,
libraries=[/* library names*/]
lib_dirs=[/*library dirs*/]
language='c++',
)
setup(
ext_modules = [module],
)
setup.cfg
[metadata]
name = TestlibPy
version = 0.0.1
description = Python interface for Test library
classifiers =
Programming Language:: Python :: 3
[options]
packages = find:
python_requires = >=3.6
pyproject.toml
[build-system]
requires = [
"setup tools>=42",
"wheel"
]
build-backend = "setuptools.build_meta"
After building with
py -m build
I got the bare minimum packages that don't even include the headers (and I don't get the logic here - it builds the distribution with them in mind, from the directory provided in 'include', yet doesn't think other users will need those heades?).
So I wrote a MANIFEST.in:
graft MyLibC++/include
graft MyLib/3rdpartyLibs
After another build I get
package.tar.gz - contains everything I asked for and works on my other laptop after installation, but it's an unencoded archive that gives away the source code to anyone who bothers to look (I obviously don't want that);
package.whl - ignores my MANIFEST.in and doesn't seem to work (is it supposed to? Did all the necessary information get into the .pyd file without me knowing any better?)
My questions are:
Is it alright to include all the 3rd party C++ libraries + pybind11 in my distribution package, or is there some better way to do things?
Should the C++ library be in .dll format, or can I get away with a bunch of .cpp files?
Can I somehow write the .hpp and 3rdparty files to the .whl package? Or it should work without them?
That's my first time working with Python, extensions and packages, so maybe I'm asking all the wrong questions. Any advice would be helpful.
I'm working on a project to call C++ from Python. We use Cython for this purpose. When compile by using the command "python3.6 setup.py build_ext --inplace", the compiler "x86_64-linux-gnu-gcc" is used. Is there a way to use a different compiler like "arm-linux-gnueabihf-g++"?
Also, is there a way to add a compilation option such as "-DPLATFORM=linux"?
Here's the setup.py:
from distutils.core import setup, Extension
from Cython.Build import cythonize
setup(ext_modules = cythonize(Extension(
"app",
sources=["app.pyx", "myapp.cpp"],
language="c++",
include_dirs=["../base"]
)))
Distutils by default uses the CC system environment variable to decide what compiler to use. You can run the python script such that the CC variable is set to whatever you want at the beginning of the script before setup() is called.
As for passing flags to the compiler, add a extra_compile_args named argument to your Extension() module. It may look like this for example:
from distutils.core import setup, Extension
from Cython.Build import cythonize
setup(ext_modules = cythonize(Extension(
"app",
sources=["app.pyx", "myapp.cpp"],
language="c++",
include_dirs=["../base"],
extra_compile_args=["-DPLATFORM=linux"]
)))
You can fix the value of the CC environment variable in setup.py. For instance:
os.environ["CC"] = "g++"
or
os.environ["CC"] = "clang++"
In order to specify a C++ compiler for Cython, you need to set a proper CXX environment variable prior calling setup.py.
This could be done:
Using a command-line option:
export CXX=clang++-10
In the setup.py:
os.environ["CXX"] = "clang++-10"
Note: clang++-10 is used as an example of the alternative C++ compiler.
Note: CC environment variable specifies C compiler, not C++. You may consider specifying also e.g. export CC=clang-10.
We have a bunch of C++ files with classes that we wrap to Python using Cython. We use setuptools to build the Cython extension. This all works fine, we followed the guide here:
http://cython.readthedocs.io/en/latest/src/userguide/wrapping_CPlusPlus.html
We are basically doing something like this
from distutils.core import setup
from Cython.Build import cythonize
setup(ext_modules = cythonize(
"rect.pyx", # our Cython source
sources=["Rectangle.cpp"], # additional source file(s)
language="c++", # generate C++ code
))
We don't like about this that we have to recompile everything, even if only the Cython part changes, rect.pyx in this example. In fact we never touch the .cpp files, but change the .pyx files often.
We would like to compile the .cpp files separately into a static or shared libary, then build .pyx files independently, which links to the library generated from the .cpp files. All this would be easy with make or cmake, but we want a pure Python solution that uses only setuptools. Mock-up code would look something like this:
from distutils.core import setup
from Cython.Build import cythonize
class CppLibary:
# somehow get that to work
# this should only recompile cpplib when source files changed
cpplib = CppLibary('cpplib',
sources=["Rectangle.cpp"], # put static cpp code here
include_dirs=["include"])
setup(ext_modules = cythonize(
"rect.pyx", # our Cython source
libraries=[cpplib], # link to cpplib
language="c++", # generate C++ code
))
There is a seemingly undocumented feature of setup that can do this, for example:
import os
from setuptools import setup
from Cython.Build import cythonize
ext_lib_path = 'rectangle'
include_dir = os.path.join(ext_lib_path, 'include')
sources = ['Rectangle.cpp']
# Use as macros = [('<DEFINITION>', '<VALUE>')]
# where value can be None
macros = None
ext_libraries = [['rectangle', {
'sources': [os.path.join(ext_lib_path, src) for src in sources],
'include_dirs': [include_dir],
'macros': macros,
}
]]
extensions = [Extension("rect",
sources=["rect.pyx"],
language="c++",
include_dirs=[include_dir],
libraries=['rectangle'],
)]
setup(ext_modules=cythonize(extensions),
libraries=ext_libraries)
The libraries argument builds the external library found in directory rectangle, with include directory rectangle/include common between it and the extension.
Have also switched the import to setuptools from distutils which is deprecated, now part of setuptools.
Have not seen any documentation on this argument but seen it used in other projects.
This is untested, please provide sample files for testing if it does not work.
I have a question. I would like to distribute my cython-powered packages, but I see no easy way to build them in setup.py. I would like setup.py to:
most importantly: install my package without cython (from pre-generated C files or by installing cython beforehand)
rebuild (run cythonize) package on sdist
not need to hard-code list of my cython modules (just use glob or something)
be able to work without .c files (should not be stored in git) or .pyx (might not be distributed). at least one of those sets will be always present of course.
Currently in my itchy package, I am using this quite complicated code:
import os
from glob import glob
from distutils.command.build_ext import build_ext as _build_ext
from distutils.command.sdist import sdist as _sdist
from distutils.core import setup
from distutils.core import Extension
def generate_extensions():
return [
# Compile cython-generated .c files into importable .so libraries.
Extension(os.path.splitext(name)[0], [name])
for name in C_FILES
]
# In distribution version, there are no pyx files, when you clone package from git, there will be no c files.
CYTHON_FILES = glob('itchy/*.pyx')
C_FILES = glob('itchy/*.c')
extensions = generate_extensions()
class build_ext(_build_ext):
def run(self):
# Compile cython files (.pyx, some of the .py) into .c files if Cython is available.
try:
from Cython.Build import cythonize
if CYTHON_FILES:
cythonize(CYTHON_FILES)
# Update C_FILES in case they were originally missing.
global C_FILES, extensions
C_FILES = glob('itchy/*.c')
extensions = generate_extensions()
else:
print('No .pyx files found, building extensions skipped. Pre-built versions will be used.')
except ImportError:
print('Cython is not installed, building extensions skipped. Pre-built versions will be used.')
assert C_FILES, 'C files have to be present in distribution or Cython has to be installed'
_build_ext.run(self)
class sdist(_sdist):
def run(self):
# Make sure the compiled Cython files in the distribution are up-to-date
self.run_command("build_ext")
_sdist.run(self)
setup(
(...)
ext_modules = extensions,
cmdclass = {
'build_ext': build_ext,
'sdist': sdist,
},
)
Usually done by attempting to import cython and adjusting extensions to either
Build pyx files with cython if cython is present
Build C files if cython is not present
For example:
try:
from Cython.Distutils.extension import Extension
from Cython.Distutils import build_ext
except ImportError:
from setuptools import Extension
USING_CYTHON = False
else:
USING_CYTHON = True
ext = 'pyx' if USING_CYTHON else 'c'
sources = glob('my_module/*.%s' % (ext,))
extensions = [
Extension(source.split('.')[0].replace(os.path.sep, '.'),
sources=[source],
)
for source in sources]
cmdclass = {'build_ext': build_ext} if USING_CYTHON else {}
setup(<..>, ext_modules=extensions, cmdclass=cmdclass)
The source.split stuff is needed as cythonized extension names need to be in the form my_module.ext while glob requires path names like my_module/ext.
See this repository for a real-world example.
You should, however, include .c files in your git repo as well as the distributable otherwise when it comes time to build a distribution the .c files will be re-built and may or may not be the same files as were built on your machine.
They may be built by another version of cython, for example, or on a different platform, producing different code.
Cython is a static compiler - it's recommended to commit the files it produces to repository.
It is strongly recommended that you distribute the generated .c files as well as your Cython sources, so that users can install your module without needing to have Cython available.
See Cython documentation on distributing modules.