I'm trying to make a setup.py for cgal-bindings. To install this, the user needs to have at least a certain version of CGAL. In addition, CGAL has a few optional targets that should be built if the user has some libraries (like Eigen3). Is there a cross-platform way in Python to check for this?
I can use find_library in ctypes.util to check if the library exists, but I don't see any easy way to get the version. <-- This doesn't actually work all the time, some libraries are header-only like eigen3, which is a C++ template library.
Using the install_requires argument of setup() only works for Python libraries and CGAL is a C/C++ library.
Whether a particular extension module should be compiled depending on the availability of some library version, can be accomplished by dynamically generating the ext_modules argument of setup() in setup.py.
For the _yaml.so module of ruamel.yaml, that only should be compiled when the libyaml development libraries have been installed on the system I do:
import os
from textwrap import dedent
def check_extensions():
"""check if the C module can be build by trying to compile a small
program against the libyaml development library"""
import tempfile
import shutil
import distutils.sysconfig
import distutils.ccompiler
from distutils.errors import CompileError, LinkError
libraries = ['yaml']
# write a temporary .c file to compile
c_code = dedent("""
#include <yaml.h>
int main(int argc, char* argv[])
{
yaml_parser_t parser;
parser = parser; /* prevent warning */
return 0;
}
""")
tmp_dir = tempfile.mkdtemp(prefix = 'tmp_ruamel_yaml_')
bin_file_name = os.path.join(tmp_dir, 'test_yaml')
file_name = bin_file_name + '.c'
with open(file_name, 'w') as fp:
fp.write(c_code)
# and try to compile it
compiler = distutils.ccompiler.new_compiler()
assert isinstance(compiler, distutils.ccompiler.CCompiler)
distutils.sysconfig.customize_compiler(compiler)
try:
compiler.link_executable(
compiler.compile([file_name]),
bin_file_name,
libraries=libraries,
)
except CompileError:
print('libyaml compile error')
ret_val = None
except LinkError:
print('libyaml link error')
ret_val = None
else:
ret_val = [
Extension(
'_yaml',
sources=['ext/_yaml.c'],
libraries=libraries,
),
]
shutil.rmtree(tmp_dir)
return ret_val
This way you require no extra files in the distribution. Even if you cannot fail to compile based on the version number at compile time, you should be
able to run the resulting program from the temporary directory and check the exit value and/or output.
Related
I am using Python3.6. I have created a C++ extension using (pybind11)[https://github.com/pybind/pybind11]. I copied the compiled *.pyd file along with the dependent dll to the site packages. But when I try to load any functions from the external DLL, python complains that the function is not present. If I want to access the function, I need write
sys.path.append(r'C:\Users\test\AppData\Local\Programs\Python\Python36\Lib\site-packages\CppProject')
or I need to add the same path to the PYTHONPATH environment variable.
Why Python is not able to load the function even though it is present in the same path as the pyd? I don't want to append the sys path everytime I need to use the module or use the environment variable? Is there any way to avoid this? Is there any way to add this path to the sys automatically whenever the user import the module?
Example:
CppExport.dll
#ifdef CPPEXPORT_EXPORTS
#define CPPEXPORT_API __declspec(dllexport)
#else
#define CPPEXPORT_API __declspec(dllimport)
#endif
extern "C" CPPEXPORT_API double sin_impl(double x);
const double e = 2.7182818284590452353602874713527;
double sin_impl(double x){
return (1 - pow(e, (-2 * x))) / (2 * pow(e, -x));
}
CppProject.pyd
PYBIND11_MODULE(CppProject, m) {
m.def("sin_impl", &sin_impl, R"pbdoc(
Compute a hyperbolic tangent of a single argument expressed in radians.
)pbdoc");
#ifdef VERSION_INFO
m.attr("__version__") = VERSION_INFO;
#else
m.attr("__version__") = "dev";
#endif
}
Setup.py
from setuptools import setup
import distutils
import sys
from setuptools.dist import Distribution
from distutils.sysconfig import get_python_lib
relative_site_packages = get_python_lib().split(sys.prefix+os.sep)[1]
date_files_relative_path = os.path.join(relative_site_packages, "CppProject")
class BinaryDistribution(Distribution):
"""Distribution which always forces a binary package with platform name"""
def has_ext_modules(foo):
return True
setup(
name='CppProject',
version='1.0',
description='CppProject Library',
packages=['CppProject'],
package_data={
'CppProject': ['CppProject.pyd'],
},
data_files = [(date_files_relative_path, ["CppExport.dll"])],
distclass=BinaryDistribution
)
In Python:
from CppProject import sin_impl
Error:
ImportError: cannot import name 'sin_impl'
Full Code is present in Github
Sorry for the previous reply here is some better advises :
You want to distribute your library, to do so you need to create setup.py and init.py. Once this done you will be able to install your package using python setup.py install.
For me the setup.py look like :
README_rst = ''
from distutils.core import setup
with open('README.rst', mode='r', encoding='utf-8') as fd:
README_rst = fd.read()
setup(
name='MyStack',
version='0.0.1',
description='Cool short description',
author='Author',
author_email='author#mail.com',
url='repo.com',
packages=['Pkg'],
long_description=README_rst,
include_package_data=True,
classifiers=[
# Trove classifiers
# The full list is here: https://pypi.python.org/pypi?%3Aaction=list_classifiers
'Development Status :: 3 - Alpha',
]
)
In the init.py you will have to find your library and import it. Here is an example how Qt does :
def find_qt():
import os
path = os.environ['PATH']
dll_dir = os.path.dirname(__file__) + '\\Qt\\bin'
if os.path.isfile(dll_dir + '\\Qt5Core.dll'):
path = dll_dir + ';' + path
os.environ['PATH'] = path
else:
for dll_dir in path.split(';'):
if os.path.isfile(dll_dir + '\\Qt5Core.dll'):
break
else:
raise ImportError("unable to find Qt5Core.dll on PATH")
try:
os.add_dll_directory(dll_dir)
except AttributeError:
pass
find_qt()
del find_qt
Hope this help
The fact that your code works when you explicitly add the directory to sys.path is the key to understand what's happening.
Since site-packages is one of the locations searched by the interpreter when importing modules, this statement:
from CppProject import sin_impl
is actually searching for a module named sin_impl inside the CppProject folder.
Instead you should do:
from CppProject.CppProject import sin_impl
which points to the actual module of the same name.
This actually doesn't require the presence of __init__.py inside CppProject to qualify it as a Python package, since Python 3.3+ implements implicit namespace packages.
However, when you are building a complex program with many dependencies the package constructor enables you to add some kind of initialization to be performed before any regular module is executed.
I'm building a C Python extension which makes use of a "third party" library— in this case, one that I've built using a separate build process and toolchain. Call this library libplumbus.dylib.
Directory structure would be:
grumbo/
include/
plumbus.h
lib/
libplumbus.so
grumbo.c
setup.py
My setup.py looks approximately like:
from setuptools import Extension, setup
native_module = Extension(
'grumbo',
define_macros = [('MAJOR_VERSION', '1'),
('MINOR_VERSION', '0')],
sources = ['grumbo.c'],
include_dirs = ['include'],
libraries = ['plumbus'],
library_dirs = ['lib'])
setup(
name = 'grumbo',
version = '1.0',
ext_modules = [native_module] )
Since libplumbus is an external library, when I run import grumbo I get:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: dlopen(/path/to/grumbo/grumbo.cpython-37m-darwin.so, 2): Library not loaded: lib/libplumbus.dylib
Referenced from: /path/to/grumbo/grumbo.cpython-37m-darwin.so
Reason: image not found
What's the simplest way to set things up so that libplumbus is included with the distribution and properly loaded when grumbo is imported? (Note that this should work with a virtualenv).
I have tried adding lib/libplumbus.dylib to package_data, but this doesn't work, even if I add -Wl,-rpath,#loader_path/grumbo/lib to the Extension's extra_link_args.
The goal of this post is to have a setup.py which would create a source distribution. That means after running
python setup.py sdist
the resulting dist/grumbo-1.0.tar.gz could be used for installation via
pip install grumbo-1.0.tar.gz
We will start for a setup.py for Linux/MacOS, but then tweak to make it work for Windows as well.
The first step is to get the additional data (includes/library) into the distribution. I'm not sure it is really impossible to add data for a module, but setuptools offers functionality to add data for packages, so let's make a package from your module (which is probably a good idea anyway).
The new structure of package grumbo looks as follows:
src/
grumbo/
__init__.py # empty
grumbo.c
include/
plumbus.h
lib/
libplumbus.so
setup.py
and changed setup.py:
from setuptools import setup, Extension, find_packages
native_module = Extension(
name='grumbo.grumbo',
sources = ["src/grumbo/grumbo.c"],
)
kwargs = {
'name' : 'grumbo',
'version' : '1.0',
'ext_modules' : [native_module],
'packages':find_packages(where='src'),
'package_dir':{"": "src"},
}
setup(**kwargs)
It doesn't do much yet, but at least our package can be found by setuptools. The build fails, because the includes are missing.
Now let's add the needed includes from the include-folder to the distribution via package-data:
...
kwargs = {
...,
'package_data' : { 'grumbo': ['include/*.h']},
}
...
With that our include-files are copied to the source distribution. However because it will be build "somewhere" we don't know yet, adding include_dirs = ['include'] to the Extension definition just doesn't cut it.
There must be a better way (and less brittle) to find the right include path, but that is what I came up with:
...
import os
import sys
import sysconfig
def path_to_build_folder():
"""Returns the name of a distutils build directory"""
f = "{dirname}.{platform}-{version[0]}.{version[1]}"
dir_name = f.format(dirname='lib',
platform=sysconfig.get_platform(),
version=sys.version_info)
return os.path.join('build', dir_name, 'grumbo')
native_module = Extension(
...,
include_dirs = [os.path.join(path_to_build_folder(),'include')],
)
...
Now, the extension is built, but cannot be yet loaded because it is not linked against shared-object libplumbus.so and thus some symbols are unresolved.
Similar to the header files, we can add our library to the distribution:
kwargs = {
...,
'package_data' : { 'grumbo': ['include/*.h', 'lib/*.so']},
}
...
and add the right lib-path for the linker:
...
native_module = Extension(
...
libraries = ['plumbus'],
library_dirs = [os.path.join(path_to_build_folder(), 'lib')],
)
...
Now, we are almost there:
the extension is built an put into site-packages/grumbo/
the extension depends on libplumbus.so as can be seen with help of ldd
libplumbus.so is put into site-packages/grumbo/lib
However, we still cannot import the extension, as import grumbo.grumbo leads to
ImportError: libplumbus.so: cannot open shared object file: No such
file or directory
because the loader cannot find the needed shared object which resides in the folder .\lib relative to our extension. We could use rpath to "help" the loader:
...
native_module = Extension(
...
extra_link_args = ["-Wl,-rpath=$ORIGIN/lib/."],
)
...
And now we are done:
>>> import grumbo.grumbo
# works!
Also building and installing a wheel should work:
python setup.py bdist_wheel
and then:
pip install grumbo-1.0-xxxx.whl
The first mile stone is achieved. Now we extend it, so it works other platforms as well.
Same source distribution for Linux and Macos:
To be able to install the same source distribution on Linux and MacOS, both versions of the shared library (for Linux and MacOS) must be present. An option is to add a suffix to the names of shared objects: e.g. having libplumbus.linux.so and libplumbis.macos.so. The right shared object can be picked in the setup.py depending on the platform:
...
import platform
def pick_library():
my_system = platform.system()
if my_system == 'Linux':
return "plumbus.linux"
if my_system == 'Darwin':
return "plumbus.macos"
if my_system == 'Windows':
return "plumbus"
raise ValueError("Unknown platform: " + my_system)
native_module = Extension(
...
libraries = [pick_library()],
...
)
Tweaking for Windows:
On Windows, dynamic libraries are dlls and not shared objects, so there are some differences that need to be taken into account:
when the C-extension is built, it needs plumbus.lib-file, which we need to put into the lib-subfolder.
when the C-extension is loaded during the run time, it needs plumbus.dll-file.
Windows has no notion of rpath, thus we need to put the dll right next to the extension, so it can be found (see also this SO-post for more details).
That means the folder structure should be as follows:
src/
grumbo/
__init__.py
grumbo.c
plumbus.dll # needed for Windows
include/
plumbus.h
lib/
libplumbus.linux.so # needed on Linux
libplumbus.macos.so # needed on Macos
plumbus.lib # needed on Windows
setup.py
There are also some changes in the setup.py. First, extending the package_data so dll and lib are picked up:
...
kwargs = {
...
'package_data' : { 'grumbo': ['include/*.h', 'lib/*.so',
'lib/*.lib', '*.dll', # for windows
]},
}
...
Second, rpath can only be used on Linux/MacOS, thus:
def get_extra_link_args():
if platform.system() == 'Windows':
return []
else:
return ["-Wl,-rpath=$ORIGIN/lib/."]
native_module = Extension(
...
extra_link_args = get_extra_link_args(),
)
That it!
The complete setup file (you might want to add macro-definition or similar, which I've skipped):
from setuptools import setup, Extension, find_packages
import os
import sys
import sysconfig
def path_to_build_folder():
"""Returns the name of a distutils build directory"""
f = "{dirname}.{platform}-{version[0]}.{version[1]}"
dir_name = f.format(dirname='lib',
platform=sysconfig.get_platform(),
version=sys.version_info)
return os.path.join('build', dir_name, 'grumbo')
import platform
def pick_library():
my_system = platform.system()
if my_system == 'Linux':
return "plumbus.linux"
if my_system == 'Darwin':
return "plumbus.macos"
if my_system == 'Windows':
return "plumbus"
raise ValueError("Unknown platform: " + my_system)
def get_extra_link_args():
if platform.system() == 'Windows':
return []
else:
return ["-Wl,-rpath=$ORIGIN/lib/."]
native_module = Extension(
name='grumbo.grumbo',
sources = ["src/grumbo/grumbo.c"],
include_dirs = [os.path.join(path_to_build_folder(),'include')],
libraries = [pick_library()],
library_dirs = [os.path.join(path_to_build_folder(), 'lib')],
extra_link_args = get_extra_link_args(),
)
kwargs = {
'name' : 'grumbo',
'version' : '1.0',
'ext_modules' : [native_module],
'packages':find_packages(where='src'),
'package_dir':{"": "src"},
'package_data' : { 'grumbo': ['include/*.h', 'lib/*.so',
'lib/*.lib', '*.dll', # for windows
]},
}
setup(**kwargs)
I'm working on packaging a Python interface to a C library. The C library comes as a binary distribution tarball with headers and the compiled library. I want to make a bdist_wheel out of it, along with my built Python extensions, and the headers.
I've written a couple of distutils commands for extracting and installing the library like so:
from distutils.core import Command
from distutils.command.build import build
import os
import tarfile
class ExtractLibraryCommand(Command):
description = 'extract library from binary distribution'
def initialize_options(self):
self.build_lib = None
self.build_temp = None
self.library_dist = os.environ.get('LIBRARY_DIST')
def finalize_options(self):
self.set_undefined_options('build',
('build_lib', 'build_lib'),
('build_temp', 'build_temp'))
assert os.path.exists(self.library_dist), 'Library dist {} does not exist'.format(self.library_dist)
def run(self):
with tarfile.open(self.library_dist, 'r') as tf:
tf.extractall(path=self.build_temp)
class InstallLibraryCommand(Command):
description = 'install library from extracted distribution'
def initialize_options(self):
self.build_lib = None
self.build_temp = None
def finalize_options(self):
self.set_undefined_options('build',
('build_lib', 'build_lib'),
('build_temp', 'build_temp'))
def run(self):
self.copy_tree(
os.path.join(os.path.join(build_temp, 'my_library')),
os.path.join(self.build_lib, os.path.join('my_package', 'my_library'))
)
Then I override the build step to include my new commands.
class BuildCommand(build):
def run(self):
self.run_command('extract_library')
self.run_command('install_library')
build.run(self)
The problem is, I'm not sure how to get the path to the headers for the library to build my extensions, as they're installed to a directory specified by distutils.
from setuptools import setup, find_packages
from setuptools.extension import Extension
from Cython.Build import cythonize
extensions = [
Extension(
'package.library.*',
['package/library/*.pyx'],
include_dirs=???,
),
]
setup(
packages=find_packages(),
...
ext_modules=cythonize(extensions),
)
EDIT: To clarify, this is one setup.py script.
You can modify the extensions in the InstallLibraryCommand, after the library becomes available. I'd probably also move the extraction/installation code to finalize_options instead of run as installing the library in building stage is somewhat late in my opinion (makes the library unavailable in the configuration stage). Example:
class InstallLibraryCommand(Command):
def finalize_options(self):
...
with tarfile.open(self.library_dist, 'r') as tf:
tf.extractall(path=self.build_temp)
include_path = os.path.join(self.build_lib, os.path.join('my_package', 'my_library'))
self.copy_tree(
os.path.join(os.path.join(build_temp, 'my_library')),
include_path
)
for ext in self.distribution.ext_modules:
ext.include_dirs.append(include_path)
After deliberating on this problem a little more, I've decided to commit the library's headers with the Cython interface, as they're part of the interface, and required for building. This way, the code documents and uses a particular fixed version, and can be distributed with compatible binaries.
I am writing a Python extension in C++. I compile it by defining a list of the constituent source files in my setup.py file, like so:
extensions = {
'im': [
"im/src/buffer.cpp",
"im/src/detail.cpp",
"im/src/gil.cpp",
"im/src/halideimage.cpp",
"im/src/hybrid.cpp",
"im/src/hybridimage.cpp",
"im/src/options.cpp",
"im/src/pybuffer.cpp",
"im/src/pycapsule.cpp",
"im/src/structcode.cpp",
"im/src/typecode.cpp",
"im/src/module.cpp"
],
}
… these are used to define an instance of setuptools.Extension which is ultimately passed to the setup() function. This has all worked just fine throughout the project, until now, when I tried to add a platform-specific bit:
preview_source = (sys.platform == 'darwin') and 'im/src/plat/preview_mac.mm' or \
(sys.platform == 'linux') and 'im/src/plat/preview_linux.cpp' or \
(sys.platform == 'win32') and 'im/src/plat/preview_windows.cpp' or \
'im/src/plat/preview.cpp'
extensions = {
'im': [
"im/src/buffer.cpp",
"im/src/detail.cpp",
"im/src/gil.cpp",
"im/src/halideimage.cpp",
"im/src/hybrid.cpp",
"im/src/hybridimage.cpp",
"im/src/options.cpp",
preview_source,
"im/src/pybuffer.cpp",
"im/src/pycapsule.cpp",
"im/src/structcode.cpp",
"im/src/typecode.cpp",
"im/src/module.cpp"
],
}
… adding this new bit chooses the right file for compilation – but it fails to compile at all on Mac OS X. Apparently distutils/setuptools doesn’t recognize the “.mm” extension as a source file:
error: unknown file type '.mm'
I am no expert when it comes to distutils and setuptools platform-specific configuration – what’s a simple way to conditionally add this one source file to the source file list on the Mac?
I ran into the same issue, did you ever find some solution?
It looks like '.mm' is not supported in my version of distutils but '.m' is. So I separated the C++ parts of the .mm file into a .cpp file, and create a small C header to access the .m file from that .cpp.
I'm working on a project where I ran into this problem. Here is something that I put together. Its kind of hacky but it works.
from distutils.unixccompiler import UnixCCompiler
from setuptools import setup
from setuptools.command.build_ext import build_ext
class DarwinInteropBuildExt(build_ext):
def initialize_options(self):
# add support for ".mm" files
UnixCCompiler.src_extensions.append(".mm")
UnixCCompiler.language_map[".mm"] = "objc"
# then intercept and patch the compile and link methods to add needed flags
unpatched_compile = UnixCCompiler._compile
unpatched_link = UnixCCompiler.link
def patched_compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts):
# define language specific compile flags here
if ext == ".cpp":
patched_postargs = extra_postargs + ["-std=c++17"]
elif ext == ".mm":
patched_postargs = extra_postargs + [
"-ObjC++",
"-fobjc-weak",
"-fobjc-arc",
]
else:
patched_postargs = extra_postargs
unpatched_compile(self, obj, src, ext, cc_args, patched_postargs, pp_opts)
def patched_link(
self,
target_desc,
objects,
output_filename,
output_dir=None,
libraries=None,
library_dirs=None,
runtime_library_dirs=None,
export_symbols=None,
debug=0,
extra_preargs=None,
extra_postargs=None,
build_temp=None,
target_lang=None,
):
# define additional linking arguments here if needed
existing_postargs = extra_postargs or []
framework_postargs = [
"-framework", "Cocoa",
"-framework", "Metal",
"-framework", "QuartzCore",
]
unpatched_link(
self,
target_desc,
objects,
output_filename,
output_dir,
libraries,
library_dirs,
runtime_library_dirs,
export_symbols,
debug,
extra_preargs,
existing_postargs + framework_postargs,
build_temp,
target_lang,
)
UnixCCompiler._compile = patched_compile
UnixCCompiler.link = patched_link
super().initialize_options()
# ...
setup(
# use the custom cmd class here
cmdclass={"build_ext": DarwinInteropBuildExt},
)
I'm using ctypes to load a DLL in Python. This works great.
Now we'd like to be able to reload that DLL at runtime.
The straightforward approach would seem to be:
1. Unload DLL
2. Load DLL
Unfortunately I'm not sure what the correct way to unload the DLL is.
_ctypes.FreeLibrary is available, but private.
Is there some other way to unload the DLL?
you should be able to do it by disposing the object
mydll = ctypes.CDLL('...')
del mydll
mydll = ctypes.CDLL('...')
EDIT: Hop's comment is right, this unbinds the name, but garbage collection doesn't happen that quickly, in fact I even doubt it even releases the loaded library.
Ctypes doesn't seem to provide a clean way to release resources, it does only provide a _handle field to the dlopen handle...
So the only way I see, a really, really non-clean way, is to system dependently dlclose the handle, but it is very very unclean, as moreover ctypes keeps internally references to this handle. So unloading takes something of the form:
mydll = ctypes.CDLL('./mylib.so')
handle = mydll._handle
del mydll
while isLoaded('./mylib.so'):
dlclose(handle)
It's so unclean that I only checked it works using:
def isLoaded(lib):
libp = os.path.abspath(lib)
ret = os.system("lsof -p %d | grep %s > /dev/null" % (os.getpid(), libp))
return (ret == 0)
def dlclose(handle)
libdl = ctypes.CDLL("libdl.so")
libdl.dlclose(handle)
It is helpful to be able to unload the DLL so that you can rebuild the DLL without having to restart the session if you are using iPython or similar work flow. Working in windows I have only attempted to work with the windows DLL related methods.
REBUILD = True
if REBUILD:
from subprocess import call
call('g++ -c -DBUILDING_EXAMPLE_DLL test.cpp')
call('g++ -shared -o test.dll test.o -Wl,--out-implib,test.a')
import ctypes
import numpy
# Simplest way to load the DLL
mydll = ctypes.cdll.LoadLibrary('test.dll')
# Call a function in the DLL
print mydll.test(10)
# Unload the DLL so that it can be rebuilt
libHandle = mydll._handle
del mydll
ctypes.windll.kernel32.FreeLibrary(libHandle)
I don't know much of the internals so I'm not really sure how clean this is. I think that deleting mydll releases the Python resources and the FreeLibrary call tells windows to free it. I had assumed that freeing with FreeLibary first would have produced problems so I saved a copy of the library handle and freed it in the order shown in the example.
I based this method on ctypes unload dll which loaded the handle explicitly up front. The loading convention however does not work as cleanly as the simple "ctypes.cdll.LoadLibrary('test.dll')" so I opted for the method shown.
windows and linux compatible minimal reproducible example from 2020
overview of similar discussion
Here an overview of similar discussions (where I constructed this answer from).
How can I unload a DLL using ctypes in Python?
ctypes unload dll
Unload shared library inside ctypes loaded shared library
forcing ctypes.cdll.LoadLibrary() to reload library from file
minimal reproducible example
This is for windows and linux, hence there are 2 scripts given for compilation.
Tested under:
Win 8.1, Python 3.8.3 (anaconda), ctypes 1.1.0, mingw-w64 x86_64-8.1.0-posix-seh-rt_v6-rev0
Linux Fedora 32, Python 3.7.6 (anaconda), ctypes 1.1.0, g++ 10.2.1
cpp_code.cpp
extern "C" int my_fct(int n)
{
int factor = 10;
return factor * n;
}
compile-linux.sh
#!/bin/bash
g++ cpp_code.cpp -shared -o myso.so
compile-windows.cmd
set gpp="C:\Program Files\mingw-w64\x86_64-8.1.0-posix-seh-rt_v6-rev0\mingw64\bin\g++.exe"
%gpp% cpp_code.cpp -shared -o mydll.dll
PAUSE
Python code
from sys import platform
import ctypes
if platform == "linux" or platform == "linux2":
# https://stackoverflow.com/a/50986803/7128154
# https://stackoverflow.com/a/52223168/7128154
dlclose_func = ctypes.cdll.LoadLibrary('').dlclose
dlclose_func.argtypes = [ctypes.c_void_p]
fn_lib = './myso.so'
ctypes_lib = ctypes.cdll.LoadLibrary(fn_lib)
handle = ctypes_lib._handle
valIn = 42
valOut = ctypes_lib.my_fct(valIn)
print(valIn, valOut)
del ctypes_lib
dlclose_func(handle)
elif platform == "win32": # Windows
# https://stackoverflow.com/a/13129176/7128154
# https://stackoverflow.com/questions/359498/how-can-i-unload-a-dll-using-ctypes-in-python
lib = ctypes.WinDLL('./mydll.dll')
libHandle = lib._handle
# do stuff with lib in the usual way
valIn = 42
valOut = lib.my_fct(valIn)
print(valIn, valOut)
del lib
ctypes.windll.kernel32.FreeLibrary(libHandle)
A more general solution (object oriented for shared libraries with dependencies)
If a shared library has dependencies, this does not necessarily work anymore (but it can - depends on the dependency ^^). I did not investigate the very details, but it looks like the mechanism is the following: library and dependency are loaded. As the dependency is not unloaded, the library can not get unloaded.
I found, that if I include OpenCv (Version 4.2) into my shared library, this messes up the unloading procedure. The following example was only tested on the linux system:
code.cpp
#include <opencv2/core/core.hpp>
#include <iostream>
extern "C" int my_fct(int n)
{
cv::Mat1b mat = cv::Mat1b(10,8,(unsigned char) 1 ); // change 1 to test unloading
return mat(0,1) * n;
}
Compile with
g++ code.cpp -shared -fPIC -Wall -std=c++17 -I/usr/include/opencv4 -lopencv_core -o so_opencv.so
Python code
from sys import platform
import ctypes
class CtypesLib:
def __init__(self, fp_lib, dependencies=[]):
self._dependencies = [CtypesLib(fp_dep) for fp_dep in dependencies]
if platform == "linux" or platform == "linux2": # Linux
self._dlclose_func = ctypes.cdll.LoadLibrary('').dlclose
self._dlclose_func.argtypes = [ctypes.c_void_p]
self._ctypes_lib = ctypes.cdll.LoadLibrary(fp_lib)
elif platform == "win32": # Windows
self._ctypes_lib = ctypes.WinDLL(fp_lib)
self._handle = self._ctypes_lib._handle
def __getattr__(self, attr):
return self._ctypes_lib.__getattr__(attr)
def __del__(self):
for dep in self._dependencies:
del dep
del self._ctypes_lib
if platform == "linux" or platform == "linux2": # Linux
self._dlclose_func(self._handle)
elif platform == "win32": # Windows
ctypes.windll.kernel32.FreeLibrary(self._handle)
fp_lib = './so_opencv.so'
ctypes_lib = CtypesLib(fp_lib, ['/usr/lib64/libopencv_core.so'])
valIn = 1
ctypes_lib.my_fct.argtypes = [ctypes.c_int]
ctypes_lib.my_fct.restype = ctypes.c_int
valOut = ctypes_lib.my_fct(valIn)
print(valIn, valOut)
del ctypes_lib
Let me know, when there are any issues with the code examples or the explanation given so far. Also if you know a better way! It would be great, if we could settle the issue once and for all.
For total cross-compatibility: I maintain a list of various dlclose() equivalents for each platform and which library to get them from. It's a bit of a long list but feel free to just copy/paste it.
import sys
import ctypes
import platform
OS = platform.system()
if OS == "Windows": # pragma: Windows
dll_close = ctypes.windll.kernel32.FreeLibrary
elif OS == "Darwin":
try:
try:
# macOS 11 (Big Sur). Possibly also later macOS 10s.
stdlib = ctypes.CDLL("libc.dylib")
except OSError:
stdlib = ctypes.CDLL("libSystem")
except OSError:
# Older macOSs. Not only is the name inconsistent but it's
# not even in PATH.
stdlib = ctypes.CDLL("/usr/lib/system/libsystem_c.dylib")
dll_close = stdlib.dlclose
elif OS == "Linux":
try:
stdlib = ctypes.CDLL("")
except OSError:
# Alpine Linux.
stdlib = ctypes.CDLL("libc.so")
dll_close = stdlib.dlclose
elif sys.platform == "msys":
# msys can also use `ctypes.CDLL("kernel32.dll").FreeLibrary()`. Not sure
# if or what the difference is.
stdlib = ctypes.CDLL("msys-2.0.dll")
dll_close = stdlib.dlclose
elif sys.platform == "cygwin":
stdlib = ctypes.CDLL("cygwin1.dll")
dll_close = stdlib.dlclose
elif OS == "FreeBSD":
# FreeBSD uses `/usr/lib/libc.so.7` where `7` is another version number.
# It is not in PATH but using its name instead of its path is somehow the
# only way to open it. The name must include the .so.7 suffix.
stdlib = ctypes.CDLL("libc.so.7")
dll_close = stdlib.close
else:
raise NotImplementedError("Unknown platform.")
dll_close.argtypes = [ctypes.c_void_p]
You can then use dll_close(dll._handle) to unload a library dll = ctypes.CDLL("your-library").
This list is taken from this file. I will update the master branch every time I encounter a new platform.
Piotr's answer helped me, but I did run into one issue on 64-bit Windows:
Traceback (most recent call last):
...
ctypes.ArgumentError: argument 1: <class 'OverflowError'>: int too long to convert
Adjusting the argument type of the FreeLibrary call as suggested in this answer solved this for me.
Thus we arrive at the following complete solution:
import ctypes, ctypes.windll
def free_library(handle):
kernel32 = ctypes.WinDLL('kernel32', use_last_error=True)
kernel32.FreeLibrary.argtypes = [ctypes.wintypes.HMODULE]
kernel32.FreeLibrary(handle)
Usage:
lib = ctypes.CDLL("foobar.dll")
free_library(lib._handle)
If you need this functionality, you could write 2 dlls where dll_A loads/Unloads library from dll_B. Use dll_A as as python interface-loader and passthrough for functions in dll_B.