Boost.python module provides a easy way of blinding c/c++ codes into Python. However, most tutorials assume that bjam is used to compile this module. I was wondering if I do not compile this module can I still use this module? What I mean "do not compile this module" is including all the source files of Boost.python in my current project. I did it for other modules from Boost. For example, the Boost.filesystem module, when I use this module, I just include all the files from this module and compile them with the codes I have written. Thanks.
Yes, absolutely, it's a library like any other.
I always use it with CMake, but anything will do. You need to
Add to include paths the location of the boost headers.
Add to include paths the location of python headers (usually installed with Python, location depends on OS)
Link with the appropriate boost.python library (e.g. in my case it's boost_python-vc120-mt-1_58.lib or boost_python-vc120-mt-gd-1_58.lib, again depends on version/os/toolkit)
Related
The package I want to document with Sphinx contains pure Python modules (ok) + a C/C++ library bind using pybind11.
There are a lot of dependencies that can not reasonnably be built on ReadTheDocs!
So on RTD, autodoc can not import my library to extract the docstrings...
My first idea was to generate _build/doctrees locally and use it on RTD. But it contains binary files, heavy to store in the repository: no.
Is there a way to "expand" autodoc directives in RST files? It could produce full text of RST files or a fake static module as a .py...
Thanks in advance for your ideas!
Mathieu
I think there are two potential ways to grab the docstrings from C/C++ compiled libraries:
Option 1
Install all the system dependencies to install/build the Python packages you need. This can be achieved on Read the Docs by using the key build.apt_packages in the config file (see https://docs.readthedocs.io/en/stable/config-file/v2.html)
Option 2
Try using sphhinx-autoapi extension (https://sphinx-autoapi.readthedocs.io/en/latest/) instead autodoc. This extension does not require to have the dependencies installed since it works statically by parsing the files directly instead of inspecting them dynamically.
I wrote a small tool that produces Python code from a compiled Python extension (signatures and docstrings of course).
In the case, it can be useful for others:
https://gitlab.com/doc-tools/pydoc-codedoc
(still incomplete of course)
I'd like to specify a path for ctypes.util.find_library() to search. How can I do this?
I'd like to do it from within Python.
I'm using macOS.
If I wanted to do it from outside Python, I can specify the location using LD_LIBRARY_PATH. However I have read that I cannot modify this environment variable from within Python as it is cached on Python's startup. Modifying the value and then restarting Python seems like a very unusable idea; for example, what would happen if the library was imported part way through execution?
Why would I like to do this? Because I would like to add a MacOS wheel to a Python library that works under Windows. Currently they're packaging the DLLs into the same directory as the Python source files and adding that path to Windows' PATH environment, which ctypes.util.find_library() searches--a technique that I can't seem to replicate under Mac.
I have tried to understand delocate. It seems to state that the Python library doesn't depend on any shared objects. I suspect this is because the dylibs are loaded dynamically using ctypes.util.find_library() rather than being compiled code within Python.
Any suggestions would be gratefully received!
Although there are environment variables (LD_LIBRARY_PATH and DYLD_LIBRARY_PATH) that impact the search path for shared libraries, they are read and fixed by Python when Python starts up. Thus that option was eliminated.
Instead, we took the approach hinted at in the Python documentation:
If wrapping a shared library with ctypes, it may be better to determine the shared library name at development time, and hardcode that into the wrapper module instead of using find_library() to locate the library at runtime.
We hard-coded the names of the libraries that were included for each operating system, plus we included a generic name for operating systems where the libraries were not included:
names = {
"Windows": "libogg.dll",
"Darwin": "libogg.0.dylib",
"external": "ogg"
}
The library loading code checks if it can find the specified library for the current operating system. If it isn't successful, a search is made for the 'external' library.
The library loading code can be found inside the PyOgg project; see Library.load().
Further to this, inspired by the delocate project, we were required to edit the paths found inside certain dylibs. For example, we edited the opusfile binary to point to the ogg binary found in our package:
install_name_tool -change /usr/local/opt/libogg/lib/libogg.0.dylib #loader_path/libogg.0.dylib libopusfile.0.dylib
For the details on this process please see the README file associated with the macOS binaries inside the PyOgg project.
Further discussion of the approach can be found in the issue raised on PyOgg's GitHub page (#32).
I'm writing some sort of Python C extension. It uses my own *.so library and headers from another project (let's say they're in /usr/local/lib/otherproject.so and /usr/local/include/otherproject.h).
I don't know which strategy to follow. I came up with two:
As a pure Python extension
Write a Python C extension just as described in the official docs. The problem here is that I don't know how to link with my own library and headers; to compile, I write a setup.py file and run python3.4 setup.py build. I don't know if I can include some option to the former command, or if I can write something in setup.py to include my headers and binaries (if so, I will also have to worry about making this distributable?).
With ctypes
Write a C library (with my other project's build system). Include Python by passing '/usr/include/python2.7' to find headers and the python2.7 binary. Then use ctypes to wrap around that library and get the functions, types, objects, etc. The inconvenience here is that I need to manually wrap around every single function/type/variable from ctypes; I don't think I can use PyModule_AddObject since I'm not creating the module in C but in the Python wrapper (with ctypes).
Also, I tried the second approach, but I could not successfully get my custom PyTypeObject from ctypes. If the second approach sounds good to any more expert brain here on SO, I would post the code to get any help =).
The second approach also yields problems with distribution. And if you create a Python object in C you should do it in the context of a module. For scenarios where distribution is problematic, you could link this module statically instead.
For your issue with linking you'll find more information about Library options in the documentation. Since your library resides in a directory which should be in the standard library search path, you'd only need to define your library with the libraries option of the Extension class:
mymodule_ext = Extension('mymodule', ['mymodule.c'], libraries=['otherproject'])
If you're not using the standard lib* prefix you'd need to use libraries=[':otherproject.so'] instead.
I need to compile my c extension that is invokable by python in kodi. Can anyone please list the steps involved. I think I have actually cross compiled the c extension but it wouldn't work with kodi.
Sorry that I cannot provide you specific steps, but from what I know building binary modules for Kodi-Android is not a trivial task. Here's what I know:
You need to use Python.h from the Python sources used for Kodi's built-in interpreter.
You need to link against libkodi.so to find necessary Python symbols.
This is important: import mechanism for binary nodules in Kodi-Android is broken.
If you use:
import foo
Kodi-Android will actually search for libfoo.so because it automatically appends lib- when searches for shared library files and your import will fail. Simple renaming foo.so to libfoo.so won't help because the name must match the module declaration, for example:
PyMODINIT_FUNC
initspam(void)
{
(void) Py_InitModule("libfoo", SpamMethods);
}
Only if the module declaration "libfoo" matches the filename libfoo.so ("lib-" part is mandatory for Kodi-Android), then the import should succeed provided there are no other pitfall. As I've said, it's not a trivial task.
BTW, you can build a pure C shared library and use ctypes, and don't mess with all this broken Python-C modules stuff. Naturally, your library name must start with "lib-" (again, this is mandatory for Kodi-Android) but using shared libs via ctypes is easier provided your library doesn't have external dependencies.
UPD: I forgot about permission issues. Android does not allow to import binary modules from everywhere. Kodi's temporary directory is known to work but not always. Again, as far as binary Python modules concerned, Kodi-Android is a totall mess.
I'm an extension noob. What I want to do is create an extension that doesn't require other libraries to be installed. Is this impossible because the extension has to link against a specific version of libpython at runtime?
You can't make a statically linked extension module because Python needs to load it dynamically at runtime and because (as you reasoned) the module needs to dynamically link against libpython.
You could compile your own custom version of Python with your extension statically linked into the interpreter. That's usually more trouble than it's worth.
Why do you want to make a statically linked extension? If we have more information about your goals, we might be able to help you achieve them in a different way.
Welcome to StackOverflow. :-)
I think you're mixing things. You don't want the extension to be statically linked in the interpreter (which is possible but cumbersome since it involves rebuilding a custom interpreter), you want your extension not to be linked against pythonxx.dll, or to be linked statically to it. This is not possible; your extension and the python interpreter would have each their own copies of global variables for instance, which is Bad.
There is another approach, which is to determine what Python versions are available at runtime and using dynamically the Python/C API by loading the Python DLL through LoadLibrary (Windows) or dlopen (Linux/etc), then deciding at runtime on the methods signatures depending on the version, etc. Very cumbersome. For an example of this kind of manipulation in Delphi, see PythonForDelphi:
http://www.atug.com/andypatterns/pythonDelphiTalk.htm
I'm not aware of any other project who would do that.