autodoc: workaround when "cannot import module" - python

The package I want to document with Sphinx contains pure Python modules (ok) + a C/C++ library bind using pybind11.
There are a lot of dependencies that can not reasonnably be built on ReadTheDocs!
So on RTD, autodoc can not import my library to extract the docstrings...
My first idea was to generate _build/doctrees locally and use it on RTD. But it contains binary files, heavy to store in the repository: no.
Is there a way to "expand" autodoc directives in RST files? It could produce full text of RST files or a fake static module as a .py...
Thanks in advance for your ideas!
Mathieu

I think there are two potential ways to grab the docstrings from C/C++ compiled libraries:
Option 1
Install all the system dependencies to install/build the Python packages you need. This can be achieved on Read the Docs by using the key build.apt_packages in the config file (see https://docs.readthedocs.io/en/stable/config-file/v2.html)
Option 2
Try using sphhinx-autoapi extension (https://sphinx-autoapi.readthedocs.io/en/latest/) instead autodoc. This extension does not require to have the dependencies installed since it works statically by parsing the files directly instead of inspecting them dynamically.

I wrote a small tool that produces Python code from a compiled Python extension (signatures and docstrings of course).
In the case, it can be useful for others:
https://gitlab.com/doc-tools/pydoc-codedoc
(still incomplete of course)

Related

How does python venv manage C++ dependencies

I'm using a library which offers a python wrapper to a c++ executable.
I installed it (https://github.com/bulletphysics/bullet3) using venv (https://docs.python.org/3/library/venv.html) - and all is working great.
I'm considering trying to build https://github.com/bulletphysics/bullet3
From the root of the venv folder I found gym/lib/python3.7/site-packages/pybullet.cpython-37m-x86_64-linux-gnu.so. I'm guessing this is the executable that is invoked eventually from python.
What steps are involved in calling from Python to the correct external binary executable? How does import pybullet as p resolve to gym/lib/python3.7/site-packages/pybullet.cpython-37m-x86_64-linux-gnu.so?
This seems to be close to the end of the c++ world; but I can't find the right key word searches to see exactly how that allows python usages.
Thanks
In short words: C-python just looks for correctly named dynamic libraries in PYTHONPATH, loads such library and uses predefined interface to understand what exactly from this library shall be visible as contents of the module inside Python.
In long words details of how to prepare such shared object and what are required contents of it are described in https://docs.python.org/3/extending/index.html
So venv just puts dynamic library in directory which is part of PYTHONPATH of virtual environment.
python just looks for correctly named dynamic libraries in PYTHONPATH, loads such library and uses predefined interface
for further info :
https://docs.python.org/3/extending/index.html

Can I compile boost.python module without bjam?

Boost.python module provides a easy way of blinding c/c++ codes into Python. However, most tutorials assume that bjam is used to compile this module. I was wondering if I do not compile this module can I still use this module? What I mean "do not compile this module" is including all the source files of Boost.python in my current project. I did it for other modules from Boost. For example, the Boost.filesystem module, when I use this module, I just include all the files from this module and compile them with the codes I have written. Thanks.
Yes, absolutely, it's a library like any other.
I always use it with CMake, but anything will do. You need to
Add to include paths the location of the boost headers.
Add to include paths the location of python headers (usually installed with Python, location depends on OS)
Link with the appropriate boost.python library (e.g. in my case it's boost_python-vc120-mt-1_58.lib or boost_python-vc120-mt-gd-1_58.lib, again depends on version/os/toolkit)

Creating a Python type in C using an external library: ctypes or setuptools?

I'm writing some sort of Python C extension. It uses my own *.so library and headers from another project (let's say they're in /usr/local/lib/otherproject.so and /usr/local/include/otherproject.h).
I don't know which strategy to follow. I came up with two:
As a pure Python extension
Write a Python C extension just as described in the official docs. The problem here is that I don't know how to link with my own library and headers; to compile, I write a setup.py file and run python3.4 setup.py build. I don't know if I can include some option to the former command, or if I can write something in setup.py to include my headers and binaries (if so, I will also have to worry about making this distributable?).
With ctypes
Write a C library (with my other project's build system). Include Python by passing '/usr/include/python2.7' to find headers and the python2.7 binary. Then use ctypes to wrap around that library and get the functions, types, objects, etc. The inconvenience here is that I need to manually wrap around every single function/type/variable from ctypes; I don't think I can use PyModule_AddObject since I'm not creating the module in C but in the Python wrapper (with ctypes).
Also, I tried the second approach, but I could not successfully get my custom PyTypeObject from ctypes. If the second approach sounds good to any more expert brain here on SO, I would post the code to get any help =).
The second approach also yields problems with distribution. And if you create a Python object in C you should do it in the context of a module. For scenarios where distribution is problematic, you could link this module statically instead.
For your issue with linking you'll find more information about Library options in the documentation. Since your library resides in a directory which should be in the standard library search path, you'd only need to define your library with the libraries option of the Extension class:
mymodule_ext = Extension('mymodule', ['mymodule.c'], libraries=['otherproject'])
If you're not using the standard lib* prefix you'd need to use libraries=[':otherproject.so'] instead.

Cross compiling my c python extension for kodi / xbmc android

I need to compile my c extension that is invokable by python in kodi. Can anyone please list the steps involved. I think I have actually cross compiled the c extension but it wouldn't work with kodi.
Sorry that I cannot provide you specific steps, but from what I know building binary modules for Kodi-Android is not a trivial task. Here's what I know:
You need to use Python.h from the Python sources used for Kodi's built-in interpreter.
You need to link against libkodi.so to find necessary Python symbols.
This is important: import mechanism for binary nodules in Kodi-Android is broken.
If you use:
import foo
Kodi-Android will actually search for libfoo.so because it automatically appends lib- when searches for shared library files and your import will fail. Simple renaming foo.so to libfoo.so won't help because the name must match the module declaration, for example:
PyMODINIT_FUNC
initspam(void)
{
(void) Py_InitModule("libfoo", SpamMethods);
}
Only if the module declaration "libfoo" matches the filename libfoo.so ("lib-" part is mandatory for Kodi-Android), then the import should succeed provided there are no other pitfall. As I've said, it's not a trivial task.
BTW, you can build a pure C shared library and use ctypes, and don't mess with all this broken Python-C modules stuff. Naturally, your library name must start with "lib-" (again, this is mandatory for Kodi-Android) but using shared libs via ctypes is easier provided your library doesn't have external dependencies.
UPD: I forgot about permission issues. Android does not allow to import binary modules from everywhere. Kodi's temporary directory is known to work but not always. Again, as far as binary Python modules concerned, Kodi-Android is a totall mess.

Distributing a Python library (single file)

For my project I would be using the argparse library. My question is, how do I distribute it with my project. I am asking this because of the technicalities and legalities involved.
Do I just:
Put the argparse.py file along with
my project. That is, in the tar file for my project.
Create a package for it for my
distro?
Tell the user to install it himself?
What's your target Python version? It appears that argparse is included from version 2.7.
If you're building a small library with minimal dependencies, I would consider removing the dependency on an external module and only use facilities offered by the standard Python library. You can access command line parameters with sys.argv and parse them yourself, it's usually not that hard to do. Your users will definitely appreciate not having to install yet another third party module just to use your code.
It would be best for the user to install it so that only one copy is present on the system and so that it can be updated if there are any issues, but including it with your project is a viable option if you abide by all requirements specified in the license.
Try to import it from the public location, and if that fails then resort to using the included module.
You could go with Ignacio's suggestion.
But... For what it is worth, there's another library for argument parsing built into Python, which is quite powerful. Have you tried optparse? It belongs to the base Python distribution and has been there for a while...
Good luck!

Categories