Will the Sphinx documentation engine successfully generate documentation on a project that doesn't import well? In particular my project has an exotic dependency. I don't want document generation to depend on this dependency.
Does Sphinx need to import my module and use introspection or does it parse?
If you're using the autodoc extension, then yes, your project must be importable. But sometimes it's possible to mock out dependencies in your conf.py (since, presumably, at the time of import, the dependencies are needed in name only). The Read the Docs documentation has an example of how to do this.
Core Sphinx doesn't touch your code at all. The autodoc extension does, and it indeed imports it:
For Sphinx (actually, the Python interpreter that executes Sphinx) to find your module, it must be importable.
Related
The package I want to document with Sphinx contains pure Python modules (ok) + a C/C++ library bind using pybind11.
There are a lot of dependencies that can not reasonnably be built on ReadTheDocs!
So on RTD, autodoc can not import my library to extract the docstrings...
My first idea was to generate _build/doctrees locally and use it on RTD. But it contains binary files, heavy to store in the repository: no.
Is there a way to "expand" autodoc directives in RST files? It could produce full text of RST files or a fake static module as a .py...
Thanks in advance for your ideas!
Mathieu
I think there are two potential ways to grab the docstrings from C/C++ compiled libraries:
Option 1
Install all the system dependencies to install/build the Python packages you need. This can be achieved on Read the Docs by using the key build.apt_packages in the config file (see https://docs.readthedocs.io/en/stable/config-file/v2.html)
Option 2
Try using sphhinx-autoapi extension (https://sphinx-autoapi.readthedocs.io/en/latest/) instead autodoc. This extension does not require to have the dependencies installed since it works statically by parsing the files directly instead of inspecting them dynamically.
I wrote a small tool that produces Python code from a compiled Python extension (signatures and docstrings of course).
In the case, it can be useful for others:
https://gitlab.com/doc-tools/pydoc-codedoc
(still incomplete of course)
I am running ROS Indigo. I have what should be a simple problem: I have a utility class in my package that I want to be callable from our scripts. It only needs to be called within our own package; I don't need it to be available to other ROS packages.
I defined a class named HandControl in a file HandControl.py. All my attempts to import it, or use it without importing, fail. Where in the catkin workspace do I put it -- the root of the package, or in scripts? Do I need __init.py__ anywhere (I have tried several places)?
It is a good practice to follow the standards of Python and ROS here. Scripts are typically placed in /script directory and they should not be imported into other python scripts. Reusable python code is an indication of a python module. Python modules should be placed in /src/package_name and there you should create __init__.py as well. This module will be available everywhere in your catkin workspace. There is a good chance this structure will help you in the future to structure things, even though you may not seem to need it at the moment. Project typically grow and following guidelines helps to maintain good code. For more specific details checkout this python doc.
Erica,
please see this school project, which was written in Python and run on ROS Indigo. If you'd look in the /scripts folder, you can see an example of a custom class that is being called from other scripts. If you'd look into the launch files in /launch you can see an example of configuring the ROS nodes - maybe that is your problem.
I need to programmatically block import of a python package and all child packages.
For example, I need to block loading of the package "foo" and also ensure that all children of foo such as "foo.bar" cannot be imported.
How can this be achieved in python 2.x without restructuring my site packages or PYTHONPATH?
For context, the intent is to programmatically avoid the risk of importing proprietary code into GPL licensed code.
This might not fit your solution exactly but you can simply mock out the entire module so the module and all of it's submodules have no effect:
https://pypi.python.org/pypi/mock
I'm interested in wrapping pep8 so I can monkey-patch it before use. What is the "right" way to wrap a module?
If my module is named pep8 and lives in my path somewhere before the real pep8, any "import pep8" in my module will just import itself. I don't know in advance where the real pep8 will live, since this needs to be generalized for multiple systems. I can't remove the path where my pep8 wrapper lives from sys.path, because that too will be different depending on the system where it's executed.
I don't want to have to rename my pep8, because I'd like for the pep8 command to work without modification.
My pep8 is a directory containing a __init__.py with the following contents:
from pep8 import *
MAX_LINE_LENGTH = 119
For Python 2.5+, you can specify using absolute imports by default. With from __future__ import absolute_import.
For monkey patching a Python module, you'll want to do relative imports from your project to your overridden module.
For this example, I will assume you are distributing a library. It requires a little finessing for other projects, since the __main__ python file cannot have relative imports.
myproject/__init__.py:
from . import pep8 # optional "as pep8"
# The rest of your code, using this pep8 module.
myproject/pep8/__init__.py:
from __future__ import absolute_import
from pep8 import *
MAX_LINE_LENGTH = 119
I realize this is an old question, but it still comes up in google searches. For instances where this is actually desired (ex: protected library wrapping) I suggest the WRAPT package.
I actually use this for instances where I have a model that is part of a core set but can be extended by other applications (such as front-ends like flask apps). The core model is protected but can be extended by other developers.
https://pypi.python.org/pypi/wrapt
I'm just starting to get to the point in my python projects that I need to start using multiple packages and I'm a little confused on exactly how everything is supposed to work together. What exactly should go into the __init__.py of the package? Some projects I see just have blank inits and all of their code are in modules in that package. Other projects implement what seems to be the majority of the package's classes and functions inside the init.
Is there a document or style guide or something that describes what the python authors had in mind for the use of packages and the __init__ file and such?
Edit:
I know the point of having the __init__.py file in the simplest sense that it makes a folder a package. But why would I put a function there instead of a module in that same folder(package)?
__init__.py can be empty, but what it really does is make sure Python treats your directories correctly, provide any initialization you might need for when your package is imported (configuring the environment or something along those lines), or defining __all__ so that Python knows what to do when someone uses from package import *.
Most everything you need to know is described in the docs on Packages. Dive Into Python also has a piece on packaging.
You already know, I guess that __init__.py files are required to make Python treat the directories as containing packages.
In the above model __init__.py can remain empty.
You can can also execute initialization code for the package.
You can also set the __all__ variable.
[Edit: learnings]
When you do "from package import item", or "from package import *", then the variable __all__ can be used to import selected packages.
See : http://docs.python.org/tutorial/modules.html