Python Sphinx documenting public interface of a package - python

I have a Python package which contains submodules. Currently, my intention is to allow usage of functionality what the package exports, e.g.:
package_X
+-- __init__.py
+-- submodule_A.py
+-- submodule_B.py
Submodules are implementation details. Everything the user of the package needs to know is exported in the __init__.py file.
Now when building documentation with Sphinx, I get TOC and documentation as follows:
package_X
Submodules
submodule_A.py
submodule_B.py
Is there any setting or directive I can write to __init__.py or submodule_*.py to make public stuff appear as if it were directly in package_X? I am also ready to modify conf.py to make that work.

Related

Can you autodoc only submodules using sphinx-apidoc?

I use sphinx-click extension for my click application, and i want to use sphinx-apidoc to generate .rst files for only submodules.
I've been working with pythons Sphinx documentation library for about a year now, and I have a usecase... I cannot quite figure out.. maybe I'm just blind right now.
Anyways, I have a cli tool structured something like
my_tool/
+-- __init__.py
+-- cli_entry.py
+-- utils
| +-- __init__.py
| +-- foo.py
| +-- bar.py
Where cli_entry is a click application, and it imports my_tool.utils.foo and my_tools.utils.bar
Since this is using the Click library. I've decided to use the sphinx_click extension to document any command that is in cli_entry.py (which documents all the commands great).
But heres the issue, I want to use sphinx-apidoc to generate the .rst files for everything that is in the ./my_tool/utils/ modules.
When I use the command as sphinx-apidoc -o ../docs/utils my_tool/utils the output files that I get include
docs/
+-- utils
| +-- module.rst
| +-- utils.rst
Which looks great at first, but upon opening utils.rst the file looks something like
utils package
=============
Submodules
----------
utils.foo module
----------------------
.. automodule:: utils.foo
:members:
:undoc-members:
:show-inheritance:
utils.bar module
----------------------
.. automodule:: utils.bar
:members:
:undoc-members:
:show-inheritance:
Then when I build the documentation using make html (from sphinx generated makefile) I get an error saying Failed to import 'utils.foo': no module named utils.foo thats because the import should exist as my_tool.utils.foo
How can I use sphinx-apidoc to generate only submodules and include the correct import paths?? Maybe its something I'm missing in the conf.py.. maybe it's an option I'm missing from sphinx-apidoc?
EDIT: I should mention that I could use the exclude_pattern argument... but I would prefer to not have to specify each cli file in my root. ex. In the case that I have cli_entry.py, cli_commandgroup1.py... cli_commandgroupN.py. I want this solution to be dynamic enough to just support only submodules.
EDIT: I tried using sphinx-apidoc -o ../docs/utils my_tool/ and this creates the following output
docs/
+-- utils
| +-- module.rst
| +-- my_tool.cli_entry.rst
| +-- my_tool.utils.rst
Now in the my_tool.utils.rst file, the imports are correct, and the docs can be generated.
my_tool.utils.foo module
----------------------
.. automodule:: my_tool.utils.foo
:members:
:undoc-members:
:show-inheritance:
The issue with specifying my_tool/ is that my_tool.cli_entry.rst gets created, when this .rst file would already be created using the click-sphinx extension.

Ansible - generate .rst file from module's DOCUMENTATION string

I have written custom Ansible module and documented it using standard Ansible convention, i.e. by writing DOCUMENTATION and EXAMPLES global strings in module file.
I already have some of the documentation generated using Sphinx 1.8.3 and hosted locally. I would like to have Ansible documenation included in Sphinx generated pages. My directory structure is fairly simple:
./ansible/docs
├── conf.py
├── index.rst
├── _static
└── _templates
./ansible/library/
├── __init__.py
└── module.py
Now, I could write documentation as function docstrings and then include it using Sphinx .. automodule:: directive. This works, but uses different format than Ansible DOCUMENTATION string.
Although Ansible module documentation goes on in depth, how the docstrings should be formatted, it does not seem to provide any information how to generate docs locally.
What is the correct way to convert Ansible module documentation to .rst file, so that it could be included by Sphinx?
By using the provided Makefile in the docs/docsite directory (you can also run make webdocs from the top-level). You'll want to ensure you have loaded the docsite requirements into your virtualenv, in addition to pip install -e $PWD or its equivalent because the docsite sphinx uses some of ansible's own libraries to do its work.

python namespaces vs packages: making a package the default namespace

I have a project with an overarching namespace, with packages inside it. Here's the folder structure:
pypackage
├── pypackage <-- Source code for use in this project.
| |
│ ├── bin <-- Module: Cli entry point into pyproject.
| | ├── __init__.py
| | └── pypackage.py
| |
| └── core <-- Module: Core functionality.
| ├── __init__.py
| └── pypackage.py
|
├── tests
├── README.md
└── setup.py
Pretty simple. If I want to import it I use:
from pypackage.core import pypackage
and it works great because my setup.py looks like this:
from setuptools import setup, find_packages
...
NAME = 'pypackage'
setup(
name=NAME,
namespace_packages=[NAME],
packages=[f'{NAME}.{p}' for p in find_packages(where=NAME)],
entry_points={
"console_scripts": [
f'{NAME} = {NAME}.bin.{NAME}:cli',
]
},
...
)
However, I have legacy code that imports this pypackage when it used to just be a stand alone python file. like this:
import pypackage
So how do I make it so I can keep the same structure with namespaces and subpackages but still import it the old way? How do I turn this:
from pypackage.core import pypackage
into this:
import pypackage
In other words, how do I alias the pypackage.core.pypackage module to be pypackage for when I'm importing pypackage into an external project?
You would add the 'old' names inside your new package by importing into the top-level package.
Names imported as globals in pypackage/__init__.py are attributes on the pypackage package. Make use of that to give access to 'legacy' locations:
# add all public names from pypackage.core.pypackage to the top level for
# legacy package use
from .core.pypackage import *
Now any code that uses import pypackage can use pypackage.foo and pypackage.bar if in reality these objects were defined in pypackage.core.pypackage instead.
Now, because pypackage is a setuptools namespace package you have a different problem; namespace packages are there for multiple separate distributions to install into so that top-level package must either be empty or only contain a minimum __init__.py file (namespace packages created with empty directories require Python 3.3).
If you are the only publisher of distributions that use this namespace, you can cheat a little here and use a single __init__.py file in your core package that could use pkg-util-style __init__.py file with the additional import I used above, but then you must not use any __init__.py files in other distribution packages or require that they all use the exact same __init__.py content. Coordination is key here.
Or you would have to use a different approach. Leave pypackage as a legacy wrapper module, and rename the new package format to use a new, different top-level name that can live next to the old module. At this point you can then just include the legacy package in your project, directly, as an extra top-level module.
Martin Pieters has the right idea if I were using packages, but a namespace package is a setuptools thing.
So that didn't work. after more research, I learned that there's no way to do what I'm trying to do. So if I really want to do it I must convert everything to a regular package hierarchy instead of namespace package, then use martin's solution.
I've decided to modify the legacy code instead to import it the new way.

Python: Distributing a file/module into another package with setuptools

Generically, I'm trying to distribute a module/file that I want to be part of a another python package that has already set up a directory structure under site-packages.
For the given structure under site-packages:
-- main_package
-- __init__.py
-- sub_package
-- __init__.py
-- bar.py
I have a module foo.py that I want to exist next to the module bar.py, both under the package sub_package. If I just setup the structure in my repo along the lines of a typical package then the __init__.py that setuptools requires clobber the existing __init__.py files for the main_package and sub_package. Ideally this would be a wheel or sdist that could be distributed from our own pypi.
I've tried using py_module, but I can't get it to place the module anywhere other than the top level under site-packages.
Specifically, I'm trying to distribute a saltstack external_pillar into the salt/pillar/ structure already setup by salt. There are multiple salt instances that may or may not want the external_pillar, so simply bundling up the pillar into the salt distribution we have isn't feasible.
You may be looking for namespace packages.
With in your case a structure like this:
`-- main_package
| # no __init__.py here
`-- sub_package
|-- __init__.py
`-- foo.py
Python 3.3 added implicit namespace packages from PEP 420. All that is required to create a native namespace package is that you just omit __init__.py from the namespace package directory.
...
It is extremely important that every distribution that uses the namespace package omits the __init__.py or uses a pkgutil-style __init__.py. If any distribution does not, it will cause the namespace logic to fail and the other sub-packages will not be importable.
...
Finally, every distribution must provide the namespace_packages argument to setup() in setup.py.
However, from your question I am not sure whether sub_package already exists in main_package or not.
If it does, I am not sure of the structure to use to extend sub_package while not shadowing its modules, you may need to use an alternate sub-package name to achieve this.

logging.NullHandler and __init__.py file location in Python-requests

I am trying to learn more about Python by looking through the code in popular libraries. The first library that I touched up is python-requests by Kenneth Reitz.
What I did is simply git clone <request_repo_url_from_github>, and now I'm inspecting the code.
I was looking through the __init__.py file in requests/packages.
I have a few questions to ask:
Why is __init__.py inside requests/packages, should there not be an __init__.py file under requests directly? Or is it simple because this is a directly cloned and uninstalled package from github that its like this way?
The second question refers to the above code below. What I would like to know is what does NullHandler do exactly? I took a look at the documentation here, what does it mean to have a 'no-op' handler. Where would this Handler be used by library devs? I mean, what is special about it?
import logging
try: # Python 2.7+
from logging import NullHandler
except ImportError:
class NullHandler(logging.Handler):
def emit(self, record):
pass
From what I remember of requests, the packages directory contains mirrors of other dependencies which the author decided to bundle along with requests instead of adding as dependencies. Personally I would have simply make them dependencies, citing a specific version if necessary, but I'm sure they had their reasons for the bundling approach. I any case, it contains a __init__.py so the code can treat it as a module and do things like this:
import requests.packages.urllib3
If you look at the requests directory on Github, you'll see that there is indeed an __init__.py in that directory too. If you want to have a hierarchy of packages you need such a file at each level, although in the simplest case it can be an empty file.
If you don't place __init__.py in a directory, Python won't recognise it as a package - this is to prevent accidental inclusion of modules from places you'd rather not. You can imagine any number of ways that a directory could be named the same as a package somewhere else on sys.path and cause untold confusion, but since there won't be an __init__.py in it, it'll be ignored by Python.
To answer your second question, the NullHandler is for cases where it's for some reason convenient to have a logging handler, but you don't actually want to do any logging. You might use this if you're using a library which performs logging but in your case you don't actually want to log anything - you install a NullHandler to throw those logging messages away, because it's a lot easier (and better practice) than changing their library to strip out the logging code.
In that example, I suppose you could add an alternative logging and simply set the logging level so that no messages are actually produced, but it's arguably easier to just use a NullHandler instead.
Having an __init__.py file in a directory turns that directory into a Python package. It does not turn subdirectories into packages as well. If you look at the source tree, you'll see it looks like this (with non relevant-files removed)
requests/
|
|-- __init__.py
|-- packages/
|
|-- __init__.py
|-- charade/
| |
| |-- __init__.py
|-- urllib3/
|
|-- __init__.py
This defines a top-level package, requests, and also the subpackages requests.packages, requests.packages.charade and requests.packages.urllib3. Defining these packages is necessary to make them properly importable.
To directly answer the question you asked, there is an __init__.py file under requests/ directly. There's just more than one in the entire tree.
The NullHandler does nothing. It's in place so that calls to the logging library can be used unconditionally, even if the user doesn't configure any loggers. Basically, all loggers attached to the logging library get called when urllib3 tries to log anything. If there are no loggers attached, then the logging library emits warnings. Those are lousy, so this is a workaround to make the library code simpler without forcing logging on people.

Categories