I am trying to document a package in Python. At the moment I have the following directory structure:
.
└── project
├── _build
│ ├── doctrees
│ └── html
│ ├── _sources
│ └── _static
├── conf.py
├── index.rst
├── __init__.py
├── make.bat
├── Makefile
├── mod1
│ ├── foo.py
│ └── __init__.py
├── mod2
│ ├── bar.py
│ └── __init__.py
├── _static
└── _templates
This tree is the result of the firing of sphinx-quickstart. In conf.py I uncommented sys.path.insert(0, os.path.abspath('.')) and I have extensions = ['sphinx.ext.autodoc'].
My index.rst is:
.. FooBar documentation master file, created by
sphinx-quickstart on Thu Aug 28 14:22:57 2014.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
Welcome to FooBar's documentation!
==================================
Contents:
.. toctree::
:maxdepth: 2
Indices and tables
==================
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`
In all the __init__.py's I have a docstring and same goes to the modules foo.py and bar.py. However, when running make html in the project I don't see any of the docstings.
Here is an outline:
Document your package using docstrings in the sources.
Use sphinx-quickstart to create a Sphinx project.
Run sphinx-apidoc to generate .rst sources set up for use with autodoc. More information here.
Using this command with the -F flag also creates a complete Sphinx project. If your API changes a lot, you may need to re-run this command several times.
Build the documentation using sphinx-build.
Notes:
Sphinx requires .rst files with directives like automodule or autoclass in order to generate API documentation. It does not automatically extract anything from the Python sources without these files. This is different from how tools like Epydoc or Doxygen work. The differences are elaborated a bit more here: What is the relationship between docutils and Sphinx?.
After you have run sphinx-apidoc, it may be necessary to adjust sys.path in conf.py for autodoc to find your modules.
In order to avoid strange errors like in these questions, How should I solve the conflict of OptionParser and sphinx-build in a large project?, Is OptionParser in conflict with sphinx?, make sure that the code is properly structured, using if __name__ == "__main__": guards when needed.
Related
I have a question regarding the Sphinx autodoc generation. I feel that what I am trying to do should be very simple, but for some reason, it won't work.
I have a Python project of which the directory is named slotting_tool. This directory is located at C:\Users\Sam\Desktop\picnic-data-shared-tools\standalone\slotting_tool
I set up Sphinx using sphinx-quickstart. Then my directory structure (simplified) is as follows:
slotting_tool/
|_ build/
|_ source/
|___ conf.py
|___ index.rst
|_ main/
|___ run_me.py
Now, I set the root directory of my project to slotting_tool by adding the following to the conf.py file.
import os
import sys
sys.path.insert(0, os.path.abspath('..'))
Next, I update my index.rst file to look like this:
.. toctree::
:maxdepth: 2
:caption: Contents:
.. automodule:: main.run_me
:members:
When trying to build my html using the sphinx-build -b html source .\build command, I get the following output, with the no module named error:
(base) C:\Users\Sam\Desktop\picnic-data-shared-tools\standalone\slotting_tool>sphinx-build -b html source .\build
Running Sphinx v1.8.1
loading pickled environment... done
building [mo]: targets for 0 po files that are out of date
building [html]: targets for 1 source files that are out of date
updating environment: [] 0 added, 1 changed, 0 removed
reading sources... [100%] index
WARNING: autodoc: failed to import module 'run_me' from module 'main'; the following exception was raised:
No module named 'standalone'
looking for now-outdated files... none found
pickling environment... done
checking consistency... done
preparing documents... done
writing output... [100%] index
generating indices... genindex
writing additional pages... search
copying static files... done
copying extra files... done
dumping search index in English (code: en) ... done
dumping object inventory... done
build succeeded, 1 warning.
The HTML pages are in build.
There are no HTML pages that refer to run_me.py in build. I have tried setting my root directory to all different kinds of directories and I have tried replacing all dots . with backslashes \ and so forth, but can't seem to find out what I'm doing wrong.
By the way, the statement that standalone is not a module is in fact true, it is just a directory without an __init__.py. Don't know if that might have caused some trouble?
Anyone have an idea?
This is the usual "canonical approach" to "getting started" applied to the case when your source code resides in a src directory like Project/src instead of simply being inside the Project base directory.
Follows these steps:
Create a docs directory in your Project directory (it's from this docs directory the commands in the following steps are executed).
sphinx-quickstart (choose separate source from build. Places .html and .rst files in different folders).
sphinx-apidoc -o ./source ../src
make html
This would yield the following structure (provided you .py source files reside in Project/src):
Project
|
├───docs
│ │ make.bat
│ │ Makefile
│ │
│ ├───build
│ └───source
│ │ conf.py
│ │ index.rst
│ │ modules.rst
│ │ stack.rst
│ │
│ ├───_static
│ └───_templates
└───src
stack.py
In your conf.py you'd add (after step 2):
import os
import sys
sys.path.insert(0, os.path.abspath(os.path.join('..', '..', 'src')))
Also include in conf.py:
extensions = ['sphinx.ext.autodoc', 'sphinx.ext.napoleon']
And in index.rst you'd link modules.rst:
Welcome to Project's documentation!
================================
.. toctree::
:maxdepth: 2
:caption: Contents:
modules
Indices and tables
==================
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`
Your stack.rst and modules.rst were auto-generated by sphinx-apidoc, no need to change them (at this point). But just so you know this is what they look like:
stack.rst:
stack module
============
.. automodule:: stack
:members:
:undoc-members:
:show-inheritance:
modules.rst:
src
===
.. toctree::
:maxdepth: 4
stack
After `make html` open `Project/docs/build/index.html` in your browser, the results:
and:
Let's take an example with a project: dl4sci-school-2020 on master branch, commit: 6cbcc2c72d5dc74d2defa56bf63706fd628d9892:
├── dl4sci-school-2020
│ ├── LICENSE
│ ├── README.md
│ ├── src
│ │ └── __init__.py
│ └── utility
│ ├── __init__.py
│ └── utils.py
and utility package has a utils.py module:
Follow this process(FYI, I'm using sphinx-build 3.1.2):
create a docs/ directory under you project:
mkdir docs
cd docs
start sphinx within docs/, and just pass your project_name, your_name & version of your choice and rest keep defaults.
sphinx-quickstart
you will get below auto-generated in your docs/ folder
├── docs
│ ├── Makefile
│ ├── build
│ ├── make.bat
│ └── source
│ ├── _static
│ ├── _templates
│ ├── conf.py
│ └── index.rst
Since, we created a separate docs directory so we need sphinx find
where to find build files and python src module.
So, edit the conf.py file, you can use my conf.py file too
import os
import sys
basedir = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..'))
sys.path.insert(0, basedir)
Now, to enable access to nested multiple packages & modules if any, you need to edit index.rst file.
.. toctree::
:maxdepth: 2
:caption: Description of my CodeBase:
modules
The modules picks up content from modules.rst file which we will create below:
Make sure you're still in doc/ to run the below command
sphinx-apidoc -o ./source ..
The output you get:
├── docs
│ ├── Makefile
│ ├── build
│ ├── make.bat
│ └── source
│ ├── _static
│ ├── _templates
│ ├── conf.py
│ ├── index.rst
│ ├── modules.rst
│ ├── src.rst
│ └── utility.rst
now run:
make html
Now, go and open in browser of your choice,
file:///<absolute_path_to_your_project>/dl4sci-school-2020/docs/build/html/index.html
have you beautiful documentation ready
https://imgur.com/5t1uguh
FYI, You can switch any theme of your choice, I found sphinx_rtd_theme and extension sphinxcontrib.napoleon super dope!. Thanks to their creators, so I used it.
below does the work!
pip install sphinxcontrib-napoleon
pip install sphinx-rtd-theme
You can host your documentation it on readthedocs
enjoy documenting your code!
sys.path.insert(0, os.path.abspath('../..'))
That's not correct. Steve Piercy's comment is not entirely on point (you don't need to add a __init__.py since you're using a simple module) but they're right that autodoc will try to import the module and then inspect the content.
Hoever assuming your tree is
doc/conf.py
src/stack.py
then you're just adding the folder which contains your repository to the sys.path which is completely useless. What you need to do is add the src folder to sys.path, such that when sphinx tries to import stack it finds your module. So your line should be:
sys.path.insert(0, os.path.abspath('../src')
(the path should be relative to conf.py).
Of note: since you have something which is completely synthetic and should contain no secrets, an accessible repository or a zip file of the entire thing makes it much easier to diagnose issues and provide relevant help: the less has to be inferred, the less can be wrong in the answer.
IMHO running pip install --no-deps -e . in the top project folder (or where ever setup.py is) to get an "editable" install is a better alternative to get your package modules on the PYTHONPATH than altering it in docs/conf.py using sys.path.
For me installing the package via setup.py file and re-running corresponding commands fixed the problem:
$ python setup.py install
I'm using pyright for type checking and I'm also using pytest for testing inside Visual Studio Code. The folder structure for my tests is to have a 'test' subfolder in the package root . For example
|
MyPackage
|-- __init__.py
|-- MyModule.py
|--test
|-- __init__.py
|--MyModule_test.py
I'm organizing things like this as there will be many packages and I want to keep things organized.
Inside pytest I have
import pytest
import MyPackage.MyModule
...
Pytest is able to discover the tests and run them OK because it has some special ability to adjust its sys.path (or something).
However, pyright will just complain that it cannot import the module,
Import 'MyPackage.MyModule' could not be resolvedpyright (reportMissingImports). This makes sense, but is there some way to deal with this, either in pyright or in the Visual Studio Code settings to stop this from complaining?
You can add the library path to the path variable.
import sys
sys.path.insert(1, str('..'))
import MyModule
To enable Pylance to use your library properly (for auto-complete ...), use the following steps:
Pylance, by default, includes the root path of your workspace. If you want to include other subdirectories as import resolution paths, you can add them using the python.analysis.extraPaths setting for the workspace.
In VS Code press +<,> to open Settings.
Type in python.analysis.extraPaths
Select "Add Item"
Type in the path to your library `..'
Ok, a relative import as illustrated here was able to solve this. So in my case I should have
# MyModule_test.py
import pytest
from .. import MyModule
You should create a pyrightconfig.json file or pyproject.toml file at the root of your project. For example, if it's a Django project, you should have one of those files where manage.py is placed. Then, set include parameter and add the subdirectories (or app folders in Django terms).
You can consult this sample config file. See this issue ticket.
For example, if this were my project structure:
├── manage.py
├── movie
│ ├── admin.py
│ ├── apps.py
│ ├── __init__.py
│ ├── models.py
│ ├── tests.py
│ └── views.py
├── moviereviews
│ ├── asgi.py
│ ├── __init__.py
│ ├── settings.py
│ ├── urls.py
│ └── wsgi.py
└── pyproject.toml
my pyproject.toml would be:
[tool.pyright]
include = ["movie", "moviereviews"]
If you are working within a Python virtual environment, set venvPath and venv. Consult the documentation for an exhaustive list of options.
I have the following directory structure:
├── DynamicProgramming
│ ├── 0-1_kp_problem.py
│ ├── b.py
│ ├── largest_contigous_subarray.py
│ ├── longest_common_substring.py
│ ├── min_change_for_given_money.py
│ ├── optimal_matrix_chain.py
│ ├── Readme.md
│ └── wis.py
├── helper
│ ├── a.py
│ └── __init__.py
└── Readme.md
The helper directory contains the library functions which will be used all over the code. How can I import the helper package from the scripts inside DynamicProgramming without adding it to the path?
Edit=>
I cannot move helper directory inside dynamicProgramming because there can be more than one directories using it.
You could use something like:
from ..helper import a
See python docs on packages.
If you run your code from project root folder, you are likely to succeed with import helper or import helper.a. If not, you would have to add current directory to PYTHONPATH:
$ export PYTHONPATH="."
better use project setup.py
Instead of playing with PYTHONPATH (what can be tricky business sometime), you shall create your project as python package.
You add setup.py into your project root, specify attributes of that package and build it from it.
setup.py can define multiple packages at once, but generally it is more often
using only one. For this purpose it would be better moving the helper package
into DynamicProgramming structure and import it from there.
Search for setup.py python packaging tutorials, it requires some study, but it will pay back.
Say you want want to create a programming project that mixes C++ and Python. The Foo C++ project structure uses CMake, and a Python module is created by using Swig. The tree structure would look something like this:
├── CMakeLists.txt
├── FooConfig.cmake.in
├── FooConfigVersion.cmake.in
├── Makefile
├── README
├── foo
│ ├── CMakeLists.txt
│ ├── config.hpp.in
│ ├── foo.cpp
│ └── foo.hpp
└── swig
└── foo.i
Now you would like to make use of the Foo project within a Python project, say Bar:
├── AUTHORS.rst
├── CONTRIBUTING.rst
├── HISTORY.rst
├── LICENSE
├── MANIFEST.in
├── Makefile
├── README.rst
├── docs
│ ├── Makefile
│ ├── authors.rst
│ ├── conf.py
│ ├── contributing.rst
│ ├── history.rst
│ ├── index.rst
│ ├── installation.rst
│ ├── make.bat
│ ├── readme.rst
│ └── usage.rst
├── bar
│ ├── __init__.py
│ └── bar.py
├── requirements.txt
├── setup.cfg
├── setup.py
├── tests
│ ├── __init__.py
│ └── test_bar.py
└── tox.ini
This structure was crated by using cookiecutter's pypackage template. A BoilerplatePP template is also available to generate a CMake C++ project using cookiecutter (no Swig part).
So now that I have the structure of both projects, and considering that the development will take place mainly in Python and the the project will be run in different systems, I need to address the following questions:
What's the best way to mix them? Should I collapse both root directories? Should I have the Foo C++ project as a directory of the Bar project or the other way around? I may be inclined to put the entire C++ structure shown above in a folder at the root level of the Python project, but I would like to know a priori any pitfalls as the CMake system is quite powerful and it may be convenient to do it the other way around.
In case I decide to put the Foo project as a directory within Bar, is the Python setuptools package as powerful as the CMake build system? I ask this because when I take a look at the Bar project, at the top level it seems there's only a bunch of scripts, but I don't know if this is the equivalent to CMake as I'm new to Python.
The Bar project outlined above has a single bar directory, but I assume that whenever this project expands, instead of having many other directories at the root level, other directories containing Python code will be placed within bar. Is this correct (in the Pythonic sense)?
I assume that a single egg will be produced from the entire project, so that it can be installed and run in many different python systems. Is the integration of the module created by the Foo project easy? I assume that this module will be created in a different directory than bar.
In order for the Python code within the bar directory, the module created by Swig has to be available, so I guess the most straightforward way to do this is to modify the environmental variable PYTHONPATH using the CMake system. Is this fine or is there a better way?
If the C++ application has no use outside the Python package that will contain it:
You can pretty safely place the C++ code within the python package that owns it. Have the "foo" directory within the "bar" directory within your example. This will make packaging the final Python module a bit easier.
If the C++ application is reusable:
I would definitely try to think of things in terms of "packages", where independent parts are self-contained. All independent parts live on the same level. If one part depends on another, you import from its corresponding "package" from the same level. This is how dependencies typically work.
I would NOT include one within the other, because one does not strictly belong to the other. What if you started a third project that needed "foo", but did not need "bar"?
I would place both "foo" and "bar" packages into the same "project" directory (and I would probably give each package it's own code repository so each package can be easily maintained and installed).
I'm trying to auto-generate basic documentation for my codebase using Sphinx. However, I'm having difficulty instructing Sphinx to recursively scan my files.
I have a Python codebase with a folder structure like:
<workspace>
└── src
└── mypackage
├── __init__.py
│
├── subpackageA
│ ├── __init__.py
│ ├── submoduleA1
│ └── submoduleA2
│
└── subpackageB
├── __init__.py
├── submoduleB1
└── submoduleB2
I ran sphinx-quickstart in <workspace>, so now my structure looks like:
<workspace>
├── src
│ └── mypackage
│ ├── __init__.py
│ │
│ ├── subpackageA
│ │ ├── __init__.py
│ │ ├── submoduleA1
│ │ └── submoduleA2
│ │
│ └── subpackageB
│ ├── __init__.py
│ ├── submoduleB1
│ └── ubmoduleB2
│
├── index.rst
├── _build
├── _static
└── _templates
I've read the quickstart tutorial, and although I'm still trying to understand the docs, the way it's worded makes me concerned that Sphinx assumes I'm going to manually create documentation files for every single module/class/function in my codebase.
However, I did notice the "automodule" statement, and I enabled autodoc during quickstart, so I'm hoping most of the documentation can be automatically generated. I modified my conf.py to add my src folder to sys.path and then modified my index.rst to use automodule. So now my index.rst looks like:
Contents:
.. toctree::
:maxdepth: 2
Indices and tables
==================
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`
.. automodule:: alphabuyer
:members:
I have dozens of classes and functions defined in the subpackages. Yet, when I run:
sphinx-build -b html . ./_build
it reports:
updating environment: 1 added, 0 changed, 0 removed
And this appears to have failed to import anything inside my package. Viewing the generated index.html shows nothing next to "Contents:". The Index page only shows "mypackage (module)", but clicking it shows it also has no contents.
How do you direct Sphinx to recursively parse a package and automatically generate documentation for every class/method/function it encounters, without having to manually list every class yourself?
You can try using sphinx-apidoc.
$ sphinx-apidoc --help
Usage: sphinx-apidoc [options] -o <output_path> <module_path> [exclude_paths, ...]
Look recursively in <module_path> for Python modules and packages and create
one reST file with automodule directives per package in the <output_path>.
You can mix sphinx-apidoc with sphinx-quickstart in order to create the whole doc project like this:
$ sphinx-apidoc -F -o docs project
This call will generate a full project with sphinx-quickstart and Look recursively in <module_path> (project) for Python modules.
Perhaps apigen.py can help: https://github.com/nipy/nipy/tree/master/tools.
This tool is described very briefly here: http://comments.gmane.org/gmane.comp.python.sphinx.devel/2912.
Or better yet, use pdoc.
Update: the sphinx-apidoc utility was added in Sphinx version 1.1.
Note
For Sphinx (actually, the Python interpreter that executes
Sphinx) to find your module, it must be importable. That means that
the module or the package must be in one of the directories on
sys.path – adapt your sys.path in the configuration file accordingly
So, go to your conf.py and add
import an_example_pypi_project.useful_1
import an_example_pypi_project.useful_2
Now your index.rst looks like:
.. toctree::
:glob:
example
an_example_pypi_project/*
and
make html
From Sphinx version 3.1 (June 2020), if you're happy to use sphinx.ext.autosummary to display summary tables, you can use the new :recursive: option to automatically detect every module in your package, however deeply nested, and automatically generate documentation for every attribute, class, function and exception in that module.
See my answer here: https://stackoverflow.com/a/62613202/12014259