How to get Read the Docs to generate py-modindex.html? - python

I'm trying to get Read the Docs to generate the py-modindex.html file. Research into this question lead me to the following setup:
setup.py in the project directory has the following contents, which were the minimum needed to get pytest to work and haven't been changed since I got that part of my project working:
import setuptools
setuptools.setup(
name='polygons',
packages=setuptools.find_packages(),
)
docs/requirements.txt contains a single line:
sphinx-autodoc-annotation
The Read the Docs repository URL points to my GitHub repository.
The RtD setting for "Install your project inside a virtualenv using setup.py install" is checked.
The RtD setting for "Requirements file" points to docs/requirements.txt.
The "Module Index" link gets included in index.html, but the py-modindex.html file is missing.
My understanding is that with the virtualenv setting above, RtD will use the setup.py file to install the project so that Sphinx can read the documentation found in the Python docstrings. I'm using function annotations and would like sphyinx-autodoc-annotation to make use of those when creating the built docs files. All of this works splendidly on my local machine when I run make html while in the docs folder. Now I'm trying to get it to work on Read the Docs.
Note: This is an exercise I'm going through to understand everything before I apply it to my real project, which the polygons project is a placeholder for.

Related

Python edit JS code pointed at by (local) Package

Edited based on comments:
I am trying to edit files (js) within one project which are called by a package in Python, however I am unable to determine the steps to be able to do so. I am looking to add some functionality to the itkwidgets package by updating the annotations returned by some function calls within the js files.
I have forked itkwidgets from git and installed locally using python -m pip install -e .. I can import this package in my notebook and have confirmed that it is loaded and working as expected.
My expectation is that I should be able to point the package (itkwidgets) towards a local version of the js project (itk-vtk-viewer) so that any edits made can be observed in the notebook.
(Part I do not understand)
In a viewer.js file it specifies the location of the js code that I want to edit:
import createViewer from 'itk-vtk-viewer/src/createViewer'
This viewer.js file is pointed to in a index.js file:
const { ViewerModel, ViewerView } = require('./viewer.js');
and this index.js file is pointed to within the package.json file.
My original thoughts were that I should be able to change the location pointed to in viewer.js file to a local folder and it would point towards the new code. However, even if I change the line to import createViewer from '/<NON_EXISTENT_FOLDER>/src/createViewer'. The code still runs as before. If I point towards the local version, I would expect some added console.log's to be printed, but again this does not occur.
I have tried reinstalling the package using pip following the change but that did not help either.
I am at a loss as I do not really understand what is happening. Any help appreciated.

Python PDM + pre-commit using pylint: imports cannot be found

Background
I am wrangling some legacy code into shape.
I use PDM to manage dependencies, which places all dependent packages in a __pypackages__ folder directly under the repo root level. PDM also uses the relatively new pyproject.toml package config file.
I am trying to adopt pre-commit Git hooks so that I can have automated checks for formatting and style before trying to commit, merge, and/or create PRs.
I am asking pre-commit to use only a few Python tools for now: pylint and black.
Issue
Most of that toolset works great together. However, pylint cannot find any of the modules that are stored in the __pypackages__ folder. Most of what I have read suggests that I alter my $PYTHONPATH to find the modules.
This solution seems very outdated. But also, I am not sure how I can do this in a robust way across the team. I can alter the Git hooks, but the $PYTHONPATH may be different for each engineer, so this will only work for my machine.
I would like to be able to add something in the pyproject.toml file to have pylint find it. I am not sure what to write, though, so that it generically works across the whole team. Something like
[tools.pylint]
pypackages = "./__pypackages__"
Any ideas how I can do this?
Details
I am not sure more details are needed, but here it is:
My actions:
> pre-commit run --all-files # The --all-files flag is just to allow me to test without a commit
Trim Trailing Whitespace.................................................Passed
Fix End of Files.........................................................Passed
Check Yaml...........................................(no files to check)Skipped
Check for added large files..............................................Passed
black....................................................................Passed
pylint...................................................................Failed
- hook id: pylint
- exit code: 30
************* Module testfile
testfile.py:18:0: E0401: Unable to import 'boto3' (import-error)
boto3 is in the __pypackages__ mentioned above. None of the modules can be imported, but I limited the output for clarity.
I can pdm run ... everything correctly and VS Code sees the modules fine. But pylint is not finding it because it cannot find this __pypackages__ folder.
You can get around this by updating the PYTHONPATH environment variable used by the extension, by creating a file named .env in your workspace (project folder) and adding the following entry:
PYTHONPATH=D:/commonScripts
Note: Relative paths are also supported.
Further info on .env files can be found here https://code.visualstudio.com/docs/python/environments#_environment-variable-definitions-file

reStructuredText: README.rst not working on PyPI

I have created a package on github django-joyride, after publishing it to pypi it is not showing the README.rst properly. I have checked my syntax here on an online viewer and you can see it works fine. What could be the issue?
I had the same problem when uploading my python module to pypi .
Later I checked the README.rst for errors using rst-lint which showed that my readme file was right. You can also use restructuredtext_link package for python to check the rst file for any errors or warnings .
I found that the problem was not in the README file but in setup.py itself.
Follow the below points while writing Readme and setup.py
DO NOT WRITE MULTI LINE python strings for description or summary or anything that goes into the setup( ) arguments .
Don't use relative links in the README file .(like ./path1/path2 ).
Make sure the rst syntax is all right using a checking tool like rst-lint.
If you have a markdown file , you can convert it to Restructured text using pandoc easily.
Also do not use any new docutils syntax since Pypi doesn't use the latest version as Github does.
It could be that pypi gets confused because you have both a README.md and a README.rst file. Try to delete the .md one; github can handle README.rst fine.

setup.py's sdist, bdist and install behaving differently regarding data_files

I'm trying to distribute web assets along with a web app that I'm trying to package, but I'm failing miserably. I don't understand why I have a different list of files installed or packages when I run bdist, sdist, or install.
Project Layout
The project runs with python 3 on Arch. The results are the same with Py3 on Raspbian.
I've done a very trimmed down version to make things simpler, which I describe here.
The files layout is as follow :
data/index.html
MANIFEST.in
mylib.py
setup.py
The MANIFEST.in file is :
recursive-include data *
The setup.py is :
#!/usr/bin/env python
from setuptools import setup, find_packages
setup(name='mylib',
version='0.1.2',
url='http://www.example.org',
author='Foo',
packages=find_packages(),
data_files = [ ('share/mylib', ['data/index.html']) ]
)
My goal is to install index.html in PREFIX/share/mylib/index.html.
Running setup.py
Now, bdist includes the file at the seemingly right location, while sdist and install just ignore it :
bdist
Using bdist, I have the following files in the resulting tar :
./usr/lib/python3.3/site-packages/mylib-0.1.2-py3.3.egg-info/SOURCES.txt
./usr/lib/python3.3/site-packages/mylib-0.1.2-py3.3.egg-info/top_level.txt
./usr/lib/python3.3/site-packages/mylib-0.1.2-py3.3.egg-info/dependency_links.txt
./usr/lib/python3.3/site-packages/mylib-0.1.2-py3.3.egg-info/PKG-INFO
./usr/share/mylib/index.html
This is exactly what I want to be installed, perfect. However, I really want sdist and install to work, since I want to distribute this thing on PyPI and be able to install from source checkouts.
sdist
When I untar the sdist file, everything seems ok and data is included :
...
mylib-0.1.2/data/
mylib-0.1.2/data/index.html
...
However, if I sudo python setup.py install --record=log.txt in the directory where it is untarred, the only file listed in the log is /usr/lib/python3.3/site-packages/mylib-0.1.2-py3.3.egg. No trace of data/index.html anywhere ('/usr/local/share', '/usr/share')
install
Same issue as sdist (I suppose this is expected). No trace of data/index.html anywhere ('/usr/local/share', '/usr/share').
I also tried to add a setup.cfg like this :
[install]
install-data=/usr/local/share/mylib/
install_data=/usr/local/share/mylib/
(I've added both install-data and install_data since docs state to use the later, while I saw other projects using the former). No luck.
Epilogue
Now, I quite new to python and it's environment, I'm probably missing something obvious or misunderstanding how setuptools works. I've been reading the doc back an forth, reading stackoverflow's Q&A in data_files at great length, but didn't make any progress.
If someone could point me to the right direction to solve this, this would be great. A link to a simple project distributing assets would be a good read too. I just couldn't find one that gave me that "Ah ah!" moment.
Thanks for reading.
I don't know whether this helps, as I always include my data files relative to the python packages they go with. Additionally to the MANIFFEST.in, you'd have a package_data key in setup.py:
setup(name='mylib',
version='0.1.2',
url='http://www.example.org',
author='Foo',
packages=find_packages(),
package_data={'package_name': 'package_dir/data/*'}
)
this would put the data to site-packages/mylib-0.1.2/data

Exclude system paths from django_coverage

I'm running django_coverage over a project with the command test_coverage. It's working, but it's including in the output and final calculation code in /usr/local/lib/python2.6/dist-packages. I'm not interested in knowing about the coverage of those modules, only the test coverage for my project. I see in the django_coverage documentation on BitBucket that there is a COVERAGE_PATH_EXCLUDES, but that seems to apply only to subdirectories of the project and not absolute system paths. Also, I see that the default for COVERAGE_MODULE_EXCLUDES is to exclude any imports with "django" in it, but I'm still getting output for /usr/local/lib/python2.6/dist-packages/django.
Any thoughts on how to fix this?
Do you have 'django' listed in COVERAGE_PATH_EXCLUDES? I have a similar setup (django 1.1.2, python 2.6) don't see the output for any django packages in my test coverage results. Can you post what you are using for the excludes?
I'm not using django so I can't confirm this, but is it possible that you have modified the original code settings file rather than including the settings in your own as it says in step 3 (from the readme excerpt below):
Install as a Django app
Place the entire django_coverage app in your third-party apps directory.
Update your settings.INSTALLED_APPS to include django_coverage.
Include test coverage specific settings in your own settings file. See settings.py for more detail.

Categories