document scripts not being part of modules in sphinx - python

I looked for several packages (sphinx-gallery, autoprogram,...), but found nothing on how to easily use a docstring from a python script for documentation. So I somehow want to do autodoc on a specific file.
Is somebody aware of the possibility to automatically generate sphinx documentation out of python scripts?
Like I have a docstring in the beginning of the script and maybe some functions with docstrings in there and just want to autogenerate some documentation like I can do with the .. automodule::directive, but unfortunately that won't work for relative paths / scripts.
EDIT:
The scripts I want to create some docstrings out are not cli scripts, it are just some python scripts which are getting called by a cron job in general. So unfortunately autoprogram won't help here as far as I see.
EDIT2:
Okay, so now I got a little bit clearer on that after re-reading the documentation and trying around. What I wanted to do is automatically taking the docstring of a python file and put that to documentation without executing the whole file (because for some reasons I can't or don't want to hide everything behind a main routine). I got autodoc to document a specific file (there was some misconfiguration why that didn't worked), but like stated in its documentation, it executes the file. That's my true problem right here. I'd be happy if one has a solution to achieve this, but would totally understand if this is not possible without big effort.

What I wanted to do is automatically taking the docstring of a python file and put that to documentation without executing the whole file.
You cannot do that. To get the docstring from a module, the module needs to be imported. Importing the module executes the whole file.
If you don't want code to be executed upon a "simple import", you can use a if __name__ == __name__ block or setuptools entry_point based automatic script generation.

Related

how to make vscode detect / auto reload modules after editing them?

I've seen a few questions asking this, but none of the solutions worked for me.
I am developing a few functions/classes in different modules and have a main.py script that calls everything.
The problem is, when I make a change to a function in another module i.e. module1.py, VSCode does not detect the changes when I call the function in main.py after updating, it's still the older version.
I can get around this by doing something like:
from importlib import reload
reload module1
but this gets old real quick especially when I'm importing specific functions or classes from a module.
Simply re-running the imports at the top of my main.py doesn't actually do anything, I can only do that if I kill the shell and reopen it from the begining, which is not ideal if I am incrementally developing something.
I've read on a few questions that I could include this:
"files.useExperimentalFileWatcher" : true
into my settings.json, but it does not seem to be a known configuration setting in my version, 1.45.1.
This is something Spyder handles by default, and makes it very easy to code incrementally when calling functions and classes from multiple modules in the pkg you are developing.
How can I achieve this in VSCode? To be clear, I don't want to use IPython autoreload magic command.
Much appreciated
FYI here are the other questions I saw, but did not get a working solution out of, amongst others with similar questions/answers :
link1
link2
There is no support for this in VS Code as Python's reload mechanism is not reliable enough to use outside of the REPL, and even then you should be careful. It isn't a perfect solution and can lead to stale code lying about which can easily trip you up (and I know this because I wrote importlib.reload() 😁).

Is there any tool that shows the order of files that gets run in a project?

I am reading source code of a project that contains several sub-folders and files. I was wondering if there is any tool that shows the order of the files and/or functions that gets run in a project. The project is in python.
I think you're asking for a list of the modules that get imported. To do that, use the python -v option for verbose output.
Print a message each time a module is initialized, showing the place (filename or built-in module) from which it is loaded. When given twice (-vv), print a message for each file that is checked for when searching for a module. Also provides information on module cleanup at exit.
That will give you a list of all the modules that get imported in the order they are imported.
Have you tried pycallgraph?
Python module that creates call graph visualizations for Python
applications.
Additionally you can try snakefood:
Generate dependency graphs from Python code.

Python Code Completion for C++ Libraries (Using Pycharm, VTK)

I'm trying to get back into coding for some math/physics experimentations and found VTK as a powerful tool using python. So I installed Python(x,y) and Pycharm Community edition. But I cannot get the Code Completion for VTK to work. I know this question has been posted quite a lot of times, but I couldn't find any concrete answer.
Here's what I know so far:
In order for Code Completion to work Pycharm constructs Skeletons. (Basically Python files with empty Classes/Methods that match the C++ API and can then be used like any other Python file for code completion.)
If I locate these files they don't appear to be complete and look something like this:
If this is indeed the skeleton (the file is called vtkRenderingPython.py) then shouldn't there be empty function declarations?
The result is that I get code completion for the classnames, but not the functions. For a library this huge that's rather annoying. Is there an easy way to get this working, or is this just a limitation I have to live with? Is there maybe a way to get complete Skeletons and replace the ones I have here? Am I missing the point entirely?
After another couple of hours I tried my luck with the PyDev extension for Eclipse. I didn't think that would work, but to my surprise it did! No settings necessary, it just worked out of the box.
The only drawback is that inherited methods are not shown in code completion. The base class is shown in the documentation window though so you can get the available functions by creating a temporary object of the base class and scrolling through the code completion there.

Idiom for modules in packages that are not directly executed

Is there an idiom or suggested style for modules that are never useful when directly executed, but simply serve as components of a larger package — e.g. those containing definitions, etc.?
For example is it customary to omit #!/usr/bin/env python; add a comment; report a message to the user or execute other code (e.g. using a check of whether or not __name__ is '__main__' — or simply do nothing special?
Most of the python code I write is modules that generally don't get directly called as scripts. Sometimes when I'm working on a small project and don't want to set up a more complex testing system, I'll call my tests for the module bellow if __name__ == '__main__':, that way I can quickly test my module just by calling python modulename.py (this sometimes does not play nice with relative imports and such but it works well enough for small projects). Whether or not I do this, I drop the shebang and don't give execute remission to the file because I don't like to make modules executable unless they're meant to be run as scripts.

Python File Structure on GitHub

I've been looking around at some open source projects on Python, and I'm seeing a lot of files and patterns that I'm not familiar with.
First of all, a lot of projects just have a file called setup.py, which usually contains one function:
setup(blah, blah, blah)
Second, a lot contain a file that is simply called __init__.py and contains next to no information.
Third, some .py files contain a statement similar to this:
if __name__ == "__main__"
Finally, I'm wondering if there are any "best practices" for dividing Python files up in a git repository. With Java, the idea of file division comes pretty naturally because of the class structure. With Python, many scripts have no classes at all, and sometimes a program will have OOP aspects, but a class by class division does not make that much sense. Is it just "whatever makes the code the most readable," or are there some guidelines somewhere about this?
The setup.py is part of Python’s module distribution using the distrubution utilities. It allows for easy installation of the Python module and is useful when, well, you want to distribute your project as a whole Python module.
The __init__.py is used for Python’s package system. An empty file is usually enough to make Python recognize the directory it is in as a package, but you can also define different things in it.
Finally, the __name__ == '__main__' check is to ensure that the current script is run directly (e.g. from the command line) and it is not just imported into some other script. During a Python script execution only a single module’s __name__ property will be equal to __main__. See also my answer here or the more general question on that topic.
The setup.py is part of distutils setup process. You'll want to have one of those if you're distributing a module instead of just a basic script (which even then it's a good idea to have one so you can easily expand into a module later).
The __init__.py part of the python module import process:
Files named init.py are used to mark directories on disk as a
Python package directories. If you have the files
mydir/spam/init.py mydir/spam/module.py and mydir is on your path,
you can import the code in module.py as:
import spam.module or
from spam import module If you remove the init.py file, Python
will no longer look for submodules inside that directory, so attempts
to import the module will fail.
if __name == "__main__" is a way to indicate code that would be executed if the file was run directly instead of imported.
To answer on how to layout your code, the distfiles documentation has a good guide on this.
In addition to #poke's answer, see this related question on what the directory structure of a python project should be. Here is another useful tutorial on how to make your project easily runnable.

Categories