Is it possible to generate a single .pot file from Sphinx documentation? - python

I'm undergoing the task of i18n / l10n the documentation of a largish project. The documentation is done with Sphinx, that has off-the shelf basic support for i18n.
My problem is similar to that of this other question: namely the fact a large chunk of the strings for each pot file is the same, and I would like my translators not to re-enter the same translation over and again. I would rather have a single template file.
My problem is not really merging the files (that is just a msgcat *.pot > all.pot away), but rather the fact that - for the domains to work when building the documentation in a particular language - I have to copy and rename all.pot back to the original file names. So my workaroundish way of working is:
Generate fileA.pot, fileB.pot
Merge the two into all.pot
cp all.pot fileA.pot + cp all.pot fileB.pot
Is there a cleaner way to do the same? gettext_compact brings me only half-way through my goal...

After over 7 months, extensive research and a couple of attempted answers, it seems safe to say that no - at least with the current version 1.1.3 - it is not possible to generate a single .pot file from Sphinx documentation.
The workaround described in the original question is also the most straightforward for automatically removing duplicate strings when merging the different .pot files into a single one.

I see two possibilities here:
One is to hack Sphinx to not use domains at all… which I guess you won’t want to do for several reasons.
The other is: since you split by domains, the -M option of msggrep looks like it does what you need to do. Otherwise, the -N option can still be used to filter by source file(s).
That is, if the obvious solution doesn't work:
generate fileA.pot, fileB.pot (also look at msguniq here)
merge them into all.pot and give that to translators
for x in file*.pot; do msgmerge -o ${x%t} all-translated.po $x; done
According to TFM, msgmerge only takes the (supposedly newer) translations from its first argument (the translated po file) that match the up-to-date source locations of the second argument (po file with old translations, or pot file = template with only empty msgstrings).

Add the following to your conf.py :
gettext_compact = "docs"

Related

Get method names and lines from git commit - python

I am wondering if there is a way to get the list of method names that are modified along with modified lines and file path from git commit or if there are any better solution for python files
I have been going through few answers in stackoverflow where the suggestion was to edit the gitconfig file and .gitattributes but I couldn't get the desired output
Git has no notion of the semantics of the files it tracks. It can just give you the lines added or removed from a file or the bytes changed in binary files.
How these maps to your methods is beyond Git's understanding. However there might be tools and/or scripts that will try and heuristically determine which methods are affected by a (range of) commits.

How to use doxygen in a project containing both C++ and Python (multi-programming language project)

I am creating documentation for my project which has codes written in both C++ and Python. I can generate documentation with doxygen on my C++ codes, but find no way to do the same with my Python codes after I have the former files. And if I need both doxygen documents to appear under the same index.html, i.e. to merge them into one uniform doxygen file, how can I do this? What is the convention that everyone uses?
First off, thank you albert and shawnhcorey for your quick responses.
I tried what you suggested, and made the following modifications to parse documentation comments from files written in both languages:
In Doxyfile
INPUT = include/my_package/ scripts/my_package/
Both directories are separated by space.
Then,
FILE_PATTERNS = *.h *.py
The two wildcards patterns are also separated by a space.
Back to where you placed your doxyfile, then run
doxygen
And off you go!

Can I implement my own Python-based git merge strategy?

I developed a Python library for merging large numbers of XML files in a very specific way. These XML files are split up and altered by multiple users in my group and it would be much easier to put everything into a Git repo and have git-merge manage everything via my Python code.
It seems that implementing my code for git-mergetool is possible, but I would have to write my own code to manage the conflict returns for the internal git-merge (i.e. parse the >>>>>>> <<<<<<< ======= identifiers), which would be more time consuming.
So, is there a way to have Git's merge command automatically use my Python code instead of its internal git-merge?
You can implement a custom merge driver that's used for certain filetypes instead of Git's default merge driver.
Relevant documentation in gitattributes(5)
Some related StackOverflow questions:
Git - how to force merge conflict and manual merge on selected file
How do I tell git to always select my local version for conflicted merges on a specific file?

Can I use doxygen to document a command-line program?

I contribute to a large code project that uses Doxygen to document a series of C libraries. We are also starting to use doxygen with doxypy for associated python modules.
Is there an easy way to document command-line programs (in python or C), and their command line options, (automatically) using doxygen?
In order to generate man pages, you need to set GENERATE_MAN tag to Yes (.doxyfile).
By default, a sub-folder named man is created within the directory provided using OUTPUT_DIRECTORY to contain the pages generated.
By doing that, doxygen will render all the markup you added to the source code as a man page (one page for each translation unit).
At this point, you might want to exclude certain parts you want to ignore (I assume you are interested in showing only how to call the main) using the exclude* directives.
I advise you to compile two different doxyfiles: one for internal usage (complete javadoc-like documentation), the other for producing the program man and the like.
Of course, you will not get the expected result at the first try and you might need to play with doxygen markup a bit.

Python: How to create diff/patch files between 2 revisions for a single URL?

I would like to create a diff (patch) file between two revisions for a single SVN URL, including lines of unified context.
Basically I need to provide a Python method to achieve the following:
URL to the SVN repository
number for first (before) revision
number for second (after) revision
The output I require is as follows:
number of lines of code in head revision
number of changed files
actual diff files.
How can I do this using Python? I see many similar questions here but none specifically on how to achieve this in Python. Can anyone suggest some libraries/code to help accomplish this?
I noticed pysvn offers diff method. I think this is exactly what you need.

Categories