Hosting build documentation from a pull request on GutHub - python

I work on an open source project.
In order to facilitate the review of sphinx documentation changes in our Python source code, we’d love if we could somehow get the documentation generated and hosted somewhere for each pull request, like we already do with coveralls for our code coverage.
Pushing new commits would update the generated doc for that pull request. We’ll be soon be adding doc generation to our travis build to find sphinx errors, but doing a final visual review would still need to be done locally by pulling the branch and generating it locally.
Is there any GitHub app that offers to host a webpage generated on a pull request?

2018: You can make your own GitHub repository serve pages with GitHub pages.
If you generate your documentation in the gh-pages branch or the doc/ subfolder of your main branch, you can have both your codebase, and your doc in the repo.
See for instance lneuhaus/pyrpl issue 85, which illustrates how to automate Sphinx for Python documentation
cd doc/sphinx
sphinx-apidoc -f -o source/ ../../pyrpl/
make html
You can automate that using Syntaf/travis-sphinx:
A standalone script for automated building and deploying of sphinx docs via travis-ci
Update Q4 2022: the OP fronsacqc adds in the comments:
We ended up pushing the documentation generated by the build on an S3 folder tagged with the static hosting flag, so now every build's documentation is hosted on S3 for a couple of days.

Related

How can I build an updater for a flask application?

I'm building an internal only flask-based application for work. I would like the program to update automatically from the master branch of my git repository. The application is running on a Linux based (Ubuntu server) VM. Is there a way to get the flask app to automatically pull changes and update from the master branch?
Edit: I don't want a python module updater from pip/git (most common search result)
Update: I found a module called micropython-ota-updator, the problem with this is it seems to update on reboots. The software I'm building is meant to consistently stay up. My inital idea was to check for updates at a interval.
For anyone else having this issue:
I decided on dynamically generating GitHub links and building a function for cloning the repo. There were also some configuration files that get saved outside of the repo before cloning. Once the repo is cloned, the configuration files are moved back into their proper places in the new repo. If anyone else wants more information about this, let me know. I'm glad to share my discoveries!

git pull doesn't update files on pythonanywhere

I am following Django Girls Tutorial, I am on html section - https://tutorial.djangogirls.org/en/html/ - where git is updated than files are transferred to pythonanywhere by using git pull. My git has been updated however, after git pull on pythonanywhere I do not see updated Files. I rerun my web app as well, no changes.
After git pull I see files changes, insertions and deletion, but no files updated in FILES section.
Thank You for help,
Bogdan
enter image description here
I found this if your willing to give it a shot. It allows you to reload and push all to your pythonanywhere site. https://blog.pythonanywhere.com/87/

How to deploy a blog made using Sphinx

I made a blog using Ablog, a plugin for Sphinx that lets you build a complete blog with analytics, disqus integration etc.
I want to deploy this app, I tried heroku but it didn't work. Has anybody deployed a blog using Ablog, or Sphinx?
Thanks
http://ablog.readthedocs.org/
http://sphinx-doc.org/
It looks like ablog build generates a static site, and deploying it should be as easy as copying the generated files into a directory that a web server is configured to serve. I have a similar setup using Pelican for my blog and I use Github Pages to deploy it, which means creating a git repo with the correct name (username.github.io), adding your generated files, and pushing it up, after which your blog will be available at username.github.io.

build/deploy script in product repo or in external repo?

I am working on a web project written in python. Here are some facts:
Project hosted on GitHub
fabric (a python library for build and deploy automation) script (fabfile.py) for automatic build and deploy
Jenkins for build automation
My question is, where to put the fabfile.py conventionally.
I prefer to put the fabfile.py in the root of the project repo so that I can config jenkins job to grab the source code from git and simply run fab build to get the compiled package.
Someone insists that the fabfile.py should NOT be part of the project repo, it should be kept in an external repo instead. In this case you need to config jenkins to clone the fabfile repo, invoke git to clone the product repo then run packaging.
I know this is probably a matter of personal flavor, but are there any benefits to put fabfile.py in a separate repo than to put it with the product code?
Any suggestions are appreciated.
In my opinion, I can't see any benefits besides maybe preventing some junior dev accidentally deploying some unwanted code.
On the other hand, it's nice to have everything in one repo so you don't have to maintain multiple repositories. In past experiences, we always included deployment scripts in the root of the project.

Practices while releasing the python/ruby/script based web applications on production

I am purely a windows programmer and spend all my time hacking VC++.
Recently I have been heading several web based applications and myself built applications with python (/pylons framework) and doing projects on rails. All the web projects are hosted on ubuntu linux.
The RELEASE procedures and check list we followed for building and releasing VC++ windows application are merely no more useful when it comes to script based language.
So we don't built any binaries now. I copied asp/php files into IIS folder through ftp server when using open source cms applications.
So FTP is the one of the way to host the files to the web server. Now we feel lazy or not so passionate to copy files via ftp instead we use the SVN checkout and we simply do svn update to get the latest copy.
Is SVN checkout and svn update are the right methods to update the latest build files into the server? Are there any downside in using svn update? Any better method to release the script/web based scripts into the production server?
PS: I have used ssh server at some extension on linux platform.
I would create a branch in SVN for every release of web application and when the release is ready there, I would check it out on the server and set to be run or move it into the place of the old version.
Is SVN checkout and svn update are the right methods to update the latest build files into the server?
Very, very good methods. You know what you got. You can go backwards at any time.
Are there any downside in using svn update? None.
Any better method to release the script/web based scripts into the production server?
What we do.
We do not run out of the SVN checkout directories. The SVN checkout directory is "raw" source sitting on the server.
We use Python's setup.py install to create the application in /opt/app/app-x.y directory tree. Each tagged SVN branch is also a branch in the final installation.
Ruby has gems and other installation tools that are probably similar to Python's.
Our web site's Apache and mod_wsgi configurations refer to a specific /opt/app/app-x.y version. We can then stage a version, do testing, do things like migrate data from production to the next release, and generally get ready.
Then we adjust our Apache and mod_wsgi configuration to use the next version.
Previous versions are all in place. And left in place. We'll delete them some day when they confuse us.
The one downside of doing an svn update on your web root is that the .svn directories can potentially be made public, so be careful about permissions on the web server.
That said, there are far better ways to deploy an app built with dynamic languages. In the Rails world Capistrano is a mature deployment tool, as well as Vlad the Deployer. Capistrano can easily be used for non-rails deployments
There are many deployment strategies based on version control. You could go through the tutorials and get some ideas.
I'd also like to add that even though we do not "build" (compile) the project in dynamic languages, we do "build" (test/integration) them. A serious project would use a continuous integration server to validate the integrated "build" of project on every commit or integration, well before it gets to production.
One downside of doing an svn update: though you can go back in time, to what revision do you go back to? You have to look it up. svn update pseudo-deployments work much cleaner if you use tags - in that case you'd be doing an svn switch to a different tag, not an svn update on the same branch or the trunk.
You want to tag your software with the version number something like 1.1.4 , and then have a simple script to zip it up application-1.1.4,zip, and deploy it - then you have automated repeatable releases and rollbacks as well as greater visibility into what is changing between releases.

Categories