Is there a Python equivalent to PHP-FPM? - python

I'm starting a web project in Python and I'm looking for a process manager that offers reloading in the same manner as PHP-FPM.
I've built stuff with Python before and Paste seems similar to what I want, but not quite.
The need for the ability to reload the process rather than restart is to allow long-running tasks to complete uninterrupted where necessary.

How about supervisor with uwsgi?

Related

Good way to run ONE indefinite process on Django Framework

I'm building a web app using a Django framework. There isn't much user interaction with only a few static links, navbar, and a few plots which come from my app itself. The main part of the app comes from a python script which reads data from an external source, does some data processing on it, and then writes to my django database. Then after writing to the database a new page is created with information about the database entry. Note that there is no user interaction so no starting or stopping the task. I want the task to run in the background 24/7.
Thus far I've looked into celery and django-background-tasks. Celery seems like a bit much for my use case. I don't think I need a broker service as I just want to run 1 task which the user will never interact with. Additionally, I don't need multiple workers. Django-background-tasks seems like a good lightweight alternative but it seems it does not support indefinite tasks without having to refresh the task every once in a while (ideally I don't want this). Is there a tool that is better suited for this use case? Or am I just completely misunderstanding celery and django-background-tasks.
Update
Thanks for the comments, everyone! So I looked up what was mentioned by #D Malan and I think a tool like Supervisord might fit my use case. That is I can run a python script separately from the django application in the background and then have the python script interact with the django application. The only problem I have now is getting the process to interact with the django app. #deceze mentioned invoking a manage command from the python script. So this would involve creating a subprocess from the python script calling a custom django management command which updates my database? Or can I use the django.core.management.call_command but from a python file separate from the django application. If this is the case how would it know where to get the command from?

How to automatically rerun a python program after it finishes? Supervisord?

I have a python program that I would like to constantly be running updates and gathering new data. Essentially, I am gathering data from a bunch of domains. My processors take about a day and a half to run. Once they finish, I'd like them to automatically start over again.
I don't want to use a while loop to just restart the processes without killing everything related first because some of the packages that I am using to support these processors (mainly pyV8) have a problem of memory slowly accumulating and I'm not a good enough programmer to dive into debugging a memory leak in a big package like that. So, I need all of the related processes to successfully die and then come back to life.
I have heard that supervisord can do this type of work, but don't like messing around with .conf files and would prefer to keep everything inside of python.
Summary: Is there a package that will kill all related processes with a script/package that I could use to put into a while loop or create this kind of behavior inside of a python script?
I don't see why you couldn't use supervisord. The configuration is really simple and very flexible and it's not limited to python programs.
For example, you can create file /etc/supervisor/conf.d/myprog.conf:
[program:myprog]
command=/opt/myprog/bin/myprog --opt1 --opt2
directory=/opt/myprog
user=myuser
Then reload supervisor's config:
$ sudo supervisorctl reload
and it's on. Isn't it simple enough?
More about supervisord configuration: http://supervisord.org/subprocess.html

Python start and manage external processes from Django

I'm in need of a way to execute external long running processes from a web app written in Django and Python.
Right now I'm using Supervisord and the API. My problem with this solution is that it's very static. I need to build the commands from my app instead of having to pre configure Supervisord with all possible commands. The argument and the command is dynamic.
I need to execute the external process, save a pid/identifier and later be able to check if it's still alive and running and stop the process.
I've found https://github.com/mnaberez/supervisor_twiddler to add processes on the fly to supervisord. Maybe that's the best way to go?
Any other ideas how to best solve this problem?
I suggest you have a look at this post:
Processing long-running Django tasks using Celery + RabbitMQ + Supervisord + Monit
As the title says, there are a few additional components involved (mainly celery and rabbitMQ), but these are good and proven technologies for this kind of requirement.

Working implementation of daemon in Python

Does anyone know of a working and well documented implementation of a daemon using python? Please post a link here if you know of a project that fits these two requirements.
Three options I can think of-
Make a cron job that calls your script. Cron is a common name for a GNU/Linux daemon that periodically launches scripts according to a schedule you set. You add your script into a crontab or place a symlink to it into a special directory and the daemon handles the job of launching it in the background. You can read more at wikipedia. There is a variety of different cron daemons, but your GNU/Linux system should have it already installed.
Pythonic approach (a library, for example) for your script to be able to daemonize itself. Yes, it will require a simple event loop (where your events are timer triggering, possibly, provided by sleep function). Here is the one I recommend & use - A simple unix/linux daemon in Python
Use python multiprocessing module. The nitty-gritty of trying to fork a process etc. are hidden in this implementation. It's pretty neat.
I wouldn't recommend 2 or 3 'coz you're in fact repeating cron functionality. The Linux system paradigm is to let multiple simple tools interact and solve your problems. Unless there are additional reasons why you should make a daemon (in addition to trigger periodically), choose the other approach.
Also, if you use daemonize with a loop and a crash happens, make sure that you have logs which will help you debug. Also devise a way so that the script starts again. While if the script is added as a cron job, it will trigger again in the time gap you kept.
If you just want to run a daemon, consider Supervisor, a daemon that itself controls and manages daemons.
If you want to look at the nitty-gritty, you can check out Supervisor's launch script or some of the responses to this lazyweb request.
Check this link for a double-fork daemon: http://code.activestate.com/recipes/278731-creating-a-daemon-the-python-way/
The code is readable and well-documented. You want to take a look at chapter 13 of W. Richard's book 'Advanced Programming in the UNix Environment' for detailed information on Unix daemons.

what's a good module for writing an http web service interface for a daemon?

To give a little background, I'm writing (or am going to write) a daemon in Python for scheduling tasks to run at user-specified dates. The scheduler daemon also needs to have a JSON-based HTTP web service interface (buzzword mania, I know) for adding tasks to the queue and monitoring the scheduler's status. The interface needs to receive requests while the daemon is running, so they either need to run in a separate thread or cooperatively multitask somehow. Ideally the web service interface should run in the same process as the daemon, too.
I could think of a few ways to do it, but I'm wondering if there's some obvious module out there that's specifically tailored for this kind of thing. Any suggestions about what to use, or about the project in general are quite welcome. Thanks! :)
Check out the class BaseHTTPServer -- a "Basic HTTP server" bundled with Python.
http://docs.python.org/library/basehttpserver.html
You can spin up a second thread and have it serve your requests for you very easily (probably < 30 lines of code). And it all runs in the same process and Python interpreter space, so it can access all your objects, etc.
I'm not sure I understand your question properly, but take a look at Twisted
I believed all kinds of python web framework is useful.
You can pick up one like CherryPy, which is small enough to integrate into your system. Also CherryPy includes a pure python WSGI server for production.
Also the performance may not be as good as apache, but it's already very stable.
Don't re-invent the bicycle!
Run jobs via cron script, and create a separate web interface using, for example, Django or Tornado.
Connect them via a database. Even sqlite will do the job if you don't want to scale on more machines.

Categories