Background job/task monitor for Django - python

I have a few management commands for my Django poject that is run automatically by cron. Are there any django packages that allow me to monitor the status of my background jobs?
Currently I have to trudge through my log files to find out if everything went okay or not and I'm confident that writing a simple job/task monitor for Django wouldn't be hard all but if there were already some existing packages that you know about, it would help a lot as I don't have cobble together something myself.
Thanks

I recommend Celery can be used to set up periodic tasks.

django-tasks. Sorry I can't provide more info, haven't used it myself (yet), but it seems to provide what you're looking for.

I used django-chronograph for managing my scheduled jobs and django-peavy for logging. That provided everything I needed.

Related

Good way to run ONE indefinite process on Django Framework

I'm building a web app using a Django framework. There isn't much user interaction with only a few static links, navbar, and a few plots which come from my app itself. The main part of the app comes from a python script which reads data from an external source, does some data processing on it, and then writes to my django database. Then after writing to the database a new page is created with information about the database entry. Note that there is no user interaction so no starting or stopping the task. I want the task to run in the background 24/7.
Thus far I've looked into celery and django-background-tasks. Celery seems like a bit much for my use case. I don't think I need a broker service as I just want to run 1 task which the user will never interact with. Additionally, I don't need multiple workers. Django-background-tasks seems like a good lightweight alternative but it seems it does not support indefinite tasks without having to refresh the task every once in a while (ideally I don't want this). Is there a tool that is better suited for this use case? Or am I just completely misunderstanding celery and django-background-tasks.
Update
Thanks for the comments, everyone! So I looked up what was mentioned by #D Malan and I think a tool like Supervisord might fit my use case. That is I can run a python script separately from the django application in the background and then have the python script interact with the django application. The only problem I have now is getting the process to interact with the django app. #deceze mentioned invoking a manage command from the python script. So this would involve creating a subprocess from the python script calling a custom django management command which updates my database? Or can I use the django.core.management.call_command but from a python file separate from the django application. If this is the case how would it know where to get the command from?

Python start and manage external processes from Django

I'm in need of a way to execute external long running processes from a web app written in Django and Python.
Right now I'm using Supervisord and the API. My problem with this solution is that it's very static. I need to build the commands from my app instead of having to pre configure Supervisord with all possible commands. The argument and the command is dynamic.
I need to execute the external process, save a pid/identifier and later be able to check if it's still alive and running and stop the process.
I've found https://github.com/mnaberez/supervisor_twiddler to add processes on the fly to supervisord. Maybe that's the best way to go?
Any other ideas how to best solve this problem?
I suggest you have a look at this post:
Processing long-running Django tasks using Celery + RabbitMQ + Supervisord + Monit
As the title says, there are a few additional components involved (mainly celery and rabbitMQ), but these are good and proven technologies for this kind of requirement.

Cron-like scheduler, something between cron and celery

I'd like to run periodic tasks on my django project, but I don't want all the complexity of celery/django-celery (with celerybeat) bundled in my project.
I'd like, also, to store the config with the times and which command to run within my SCM.
My production machine is running Ubuntu 10.04.
While I could learn and use cron, I feel like there should be a higher level (user friendly) way to do it. (Much like UFW is to iptables).
Is there such thing? Any tips/advice?
Thanks!
There are several Django-based scheduling apps, such as django-chronograph and django-chroniker and django-cron. I forked django-chronograph into django-chroniker to fix a few bugs and extend it for my own use case. I still use Celery in some projects, but like you point out, it's a bit overcomplicated and has a large stack.
In my personal opinion, i would learn how to use cron. This won't take more than 5 to 10 minutes, and it's an essential tool when working on a Linux server.
What you could do is set up a cronjob that requests one page of your django instance every minute, and have the django script figure out what time it is and what needs to be done, depending on the configuration stored in your database. This is the approach i've seen in other similar applications.

Rebuilding a Django site every night

I have a django site that needs to be rebuilt every night. I would like to check out the code from the Git repo and then begin doing the stuff like setting up the virtual environment, downloading the packages, etc. This would have no manual intervention as this would be run from cron
I'm really confused as to what to use for this. Should I write a Python script or a Shell script? Are there any tools that assist in this?
Thanks.
So what I'm looking for is CI and from what I've seen I'll probably end up using Jenkins or Buildbot for it. I've found the docs to be rather cryptic for someone who's never attempted anything like this before.
Do all CI like Buildbot/Jenkins simply run tests and more test and send you reports or do they actually set up a working Django environment that you can access through your browser?
You'll need to create some sort of build script that does everything but the GIT checkout. I've never used any Python build tools, but perhaps something like: http://www.scons.org/.
Once you've created a script you can use Jenkins to schedule a nightly build and report success/failure: http://jenkins-ci.org/. Jenkins will know how to checkout your code and then you can have it run your script.
There are litterally 100's of different tools to do this. You can write python scripts to be run from cron, you can write shell scripts, you can use one of the 100's of different build tools.
Most python/django shops would likely recommend Fabric. This really is a matter of you running through and making sure you understand everything that needs to be done and how to script it. Do you need to run a test suite before you deploy to ensure it doesn't really break everything? Do you need to run South database migrations? You really need to think about what needs to be done and then you just write a fabric script to do those things.
None of this even touches the fact that overall what you're asking for is continuous integration which itself has a whole slew of tools to help manage that.
What you are asking for is Continuous Integration.
There are many CI tools out there, but in the end it boils down to your personal preferences (like always, hopefully) and which one just works for you.
The Django project itself uses buildbot.
If you would ask me, then I would recommend you continuous.io, which works ouf the box with Django applications.
You can manually set how many times you would like to build your Django project, which is great.
You can, of course, write a shell script which rebuilds your Django project via cron, but you should deserve better than that.

Do I need PyISAPIe to run Django on IIS6?

It seems that all roads lead to having to use PyISAPIe to get Django running on IIS6. This becomes a problem for us because it appears you need separate application pools per PyISAPIe/Django instance which is something we'd prefer not to do.
Does anyone have any advice/guidance, or can share their experiences (particularly in a shared Windows hosting environment)?
You need separate application pools no matter what extension you use. This is because application pools split the handler DLLs into different w3wp.exe process instances. You might wonder why this is necessary:
Look at Django's module setting: os.environ["DJANGO_SETTINGS_MODULE"]. That's the environment of the process, so if you change it for one ISAPI handler and then later another within the same application pool, they both point to the new DJANGO_SETTINGS_MODULE.
There isn't any meaningful reason for this, so feel free to convince the Django developers they don't need to do it :)
There are a few ways to hack around it but nothing works as cleanly as separate app pools.
Unfortunately, isapi-wsgi won't fix the Django problem, and I'd recommend that you keep using PyISAPIe (disclaimer: I'm the developer! ;)
Django runs well on any WSGI infrastructure (much like any other modern Python web app framework) and there are several ways to run WSGI on IIS, e.g. see http://code.google.com/p/isapi-wsgi/ .

Categories