I'd like to run periodic tasks on my django project, but I don't want all the complexity of celery/django-celery (with celerybeat) bundled in my project.
I'd like, also, to store the config with the times and which command to run within my SCM.
My production machine is running Ubuntu 10.04.
While I could learn and use cron, I feel like there should be a higher level (user friendly) way to do it. (Much like UFW is to iptables).
Is there such thing? Any tips/advice?
Thanks!
There are several Django-based scheduling apps, such as django-chronograph and django-chroniker and django-cron. I forked django-chronograph into django-chroniker to fix a few bugs and extend it for my own use case. I still use Celery in some projects, but like you point out, it's a bit overcomplicated and has a large stack.
In my personal opinion, i would learn how to use cron. This won't take more than 5 to 10 minutes, and it's an essential tool when working on a Linux server.
What you could do is set up a cronjob that requests one page of your django instance every minute, and have the django script figure out what time it is and what needs to be done, depending on the configuration stored in your database. This is the approach i've seen in other similar applications.
Related
I have made a small django app to fit all my needs. I will use it on my company level to track simple tasks of couple mehanical engineers. Now, only thing left is to send scheduled emails in my Django app (every day at noon, if someone is behind with work, he would get an email). Since I'm using Windows and I'll deploy this app on Windows, I can't use cron job (this only works on Linux, as I've seen on forums), which is simple and easy. Only way I found so far was using django-celery-beat. This is not so easy to set up, and I need to run 'worker' each time I run my server. This is a bit above my level and I would need to learn a lot more (and it needs to have a message broker, like RabbitMQ, which I also need to run and learn to implement).
I was wondering is there a more easy way to send a simple email every day at noon? I don't want to install additional apps, I wish to keep it simple as possible.
You can do it by Dockerizing Django with Redis and Celery.
Dockerizing is the process of packing, deploying, and running applications using Docker containers.
please use the below link to read more about dockerizing
Dockerizing
Dockerizing Django with Postgres, Redis and Celery
I'm building a web app using a Django framework. There isn't much user interaction with only a few static links, navbar, and a few plots which come from my app itself. The main part of the app comes from a python script which reads data from an external source, does some data processing on it, and then writes to my django database. Then after writing to the database a new page is created with information about the database entry. Note that there is no user interaction so no starting or stopping the task. I want the task to run in the background 24/7.
Thus far I've looked into celery and django-background-tasks. Celery seems like a bit much for my use case. I don't think I need a broker service as I just want to run 1 task which the user will never interact with. Additionally, I don't need multiple workers. Django-background-tasks seems like a good lightweight alternative but it seems it does not support indefinite tasks without having to refresh the task every once in a while (ideally I don't want this). Is there a tool that is better suited for this use case? Or am I just completely misunderstanding celery and django-background-tasks.
Update
Thanks for the comments, everyone! So I looked up what was mentioned by #D Malan and I think a tool like Supervisord might fit my use case. That is I can run a python script separately from the django application in the background and then have the python script interact with the django application. The only problem I have now is getting the process to interact with the django app. #deceze mentioned invoking a manage command from the python script. So this would involve creating a subprocess from the python script calling a custom django management command which updates my database? Or can I use the django.core.management.call_command but from a python file separate from the django application. If this is the case how would it know where to get the command from?
I have a web application that runs locally only (does not run on a remote server). The web application is basically just a ui to adjust settings and see some information about the main application. Web UI was used as opposed to a native application due to portability and ease of development. Now, in order to start and stop the main application, I want to achieve this through a button in the web application. However, I couldn't find a suitable way to start a asynchronous and managed task locally. I saw there is a library called celery, however that seems to be suitable to a distributed environment, which mine is not.
My main need to be able to start/stop the task, as well as the check if the task is running (so I can display that in the ui). Is there any way to achieve this?
celery can work just fine locally. Distributed is just someone else's computer after all :)
You will have to install all the same requirements and the like. You can kick off workers by hand, or as a service, just like in the celery docs.
I would like to deploy several WSGI web applications with Twisted on a debian server, and need some direction for a solid production setup. These applications will be running 24/7.
I need to run several configurations, each binding to different ports/interfaces/privileges.
I want to do as much of this in python as possible.
I do not want to package my applications with a program like 'tap2deb'.
What is the best way to implement each application as a system service? Do I need some /etc/init.d shell scripts, or can I manage this with python? (I don't want anything quite as heavy as Daemontools)
If I use twistd to manage most of the configuration/process management, what kind of wrappers/supervisors do I need to put in place?
I would like centralized management, but restricting control to the parent user account is not a problem.
The main problem I want to avoid, is having to SSH into my server once a day to restart a blocking/crashed application
I have found several good references for launching daemon processes with python. See daemoncmd from pypi.
Im still coming up a little short on the monitoring/alert solutions (in python).
It seems that all roads lead to having to use PyISAPIe to get Django running on IIS6. This becomes a problem for us because it appears you need separate application pools per PyISAPIe/Django instance which is something we'd prefer not to do.
Does anyone have any advice/guidance, or can share their experiences (particularly in a shared Windows hosting environment)?
You need separate application pools no matter what extension you use. This is because application pools split the handler DLLs into different w3wp.exe process instances. You might wonder why this is necessary:
Look at Django's module setting: os.environ["DJANGO_SETTINGS_MODULE"]. That's the environment of the process, so if you change it for one ISAPI handler and then later another within the same application pool, they both point to the new DJANGO_SETTINGS_MODULE.
There isn't any meaningful reason for this, so feel free to convince the Django developers they don't need to do it :)
There are a few ways to hack around it but nothing works as cleanly as separate app pools.
Unfortunately, isapi-wsgi won't fix the Django problem, and I'd recommend that you keep using PyISAPIe (disclaimer: I'm the developer! ;)
Django runs well on any WSGI infrastructure (much like any other modern Python web app framework) and there are several ways to run WSGI on IIS, e.g. see http://code.google.com/p/isapi-wsgi/ .