Update celery beat schedule at runtime - python

As far as I've seen, the native implementation of celery beat scheduler does not provide a way of adding and syncing scheduled tasks at runtime. There is django-celery-beat, but I do not want to add django dependency (which I don't really need).
I have found a couple of third-party packages backed by Redis (celery-redbeat, redisbeat);
do you know alternatives to achieve this goal? I was also thinking of subclassing Scheduler interface by myself, but it seems not so easy to gather all the necessary methods overriding.

I recommend celery-redbeat as it is a scheduler used in production by many companies.

Related

Asynchronous replacement for Celery

We're using Celery for background tasks in our Django project.
Unfortunately, we have many blocking sockets in tasks, that can be established for a long time. So Celery becomes fully loaded and does not respond.
Gevent can help me with sockets, but Celery has only experimental support of gevent (and as I found in practice, it doesn't work well).
So I considered to switch to another task queue system.
I can choose between two different ways:
Write my own task system. This is a least preferred choice, because it requires much time.
Find good and well-tried replacement for Celery that will work after monkey patching.
Is there any analogue of Celery, that will guarantee me execution of my tasks even after sudden exit?
Zeromq might be suitable for your use case.
See- https://serverfault.com/questions/80679/how-to-pick-between-rabbitmq-and-zeromq-or-something-else
You will however need to write your own messaging library to persist messages.
Have you tried to use Celery + eventlet? It works well in our project

Cron-like scheduler, something between cron and celery

I'd like to run periodic tasks on my django project, but I don't want all the complexity of celery/django-celery (with celerybeat) bundled in my project.
I'd like, also, to store the config with the times and which command to run within my SCM.
My production machine is running Ubuntu 10.04.
While I could learn and use cron, I feel like there should be a higher level (user friendly) way to do it. (Much like UFW is to iptables).
Is there such thing? Any tips/advice?
Thanks!
There are several Django-based scheduling apps, such as django-chronograph and django-chroniker and django-cron. I forked django-chronograph into django-chroniker to fix a few bugs and extend it for my own use case. I still use Celery in some projects, but like you point out, it's a bit overcomplicated and has a large stack.
In my personal opinion, i would learn how to use cron. This won't take more than 5 to 10 minutes, and it's an essential tool when working on a Linux server.
What you could do is set up a cronjob that requests one page of your django instance every minute, and have the django script figure out what time it is and what needs to be done, depending on the configuration stored in your database. This is the approach i've seen in other similar applications.

Running celery in django not as an external process?

I want to give celery a try. I'm interested in a simple way to schedule crontab-like tasks, similar to Spring's quartz.
I see from celery's documentation that it requires running celeryd as a daemon process. Is there a way to refrain from running another external process and simply running this embedded in my django instance? Since I'm not interested in distributing the work at the moment, I'd rather keep it simple.
Add CELERY_ALWAYS_EAGER=True option in your django settings file and all your tasks will be executed locally. Seems like for the periodic tasks you have to execute celery beat as well.

Background job/task monitor for Django

I have a few management commands for my Django poject that is run automatically by cron. Are there any django packages that allow me to monitor the status of my background jobs?
Currently I have to trudge through my log files to find out if everything went okay or not and I'm confident that writing a simple job/task monitor for Django wouldn't be hard all but if there were already some existing packages that you know about, it would help a lot as I don't have cobble together something myself.
Thanks
I recommend Celery can be used to set up periodic tasks.
django-tasks. Sorry I can't provide more info, haven't used it myself (yet), but it seems to provide what you're looking for.
I used django-chronograph for managing my scheduled jobs and django-peavy for logging. That provided everything I needed.

delayed_job like queue for python

I need a queue to send data from ruby to python
The system is an application with a Ruby frontend and python backend and I'd rather not add another complicated piece. If it was ruby only I'd just go with delayed_job, but ruby->python is harder.
So
I'm looking for a simple database based queue (similar to delayed_job) for python for which I'm planning to hack a ruby 'producer' part.
Or just surprise me with a solution I haven't think of yet.
Maybe you could have a look at Celery.
Pretty old question, but just for anyone stumbling across this question now and looking for a simple answer that isn't Celery:
django-background-tasks is based Ruby's DelayedJob.
Django Background Task is a databased-backed work queue for Django,
loosely based around Ruby's DelayedJob library. This project was
adopted and adapted from this repo.
To avoid conflicts on PyPI we renamed it to django-background-tasks
(plural). For an easy upgrade from django-background-task to
django-background-tasks, the internal module structure were left
untouched.
In Django Background Task, all tasks are implemented as functions (or
any other callable).
There are two parts to using background tasks:
creating the task functions and registering them with the scheduler
setup a cron task (or long running process) to execute the tasks

Categories