Im working on a project and cant solve, probably, a simple issue.
I have some datetime in a Model and I need to run some code when the current time reaches the Model datetime, so to say it is a sheduler with the provision from Models, also there is need to add some occurancies like every day, year, ...
I wonder if there is a simple nice solution.
Thanks forward....
I think you can have two solutions.
The simplest one is to create management commands to do what you need to do, and use the django.utils.timezone.now as starting value to filter your datetime in the models. You can create many commands as you wish like
run_hourly
run_daily
run_weekly
Then you can setup cron on linux to run the management commands when you need it.
Another solution is to use a task queues tool like Celery or RQ.
Celery needs to be configured and you must also setup your server to run Celery and the Celery beat schedule to run the tasks at specific time. If you don't have any specific requirements and your needs is just run a couple of tasks I would use cron instead of any Task Queues
More about Task Queues software here: https://www.fullstackpython.com/task-queues.html
I`ve found extremely usefull article. Everything works fine but PyCharm detects failure importing these
from celery.task.schedules import crontab
from celery.decorators import periodic_task
But still it doesn`t gives an error during server running.
Related
Hi I am working on a project where i need celery beat to run long term periodic tasks. But the problem is that after starting the celery beat it is taking the specified time to run for the first time.
I want to fire the task on load for the first time and then run periodically.
I have seen this question on stackoverflow and this issue on GitHub, but didn't found a reliable solution.
Any suggestions on this one?
Since this does not seem possible I suggest a different approach. Call the task explicitly when you need and let the scheduler continue scheduling the tasks as usual. You can call the task on startup by using one the following methods (you probably need to take care of the multiple calls of the ready method if the task is not idempotent). Alternatively call the task from the command line by using celery call after your django server startup command.
The best place to call it will most of the times be in the ready() function of the current apps AppConfig class:
from django.apps import AppConfig
from myapp.tasks import my_task
class RockNRollConfig(AppConfig):
# ...
def ready(self):
my_task.delay(1, 2, 3)
Notice the use of .delay() which puts the invocation on the celery que and doesn't slow down starting the server.
See: https://docs.djangoproject.com/en/3.2/ref/applications/#django.apps.AppConfig and https://docs.celeryproject.org/en/stable/userguide/calling.html.
I would like to run a particular function (let's say to delete a post) at a specific time in the future (e.g.: at 10am) only once based on a condition.
I am using Django, and I was thinking about using cron or python-crontab, but it seems that these task schedulers can only be used when a particular task has to be executed more than once in the future. As I was trying to use the python-crontab with Django, I also did not find any resources that allow me to execute "this task of deleting a post at 10am tomorrow only if a user does a particular action", for example.
Does anyone know if I can still use python-crontab? Or other technology should be used?
I would use:
https://github.com/celery/django-celery-beat
http://docs.celeryproject.org/en/latest/getting-started/first-steps-with-celery.html
Celery to run background tasks, and celery beat is a scheduler to kick off the background tasks at the specified times.
I have a heavy function (a lot of calculations are done) which outputs a individual number for each user in my Django project. This number changes just a little over time so to minimize the server load I thought about running the function once a day, save the output and just reference the output. I know that these kinda things are usually handled with Celery but the package requires a lot of site packages and extra modules so I thought about writing a simple function like:
x0 = #last.time function was called
x1 = datetime.now
if x0-x1 > 1 day:
def whatever():
....
x0 = datetime.now
return ....
I like to keep my code clean and not to install Packages which are not really required so I would like to know if there are any downsides by "just" using Python or any gain when I would do that with Celery. The task does not need to be asynchronous so I don't care about that.
Is there a clear "Use case" when Celery should be used and when not? Is there a performance loss/gain?
I hope somebody can explain that properly.
Celery is a clear winner but I would like to explain this with pros and cons.
Pros:
You can control celery from Django very easily. Running a celery task, cancelling task, checking state/progress of task can be done within django.
A periodical task running with celery is very simple, just register the task from django run the celery worker and voila you are done. No need to mess around with crontab or background processes.
Celery is very easy to setup and run. You might already know that if you have gone through the introduction of celery.
Cons
One of the cons is that you need to have at least one result backend with either redis, rabbitmq or any other one running with celery for queuing purposes. Although RabbitMq is not a heavy you need to install it once.
One more is that celery worker itself takes some memory but that won't be an issue if you are on a server, on local memory consumption might seem high to you.
I would suggest celery because it would provide you more control over your task rather than a simple background process.
In my django project, I need to collect data from about 50 remote servers into the local database minutely or every 30-seconds. Though it works with crontab in the remote servers, I want to do this in the project. Firstly, I consider the django-celery. However it does well in asynchronous processing and the collect-data task could not be delayed. Therefore i think, it may be not fit. How if i do this use the timer for python and what need i to pay more attention. Excuse for my ignorance of python and django. I'll appreciate other advice or ideas. Many thanks
Basically you can use Celery's preiodic tasks with expire option, which makes you sure that your tasks will not be executed twice.
Also you could run your own script with infinite loop like which will run calculation. If your calculation will run more than minute you can spawn your tasks using eventlet or gevent. Other option you could creare celery-tasks from this script and be sure that your tasks executes every N seconds, as you prefer.
We need a service that we can use to schedule events. For instance, we might have a task that needs to run at 3 o'clock (one time) or that runs every 2 hours (multiple times). Preferably each task could be configured with an AMQP queue that it would publish to.
We could easily implement this by creating an OS timer event. My concern is how to recover if this service ever went down. We could use CRON if it was something that allowed scheduling on-the-fly.
I was looking for a way to avoid reinventing the wheel. If there isn't a project out there that does this already, we will just create one. This is a pretty common thing, though, so I'd be surprised if no one's put one out there by now.
Celery solves this problem.
celery.schedules lets you define periodic tasks. And you can override is_due to do things like schedule once a month. You can schedule tasks to execute at a specific time using periodic_task, or celery-beat (which I believe is now the standard approach). Yet another way is to use the eta argument to Task.apply_async.