I'm wondering if there is a way to make a request to my Django backend that runs asynchronous. At page load, I need to kick off a process that takes ~30 seconds, but while it is running I cannot perform any other actions on the page that require a response from Django (specifically waiting on data for jqGrids).
Is there an easy way to tell Django that certain methods should be run asynchronously?
Django has not a native way to do asynchronous tasks but you could see Celery and using a django-celery task.
Celery web: http://www.celeryproject.org/
Django-Celery web: https://pypi.python.org/pypi/django-celery
You need to use celery. Celery is an asynchronous task queue/job queue based on distributed message passing. You can read more about celery here.
This is a great tutorial for setting up celery.
Related
I'm having different python programs doing long polling at different machines, and am thinking of a queuing based mechanism to manage the load and provide an async job functionality.
These programs are standalone, and aren't part of any framework.
I'm primarily thinking about Celery due to its ability for multi-processing and sharing tasks across multiple celery workers. Is celery a good choice here, or am I better off simply using an event based system with RabbitMQ directly?
I would say yes - Celery is definitely a good choice! We do have tasks that run sometimes for over 20 hours, and Celery works just fine. Furthermore it is extremely simple to setup and use (Celery + Redis is supersimple).
I have a django project with several Celery tasks. Also, I have several tests (using django TestCase) and I'm mocking the celery tasks. I don't want to run celery for the tests.
I have searched a lot on the internet, but no luck. So I want to ask: would there be any way not to mock these functions and have the task code executed inside the django runserver?
More info (I can't update them right now):
python 2.7
django 1.11
celery 4.3
Thank you so much for your help! :)
You could use apply instead of apply_async when you call task so that it is executed locally
apply(args=None, kwargs=None, link=None, link_error=None, task_id=None, retries=None, throw=None, logfile=None, loglevel=None, headers=None, **options)
Execute this task locally, by blocking until the task returns.
I'm willing to send tasks from a web server (running Django) to a remote machine that is holding a Rabbitmq server and some workers that I implemented with Celery.
If I follow the Celery way to go, it seems I have to share the code between both machines, which means replicating the workers logic code in the web app code.
So:
Is there a best practice to do that? Since code is redundant, I am thinking about using a git submodule (=> replicated in the web app code repo, and in the workers code repo)
Should I better use something else than Celery then?
Am I missing something?
One way to manage this is to store your workers in your django project. Django and celery play nice to each other allowing you to use parts of your django project in your celery app. http://celery.readthedocs.org/en/latest/django/first-steps-with-django.html
Deploying this would mean that your web application would not use the modules involved with your celery workers, and on your celery machine your django views and such would never be used. This usually only results in a couple of megs of unused django application code...
You can use send_task. It takes same parameters than apply_async but you only have to give the task name. Without loading the module in django you can send tasks:
app.send_task('tasks.add', args=[2, 2], kwargs={})
http://celery.readthedocs.org/en/latest/reference/celery.html#celery.Celery.send_task
I want to give celery a try. I'm interested in a simple way to schedule crontab-like tasks, similar to Spring's quartz.
I see from celery's documentation that it requires running celeryd as a daemon process. Is there a way to refrain from running another external process and simply running this embedded in my django instance? Since I'm not interested in distributing the work at the moment, I'd rather keep it simple.
Add CELERY_ALWAYS_EAGER=True option in your django settings file and all your tasks will be executed locally. Seems like for the periodic tasks you have to execute celery beat as well.
I've written a Python app that reads a database of tasks, and schedule.enter()s those tasks at various intervals. Each task reschedules itself as it executes.
I'd like to integrate this app with a WSGI framework, so that tasks can be added or deleted in response to HTTP requests. I assume I could use XML-RPC to communicate between the framework process and the task engine, but I'd like to know if there's a framework that has built-in event scheduling which can be modified via HTTP.
Sounds like what you really want is something like Celery. It's a Python-based distributed task queue which has various task behaviours including periodic and crontab.
Prior to version 2.0, it had a dependency on Django, but that has now been reduced to an integration plugin.