django-mailer - how to send_mail() asynchronously in Django - python

Is there a way to send a mail using django-mailer instantly?
As I see it now, send_mail() of django-mailer puts the mail on the queue and I need to run a periodic task that runs after every X minutes to send all mails in the queue.
I have a small app that needs to send only a handful of emails, and I want to send the email as soon as I have to, but in a different worker thread (to not hold up the web thread), and hence trying to use django-mailer
Thanks

Related

How to send partial status of request to frontend by django python?

Suppose, I have sent a post request from react to Django rest API and that request is time taking. I want to get how many percentages it has been processed and send to the frontend without sending the real response?
There are two broad ways to approach this.
(which I would recommend to start with): Break the request up. The initial request doesn't start the work, it sends a message to an async task queue (such as Celery) to do the work. The response to the initial request is the ID of the Celery task that was spawned. The frontend now can use that request ID to poll the backend periodically to check if the task is finished and grab the results when they are ready.
Websockets, wherein the connection to the backend is kept open across many requests, and either side can initiate sending data. I wouldn't recommend this to start with, since its not really how Django is built, but with a higher level of investment it will give an even smoother experience.

Trouble with understanding how RabbitMQ can be used

i'm currently working on a Python web app that needs to implement RabbitMQ.
The app is structured like that :
The client connects to a HTTP server
His connexion is send to a message queue that is connected to the main service of my app
the main service receive the message and give the user his information
I understand how to make work RabbitMq using the documentation and tutorial on the website but I have trouble seeing how can it work with real tasks like displaying a web page or printing a file ? How does my service connected to the message queue will read the message received and say : "oh, i'm gonna display this webpage".
Sorry if this is confusing, if you need further explanations on what i'm trying to get, just tell me.
Thanks for reading!
RabbitMq can be good to send message to service which can execute long running process - ie download big file, generate complex animation. Web server can't (or shoudn't) execute long running process.
Web page sends message to RabbitMq (ie. with parameters for long running process) and get unique number. When service has free worker then it checks if there is new message in queue, get it (with unique number) and start worker. When worker finish job then service send result to RabbitMQ with the same uniqe number.
At the same time web page uses JavaScript to run loop which periodically check in RabbitMQ if there is result with this unique number. If there is no result then it may display progressbar, if there is result then it may display this result.
Example: Celery - Distributed Task Queue.
Celery can use RabbitMQ to communicate with Django or Flask.
(but it can use other modules ie. Redis)
Using Celery with Django.
Flask - Celery Background Tasks
From Celery repo
Celery is usually used with a message broker to send and receive messages.
The RabbitMQ, Redis transports are feature complete, but there's also
experimental support for a myriad of other solutions, including using
SQLite for local development.

Why do I need to use asynchronous tools for sending Django email

While using Django, I had noticed when I send an ​email there is a delay and to overcome this I had to use Celery tool. So I wanted to know what is actually happening under the hood. Why does Django/Python need an async tool to accomplish this task. I have learned about MultiThreading, MultiProcessing in Python but if somebody could give me a brief idea about what exactly is happening when Django is trying to send an email without using Celery.
Think of sending an email like sending a request, in a synchronous context the process would go as follows:
Send request
Wait for response..........
Receive response
The whole time you're waiting for the response that thread cannot do anything else, it's wasting CPU cycles that could be used by something else (such as serving other users requests).
I'd like to make a distinction here between your usage of asynchronous and celery.
Pythons actual asynchronous implementation uses an "event loop" to dispatch and receive messages. The "waiting" is done in a separate thread/process which is used exclusively to receive messages, and dispatch those messages to receivers, in this way your thread that sent the request is no longer wasting CPU cycles waiting, it will be invoked by the event loop when it's ready. This is a very vague description of how pythons async works. It won't necessarily make the whole process faster for the user unless there are a lot of emails being sent.
Celery on the other hand is an asynchronous task queue, in which you have producers (your web application) sending messages, a broker (data store) which stores and distributes messages, and consumers (workers) which pull messages from the broker and processes them. The consumers are a totally separate process (often a totally separate server) from your web application, it frees up your web application to focus on returning the response to the client as soon as possible. The process when sending emails through celery would look like:
Web application sends a message to the broker and returns the response to the user. Here's a json pseudo-message. (The broker actually stores the messages as either pickled objects or JSON)
{
"task": "my_app.send_email",
"args": ["Subject Line", "Hello, World! This is your email contents", "to_email#example.com", "from_email#example.com"], #
"kwargs": {} # No keyword arguments
}
The celery worker is constantly checking with the broker for new messages to process if it is not currently processing. Sometimes the celery worker will pull in batches of messages so there is less overhead, this is configurable.
The celery worker executes a function (defined by the "task" in the message), using the arguments and keyword arguments.
That is a very simple example of why you may want to use celery to send emails, so you can return the response to the user as fast as possible! It's also well suited to longer running tasks, such as processing image thumbnails:
User uploads an image, which you store somewhere (Amazon S3 for example)
You send a message to the broker saying "execute my process_image_thumbails task with the files S3 URL as the argument"
You return the response to your user. It's nice and quick from the users perspective.
A worker picks up the message, downloads the file from S3, and processes it into thumbnails of varying sizes.
As you use celery for more new use cases you encounter new problems. For example, what do we do if someone requests the thumbnail while it's processing? I'll leave that to your imagination.

How To Perform asynchronous task in django?

I am a bit lost here. I want to send email using django on a post request. But I don't want the user to wait for a reply so I am sending it in a background thread. Also the emails are to be send every 30 seconds in a sequential order provided by the user.
But there is a option for the user to cancel sending the emails i.e a different post request to handle the cancel input.
How do I listen for cancel operation in the thread sending the email?
How do I implement the 30 second delay?
I thought of using sleep but if I do that then will it skip the cancel signal?
As mentioned above, it would probably be the case of using a Celery task for sending the e-mails asynchronously. Then, for cancelling the e-mails sending, your view handling the POST (for cancelling) request could revoke the tasks responsible for sending that sequence of e-mails. This question/answer shows how to do it.

Django: How to periodically and asynchronously publish new data to the respective client

Following is a sequence of events that is happening
Part 1:
Client sends a form.
Django receives a form, validates it and creates a Task for Celery or Django-rq to run.
Returns results.html to the user.
Part 2:
Task is run by the workers which generates JSON data every second.
This needs to be sent to the right client asynchronously and as a part of result.html.
Client should see the updated results without doing any refresh.
How do I solve this?
After some amount of research following are some of the ways I thought of:
Write the updated data to the Database and have Django poll it with a scheduler. Not really sure how I can send it to the right client.
Have the client subscribe for events and use django-websocket-redis to publish the data. I'm not sure if this is possible because each client requires a unique websocket to subscribe to and I am not sure if this is possible.

Categories