Asynchronous Payments with Django and Celery - python

I was wondering if anyone knows a way to take payments Asynchronously using django and celery. My desired process would be this:
Client submits payment form
Request is sent to Django
The request is passed to celery to do the work
Django closes the request by returning a response
Client is then notified when the payment has complete
I'd like to do this without polling.
How would I get the "payment complete" signal back to the client if the original request was closed? Would some kind of node/tornado integration be possible here?
Thanks.

Related

How to send partial status of request to frontend by django python?

Suppose, I have sent a post request from react to Django rest API and that request is time taking. I want to get how many percentages it has been processed and send to the frontend without sending the real response?
There are two broad ways to approach this.
(which I would recommend to start with): Break the request up. The initial request doesn't start the work, it sends a message to an async task queue (such as Celery) to do the work. The response to the initial request is the ID of the Celery task that was spawned. The frontend now can use that request ID to poll the backend periodically to check if the task is finished and grab the results when they are ready.
Websockets, wherein the connection to the backend is kept open across many requests, and either side can initiate sending data. I wouldn't recommend this to start with, since its not really how Django is built, but with a higher level of investment it will give an even smoother experience.

How to handle high response time

There are two different services. One service -Django is getting the request from the front-end and then calling an API in the other service -Flask.
But the response time of the Flask service is high and if the user navigates to another page that request will be canceled.
Should it be a background task or a pub/sub pattern? If so, how to do it in the background and then tell the user here is your last result?
You have two main options possible:
Make an initial request to a "simple" view of Django, which load a skeleton HTML page with a spinner where some JS will trigger a XHR request to a second Django view which will contain the other service (Flask) call. Thus, you can even properly alert your user the loading takes times and handle the exit on the browser side (ask confirmation before leaving/abort the request...)
If possible, cache the result of the Flask service, so you don't need to call it at each page load.
You can combine those two solutions by calling the service in a asynchronous request and cache its result (depending on context, you may need to customize the cache depending on the user connected for example).
The first solution can be declined with pub/sub, websockets, whatever, but a classical XHR seems fine for your case.
On our project, we have a couple of time-expensive endpoints. Our solution was similar to a previous answer:
Once we receive a request we call a Celery task that does its expensive work in async mode. We do not wait for its results and return a quick response to the user. Celery task sends its progress/results via WebSockets to a user. Frontend handles this WS message. The benefit of this approach is that we do not spend the CPU of our backend. We spend the CPU of the Celery worker that is running on another machine.

Django: How to periodically and asynchronously publish new data to the respective client

Following is a sequence of events that is happening
Part 1:
Client sends a form.
Django receives a form, validates it and creates a Task for Celery or Django-rq to run.
Returns results.html to the user.
Part 2:
Task is run by the workers which generates JSON data every second.
This needs to be sent to the right client asynchronously and as a part of result.html.
Client should see the updated results without doing any refresh.
How do I solve this?
After some amount of research following are some of the ways I thought of:
Write the updated data to the Database and have Django poll it with a scheduler. Not really sure how I can send it to the right client.
Have the client subscribe for events and use django-websocket-redis to publish the data. I'm not sure if this is possible because each client requires a unique websocket to subscribe to and I am not sure if this is possible.

Flask request waiting for asynchronous background job

I have an HTTP API using Flask and in one particular operation clients use it to retrieve information obtained from a 3rd party API. The retrieval is done with a celery task. Usually, my approach would be to accept the client request for that information and return a 303 See Other response with an URI that can be polled for the response as the background job is finished.
However, some clients require the operation to be done in a single request. They don't want to poll or follow redirects, which means I have to run the background job synchronously, hold on to the connection until it's finished, and return the result in the same response. I'm aware of Flask streaming, but how to do such long-pooling with Flask?
Tornado would do the trick.
Flask is not designed for asynchronization. A Flask instance processes one request at a time in one thread. Therefore, when you hold the connection, it will not proceed to next request.

async http request on Google App Engine Python

Does anybody know how to make http request from Google App Engine without waiting a response?
It should be like a push data with http without latency for response.
I think that this section of the AppEngine docs is what you are looking for.
Use the taskqueue. If you're just pushing data, there's no sense in waiting for the response.
What you could do is in the request handler enqueue a task with whatever data was received (using the deferred library). As soon as the task has been enqueued successfully you can return a '200 OK' response and be ready for the next push.
I've done this before by setting doing a URLFetch and setting a very low value for the deadline parameter. I put 0.1 as my value, so 100ms. You need to wrap the URLFetch in a try/catch also since the request will timeout.

Categories