How to send partial status of request to frontend by django python? - python

Suppose, I have sent a post request from react to Django rest API and that request is time taking. I want to get how many percentages it has been processed and send to the frontend without sending the real response?

There are two broad ways to approach this.
(which I would recommend to start with): Break the request up. The initial request doesn't start the work, it sends a message to an async task queue (such as Celery) to do the work. The response to the initial request is the ID of the Celery task that was spawned. The frontend now can use that request ID to poll the backend periodically to check if the task is finished and grab the results when they are ready.
Websockets, wherein the connection to the backend is kept open across many requests, and either side can initiate sending data. I wouldn't recommend this to start with, since its not really how Django is built, but with a higher level of investment it will give an even smoother experience.

Related

How to handle high response time

There are two different services. One service -Django is getting the request from the front-end and then calling an API in the other service -Flask.
But the response time of the Flask service is high and if the user navigates to another page that request will be canceled.
Should it be a background task or a pub/sub pattern? If so, how to do it in the background and then tell the user here is your last result?
You have two main options possible:
Make an initial request to a "simple" view of Django, which load a skeleton HTML page with a spinner where some JS will trigger a XHR request to a second Django view which will contain the other service (Flask) call. Thus, you can even properly alert your user the loading takes times and handle the exit on the browser side (ask confirmation before leaving/abort the request...)
If possible, cache the result of the Flask service, so you don't need to call it at each page load.
You can combine those two solutions by calling the service in a asynchronous request and cache its result (depending on context, you may need to customize the cache depending on the user connected for example).
The first solution can be declined with pub/sub, websockets, whatever, but a classical XHR seems fine for your case.
On our project, we have a couple of time-expensive endpoints. Our solution was similar to a previous answer:
Once we receive a request we call a Celery task that does its expensive work in async mode. We do not wait for its results and return a quick response to the user. Celery task sends its progress/results via WebSockets to a user. Frontend handles this WS message. The benefit of this approach is that we do not spend the CPU of our backend. We spend the CPU of the Celery worker that is running on another machine.

Pending requests with Python's SimpleHTTPServer

I'm making an anonymous chat application, similar to Omegle. My method of approach instead of using sockets is to use a REST API, but to add a bit of a twist. When a user makes a request, such as POST /search (find a match), the request is held by the server until another user sends a POST /search. Once two people have done this, both requests are responded to which lets the client know to switch to a chat page. This is also done with a pending GET /events request, which is only responded to by the server if there's any new events to be sent.
This works very well in theory with the flow of this application; however, since I'm using SimpleHTTPServer - which is a very basic library - requests are not handling asynchronously. This means that if I block one request until information requirements are fulfilled, no other requests can be accepted. For this kind of project I really don't want to take the time to learn an entirely new library/sub-language for asynchronous requests handling, so I'm trying to figure out how I can do this.
def waitForMatch(client):
# if no other clients available to match:
if not pendingQueue:
# client added to pending queue
pendingQueue.append(client)
client["pending"] = True
while client["pending"]:
time.sleep(1)
# break out and let the other client's waitForMatch call create the chatsession
return
# now there's a client available to match
otherClient = pendingQueue.pop()
otherClient["pending"] = False
# chat session created
createChatSession(otherClient, client)
This is the code I currently have, which won't work with non-async requests.

Asynchronous Payments with Django and Celery

I was wondering if anyone knows a way to take payments Asynchronously using django and celery. My desired process would be this:
Client submits payment form
Request is sent to Django
The request is passed to celery to do the work
Django closes the request by returning a response
Client is then notified when the payment has complete
I'd like to do this without polling.
How would I get the "payment complete" signal back to the client if the original request was closed? Would some kind of node/tornado integration be possible here?
Thanks.

Django: How to periodically and asynchronously publish new data to the respective client

Following is a sequence of events that is happening
Part 1:
Client sends a form.
Django receives a form, validates it and creates a Task for Celery or Django-rq to run.
Returns results.html to the user.
Part 2:
Task is run by the workers which generates JSON data every second.
This needs to be sent to the right client asynchronously and as a part of result.html.
Client should see the updated results without doing any refresh.
How do I solve this?
After some amount of research following are some of the ways I thought of:
Write the updated data to the Database and have Django poll it with a scheduler. Not really sure how I can send it to the right client.
Have the client subscribe for events and use django-websocket-redis to publish the data. I'm not sure if this is possible because each client requires a unique websocket to subscribe to and I am not sure if this is possible.

Flask request waiting for asynchronous background job

I have an HTTP API using Flask and in one particular operation clients use it to retrieve information obtained from a 3rd party API. The retrieval is done with a celery task. Usually, my approach would be to accept the client request for that information and return a 303 See Other response with an URI that can be polled for the response as the background job is finished.
However, some clients require the operation to be done in a single request. They don't want to poll or follow redirects, which means I have to run the background job synchronously, hold on to the connection until it's finished, and return the result in the same response. I'm aware of Flask streaming, but how to do such long-pooling with Flask?
Tornado would do the trick.
Flask is not designed for asynchronization. A Flask instance processes one request at a time in one thread. Therefore, when you hold the connection, it will not proceed to next request.

Categories