Flask: asynchronous response to client - python

I'm using Flask to develop a web server in a python app. I'm achieving this scenario: the client (it won't be a browser) sends a request, the server does some long task in background and on completion sends the response back to the client asynchronously. Is it possible to do that?

What you ask cannot be done with the HTTP protocol. Each request receives a response synchronously. The closest thing to achieve what you want would be this:
The client sends the request and the server responds with a job id immediately, while it also starts a background task for this long calculation.
The client can then poll the server for status by sending the job id in a new request. The response is again immediate and contains a job status, such as "in progress", "completed", "failed", etc. The server can also return a progress percentage, which the client can use to render a progress bar.
You could also implement web sockets, but that will require socket enabled server and client.

Related

How to store a message from the client in the server and send it further via some route in Flask

From a web application, this is making a request to the backend application (Python with Flask and flask-socketio). From this route on the backend, an emit should be done to a socketio client standalone application. This works fine, but when the client app sends back a message directly after, I want to retrieve this message and send it back in my route to the web application. The message I get back from the client via a callback will be asynchronous, so how in the simplest manner could this be achieved? Each time I fetch the message from the client, the route has already sent back a reply to the web app without the message.
I fully understand that this flow is usually not normal, but can this be achieved without saving this message into a database, but store it somewhere on the backend and send it back to the web app?
You can use an Event object from the Python standard library.
from threading import Event
my_event = Event()
In your Flask route:
my_event.wait() # block until the event is signaled
return socketio_response
In your Socket.IO callback function:
socketio_response = data
my_event.set() # alert the route that a result is now available

Bottle-WebSocket: How to ensure an HTTP request is from the same session as ws connection?

I built an web application using Python Bottle framework.
I used bottle-websocket plugin for WebSocket communication with clients.
Here is a part of my code.
from bottle import Bottle, request, run
from bottle.ext.websocket import GeventWebSocketServer, websocket
class MyHandler():
...
class MyServer(Bottle):
...
def _serve_websocket(self, ws):
handler = MyHandler()
some_data = request.cookies.get('some_key') # READ SOME DATA FROM HTTP REQUEST
while True:
msg = ws.receive()
handler.do_sth_on(msg, some_data) # USE THE DATA FROM HTTP REQUEST
ws.send(msg)
del(handler)
if __name__ == '__main__':
run(app=MyServer(), server=GeventWebSocketServer, host=HOST, port=PORT)
As the code shows, I need to read some data from the browser (cookies or anything in the HTTP request headers) and use it for WebSocket message processing.
How can I ensure the request is from the same browser session as the one where WebSocket connection comes?
NOTE
As I do not have much knowledge of HTTP and WebSocket, I'd love to here detailed answere as much as possible.
How can I ensure the request is from the same browser session as the one where WebSocket connection comes?
Browser session is a bit abstract since HTTP does not have a concept of sessions. HTTP and RESTful APIs is designed to be stateless, but there is options.
Usually, what you usually want to know is what user the request comes from. This is usually solved by authentication e.g. by using OpenID Connect and let the user send his JWT-token in the Authorization: header, this works for all HTTP requests, including when setting up a Websocket connection.
bottle-oauthlib seem to be a library for authenticating end-users using OAuth2 / OpenID Connect.
Another option is to identify the "browser session" using cookies but this depends on a state somewhere on the server side and is harder to implement on cloud native platforms like e.g. Kubernetes that prefer stateless workloads.

Making asynchronous HTTP requests from a flask service

I have a couple different needs for asynchrony in my Python 3.6 Flask RESTful web service running under Gunicorn.
1) I'd like for one of my service's routes to be able to send an HTTP request to another HTTP service and, without waiting for the response, send a response back to the client that called my service.
Some example code:
#route
def fire_and_forget():
# Send request to other server without waiting
# for it to send a response.
# Return my own response.
2) I'd like for another one of my service's routes to be able to send 2 or more asynchronous HTTP requests to other HTTP services and wait for them all to reply before my service sends a response.
Some example code:
#route
def combine_results():
# Send request to service A
# Send request to service B
# Wait for both to return.
# Do something with both responses
# Return my own response.
Thanks in advance.
EDIT: I am trying to avoid the additional complexity of using a queue (e.g. celery).
You can use eventlets for the the second use case. It's pretty easy to do:
import eventlet
providers = [EventfulPump(), MeetupPump()]
try:
pool = eventlet.GreenPool()
pile = eventlet.GreenPile(pool)
for each in providers:
pile.spawn(each.get, [], 5, loc) # call the interface method
except (PumpFailure, PumpOverride):
return abort(503)
results = []
for res in pile:
results += res
You can wrap each of your api endpoints in a class that implements a "common interface" (in the above it is the get method) and you can make the calls in parallel. I just place them all in a list.
Your other use case is harder to accomplish in straight python. At least a few years ago you would be forced to introduce some sort of worker process like celery to get something like that done. This question seems to cover all the issues:
Making an asynchronous task in Flask
Perhaps things have changed in flask land?

HTTP REST Gateway to AMQP Request-Response, Without Web Sockets Or Polling

I've struggled for two days to understand how REST API Gateways should return GET requests to browsers when the backend service runs on AMQP (without using Web Sockets or polling).
Have successfully RPC'ed betweeen AMQP service (with RabbitMqs reply_to & correlation_id), but with Flask HTTP request waiting I'm still lost.
gateway.py - Response Handler Inside The HTTP Handler, Times out
def products_get():
def handler(ch=None, method=None, properties=None, body=None):
if body:
return body
return False
return_queue = 'products.get.return'
broker.channel.queue_declare(return_queue)
broker.channel.basic_consume(handler, return_queue)
broker.publish(exchange='', routing_key='products.get', body='Request data', properties=pika.BasicProperties(reply_to=return_queue))
now = time.time() # for timeout. Not having this returns 'no content' immediately
while time.time() < now + 1:
if handler():
return handler()
return 'Time out'
POST/PUT can simply send the AMQP message, return 200/201/201 immediately and the service work at its own pace. A separate REST interface just for GET requests seems implausible, but don't know the other options.
Regards
I think what you're asking is "how to perform asynchronous GET requests". and I reckon that the answer is - you can't. and should not. its bad practice or bad design. and it does not scale.
Why are you trying to get your GET response payload from AMQP?
If the paylaod (the content of the response) can be pulled from some DB, just pull it from there. that's called a synchronous request.
If the payload must be processed in some backend, send it away and don't have the requester wait for a response. You could assign some ID and have the requester ask again later (or collect some callback URL from the requester and have your backend POST the response once its ready - less common design).
EDIT:
so, given that you have to work with AMQP-backed backend, I would do something a little more elaborate: spawn a thread or a process in your front end that would constantly consume from AMQP and store the results locally or in some db. and serve GET results based on data that you stored locally. if the data isn't yet available, just return 404. ideally you'll need to re-shape your API: split it into "post" requests (that would trigger work at the backend) and "get" requests (that would return the results if they're available).

Socket Connection between django servers

I am newbie for django and python. The thing what I need is, connecting more than one django server with socket. One of these servers (main server) will get a request from mobile client with Django-REST API, and then, it should transmit it to the other django servers related to an ID of the server. (e.g. When main server gets data with an ID as 1, it should transmit the data to the server#1, if it gets the data with ID 2, it should transmit the data to server#2)
I am looking forward your advices..
p.s. Http requests cannot be sent to the django servers except main one. Each of them are intranet application and locations are different. The only way to send data to these servers via http is, sending the request to main server with the ID of the servers.
If you are not able to send an (internal) http request not even to localhost you can try to speak to the WSGI API of the different django apps. The main app might create a WSGI application object and fill it with pseudo request data.
# views.py of the main server
def myview(self, request):
# do some stuff
if server_id = 1:
from server_1_app.wsgi import application
response = application(environ, pseudo_request)
# ...

Categories