I'm trying to build a simple REST API proxy server that receives HTTP requests and forwards them to a remote server.
I need to make it so it is not transparent which means if the client sends a request and the proxy does not have a response ready in its cache, it will not block the user and instead will return another immediate respond in some format, so that the response could be sent to the client later.
I am trying to use Flask in order to create the server but I don't understand how can I send a response back to the client and later update him with the correct response from the server.
My Idea was to use Flask and whenever I get some request I will initiate a new thread that will send this request to the remote server and when the thread finished it will send the response back to the client.
But What do I send to the client before initiating the thread? I thought of a Future object maybe that I will resolve in the thread, but I don't really know if that is the right way to go about this since I am new to python and server programming.
I would Appreciate if someone could point me in the right direction as to how to implement this proxy server.
Thank you!
Related
I have a server which is supposed to stream a set of endless data to the web client when client subscribes to it using grpc-web
but my problem is that the server continues sending data even after user goes to some other page and leaves the streaming area
I'm looking for a way which server could be able to control if the user is listening on stream or not and by that i would be able cancel streaming on the server
note that I'm using grpc-web streaming and proto3
server is on python and client is angular and typescript
server stubs were created with betterproto plugin
any help or idea to handle this problem will be appreciated
I'm using a service that sends me some data from user over webhooks. If there is any user interaction on this service, it hits my URL with HTTP request, with the data in POST/GET, and then expects text/json response to show back to the user. The response has to be in few seconds, otherwise the HTTP request times out and the service has no way of finding out what should be the response to the user.
The problem here is that now I'm not processing these data on my server with public IP, but I need to do it on my RPi, which keeps moving, which meains it has different IP every few hours, and mostly not public.
I'm sure I will still need to use the server with public IP to redirect these requests to my RPi, and I have few ideas, but I don't know what is reliable or if it even would work.
Let the API talk to my server and save the data. Then have the RPi constantly asking my server if there are any new data. Propably the dumbest idea - not ideal to use over metered connection, propably longer reply, and it will be harder to return the RPi's reply in the HTTP request made from API.
Having (Python) script running on my server, that will a) serve as socket server and RPi will connect to this socket, and b) have running SimpleHTTPRequestHandler to process requests from API and send them to the socket, the reply with RPi's reply. Propably easy way to keep connection between my server and RPi, allowing me to pass data in both directions.
Open SSH tunnel between the RPi and my server. This way, I could process the requests from service directly on my RPi. But how reliable is this solution? (Keeping it alive, opening the tunnel automatically, etc, propably question for superuser forum)
I'm thinking of going with choice 3 if it will be possible, but first I'd like to hear what you guys think. Is this a good and reliable idea? Or are there any better ways I don't know about? Or did anybody already faced this problem?
To sum it up:
Something sends HTTP request to public IP. I need to process this request (and reply to it) in Python script on device without public IP. I have a server with public IP that could be used as a bridge. I much don't care what will run on the server, if it will be able to redirect these requests.
Thanks
I have managed to built a simple client server application in Twisted that takes the data from the serial port and send it to the server. I want to know how i can add any kind of authentication for accessing the server. Right now anyone with the server IP can send data to the server. Any help would be highly appreciated .
I can redirect you to this question.
Basically, you need to implement a protocol client & server sides that parses username and password, validates them and keeps the connection open / routes it to a new address, or closes it.
Lower level approaches are also possible, but way more complicated.
Twisted has an SSL auth built in, if it is of any interest to you.
I'm looking to start a web project using Flask and its SocketIO plugin, which depends on gevent (something something greenlets), but I don't understand how gevent relates to the webserver. Does using gevent restrict my server choice at all? How does it relate to the different levels of web servers that we have in python (e.g. Nginx/Apache, Gunicorn)?
Thanks for the insight.
First, lets clarify what we are talking about:
gevent is a library to allow the programming of event loops easily. It is a way to immediately return responses without "blocking" the requester.
socket.io is a javascript library create clients that can maintain permanent connections to servers, which send events. Then, the library can react to these events.
greenlet think of this a thread. A way to launch multiple workers that do some tasks.
A highly simplified overview of the entire process follows:
Imagine you are creating a chat client.
You need a way to notify the user's screens when anyone types a message. For this to happen, you need someway to tell all the users when a new message is there to be displayed. That's what socket.io does. You can think of it like a radio that is tuned to a particular frequency. Whenever someone transmits on this frequency, the code does something. In the case of the chat program, it adds the message to the chat box window.
Of course, if you have a radio tuned to a frequency (your client), then you need a radio station/dj to transmit on this frequency. Here is where your flask code comes in. It will create "rooms" and then transmit messages. The clients listen for these messages.
You can also write the server-side ("radio station") code in socket.io using node, but that is out of scope here.
The problem here is that traditionally - a web server works like this:
A user types an address into a browser, and hits enter (or go).
The browser reads the web address, and then using the DNS system, finds the IP address of the server.
It creates a connection to the server, and then sends a request.
The webserver accepts the request.
It does some work, or launches some process (depending on the type of request).
It prepares (or receives) a response from the process.
It sends the response to the client.
It closes the connection.
Between 3 and 8, the client (the browser) is waiting for a response - it is blocked from doing anything else. So if there is a problem somewhere, like say, some server side script is taking too long to process the request, the browser stays stuck on the white page with the loading icon spinning. It can't do anything until the entire process completes. This is just how the web was designed to work.
This kind of 'blocking' architecture works well for 1-to-1 communication. However, for multiple people to keep updated, this blocking doesn't work.
The event libraries (gevent) help with this because they accept and will not block the client; they immediately send a response and when the process is complete.
Your application, however, still needs to notify the client. However, as the connection is closed - you don't have a way to contact the client back.
In order to notify the client and to make sure the client doesn't need to "refresh", a permanent connection should be open - that's what socket.io does. It opens a permanent connection, and is always listening for messages.
So work request comes in from one end - is accepted.
The work is executed and a response is generated by something else (it could be a the same program or another program).
Then, a notification is sent "hey, I'm done with your request - here is the response".
The person from step 1, listens for this message and then does something.
Underneath is all is WebSocket a new full-duplex protocol that enables all this radio/dj functionality.
Things common between WebSockets and HTTP:
Work on the same port (80)
WebSocket requests start off as HTTP requests for the handshake (an upgrade header), but then shift over to the WebSocket protocol - at which point the connection is handed off to a websocket-compatible server.
All your traditional web server has to do is listen for this handshake request, acknowledge it, and then pass the request on to a websocket-compatible server - just like any other normal proxy request.
For Apache, you can use mod_proxy_wstunnel
For nginx versions 1.3+ have websocket support built-in
I'm trying to code a web chat using the tornado.
The client (user) sends a long post request, during which I send him messages in response. But I am having problems with checking whether the user is online.
When a user sign out or simply close the tab / browser - everything is simple, executed on_connection_close() and I understand that it is disconnected, but if the client lost the Internet connection, then on_connection_close does not work.
How can I check, whether the user is online?
You might want to take look at tornadio2 which is Tornado+Socket.io, and implement multiplex connection, one to handle to push messages, the other is to ping the server so that you can check if the client is still connected or not.
The multiplex does not open multiple connections but uses the single connection to virtually connect to different Handlers. Look at multiplexed.py line 66.
class RouterConnection(SocketConnection):
__endpoints__ = {'/chat': ChatConnection,
'/ping': PingConnection}
The multiplex example is also an sample chat application