I just wanted to know if the following is a good, bad, or terrible idea: I'm coding up a game (just for fun, no need to worry about scaling or performance) that will use Express.js to manage websockets using socket.io, but I'd also like to code the core of the game in Python. So, my idea is to run, on a single server (let's say EC2), the Express.js app which handles the sockets connections and essentially proxies game actions to a Flask application running on the same server and being exposed via some port on localhost.
In other words, a user connects via sockets on the game server. Then, when a "game action" occurs and the Express.js server sees it, it passes the JSON over to localhost:5000/my-flask-api/game-action in order for the Python code to process the action, which sends a response back to Express.js, which then is sent back to the user.
The idea is that I run both Node and Python on the same server (using Screen or some other terminal multiplexer) to avoid paying for two separate servers and reduce latency and allow myself to have access to both websockets via socket.io and Python since it'll be more fun to code the game in. Is this idea dumb?
Related
I have a Flask/SocketIO application which currently pairs two clients together to play a game together. Right now the clients are interacting with the server through some compiled client-side Javascript, and I am using socketio to define the sockets which the clients call, e.g., movedForward when that client moved forward. The client-side JS similarly defines websockets which the server will emit to, e.g., partnerTurnedRight when the server is passing the partner's movement to the other player.
I would like to create 'dummy' clients on the server side which can interact with a normal, remote client -- basically, a python implementation of the Javascript which is spawned every time a remote client connects. The idea is to have a server-side "player" play the game with a remote, human client.
I'm not sure how to go about implementing something like this. My intuition is that I should create a separate Flask/SocketIO app (somehow), which has sockets on for the messages the server sends (e.g., partnerTurnedRight) and emits messages the server expects (e.g., movedForward). Then, when a remote client connects, spawn a stateful subprocess which has its own unique sid and is able to interact with the server with the exact same interface as the remote client. However, I'm really not sure how to put everything together or how to actually spawn a server-side client like that.
An example project which does something like this, some pseudocode, or a general structure of how to set something like this up would be greatly appreciated!
(Part of the problem is that I don't know what search terms to use, so it's been hard finding examples.)
You can use the python-socketio package server-side using python. Here's an example of the client usage:
import socketio
sio = socketio.Client()
#sio.on('connect')
def on_connect():
print('connected')
sio.emit('Hello')
#sio.on('event')
def on_message(data):
print('Received ', data)
#sio.on('disconnect')
def on_disconnect():
print('disconnected')
sio.connect('http://localhost:5000')
sio.wait()
I'm building a turn-based game and I'm hoping to implement client-server style networking. I really just need to send the position of a couple of objects and some other easily encodable data. I'm pretty new to networking, although I've coded some basic stuff in socket and twisted. Now, though, I need to be able to send the data to a computer that isn't on my local network, and I can't do port forwarding since I don't have admin access to the router and I'm also not totally sure that would do the trick anyways since I've never done it. So, I was thinking of running some Flask or Bottle or Django, etc. code off PythonAnywhere. The clients would then send data to the server code on PythonAnywhere, and when the turn passed, the other client would just go look up the information it needed on the server. I guess then the server would act as just a data bank with some simple getter and setter methods. My question is how can this be implemented? Can my Socket code on my client program talk to my Flask code on PythonAnywhere?
Yes, client code can talk to your project at PythonAnywhere, as you will be given a unique project url like http://yourblogname.pythonanywhere.com/. Your server will listen the 80 port at that url.
It depends what sort of connection your clients need to make to the server. PythonAnywhere supports WSGI, which means "normal" HTTP request/response interactions -- GET, POST, etc. That works well for "traditional" web pages or web apps.
If your client side needs dynamic, two-way connections using non-HTTP protocols, using raw sockets, or even websockets, PythonAnyhwere doesn't support that at present.
I'm looking to start a web project using Flask and its SocketIO plugin, which depends on gevent (something something greenlets), but I don't understand how gevent relates to the webserver. Does using gevent restrict my server choice at all? How does it relate to the different levels of web servers that we have in python (e.g. Nginx/Apache, Gunicorn)?
Thanks for the insight.
First, lets clarify what we are talking about:
gevent is a library to allow the programming of event loops easily. It is a way to immediately return responses without "blocking" the requester.
socket.io is a javascript library create clients that can maintain permanent connections to servers, which send events. Then, the library can react to these events.
greenlet think of this a thread. A way to launch multiple workers that do some tasks.
A highly simplified overview of the entire process follows:
Imagine you are creating a chat client.
You need a way to notify the user's screens when anyone types a message. For this to happen, you need someway to tell all the users when a new message is there to be displayed. That's what socket.io does. You can think of it like a radio that is tuned to a particular frequency. Whenever someone transmits on this frequency, the code does something. In the case of the chat program, it adds the message to the chat box window.
Of course, if you have a radio tuned to a frequency (your client), then you need a radio station/dj to transmit on this frequency. Here is where your flask code comes in. It will create "rooms" and then transmit messages. The clients listen for these messages.
You can also write the server-side ("radio station") code in socket.io using node, but that is out of scope here.
The problem here is that traditionally - a web server works like this:
A user types an address into a browser, and hits enter (or go).
The browser reads the web address, and then using the DNS system, finds the IP address of the server.
It creates a connection to the server, and then sends a request.
The webserver accepts the request.
It does some work, or launches some process (depending on the type of request).
It prepares (or receives) a response from the process.
It sends the response to the client.
It closes the connection.
Between 3 and 8, the client (the browser) is waiting for a response - it is blocked from doing anything else. So if there is a problem somewhere, like say, some server side script is taking too long to process the request, the browser stays stuck on the white page with the loading icon spinning. It can't do anything until the entire process completes. This is just how the web was designed to work.
This kind of 'blocking' architecture works well for 1-to-1 communication. However, for multiple people to keep updated, this blocking doesn't work.
The event libraries (gevent) help with this because they accept and will not block the client; they immediately send a response and when the process is complete.
Your application, however, still needs to notify the client. However, as the connection is closed - you don't have a way to contact the client back.
In order to notify the client and to make sure the client doesn't need to "refresh", a permanent connection should be open - that's what socket.io does. It opens a permanent connection, and is always listening for messages.
So work request comes in from one end - is accepted.
The work is executed and a response is generated by something else (it could be a the same program or another program).
Then, a notification is sent "hey, I'm done with your request - here is the response".
The person from step 1, listens for this message and then does something.
Underneath is all is WebSocket a new full-duplex protocol that enables all this radio/dj functionality.
Things common between WebSockets and HTTP:
Work on the same port (80)
WebSocket requests start off as HTTP requests for the handshake (an upgrade header), but then shift over to the WebSocket protocol - at which point the connection is handed off to a websocket-compatible server.
All your traditional web server has to do is listen for this handshake request, acknowledge it, and then pass the request on to a websocket-compatible server - just like any other normal proxy request.
For Apache, you can use mod_proxy_wstunnel
For nginx versions 1.3+ have websocket support built-in
The scenario is
I have multiple local computers running a python application. These are on separate networks waiting for data to be sent to them from a web server. These computers are on networks without a static IP and generally behind firewall and proxy.
On the other hand I have web server which gets updates from the user through a form and send the update to the correct local computer.
Question
What options do I have to enable this. Currently I am sending csv files over ftp to achieve this but this is not real time.
The application is built on python and using django for the web part.
Appreciate your help
Use a REST API. Then you can post information to your Django app over HTTP, using an authentication key if necessary.
http://www.django-rest-framework.org/ should help you get started quickly
Sounds like you need a message queue.
You would run a separate broker server which is sent tasks by your web app. This could be on the same machine. On your two local machines you would run queue workers which connect to the broker to receive tasks (so no inbound connection required), then notify the broker in real time when they are complete.
Examples are RabbitMQ and Oracle Tuxedo. What you choose will depend on your platform & software.
I am creating a web app which needs to continuously poll my django web server to get an update. Is there a way avoid this polling? Like server can send push messages on update or the client registers a callback for an event and server triggers the callback whenever something changes.
I know there are signaling frameworks in ASP.net etc. but I want something which can work with Django.
Thanks
Fundamentally web sockets, part of HTML5, were design for this purpose, ie bi-directional communication between clients and servers through the http protocol, while its being highly talked about few application servers have implemented and even fewer http servers have actually even began supporting it.
While there are some packages:
django-websocket
django-socketio
that have enabled it in django, they don't do anything about your http server, very rarely if ever do you use django standalone, this is because django isn't very efficient for distributing static content such as images or any other static files, as well as distribute work load, we rely on things like nginx, apache and such things for this. unfortunately they don't support web sockets, yet, as such they tend to break the communication between the client and the application server even if its initiated in the first place, depending on implementation.
From my own personal experience nginx would break the communication after 60 seconds since this was the default allotted time for anything open.
As far as I know node.js maybe the best server, currently, for working with web sockets.
Depending on what you are tying to achieve and If regular polling seems in efficient you can try long-polling, basically the connection is held open, until theres new data to be pushed back unto the client vs regular polling, which is done at some interval, note that you may have to configure your http server not to terminate pro-long open connections and run django multithreaded, since each connection will use an instance.