Why Use Socket IO and not just Socket? - python

I've build before server-client programs (both sides where build in python by far).
Recently I started building app using swift and my goal is to add a backend to my apps using python (My app is a chat app)
I searched in the Internet a tutorials to do so, and I only saw two options to communicate between server side and mobile application, the first one is to create an API (REST) (request - response) - I can't use this solution because I want a real-time chat.
And the second option was web-sockets (socket.IO).
SO, my question is why not use the simple socket technology (like I used to use when it was only python server side to python client side -> import sockets) - no sockets over web

following Features You will get if you are using Socket.io or socketcluster.io (which is developed on the top of Socket IO)
scalability :- It will scale horizontally adding more nodes (scale-out) & Linearly(scale-up)
Reduces Payload size as message payload is compressed
Authorisation via middle ware functions
Reconnects Automatically if Connection drops
If You want to use your own implementation then you have to take care of the above features/Solutions to problems which arises when User-base is increases.

My understanding is Socket.IO isn't necessary anymore because all browsers worthwhile constantly keep each other in check. Socket.IO was for when browsers and servers didn't support the same technology. These days, everything is pretty much supported and Socket is perfectly safe to stick to without the use of Socket.IO. More of a breakdown here - https://codeburst.io/why-you-don-t-need-socket-io-6848f1c871cd

Related

How could I build a plugin that provides a socketio endpoint

I got a existing software project, that has a relatively primitive plugin system and wanted to expand it by providing a web interface.
Since my application processes realtime data, websockets are the only option besides web rtc.
My previous attempt used zeromq domain sockets on the python side and a server in Node js that connected to the domain socket.
This solution works great and has some benefits over the plugin server, but I want to offer a simpler option for folks that don't need the benefits and don't want the extra complexity.
How would you go about implementing this and is it even possible to do so?
Otherwise, I'll still do a separate process, but use fastapi to build the stuff around the socket endpoint and start it up using subprocess to spawn a second process that also connects to the domain socket.
Hope my question is not stupid, or a rtfm case.

Heartbeat connection between Node.js and Python Flask with sockets? Possible?

I am facing a situation where I have an Express server and a Flask server, each responsible for various tasks. We are piping a request from Express through to the Flask server, and would like to use sockets to provide heartbeat style updates from the Flask server to the Express server.
Is it possible to use sockets like this? I admit to having never really used sockets for backend stuff before. I've used Socket.io to connect React-based sites with an Express backend, but I'm not sure how to connect two servers like this.
Any help would be greatly appreciated.
There is a Flask extension for Socket.io.
Even though the blurb text says "Flask-SocketIO gives Flask applications access to low latency bi-directional communications between the clients and the server", "clients" doesn't have to mean "frontend".
Since you've already used Socket.io and websockets, you might see if that package meets your need. Certainly easier than reaching immediately for Unix sockets if it turns out you don't have to. :)
NOTE: Flask is not concurrent. It can't handle more than one request at a time by default, because it runs on a single thread and doesn't do async/await stuff. This is more broadly a Python/WSGI problem than specifically a Flask problem. Depending on what you do, this may become a bottleneck in your app.
Automatic threading with Flask
WSGI is synchronous

using multiple twisted socket servers together

So I have a single twisted socket server that serves clients and eventually I'll need to add more servers. The problem is that connections to the server are unique and unable to be shared among multiple server instances.
This makes a problem if the servers are behind a load balancer, or if multiple users from a single chat are across multiple server instances, because a message to a chat won't successfully send to everyone.
How would I resolve this?
It may be a difficult task as balancing load can be improved according to the underlying protocol (like http for web servers).
Are you trying to design a load balancing system for basically any socket based application ? What I mean is that it is one thing to dispatch messages between multiples servers, ensuring correct synchronization, it is another thing to build a dynamic self-balancing system for any communication protocol.
To build your loadbalancer, you can use a "TCP proxy" like HAProxy (http://www.haproxy.org/)
To handle the communication between your application server instances (behind the load balancing server), you can use messaging like zeromq (http://zeromq.org/) or rabbitmq (http://www.rabbitmq.com/). You'll find some common architecture pattern there.
There are python libs for both zeromq and rabbitmq so the implementation within your twisted-based server is not too hard.

Using PythonAnywhere as a game server

I'm building a turn-based game and I'm hoping to implement client-server style networking. I really just need to send the position of a couple of objects and some other easily encodable data. I'm pretty new to networking, although I've coded some basic stuff in socket and twisted. Now, though, I need to be able to send the data to a computer that isn't on my local network, and I can't do port forwarding since I don't have admin access to the router and I'm also not totally sure that would do the trick anyways since I've never done it. So, I was thinking of running some Flask or Bottle or Django, etc. code off PythonAnywhere. The clients would then send data to the server code on PythonAnywhere, and when the turn passed, the other client would just go look up the information it needed on the server. I guess then the server would act as just a data bank with some simple getter and setter methods. My question is how can this be implemented? Can my Socket code on my client program talk to my Flask code on PythonAnywhere?
Yes, client code can talk to your project at PythonAnywhere, as you will be given a unique project url like http://yourblogname.pythonanywhere.com/. Your server will listen the 80 port at that url.
It depends what sort of connection your clients need to make to the server. PythonAnywhere supports WSGI, which means "normal" HTTP request/response interactions -- GET, POST, etc. That works well for "traditional" web pages or web apps.
If your client side needs dynamic, two-way connections using non-HTTP protocols, using raw sockets, or even websockets, PythonAnyhwere doesn't support that at present.

Sending image to server: http POST vs custom tcp protocol

I am working out how to build a python app to do image processing. A client (not a web browser) sends an image and some text data to the server and the server's response is based on the received image.
One method is to use a web server + WSGI module and have clients make a HTTP POST request (using multipart/form-data). The http server then 'works out' the uploaded image and other data that the program can use.
Another method is to create a protocol that only sends the needed data and is handled within the application. The application would be doing everything (listening on the port, etc).
Is one of these a stand-out 'best' way (if yes, which one?), or is it more up to preference (or is there another way which is better)?
I believe it's more up to your needs, the size of the images, and your general knowledge of network programming.
In terms of simplicity, posting an image to the webserver using WSGI would be fairly simple, and you wouldn't have to worry about handling connections, sockets, error handling due to busy network ports, etc.
Another argument in favor of this approach is that you can easily reuse this "feature" if you already have it working on a webserver, say, by including a browser client. It might not be one of your needs now, but the door is left open.
This would be my choice.
Also, in Python you have a huge plethora of web frameworks to choose from, from the likes of Django, which is probably a huge overkill for your needs, to something alot simpler, like http://flask.pocoo.org/ which might just suit your needs and is really simple to set up.
In my opinion HTTP is an ideal protocol for sending files or large data, and its very common use, easy to suit any situation. If you use a self-created protocol, you may find it hard to transform when you get other client needs, like a web API.
Maybe the discussions about HTTP's lack of instantaneity and agility make you hesitate about choosing HTTP, but that mostly something about instant messaging and server push, there are better protocols. But when it comes to stability and flexiblity, HTTP is always a good choice.

Categories