I want to implement the following but I'm not sure where to start / what to Google.
I'd appreciate some direction since I've never written any program that requires network connectivity and am pretty lost:
I've got 3 Raspberry Pis sitting around. I want 2 of them to be able to chat while the 3rd routes the messages (acts as a server between them).
The general flow of events should be something like this:
Server starts running on Pi #1
Pi #2 starts running and connects to the server (who's IP will be static I guess) with a name he chooses. Pi #3 does the same as #2.
Pi #3 can then, knowing the name of Pi #2, send a message to Pi #2 using: : .
This is the general outline of what I want to achieve.
I'm not sure what the server that runs on Pi #1 should be (I've heard of webserver frameworks like Flask but I don't have enough knowledge to determine if they fit my needs).
I'm also not sure on what I should be using for the client side (Pi #2,3). I could probably use sockets but I assume there is a better / easier way.
If you are on a private network, XML-RPC might be a good choice, because
It's built into Python, see this example
You can call remote functions almost as if they where local
Drawbacks:
Little network security
When sending raw data, it needs to be encoded (since to is a text protocol)
To check if your remote server is running, you can use sockets as in this example.
Related
I am trying to create a chess game between two different computers that are not in the same LAN. I am having trouble connecting the two via a TCP connection (UDP would probably be sufficient as well if the packets are arriving, but ideally TCP).
I am new to a lot of networking and am unaware of many different tools that may be useful and I am also in university and therefore don't have control over the router to update firewall rules. What can I do to work around the router firewall to connect the two devices.
I am primarily using the Python socket library at the moment to implement the connection.
Any information about how I can send messages between the two computers outside of a LAN would be very useful. Thank you for your help!
I have ensured that the client side is using the public IP of the server and the server is using "" for its socket host. I also checked that the connection was working when utilizing a LAN without issue. I included a batch file that enables the specific port used for the game at the beginning of runtime and disables it at the end of the program. If I am not mistaken, that only impacts the computer's firewall rules not the router's. I have looked into receive the packets through port 80 and redirecting it to my specific program, but was unsuccesful in finding a solution of that type.
If the server is behind a router/firewall you'll have to use some sort of hole punching method to create the connection. STUN is one of the most common, though I've never actually used it in a Python program so I don't know what Python implementations are out there.
This question is more about advice on how to approach the best solution and possible frameworks to look at.
Desired Result
I am currently attempting to create a very basic plug and play raspberry-pi which can be controlled via an API to perform various tasks such as stream a webcam or turn an led on. The device should connect to anyway wifi and automatically be able to be controlled by the API.
Current Problem
The goal of the app is to create a very easy to use plug and play solution. However I am running to to problem when trying to create a WebSocket on the PI as I cannot seem to be able to access it from an external network. I have looked into automatic port forwarding solutions such as UPnP however all of the results are extremely old. I have also thought about creating the Python WebSocket on a cloud server and use the PI as the client to connect and pass data. Please correct me if I am wrong, however this method does not seem very scalable when there are 10s/100s/1000s of pi's connecting to the same socket.
Any advice would be greatly appreciated.
I'm struggling to design an efficient way to exchange information between processes in my LAN.
Till now, I've been working with one single RPi, and I had a bunch of python scripts running as services. The services communicated using sockets (multprocessing.connection Client and Listener), and it was kind of ok.
I recently installed another RPi with some further services, and I realized that as the number of services grows, the problem scales pretty badly. In general, I don't need all the services to communicate with any other, but I'm looking for an elegant solution to enable me to scale quickly in case I need to add other services.
So essentially I though I first need a map of where each service is, like
Service 1 -> RPi 1
Service 2 -> RPi 2
...
The first approach I came up with was the following:
I thought I could add an additional "gateway" service so that any application running in RPx would send its data/request to the gateway, and the gateway would then forward it to the proper service or the gateway running on the other device.
Later I also realized that I could actually just give the map to each service and let all the services manage their own connection. This would mean to open many listeners to the external address, though, and I'm not sure it's the best option.
Do you have any suggestions? I'm also interested in exploring different options to implement the actual connection, might the Client / Listener one not be efficient.
Thank you for your help. I'm learning so much with this project!
So I'm building a system where I scan a RFID tag with a reader connected to a Raspberry Pi, the RFID tag ID should then be sent to another "central" RPI, where a database is checked for some info, and if it matches the central Pi sends a message to a lamp (also connected to a Pi) which will then turn on. This is just the start of a larger home-automation system.
I read about MQTT making it very easy to make more RPIs communicate and act on events like this. The only thing I am wondering about, but can't find documented on the internet, is whether the central Pi in my case can act like the broker, but also be subscribed to the topic for the RFID tag ID, check the database and then publish to another topic for the light.
Purely based on logical thinking I'd say yes, since the broker is running in the background. Thus I would still be able to run a python script that subscribes/publishes to, I'm guessing, localhost instead of the central Pi's IPaddress and port.
Can anyone confirm this? I can't test it myself yet because I have just ordered the equipment, and am doing lots of preparation-research.
You can run as many clients as you like on the same machine as the broker (You could even run multiple brokers as long as they listen on different ports). The only thing you need to do is ensure that each client has different client id
Introduction
I am working on a robotic sampling application. Each robot has a cabled interface with power, TCP/IP and a gas-sensor tubing. The robots are on an ARM platform, and I intend to do most of the programming in Python. The robots move slowly and there is nothing computationally intensive running on them.
Each robot should perform these "services":
Move left/right (for manual control)
Move up/down (for manual control)
Go to next sector
Each robot report these sensor readings or events:
Temperature
End-switch right
Docked at sector with ID: ###
Encoder count [lateral,longitudinal]
Error event
Client - Server Architecture
I view each robot as a client, and the sensor hub computer as the server.
The server will have a known ip and listening port, and allow the robots to connect.
The server will do the measurement scheduling, and command the robots to move from sector to sector.
The server may maintain and update a model of each robot with a state-vector containing:
[ position, switches, sensor-readings, status]
Questions
From debugging serial communication, I have experienced the benefits of having a human readable communication interface with a strict poll-response structure. I am however not sure how we should go forth designing this interface.
Are there any best practices in designing communication interfaces for devices like these?
Should I think about packet loss and corruption, or is this fully handled by TCP?
Should I design everything as services polled by the server, or should the robots broadcast it's sensor readings and events?
Should I implement Acknowledgment of commands, e.g. go-to-next-section
I apologize for the broad and vague problem formulation, this may be more a philosophy question than a software problem. However I will greatly appreciate your thoughts, experiences and advice.
TLDR
What are the guiding principles of designing TCP communication protocols for client-server architectures?
Overall Id suggest using python Twisted to build your server and client (robot side) applications (https://twistedmatrix.com/trac/). Anyways to answer your question:
"Are there any best practices in designing communication interfaces for devices like these?"
See answers to your other questions bellow.
"Should I think about packet loss and corruption, or is this fully handled by TCP?"
TCP guarantees the integrity of the data you are getting. The primary things to worry about is if the client/servers are connected or not. You can use a ReconnectingClientProtocol to make your connections a little more robust when server is restarted (see Twisted specs). Also be aware that TCP is a streaming protocol (you may not get the whole message at once), so make sure you got the whole message before acting on it. If you are sending messages quickly you may also have more than one message in your TCP buffer for that client.
"Should I design everything as services polled by the server, or should the robots broadcast it's sensor readings and events?"
Avoid polling. When the robots start up they should establish a persistent TCP connection with the server. Messages should be sent and received (handled) asynchronously.
"Should I implement Acknowledgment of commands, e.g. go-to-next-section"
Wouldn't hurt. Would be good for flow control within your application as well as recovering from situations where the server or robots are restarted and you can't be sure whether a message was processed or not.
"What are the guiding principles of designing TCP communication protocols for client-server architectures?"
Probably the thing to do for your app is to design a simple command response protocol. Start by designing simple message sets, on going from client to server, the other from server to client. You could use simple human readable XML message set as follows:
Server to Client
<SCMessage type="TurnRight"></SCMessage>
<SCMessage type="TurnLeft"></SCMessage>
<SCMessage type="NextSector"><param key="sectorName" value="B"/></SCMessage>
<SCMessage type="GetStatus"></SCMessage>
<SCMessage type="Ack"></SCMessage>
Client to Server
<SCMessage type="SensorUpdate"><param key="data" value="123"/></SCMessage>
<SCMessage type="StatusChanged"><param key="status" value="Good"/></SCMessage>
....
<SCMessage type="Ack"></SCMessage>
So when parsing these messages you can teas them apart by looking for the SCMessage start stop tags. Once you have a message you could then use an XML parser to parse the messages contents. Alternatively you could use JSON which would actually probably be a lot easier (basically you'd be sending little dictionaries back and forth).
You've got a lot of reading to-do ;) Id start by reading up on python Twisted a bit and make little toy programs to get comfortable with things.