I have a python script running on a revPi which uses Azure IOT SDK. The script basically accepts a bunch of modbus registers from a .json file, adds a few properties and sends it to Azure IOT hub for analysis.
The script is currently too dependent on network connection and due to infrastructure limitations, the connectivity is unreliable and often causes the script to die/abort often. How can I make the script to function on this poor internet connection? The main libraries being used are pymodbus and iothub_client.
As per Checking network connection I'd suggest something like this;
import urllib2
if(internet_on())
CallFunction()
else
internet_on()
def internet_on():
try:
urllib2.urlopen('http://216.58.192.142', timeout=1)
return True
except urllib2.URLError as err:
return False
"216.58.192.142" is a google address but you could use anything reliable such as Azure as this is where you are sending your data.
It may be more sensible to use a while loop or add a thread sleep to stop it checking so often.
Hope this helps.
Related
So first of all, what I really want to achieve: I want to know when an IoT device has stopped working (i.e. lost connection, shut down, basically it's not longer talking to IoT Core). I can't seem to find an implementation for this on GCP.
I have a raspberry pi as my IoT device, I have configured it on IoT core and somewhere I read that since this is not implemented a way to solve it is to create a logging sink which activates a cloud function whenever there is a CONNECT/DISCONNECT log. This would serve my purpose and I have implemented this sink and cloud function to alert me.
I have been following this guide on connecting to MQTT. However, the way the explain it, they set it up such that whenever the expiration time on the JWT is exceeded, they disconnect the client and create a new one to re-new the JWT. This would make it such that I am going to be alerted of connection/disconnection whenever this client needs to be renewed. So I won't be able to differentiate of a real issue from renewals of the MQTT client.
In the same guide, I see that they mention MQTT long term or LTS, and they claim that this way you can set up the client once and communicate continuously through it for the supported time which it says its until 2030. This seems to be what I really want, but I have not been able to connect this way and they don't explain it other than saying the hostname should be mqtt.2030.ltsapis.goog and to use a primary and backup certificates which are different from the complete root CA from the first method.
I tried using basically the same process for setting up the client:
client = mqtt.Client(client_id=client_id)
# With Google Cloud IoT Core, the username field is ignored, and the
# password field is used to transmit a JWT to authorize the device.
client.username_pw_set(
username='unused',
password=create_jwt(project_id, private_key_file, algorithm))
# Enable SSL/TLS support.
client.tls_set(ca_certs=ca_certs, tls_version=ssl.PROTOCOL_TLSv1_2)
but changing the hostname and giving it the primary cert where I would give it the complete ca_certs, but it won't accept it and I am not sure how to do it otherwise with primary and backup certifications. I am looking at the documentation on tls_set, but I don't see where these would go or how they differ from the complete ca certs. I haven't seen any other examples outside of this guide.
I am hoping to be able to connect to this MQTT LTS so that I can maintain the connection without having to constantly renew the client.
The long term MQTT domain lets you use the LTS configuration for a long period of time, not the connection.
As you mention, for your use case the solution would be to activate and use device logs. One of the events is triggered when a device disconnects from IoT Core, and you can use that event to trigger an alert.
Keep in mind that the time limits for the connection are set for security purposes, and the client should renew the connection.
I'm trying to make a script in python which fetches several pieces of information (such as CPU name, network adapters) about machines in my network.
The script is currently working on my machine by using wmi.WMI() (or wmi.WMI('localhost')) to connect.
But now I want to see if it works for other machines as well. For this purpose, I've installed VMWare and set up a Virtual Machine (running Windows XP). I'd like to know how to connect to it.
I've read that you can simply use wmi.WMI([machine name or IP]) but putting in the IP ipconfig gives me does not seem to work. I get the error The RPC server is unavailable.
Could anybody help me please?
Thank you in advance.
One thing you can do that I found to be helpful was use a try...except statement inside a while True loop. That will repeatedly force WMI to connect to the machine and only break once the connection has been established. For example, in the case of the RPC server is unavailable error:
while True:
try:
comm = wmi.WMI([servername] user=[username] password=[password])
except wmi.x_wmi:
continue
else:
break
That should help out a little.
Status Quo:
I have two python apps (frontend-server and data-collector, a database is 'between' them).
Currently using redis as db and its publish/subscribe protocol to notify the frontend when new data is available.
But may I want to use a different database (and don't want to keep redis on the system just for the pub/sub).
Are there any simple alternatives to notify my frontend if the data-collector has transacted new data to the database (without using an external message queue like beanstalkd or redis)?
ZeroMQ is a good option. It has good Python bindings, and it makes communicating between processes on the same machine and processes on different machines look almost identical.
Start by reading the guide: http://zguide.zeromq.org/page:all
As I mentioned in my comment, if you want something that is going across a network then other than setting up a web service (flask app?), or writing your own INET socket server there is nothing built in to the operating system to communicate between machines. Beanstalk has a very simple API in Python and I've used it for this kind of thing very successfully.
try:
beanstalk = beanstalkc.Connection(host="my.host.com")
beanstalk.watch("update_queue")
except:
print "Error connecting to beanstalk"
while True:
job = beanstalk.reserve()
do_something_with_job(job)
If you are only going to be working on the same machine, then read up on linux IPC. A socket connection between processes is very fast and has practically zero overhead. They can also be a part of an asynchronous program when you take advantage of epoll call backs.
I am developing a testbed for cloud computing environment. I want to establish multiple client connection to a server. What I want is that, server first of all send a data to all the clients specifying sending_interval and then all the clients will keep on sending their data with a time gap of that time_interval (as specified by the server). Please help me out, how can I do the same using python socket program. (i.e. I want multiple client to single server connectivity and also client sending data with the time gap specified by server). Will be great-full if anyone can help me. Thanks in advance.
This problem is easily solved by the ZeroMQ socket library. It is production stable. It allows you to define publisher-subscriber relationships, where a publishing process will publish data on a port regardless of how many (0 to infinite) listening processes there are. They call this the PUB-SUB model; it's in their docs (link below).
It sounds like you want to set up a bunch of clients that are all publishers. They can subscribe to a controlling channel, which which will send updates to their configuration (how often to write). They also act as publishers, pushing out their own data at an interval specified by default/config channel/socket.
Then, you have one or more listening processes that listen to all the clients' published messages. Perhaps you could even have two listening processes, one for backup or DR, or whatever.
We're using ZeroMQ and loving the simplicity it gives; there's no connection errors because the publisher doesn't care if anyone is listening, and the subscriber can start before the publisher and if there's nothing there to listen to, it can just loop around and wait until there is.
Bindings are available in ALL languages (it's freaky). The Python binding isn't pure-python, it does require a C compiler, but is frighteningly fast, and the pub/sub example is a cut/paste, 'golly, it works!' experience.
Link: http://zeromq.org
There are MANY other methods available with this library, including message queues, etc. They have relatively complete documentation, too.
Multi-Client and Single server Socket programming can be achieved by Multithreading in Socket Programming. I have implemented both the method:
Single Client and Single Server
Multiclient and Single Server
In my GitHub Repo Link: https://github.com/shauryauppal/Socket-Programming-Python
What is Multi-threading Socket Programming?
Multithreading is a process of executing multiple threads simultaneously in a single process.
To understand well you can visit Link: https://www.geeksforgeeks.org/socket-programming-multi-threading-python/, written by me.
I am looking for a message/queuing solution for my web based system running on Ubuntu.
The system was built on the following technologies:
Javascript (Extjs framework) - Frontend
PHP
Python (Daemon service which interacts with the encryption device)
Python pyserial - (Serial port interactions)
MySQL
Linux - Ccustom bash scripts(to update DB/mail reports)
The system serves the following purpose:
Capture client information on a distributed platform
Encrypt/decrypt sensitive transactions using a Hardware device
System breakdown:
The user gains access to the system using a web browser
The user captures client information and on pressing "submit" button
The data is sent to the encryption device and the system enters a wait state
The data is then encrypted on the device and sent back to the browser
The encrypted data is saved to the DB
System exits wait state and displays DONE message
Please note: I have already taken care of waiting/progress messages so lets omit that.
What I have done so far:
I created a python daemon which monitors a DB view for any new requests
The daemon service executes new requests on the device using pyserial and updates
the requests table with a "response" ie. the encrypted content
I created a polling service in PHP which frequently checks if there is a "response" in >the requests table for the specific request
Created the Extjs frontend with appropriate wait/done status messages
The problem with the current setup:
Concurreny - We expect > 20 users at any time submitting encryption/decryption requests
using a database as a message/queuing solution is not scalable due to table locking and only 1 listening process which monitors for requests
Daemon service - Relying on a daemon service is a bit risky and the DB overhead seems a bit high polling the view for new requests every second
Development - It would simplify my development tasks by just sending requests to a encrypt/decrypt service instead of doing this whole process of inserting a request in the db,polling for the response and processing the request in the daemon service.
My Question:
What would be the ideal message/queening solution in this situation? Please take into >account my system exclusively runs on a Ubuntu O/S.
I have done a few Google services and came accross something called a "Stomp" server but it prove somewhat difficult to setup and lacked some documentation. Also I prefer the advice from individuals who have some experience in setting up something like this instead of some "how to" guide :)
Thank You for your time
I believe the popular RabbitMQ implementation of AMQP offers a PHP extension (here) and you can definitely access AMQP in Python, e.g. via Qpid. RabbitMQ is also easy to install on Ubuntu (or Debian), see e.g. here.
Whether via RabbitMQ or otherwise, adopting an open messaging and queueing protocol such as AMQP has obvious and definite advantages in comparison to more "closed" solutions (even if technically open source, such solutions just won't offer as many implementations, and therefore flexibility, as a widely adopted open, standard protocol).
I would do:
The web component connects to the encryption daemon/service, sends the data and waits for the answer
The encryption daemon/service would:
On startup, start a thread (SerialThread) of each of the available serial devices
All 'serial threads' would then do a SerialQueue.get (blocking waiting for messages)
A multi threaded TCP server, check ThreadingMixIn from http://docs.python.org/library/socketserver.html
The TCP Server threads would receive the plain data and put it on the SerialQueue
A random SerialThread (Python's Queue class manages the multi thread required locking for you) would receive the request, encrypt and return the encrypted data to the TCP Server thread
The TCP Server thread would write the data back to the web component
I am using this logic on a project, you can check the source at http://bazaar.launchpad.net/~mirror-selector-devs/mirror-selector/devel/files/head:/mirrorselector/, on my case the input is an URL, the processing is to scan for an available mirror, the output is a mirror url.