I am trying to write a simple python script that will read incoming serial data (usb adaptor) at 115200. After each line is received it must upload it to MySQL running in a Synology NAS in the same network.
The problem I see with the mysql INSERT from within python is that it can take from 0,5 seconds to 1,3 seconds and during this time any incoming messages would be lost, probably several ones.
I have tried many threading options codes but cant get it to work as normally example codes show you how to run 1, 2 or 3 threads at the same time, but what I need is to create threads as they are required by the incoming data.
btw: Using Raspberry Pi.
As a reference, some of the examples I have tried:
http://www.tutorialspoint.com/python/python_multithreading.htm
How about just two threads - the first monitoring the serial data & creating a list of messages to be stored - the second to monitor the list of messages, if not empty then store to the database & remove from the list.
Related
I have two python programs. Program 1 displays videos in a grid with multiple controls on it, and Program 2 performs manipulations to the images and sends it back depending on the control pressed in Program 1.
Each video in the grid is running in its own thread, and each video has a thread in Program 2 for sending results back.
I'm running this on the same machine though and I was unable to get multiple socket connections working to and from the same address (localhost). If there's a way of doing that - please stop reading and tell me how!
I currently have one socket sitting independent of all of my video threads in Program 1, and in Program 2 I have multiple threads sending data to the one socket in an array with a flag for which video the data is for. The problem is when I have multiple threads sending data at the same time it seems to scramble things and stop working. Any tips on how I can achieve this?
Regarding If there's a way of doing that - please stop reading and tell me how!.
There's a way of doing it, assuming you are on Linux or using WSL on Windows, you could use the hostname -I commend which will output an IP that looks like 192.168.X.X.
You can use that IP in your python program by binding your server to that IP instead of localhost or 127.0.0.1.
I'm trying to combine two programs in Python 3 that use the serial ports. I'll use different USB cables for each controller. One program reads serial data coming in once a minute. The second program will use the other USB serial port once every 15 minutes. Since both programs spend a lot of time doing nothing, I want to combine them. This will be running on both a Mac (designing) and Windows (final operation).
My question is: If program #2 is using it's serial port when data comes in for program #1, when program #1 looks at it's serial port, will the data be there waiting on it?
Is there a way to capture and write very fast serial data to a file?
I'm using a 32kSPS external ADC and a baud rate of 2000000 while printing in the following format: adc_value (32bits) \t millis()
This results in ~15 prints every 1 ms. Unfortunately every single soulution I have tried fails to capture and store real time data to a file. This includes: Processing sketches, TeraTerm, Serial Port Monitor, puTTY and some Python scripts. All of them are unable to log the data in real time.
Arduino Serial Monitor on the other hand is able to display real time serial data, but it's unable to log it in a file, as it lacks this function.
Here's a printscreen of the serial monitor in Arduino with the incoming data:
One problematic thing is probably that you try to do a write each time you receive a new record. That will waste a lot of time writing data.
Instead try to collect the data into buffers, and as a buffer is about to overflow write the whole buffer in a single and as low-level as possible write call.
And to not stop the receiving of the data to much, you could use threads and double-buffering: Receive data in one thread, write to a buffer. When the buffer is about to overflow signal a second thread and switch to a second buffer. The other thread takes the full buffer and writes it to disk, and waits for the next buffer to become full.
After trying more than 10 possible solutions for this problem including dedicated serial capture software, python scripts, Matlab scripts, and some C projects alternatives, the only one that kinda worked for me proved to be MegunoLink Pro.
It does not achieve the full 32kSPS potential of the ADC, rather around 12-15kSPS, but it is still much better than anything I've tried.
Not achieving the full 32kSPS might also be limited by the Serial.print() method that I'm using for printing values to the serial console. By the way, the platform I've been using is ESP32.
Later edit: don't forget to edit MegunoLinkPro.exe.config file in the MegunoLink Pro install directory in order to add further baud rates, like 1000000 or 2000000. By default it is limited to 500000.
I am having an issue with a Python script that is running on a Raspberry Pi. Frankly, the script initially runs perfectly fine and then after a certain period of time (typically >1 hour) the computer either freezes or shuts down. I am not sure if this is a software or a hardware issue. The only clue I have so far is the following error message that appeared one time when the computer froze:
[9798.371860] Unable to handle kernel paging request at virtual address e50b405c
How should this message be interpreted? What could be a good way of keep debugging the code? Any help is relevant since I am fairly new to programming and have run out of ideas on how to troubleshoot this issue..
Here is also some background to what the Python code intends to do (not sure if it makes a difference though). In short, every other second it registers the temperature through a sensor, creates a JSON file and saves it, sends this JSON object through cURL (urllib) to a web API, receives a new JSON file, changes switches based on the data in this file, sleeps for 2 seconds and repeats this process.
Thanks!
I am relatively new to programming and thus have limited experience when it comes to the logic involved in getting scripts to work as I want.
In a nutshell I have an Arduino connected to a raspberry pi (Raspbian). The Arduino controls sensors while the raspberry pi acts as a web server. I have created a database in MySQL containing one database with two tables. The first table needs to receive an INSERT every 5 minutes (through script 1.py) and the second table gets an UPDATE every minute. Both tables receive values from the Arduino.
I can run each script separately, and it works fine. I can join the two scripts and that works too, BUT because I use cron, they both run at the SAME time interval (e.g., 5 minutes). If I run both scripts (5 minute interval and 1 minute interval) through cron, only one works. I think it has something to do with the timing of the open and close connections?
Are there any suggestions as to how I can get this to work? Ideally I would like 1 script to run every 5 minutes and the other every 5 seconds (cron can't do seconds).