I have a program that writes serial data to the text file. I want to check for specific card UIDs in the text file by reading the file at first, if there is already a card UID I want to skip the serial write step (serial write 0), if there isn't that card UID I will go ahead and serial write 1.
In order to check for card UIDs I have employed next command, please have a look at my code.
import threading
import serial
import sys
import io
import codecs
import queue
from pynput import keyboard
with io.open("uid.txt", "w", encoding="utf-8") as b:
b.write("")
q = queue.Queue()
ser = serial.Serial('COM4', baudrate = 9600, timeout = 5)
class SerialReaderThread(threading.Thread):
def run(self):
while True:
output = ser.readline().decode('utf-8')
print(output)
q.put(output)
class FileWriting(threading.Thread):
def run(self):
while True:
output = q.get()
with io.open("uid.txt", "r+", encoding="utf-8") as input:
for line in input:
if line.startswith("Card UID: "):
s = (next(input))
if line.startswith(s): ***
ser.write(b'0\r\n')
else:
ser.write(b'1\r\n')
with io.open("uid.txt", "a+", encoding="utf-8") as f:
f.write(output)
serial_thread = SerialReaderThread()
file_thread=FileWriting()
serial_thread.start()
file_thread.start()
serial_thread.join()
file_thread.join()
FileWriting thread is what I need help with. Again I want to first read the text file (which initially will be empty as it is created) and check for lines with card UID and look if there already is that specific card UID in the file if there is write 0 if there isn't write 1 in serial.
However running this code gives me an error:
Exception in thread Thread-2:
Traceback (most recent call last):
File "C:\Users\Tsotne\AppData\Local\Programs\Python\Python38-32\lib\threading.py", line 932, in _bootstrap_inner
self.run()
File "C:\Users\Tsotne\AppData\Local\Programs\Python\Python38-32\project\fff.py", line 36, in run
s = (next(input))
StopIteration
Since you re-create the uid.txt file empty each time you run your program, you don't need a file to hold the information. Just use a set instead:
ser = serial.Serial('COM4', baudrate = 9600, timeout = 5)
class SerialReaderThread(threading.Thread):
def run(self):
uids = set()
while True:
output = ser.readline().decode('utf-8')
print(output)
response = b'0' if output in uids else b'1'
ser.write(response + b'\r\n')
uids.add(output)
serial_thread = SerialReaderThread()
serial_thread.start()
serial_thread.join()
Related
I am trying to receive/send data at the same time, and my idea to doing this was
import multiprocessing
import time
from reprint import output
import time
import random
def receiveThread(queue):
while True:
queue.put(random.randint(0, 50))
time.sleep(0.5)
def sendThread(queue):
while True:
queue.put(input())
if __name__ == "__main__":
send_queue = multiprocessing.Queue()
receive_queue = multiprocessing.Queue()
send_thread = multiprocessing.Process(target=sendThread, args=[send_queue],)
receive_thread = multiprocessing.Process(target=receiveThread, args=[receive_queue],)
receive_thread.start()
send_thread.start()
with output(initial_len=2, interval=0) as output_lines:
while True:
output_lines[0] = "Received: {}".format(str(receive_queue.get()))
output_lines[1] = "Last Sent: {}".format(str(send_queue.get()))
#output_lines[2] = "Input: {}".format() i don't know how
#also storing the data in a file but that's irrelevant for here
This however results in
Received: 38 Process Process-1:
Traceback (most recent call last):
File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/multiprocessing/process.py", line 315, in _bootstrap
self.run()
File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/multiprocessing/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
File "/Users/mge/repos/python/post_ug/manual_post/main.py", line 14, in sendThread
queue.put(input())
EOFError: EOF when reading a line
I hope you see what I am trying to do but I will explain it some more: I want one thread that gets data from a server that I have replaced with the random.randint(), and I want one thread that, while the otherone is constantly checking for the data, is getting an input. I would like it to look somewhat like:
Received: 38 Received: 21 Received: 12
Last Sent: => Last Sent: Hello World! => Last Sent: Lorem Ipsum => ...
Input: Hello Wo Input: Lore Input:
But I have no Idea how to get it done. If I replace the queue.put(input()) with another queue.put(random.randint(0, 50)) the printing in the two lines will work as expected, but
how can I have an 'input field' in the bottom and
how can I get the Input without the EOF?
Looks like, according to your description: I want one thread that gets data from a server that I have replaced with the random.randint(), and I want one thread that, while the otherone is constantly checking for the data, is getting an input. what you really want to use is multi-threading, but in your code your are creating and executing 2 new Processes, instead of 2 new Threads. So, if what you want to use is multi-threading, do the following instead, replacing the use of multi-processing by the use of multi-threading:
from queue import Queue
import threading
import time
from reprint import output
import time
import random
def receiveThread(queue):
while True:
queue.put(random.randint(0, 50))
time.sleep(0.5)
def sendThread(queue):
while True:
queue.put(input())
if __name__ == "__main__":
send_queue = Queue()
receive_queue = Queue()
send_thread = threading.Thread(target=sendThread, daemon=True, args=(send_queue,))
receive_thread = threading.Thread(target=receiveThread, daemon=True, args=(receive_queue,))
receive_thread.start()
send_thread.start()
with output(initial_len=2, interval=0) as output_lines:
while True:
output_lines[0] = "Received: {}".format(str(receive_queue.get()))
output_lines[1] = "Last Sent: {}".format(str(send_queue.get()))
#output_lines[2] = "Input: {}".format() i don't know how
#also storing the data in a file but that's irrelevant for here
The instances of queue.Queue are thread-safe, so they can safely be used by multi-thread code, like in the code.
I'm trying to read output from a serial port and write it to a text file. So far this connects and prints the output, but I can't figure out how to send the output to a text file (preferrable a csv file).
#! /usr/bin/env python3
import serial
import csv
import sys
import io
#excel stuff
#from time import gmtime, strftime
#resultFile=open('MyData.csv','wb')
#end excel stuff
def scan():
"""scan for available ports. return a list of tuples (num, name)"""
available = []
for i in range(256):
try:
s = serial.Serial(i)
available.append( (i, s.portstr))
s.close() # explicit close 'cause of delayed GC in java
except serial.SerialException:
pass
return available
if __name__=='__main__':
print ("Found ports:")
for n,s in scan():
print ("(%d) %s" % (n,s))
selection = input("Enter port number:")
try:
ser = serial.Serial(eval(selection), 9600, timeout=1)
print("connected to: " + ser.portstr)
except serial.SerialException:
pass
while True:
# Read a line and convert it from b'xxx\r\n' to xxx
line = ser.readline().decode('utf-8')[:-1]
if line: # If it isn't a blank line
# f=open('testing.txt','w+')
print(line)
#print >> f.write('test.txt')
f.close()
#print(line)
#with open('test.csv', 'w') as csv_file:
# writer = csv.DictWriter(csv_file, fieldnames=['header1'], lineterminator='\n')
ser.close()
import sys
sys.stdout = open('file', 'w')
print 'test'
or redirect the shell-output
$ python foo.py > file
Duplicate of this:
Redirect stdout to a file in Python?
I am new to python so need your help on the following error message. I have two files first one is "test1.py" which is I am running and has the following code.
import sys, time, re, os, pickle
from ComLib import *
obj = Com()
obj.ComOpen()
obj.ComReset()
obj.ComClose()
and the second file is "ComLib.py" and has the following code
import serial, sys, re, pickle, time
class Com:
def ComOpen(self):
self = serial.Serial()
self.port = "COM1"
self.baudrate = 9600
self.bytesize = serial.EIGHTBITS #number of bits per bytes
self.parity = serial.PARITY_NONE #set parity check: no parity
self.stopbits = serial.STOPBITS_ONE #number of stop bits
self.timeout = 1 #non-block read
self.xonxoff = True #disable software flow control
self.rtscts = False #disable hardware (RTS/CTS) flow control
self.dsrdtr = False #disable hardware (DSR/DTR) flow control
self.writeTimeout = 2 #timeout for write
self.open()
return
def ComClose(self):
self.close()
return
def ComReset(self):
print "Executing ComReset function...!!"
self.write("~~~~~~~~~~\r")
i = 0
while i<10 :
response = self.readline()
print "Inside first while loop...!!"
print "response = "+response
if (response == ':'):
print "-->colon found...ready for next input<---"
break
i=i+1
time.sleep(0.5)
return
While executing the above I am getting the following error
"Traceback (most recent call last):
File "C:\Users\vgupta\Desktop\KeyAT\final\WDEAutomationTestSuite\WDETestSuite\Bootguard\TC#001.py", line 17, in <modul
e>
obj.ComReset()
File "C:\Users\vgupta\Desktop\KeyAT\final\WDEAutomationTestSuite\APILib\ComLib.py", line 52, in ComReset
self.write("~~~~~~~~~~\r")
AttributeError: Com instance has no attribute 'write'"
Can anyone help me out in finding out what is wrong here.
Thanks,
Vipul
Your decleration Should be:
self.sSerial = serial.Serial()
self.sSerial.port = "COM1"
self.sSerial.baudrate = 9600
.........
then you can do self.sSerial.write("~~~~~~~~~~\r")
class Com is missing __init__
I am trying to write a homework about map-reduce. I run in a terminal:
ioannis#ioannis-desktop:~$ python hw3.py
then in another terminal:
python mincemeat.py -p changeme localhost
Immediately in the former terminal, a bunch of stuff is typed:
ioannis#ioannis-desktop:~$ python hw3.py
error: uncaptured python exception, closing channel <mincemeat.ServerChannel connected 127.0.0.1:58061 at 0x7fabef5045a8>
(<type 'exceptions.TypeError'>:'NoneType' object is not iterable
[/usr/lib/python2.7/asyncore.py|read|83]
[/usr/lib/python2.7/asyncore.py|handle_read_event|449]
[/usr/lib/python2.7/asynchat.py|handle_read|158]
[/home/ioannis/mincemeat.py|found_terminator|82]
[/home/ioannis/mincemeat.py|process_command|280]
[/home/ioannis/mincemeat.py|process_command|123]
[/home/ioannis/mincemeat.py|respond_to_challenge|106]
[/home/ioannis/mincemeat.py|post_auth_init|289]
[/home/ioannis/mincemeat.py|start_new_task|258]
[/home/ioannis/mincemeat.py|next_task|304])
^CTraceback (most recent call last):
File "hw3.py", line 54, in <module>
results = s.run_server(password="changeme")
File "/home/ioannis/mincemeat.py", line 220, in run_server
self.close_all()
File "/usr/lib/python2.7/asyncore.py", line 421, in __getattr__
%(self.__class__.__name__, attr))
AttributeError: Server instance has no attribute 'close_all'
ioannis#ioannis-desktop:~$ python hw3.py
the code for hw3.py:
import mincemeat
import glob
from stopwords import allStopWords
text_files = glob.glob('/home/ioannis/Web Intelligence and Big Data/Week 3: Load - I/hw3data/hw3data/*')
def file_contents(file_name):
f = open(file_name)
try:
# print f.read()
return f.read()
except:
print "exception!!!!!!"
finally:
f.close()
source = dict((file_name, file_contents(file_name))
for file_name in text_files)
def mapfn(key, value):
for line in value.splitlines():
print "I have reach that point!"
...........
...........
def reducefn(k, vs):
result = sum(vs)
return result
s = mincemeat.Server()
s.source = source
s.mapfn = mapfn
s.reducefn = reducefn
results = s.run_server(password="changeme")
print results
In the thread Python, Asyncore and forks, the following suggestion was made:
Change your handle_accept() to return immediately when accept() returns None.
In the file mincemeat.py there is a function:
def handle_accept(self):
conn, addr = self.accept()
sc = ServerChannel(conn, self)
sc.password = self.password
Is the solution to my problem to change something in that function?
s.source = source needs to be s.datasource = source.
I am writing a file processor that can (hopefully) parse arbitrary files and perform arbitrary actions on the parsed contents. The file processor needs to run continuously. The basic idea that I am following is
Each file will have two associated processes (One for reading, other for Parsing and writing somewhere else)
The reader will read a line into a common buffer(say a Queue) till EOF or buffer full. Then wait(sleep)
Writer will read from buffer, parse the stuff, write it to (say) DB till buffer not empty. Then wait(sleep)
Interrupting the main program will cause the reader/writer to exit safely (buffer can be washed away without writing)
The program runs fine. But, sometimes Writer will initialize first and find the buffer empty. So it will go to sleep. The Reader will fill the buffer and sleep too. So for sleep_interval my code does nothing. To get around that thing, I tried using a multiprocessing.Event() to signal to the writer that the buffer has some entries which it may process.
My code is
import multiprocessing
import time
import sys
import signal
import Queue
class FReader(multiprocessing.Process):
"""
A basic file reader class
It spawns a new process that shares a queue with the writer process
"""
def __init__(self,queue,fp,sleep_interval,read_offset,event):
self.queue = queue
self.fp = fp
self.sleep_interval = sleep_interval
self.offset = read_offset
self.fp.seek(self.offset)
self.event = event
self.event.clear()
super(FReader,self).__init__()
def myhandler(self,signum,frame):
self.fp.close()
print "Stopping Reader"
sys.exit(0)
def run(self):
signal.signal(signal.SIGINT,self.myhandler)
signal.signal(signal.SIGCLD,signal.SIG_DFL)
signal.signal(signal.SIGILL,self.myhandler)
while True:
sleep_now = False
if not self.queue.full():
print "READER:Reading"
m = self.fp.readline()
if not self.event.is_set():
self.event.set()
if m:
self.queue.put((m,self.fp.tell()),block=False)
else:
sleep_now = True
else:
print "Queue Full"
sleep_now = True
if sleep_now:
print "Reader sleeping for %d seconds"%self.sleep_interval
time.sleep(self.sleep_interval)
class FWriter(multiprocessing.Process):
"""
A basic file writer class
It spawns a new process that shares a queue with the reader process
"""
def __init__(self,queue,session,sleep_interval,fp,event):
self.queue = queue
self.session = session
self.sleep_interval = sleep_interval
self.offset = 0
self.queue_offset = 0
self.fp = fp
self.dbqueue = Queue.Queue(50)
self.event = event
self.event.clear()
super(FWriter,self).__init__()
def myhandler(self,signum,frame):
#self.session.commit()
self.session.close()
self.fp.truncate()
self.fp.write(str(self.offset))
self.fp.close()
print "Stopping Writer"
sys.exit(0)
def process_line(self,line):
#Do not process comments
if line[0] == '#':
return None
my_list = []
split_line = line.split(',')
my_list = split_line
return my_list
def run(self):
signal.signal(signal.SIGINT,self.myhandler)
signal.signal(signal.SIGCLD,signal.SIG_DFL)
signal.signal(signal.SIGILL,self.myhandler)
while True:
sleep_now = False
if not self.queue.empty():
print "WRITER:Getting"
line,offset = self.queue.get(False)
#Process the line just read
proc_line = self.process_line(line)
if proc_line:
#Must write it to DB. Put it into DB Queue
if self.dbqueue.full():
#DB Queue is full, put data into DB before putting more data
self.empty_dbqueue()
self.dbqueue.put(proc_line)
#Keep a track of the maximum offset in the queue
self.queue_offset = offset if offset > self.queue_offset else self.queue_offset
else:
#Looks like writing queue is empty. Just check if DB Queue is empty too
print "WRITER: Empty Read Queue"
self.empty_dbqueue()
sleep_now = True
if sleep_now:
self.event.clear()
print "WRITER: Sleeping for %d seconds"%self.sleep_interval
#time.sleep(self.sleep_interval)
self.event.wait(5)
def empty_dbqueue(self):
#The DB Queue has many objects waiting to be written to the DB. Lets write them
print "WRITER:Emptying DB QUEUE"
while True:
try:
new_line = self.dbqueue.get(False)
except Queue.Empty:
#Write the new offset to file
self.offset = self.queue_offset
break
print new_line[0]
def main():
write_file = '/home/xyz/stats.offset'
wp = open(write_file,'r')
read_offset = wp.read()
try:
read_offset = int(read_offset)
except ValueError:
read_offset = 0
wp.close()
print read_offset
read_file = '/var/log/somefile'
file_q = multiprocessing.Queue(100)
ev = multiprocessing.Event()
new_reader = FReader(file_q,open(read_file,'r'),30,read_offset,ev)
new_writer = FWriter(file_q,open('/dev/null'),30,open(write_file,'w'),ev)
new_reader.start()
new_writer.start()
try:
new_reader.join()
new_writer.join()
except KeyboardInterrupt:
print "Closing Master"
new_reader.join()
new_writer.join()
if __name__=='__main__':
main()
The dbqueue in Writer is for batching together Database writes and for each line I keep the offset of that line. The maximum offset written into DB is stored into offset file on exit so that I can pick up where I left on next run. The DB object (session) is just '/dev/null' for demo.
Previously rather than do
self.event.wait(5)
I was doing
time.sleep(self.sleep_interval)
Which (as I have said) worked well but introduced a little delay. But then the processes exited perfectly.
Now on doing a Ctrl-C on the main process, the reader exits but the writer throws an OSError
^CStopping Reader
Closing Master
Stopping Writer
Process FWriter-2:
Traceback (most recent call last):
File "/usr/lib64/python2.6/multiprocessing/process.py", line 232, in _bootstrap
self.run()
File "FileParse.py", line 113, in run
self.event.wait(5)
File "/usr/lib64/python2.6/multiprocessing/synchronize.py", line 303, in wait
self._cond.wait(timeout)
File "/usr/lib64/python2.6/multiprocessing/synchronize.py", line 212, in wait
self._wait_semaphore.acquire(True, timeout)
OSError: [Errno 0] Error
I know event.wait() somehow blocks the code but I can't get how to overcome this. I tried wrapping self.event.wait(5) and sys.exit() in a try: except OSError: block but that only makes the program hang forever.
I am using Python-2.6
I think it would be better to use the Queue blocking timeout for the Writer class - using Queue.get(True, 5), then if during the time interval something was put into the queue, the Writer would wake up immediately.. The Writer loop would then be something like:
while True:
sleep_now = False
try:
print "WRITER:Getting"
line,offset = self.queue.get(True, 5)
#Process the line just read
proc_line = self.process_line(line)
if proc_line:
#Must write it to DB. Put it into DB Queue
if self.dbqueue.full():
#DB Queue is full, put data into DB before putting more data
self.empty_dbqueue()
self.dbqueue.put(proc_line)
#Keep a track of the maximum offset in the queue
self.queue_offset = offset if offset > self.queue_offset else self.queue_offset
except Queue.Empty:
#Looks like writing queue is empty. Just check if DB Queue is empty too
print "WRITER: Empty Read Queue"
self.empty_dbqueue()