Mixing multiprocessing and serial ports - python

I've written a class which inherits multiprocess.Process(). It holds a serial.Serial() object in a class attribute. The method self.loop() is supposed to read from and write to the serial port. When self.loop() is called, it is supposed to run as a separate process, which is a requirement of the person who asked me to write this. However, my code produces a strange error.
This is my code:
from multiprocessing import Process
import serial
import time
class MySerialManager(Process):
def __init__(self, serial_port, baudrate=115200, timeout=1):
super(MySerialManager, self).__init__(target=self.loop)
# As soon as you uncomment this, you'll get an error.
# self.ser = serial.Serial(serial_port, baudrate=baudrate, timeout=timeout)
def loop(self):
# Just some simple action for simplicity.
for i in range(3):
print("hi")
time.sleep(1)
if __name__ == "__main__":
msm = MySerialManager("COM14")
try:
msm.start()
except KeyboardInterrupt:
print("caught in main")
finally:
msm.join()
This is the error:
Traceback (most recent call last):
File "test.py", line 22, in <module>
msm.start()
File "C:\Python\Python36\lib\multiprocessing\process.py", line 105, in start
self._popen = self._Popen(self)
File "C:\Python\Python36\lib\multiprocessing\context.py", line 223, in _Popen
return _default_context.get_context().Process._Popen(process_obj)
File "C:\Python\Python36\lib\multiprocessing\context.py", line 322, in _Popen
return Popen(process_obj)
File "C:\Python\Python36\lib\multiprocessing\popen_spawn_win32.py", line 65, in __init__
reduction.dump(process_obj, to_child)
File "C:\Python\Python36\lib\multiprocessing\reduction.py", line 60, in dump
ForkingPickler(file, protocol).dump(obj)
ValueError: ctypes objects containing pointers cannot be pickled
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "test.py", line 26, in <module>
msm.join()
File "C:\Python\Python36\lib\multiprocessing\process.py", line 120, in join
assert self._popen is not None, 'can only join a started process'
AssertionError: can only join a started process
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "C:\Python\Python36\lib\multiprocessing\spawn.py", line 105, in spawn_main
exitcode = _main(fd)
File "C:\Python\Python36\lib\multiprocessing\spawn.py", line 115, in _main
self = reduction.pickle.load(from_parent)
EOFError: Ran out of input
I've also tried creating a serial port object outside of the class and passing it on to the constructor. Furthermore, I've tried not inheriting multiprocess.Process() but instead putting:
self.proc = Process(target=self.loop)
into the class and
try:
msm.proc.start()
except KeyboardInterrupt:
print("caught in main")
finally:
msm.proc.join()
into the main block. Neither of them solved the problem.
Somebody pointed out that it seems like mixing multiprocessing and serial ports just doesn't work out. Is that true? If it is, could you please explain to me why this isn't working? Any help is greatly appreciated!

In windows the serial object once created cannot be shared between two processes (ie. parent and child)
so make the serial object in the child process and pass the reference of that as argument to other functions
try this:
from multiprocessing import Process
import serial
import time
class MySerialManager(Process):
def __init__(self, serial_port, baudrate=115200, timeout=1):
super(MySerialManager, self).__init__(target=self.loop_iterator,args=(serial_port, baudrate, timeout))
# As soon as you uncomment this, you'll get an error.
# self.ser = serial.Serial(serial_port, baudrate=baudrate, timeout=timeout)
def loop_iterator(self,serial_port, baudrate,timeout):
ser = serial.Serial(serial_port, baudrate=baudrate, timeout=timeout)
self.loop(ser)
def loop(self,ser):
# Just some simple action for simplicity.
# you can use ser here
for i in range(3):
print("hi")
time.sleep(1)
if __name__ == "__main__":
msm = MySerialManager("COM4")
try:
msm.start()
except KeyboardInterrupt:
print("caught in main")
finally:
msm.join()

Related

Simple code using a serial port in a multiprocessing environment not working

I am trying to write a multiprocessing python script on Windows which will read a rs232 serial port in one process and send the data back to the main process.
I narrowed down my problem to the the creation of the serial port connection (in the main process) and using accessing it (in another process). The script consists of two files. The first file (test-main.py) is the main process, the second file (test-class.py) contains a serial port class.
test-main.py
from multiprocessing import Process, Queue
import serial
from test_class import test
def main():
#RS232 Parameteres to open serial connection
test_serial = serial.Serial(
port ='COM9',\
baudrate=9600,\
parity=serial.PARITY_NONE,\
stopbits=serial.STOPBITS_ONE,\
bytesize=serial.EIGHTBITS,\
timeout=0.1)
#create instance of test class
test_instance=test()
#creat a new thread which will measure speed and initiate relay
test_instance = Process(target=test_instance.test_method, args=(test_serial,))
test_instance.daemon = True
test_instance.start()
if __name__ == '__main__':
main()
test-class.py
from multiprocessing import Process, Queue
import serial
class test:
def test_method (self,rs232_connection):
#main loop
while 1==1:
print(rs232_connection)
The error I get is:
*C:\Users\User\Documents\BatterUp\Python>py test-main.py
Traceback (most recent call last):
File "test-main.py", line 26, in <module>
main()
File "test-main.py", line 22, in main
test_instance.start()
File "C:\Users\User\AppData\Local\Programs\Python\Python37-32\lib\multiprocessing\process.py", line 112, in start
self._popen = self._Popen(self)
File "C:\Users\User\AppData\Local\Programs\Python\Python37-32\lib\multiprocessing\context.py", line 223, in _Popen
return _default_context.get_context().Process._Popen(process_obj)
File "C:\Users\User\AppData\Local\Programs\Python\Python37-32\lib\multiprocessing\context.py", line 322, in _Popen
return Popen(process_obj)
File "C:\Users\User\AppData\Local\Programs\Python\Python37-32\lib\multiprocessing\popen_spawn_win32.py", line 89, in __init__
reduction.dump(process_obj, to_child)
File "C:\Users\User\AppData\Local\Programs\Python\Python37-32\lib\multiprocessing\reduction.py", line 60, in dump
ForkingPickler(file, protocol).dump(obj)
ValueError: ctypes objects containing pointers cannot be pickled
C:\Users\User\Documents\BatterUp\Python>Traceback (most recent call last):
File "<string>", line 1, in <module>
File "C:\Users\User\AppData\Local\Programs\Python\Python37-32\lib\multiprocessing\spawn.py", line 99, in spawn_main
new_handle = reduction.steal_handle(parent_pid, pipe_handle)
File "C:\Users\User\AppData\Local\Programs\Python\Python37-32\lib\multiprocessing\reduction.py", line 87, in steal_handle
_winapi.DUPLICATE_SAME_ACCESS | _winapi.DUPLICATE_CLOSE_SOURCE)
PermissionError: [WinError 5] Access is denied*

Not able to implement Multiprocessing in python(The parameter is incorrect)

EDIT
I am trying to run to functions simultaneously using multiprocessing
import serial
import time
from multiprocessing import Process
import sys
sertx = serial.Serial('COM4', 115200)
serrx = serial.Serial('COM3', 115200)
rx_t=0
tx_t=0
def rx(serrx):
global rx_t
while True:
print("hi")
read_serial=serrx.readline()
rx_t = time.time()
print(read_serial)
print('rx: ',rx_t)
def tx(sertx):
print("started")
global tx_t
while True:
msg = str(1)
# print('sending: ',msg.encode())
msgstat = 'A' + msg
#print(msgstat)
#print(type(msgstat))
tx_t = time.time()
sertx.write(msg.encode())
print('tx: ',tx_t)
if __name__ == '__main__':
p1 = Process(target=tx,args=(sertx,))
p2 = Process(target=rx,args=(serrx,))
p1.start()
p2.start()
p1.join()
p2.join()
Error
Traceback (most recent call last):
File "c:/Users/ambuj/Documents/Python Scripts/wave.py", line 58, in <module>
p1.start()
File "C:\Users\ambuj\Anaconda3\lib\multiprocessing\process.py", line 112, in start
self._popen = self._Popen(self)
File "C:\Users\ambuj\Anaconda3\lib\multiprocessing\context.py", line 223, in _Popen
return _default_context.get_context().Process._Popen(process_obj)
File "C:\Users\ambuj\Anaconda3\lib\multiprocessing\context.py", line 322, in _Popen
return Popen(process_obj)
File "C:\Users\ambuj\Anaconda3\lib\multiprocessing\popen_spawn_win32.py", line 89, in __init__
reduction.dump(process_obj, to_child)
File "C:\Users\ambuj\Anaconda3\lib\multiprocessing\reduction.py", line 60, in dump
ForkingPickler(file, protocol).dump(obj)
ValueError: ctypes objects containing pointers cannot be pickled
PS C:\Users\ambuj\Documents\Python Scripts> Traceback (most recent call last):
File "<string>", line 1, in <module>
File "C:\Users\ambuj\Anaconda3\lib\multiprocessing\spawn.py", line 99, in spawn_main
new_handle = reduction.steal_handle(parent_pid, pipe_handle)
File "C:\Users\ambuj\Anaconda3\lib\multiprocessing\reduction.py", line 82, in steal_handle
_winapi.PROCESS_DUP_HANDLE, False, source_pid)
OSError: [WinError 87] The parameter is incorrect
Basically what I am doing/want to do
I have a transmitter that keeps on transmitting data and I have a receiver that keeps on receiving data. When I transmit the data I note down the time, when I receive the data I note down its time. Scripts are running continuously. I am running these scripts in parallel. As I want to run these scripts simultaneously I am using multiprocessing.
Thanks
I'm sure you do not need to use multiprocessing module for such kind of oeprations, it is used to boost computations. I guess you want reading from one port and writing to another in one process, otherwise you could just write to different programs and run them independenly. For IN/OUT threading module is used.
I've never worked with serial library, but know how to create process whcih listen to input in one thread and print to another thread:
This is my messenger program
I guess that code can work this way:
import serial
import time
from threading import Thread
def rx(_ser_rx):
while True:
print("hi")
read_serial = _ser_rx.readline()
rx_t = time.time()
print(read_serial)
print('rx: ', rx_t)
def tx(_ser_tx):
print("started")
while True:
msg = "1"
tx_t = time.time()
_ser_tx.write(msg.encode())
print('tx: ', tx_t)
if __name__ == '__main__':
ser_tx = serial.Serial('COM4', 115200)
ser_rx = serial.Serial('COM3', 115200)
t1 = Thread(target=rx, args=(ser_rx,),)
t2 = Thread(target=tx, args=(ser_tx,))
t1.start()
t2.start()

python multiprocessing pickling/manager/misc error (from PMOTW)

I'm having some trouble getting the following code to run on Eclipse via Windows. The code is from Doug Hellman:
import random
import multiprocessing
import time
class ActivePool:
def __init__(self):
super(ActivePool, self).__init__()
self.mgr = multiprocessing.Manager()
self.active = self.mgr.list()
self.lock = multiprocessing.Lock()
def makeActive(self, name):
with self.lock:
self.active.append(name)
def makeInactive(self, name):
with self.lock:
self.active.remove(name)
def __str__(self):
with self.lock:
return str(self.active)
def worker(s, pool):
name = multiprocessing.current_process().name
with s:
pool.makeActive(name)
print('Activating {} now running {}'.format(
name, pool))
time.sleep(random.random())
pool.makeInactive(name)
if __name__ == '__main__':
pool = ActivePool()
s = multiprocessing.Semaphore(3)
jobs = [
multiprocessing.Process(
target=worker,
name=str(i),
args=(s, pool),
)
for i in range(10)
]
for j in jobs:
j.start()
for j in jobs:
j.join()
print('Now running: %s' % str(pool))
I get the following error, which I assume is due to some pickling issue with passing in pool as an argument to Process.
Traceback (most recent call last):
File "E:\Eclipse_Workspace\CodeExamples\FromCodes\CodeTest.py", line 50, in <module>
j.start()
File "C:\Users\Bob\AppData\Local\Programs\Python\Python36-32\lib\multiprocessing\process.py", line 105, in start
self._popen = self._Popen(self)
File "C:\Users\Bob\AppData\Local\Programs\Python\Python36-32\lib\multiprocessing\context.py", line 223, in _Popen
return _default_context.get_context().Process._Popen(process_obj)
File "C:\Users\Bob\AppData\Local\Programs\Python\Python36-32\lib\multiprocessing\context.py", line 322, in _Popen
return Popen(process_obj)
File "C:\Users\Bob\AppData\Local\Programs\Python\Python36-32\lib\multiprocessing\popen_spawn_win32.py", line 65, in __init__
reduction.dump(process_obj, to_child)
File "C:\Users\Bob\AppData\Local\Programs\Python\Python36-32\lib\multiprocessing\reduction.py", line 60, in dump
ForkingPickler(file, protocol).dump(obj)
File "C:\Users\Bob\AppData\Local\Programs\Python\Python36-32\lib\multiprocessing\connection.py", line 939, in reduce_pipe_connection
dh = reduction.DupHandle(conn.fileno(), access)
File "C:\Users\Bob\AppData\Local\Programs\Python\Python36-32\lib\multiprocessing\connection.py", line 170, in fileno
self._check_closed()
File "C:\Users\Bob\AppData\Local\Programs\Python\Python36-32\lib\multiprocessing\connection.py", line 136, in _check_closed
raise OSError("handle is closed")
OSError: handle is closed
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "C:\Users\Bob\AppData\Local\Programs\Python\Python36-32\lib\multiprocessing\spawn.py", line 99, in spawn_main
new_handle = reduction.steal_handle(parent_pid, pipe_handle)
File "C:\Users\Bob\AppData\Local\Programs\Python\Python36-32\lib\multiprocessing\reduction.py", line 87, in steal_handle
_winapi.DUPLICATE_SAME_ACCESS | _winapi.DUPLICATE_CLOSE_SOURCE)
PermissionError: [WinError 5] Access is denied
A similar question's answer seems to suggest that I initialize pool with a function call at the top level, but I don't know how to apply that to this example. Do I initialize ActivePool in worker? That seems to defeat the spirit of Hellman's example.
Another answer suggests I use __getstate__, __setstate__, to remove unpickleable objects and reconstruct them when unpickling, but I don't know a good way to do this with Proxy Objects like Manager, and I actually don't know what the unpickleable object is.
Is there any way I can make this example work with minimal changes? I really wish to understand what is going on under the hood. Thanks!
Edit - Problem Solved:
The pickling issue was pretty obvious in hindsight. The ActivePool's __init__ contained a Manager() object which seems unpicklable. The code runs normally as per Hellman's example if we remove self.mgr, and initialize the list ProxyObject in one line:
def __init__(self):
super(ActivePool, self).__init__()
self.active = multiprocessing.Manager().list()
self.lock = multiprocessing.Lock()
Comment: The 'join()' was in the Hellman example, but I forgot to add it into the code snippet. Any other ideas?
I'm running Linux and it works as expected, Windows behave different read understanding-multiprocessing-shared-memory-management-locks-and-queues-in-pyt
To determine which Parameter of args=(s, pool) raise the Error remove one and use it as global.
Change:
def worker(s):
...
args=(s,),
Note: There is no need to enclose a multiprocessing.Manager().list() with a Lock().
This is not the culprit of your error.
Question: Is there any way I can make this example work with minimal changes?
Your __main__ Process terminates, therefore all started Processes die at unpredicted position of execution. Add simple a .join() at the end to let the __main__ wait until all Processes done:
for j in jobs:
j.join()
print('EXIT __main__')
Tested with Python: 3.4.2

Python, AttributeError: RunCmd instance has no attribute 'p' for delta debug

I have a Python program that produces an error:
File "myTest.py", line 34, in run
self.output = self.p.stdout
AttributeError: RunCmd instance has no attribute 'p'
The Python code:
class RunCmd():
def __init__(self, cmd):
self.cmd = cmd
def run(self, timeout):
def target():
self.p = sp.Popen(self.cmd[0], self.cmd[1], stdin=sp.PIPE,
stdout=sp.PIPE, stderr=sp.STDOUT)
thread = threading.Thread(target=target)
thread.start()
thread.join(timeout)
if thread.is_alive():
print "process timed out"
self.p.stdin.write("process timed out")
self.p.terminate()
thread.join()
self.output = self.p.stdout #self.p.stdout.read()?
self.status = self.p.returncode
def getOutput(self):
return self.output
def getStatus(self):
return self.status
Here's the entire back trace.
Exception in thread Thread-1:
Traceback (most recent call last):
File "/usr/lib/python2.7/threading.py", line 552, in __bootstrap_inner
self.run()
File "/usr/lib/python2.7/threading.py", line 505, in run
self.__target(*self.__args, **self.__kwargs)
File "myTest.py", line 18, in target
self.p = sp.Popen(self.cmd, stdin=PIPE,
NameError: global name 'PIPE' is not defined
Traceback (most recent call last):
File "myTest.py", line 98, in <module>
c = mydd.ddmin(deltas) # Invoke DDMIN
File "/home/DD.py", line 713, in ddmin
return self.ddgen(c, 1, 0)
File "/home/DD.py", line 605, in ddgen
outcome = self._dd(c, n)
File "/home/DD.py", line 615, in _dd
assert self.test([]) == self.PASS
File "/home/DD.py", line 311, in test
outcome = self._test(c)
File "DD.py", line 59, in _test
test.run(3)
File "DD.py", line 30, in run
self.status = self.p.returncode
AttributeError: 'RunCmd' object has no attribute 'p'
What does this error mean and what is it trying to tell me?
You didn't give all the error messages. The code in the thread fails because your call to Popen is wrong, it should be:
def target():
self.p = sp.Popen(self.cmd, stdin=sp.PIPE, stdout=sp.PIPE, stderr=sp.STDOUT)
As the thread fails, it doesn't set the "p" variable, that's why you're getting the error message you're talking about.
How to reproduce this error in Python very simply:
class RunCmd():
def __init__(self):
print(self.p)
r = RunCmd()
Prints:
AttributeError: 'RunCmd' object has no attribute 'p'
What's going on:
You have to learn to read and reason about the code you are dealing with. Verbalize the code like this:
I define a class called RunCmd. It has a constructor called __init__ that takes no parameters. The constructor prints out the local member variable p.
I instantiate a new object (instance) of RunCmd class. The constructor is run, and it tries to access the value of p. No such attribute p exists, so the error message is printed.
The error message means exactly what it says. You need to create something before you can use it. If you don't, this AttributeError will be thrown.
Solutions:
Throw an error earlier on when your variable is not created.
Put the code in a try/catch to stop the program when it's not created.
Test if the variable exists before using it.

Cannot access Queue.Empty: "AttributeError: 'function' object has no attribute 'Empty'"

For some reason I can't access the Queue.Empty exception - what am I doing wrong here?
from multiprocessing import Process, Queue
# ...
try:
action = action_queue.get(False)
print "Action: " + action
except Queue.Empty:
pass
The stack trace:
Traceback (most recent call last):
File "C:\Program Files\Python27\lib\multiprocessing\process.py", line 258,
in _bootstrap
self.run()
File "C:\Program Files\Python27\lib\multiprocessing\process.py", line 114,
in run
self._target(*self._args, **self._kwargs)
File "D:\Development\populate.py", line 39, in permutate
except Queue.Empty: AttributeError: 'function' object has no attribute 'Empty'
The Queue.Empty exception is in the Queue module, not in the multiprocessing.queues.Queue class. The multiprocessing module actually uses the Queue (module) Empty exception class:
from multiprocessing import Queue
from Queue import Empty
q = Queue()
try:
q.get( False )
except Empty:
print "Queue was empty"
If you want to be very explicit and verbose, you can do this:
import multiprocessing
import Queue
q = multiprocessing.Queue()
try:
q.get( False )
except Queue.Empty:
print "Queue was empty"
Favoring the former approach is probably a better idea because there is only one Queue object to worry about and you don't have to wonder if you are working with the class or the module as in my second example.

Categories