python checking on thread started in different class - python

I'm curious how can I get the status of thread that have been started in a separate class in python.
So currently I have:
class VideoCapture:
def record:
Thread(name='uploading', target=self.upload, args=(upload_queue)).start()
In a seperate file main.py I have an instance of VideoCapture.
I want to be able to check the status of the thread "uploading" by typing something like VideoCapture.uploading.isAlive(). However I get the error that VideoCapture has no object uploading. So how can I access it?

Store the thread as something that is part of the class, then after that you have a means of accessing it later on.
class VideoCapture:
def __init__():
self.uploading = None
def record:
self.uploading = Thread(name='uploading', target=self.upload, args=(upload_queue)).start()
Now somewhere else you have:
video_capture = VideoCapture()
video_capture.record()
if video_capture.uploading.isAlive():
# do something

There's method is_alive() in Thread object, so basically you need just use it:
class VideoCapture:
def record(self):
# don't forget to add it on __init__
self.uploading = Thread(name='uploading', target=self.upload, args=(upload_queue)).start()
tmp = VideoCapture()
tmp.record()
tmp.uploading.is_alive() # here it is

Related

Python multiprocessing.Pool.apply_async() not executing class function

In a custom class I have the following code:
class CustomClass():
triggerQueue: multiprocessing.Queue
def __init__(self):
self.triggerQueue = multiprocessing.Queue()
def poolFunc(queueString):
print(queueString)
def listenerFunc(self):
pool = multiprocessing.Pool(5)
while True:
try:
queueString = self.triggerQueue.get_nowait()
pool.apply_async(func=self.poolFunc, args=(queueString,))
except queue.Empty:
break
What I intend to do is:
add a trigger to the queue (not implemented in this snippet) -> works as intended
run an endless loop within the listenerFunc that reads all triggers from the queue (if any are found) -> works as intended
pass trigger to poolFunc which is to be executed asynchronosly -> not working
It works as soon as I source my poolFun() outside of the class like
def poolFunc(queueString):
print(queueString)
class CustomClass():
[...]
But why is that so? Do I have to pass the self argument somehow? Is it impossible to perform it this way in general?
Thank you for any hint!
There are several problems going on here.
Your instance method, poolFunc, is missing a self parameter.
You are never properly terminating the Pool. You should take advantage of the fact that a multiprocessing.Pool object is a context manager.
You're calling apply_async, but you're never waiting for the results. Read the documentation: you need to call the get method on the AsyncResult object to receive the result; if you don't do this before your program exits your poolFunc function may never run.
By making the Queue object part of your class, you won't be able to pass instance methods to workers.
We can fix all of the above like this:
import multiprocessing
import queue
triggerQueue = multiprocessing.Queue()
class CustomClass:
def poolFunc(self, queueString):
print(queueString)
def listenerFunc(self):
results = []
with multiprocessing.Pool(5) as pool:
while True:
try:
queueString = triggerQueue.get_nowait()
results.append(pool.apply_async(self.poolFunc, (queueString,)))
except queue.Empty:
break
for res in results:
print(res.get())
c = CustomClass()
for i in range(10):
triggerQueue.put(f"testval{i}")
c.listenerFunc()
You can, as you mention, also replace your instance method with a static method, in which case we can keep triggerQueue as part of the class:
import multiprocessing
import queue
class CustomClass:
def __init__(self):
self.triggerQueue = multiprocessing.Queue()
#staticmethod
def poolFunc(queueString):
print(queueString)
def listenerFunc(self):
results = []
with multiprocessing.Pool(5) as pool:
while True:
try:
queueString = self.triggerQueue.get_nowait()
results.append(pool.apply_async(self.poolFunc, (queueString,)))
except queue.Empty:
break
for r in results:
print(r.get())
c = CustomClass()
for i in range(10):
c.triggerQueue.put(f"testval{i}")
c.listenerFunc()
But we still need to reap the pool_async results.
Okay, I found an answer and a workaround:
the answer is based the anser of noxdafox to this question.
Instance methods cannot be serialized that easily. What the Pickle protocol does when serialising a function is simply turning it into a string.
For a child process would be quite hard to find the right object your instance method is referring to due to separate process address spaces.
A functioning workaround is to declare the poolFunc() as static function like
#staticmethod
def poolFunc(queueString):
print(queueString)

Instance sharing across classes

I have this python script which has a listener class and a main class. A serial stream is created in the main class and a listener instance is created in the main class. The whole purpose of the listener is to send a message on the serial port when a listened property has changed. The problem is that the listener doesn't have access to the output stream created in the main class. The listener does an abrupt return when trying to execute the outputStream.write statement How can I give the listener access to the output stream?
import purejavacomm
import java.beansSensorListener
class MyListener(java.beans.PropertyChangeListener):
def propertyChange(self, event):
if (< some property has changed >) :
self.outputStream.write(message) # send notice on serial port
return
class MainClass(jmri.jmrit.automat.AbstractAutomaton) :
def __init__(self) :
self.portID = purejavacomm.CommPortIdentifier.getPortIdentifier("COM3")
self.port = self.portID.open("SerialCom", 50)
self.outputStream = self.port.getOutputStream()
return
def init(self) :
myListener = MyListener()
deviceList = devices.getNamedBeanSet()
for device in deviceList :
device.addPropertyChangeListener(myListener)
return
a = MainClass()
a.start();
It seems that really MyListener should either have a stream passed into the constructor or the callback (using lambda expression).
First, which is probably the cleanest way, would look like this:
def init(self) :
myListener = MyListener(self.outputStream)
Second version: device.addPropertyChangeListener(lambda x: myListener(x, self.outputStream)) Both need changes in the MyListener class to either constructor, or definition of propertyChange.
If you are set of sharing an instance of here are two options.
If you can modify arguments to MyListener constructor you pass in instance of MainClass to it:
def init(self) :
myListener = MyListener(self)
If not, a little bit more hacky way of doing this is by making MainClass a singelton (see example here), and then calling it's constructor within MyListener

Dump data from JSON file to a class variable and access data from outside the class

I'm trying to store some data from a JSON file into a class variable that can be accessed by any other functions outside the class. The data in the JSON file will always be different, so I'm using a thread.
This is an example of my code.
class fruitStand(threading.Thread):
fruitAmount = []
def __init__(self):
threading.Thread.__init__(self)
def run(self):
with open("fruitDatafile.json", "r") as fruitDatafile:
fruitData = json.load(fruitDatafile)
fruitValue = [
fruitData["apples"],
fruitData["pears"],
fruitData["watermelons"],
fruitData["lemons"]]
while True:
self.fruitAmount.clear()
for item in fruitValue:
self.fruitAmount.append(item)
time.sleep(3100)
fStand = fruitStand()
fStand.start()
print(fStand.fruitAmount)
The data seems to be stored correctly into the fruitAmount variable, but when I try to access it from outside the class, it shows as if there's nothing in it.
print(fStand.fruitAmount) is being run before the thread is finished so it is empty. try printing it inside the run method or wait till the thread is complete using
fStand.join() method

Passing arguments from one class to another class using threading python

I'm new to threading and python. I would like to understand how to pass multiple arguments from one class to another class in python using threading.
I'm using a main thread to call a class- Process then inside the run I'm doing some business logic and calling another class- build using thread and passing multiple arguments.
The run of build class is getting executed but Inside the build class, I'm unable to access those arguments and hence not able to proceed further.
Not sure if my approach is right? Any suggestions will be appreciated.
Below is my main class :
from threading import Thread
import logging as log
from process import Process
if __name__ == '__main__':
try:
proc = Process()
proc.start()
except Exception as e:
#log some error
Inside Process:
#all the dependencies are imported
class Process(Thread):
'''
classdocs
'''
def __init__(self):
'''
Constructor
'''
Thread.__init__(self)
#other intializations
def run(self):
#some other logic
self.notification(pass_some_data)
#inside notification I'm calling another thread
def notification(self,passed_data):
#passed data is converted dict1
#tup1 is being formed from another function.
#build is a class, and if i don't pass None, i get groupname error.
th = build(None,(tup1,),(dict1,))
th.start()
#inside build
class build(Thread):
def _init_(self,tup1,dict1):
super(build,self).__init__(self)
self.tup1 = tup1
self.dict1 = dict1
def run(self):
#some business logic
#I'm unable to get the arguments being passed here.

Methods on descriptors

I'm trying to implement a wrapper around a redis database that does some bookkeeping, and I thought about using descriptors. I have an object with a bunch of fields: frames, failures, etc., and I need to be able to get, set, and increment the field as needed. I've tried to implement an Int-Like descriptor:
class IntType(object):
def __get__(self,instance,owner):
# issue a GET database command
return db.get(my_val)
def __set__(self,instance,val):
# issue a SET database command
db.set(instance.name,val)
def increment(self,instance,count):
# issue an INCRBY database command
db.hincrby(instance.name,count)
class Stream:
_prefix = 'stream'
frames = IntType()
failures = IntType()
uuid = StringType()
s = Stream()
s.frames.increment(1) # float' object has no attribute 'increment'
Is seems like I can't access the increment() method in my descriptor. I can't have increment be defined in the object that the __get__ returns. This would require an additional db query if all I want to do is increment! I also don't want increment() on the Stream class, as later on when I want to have additional fields like strings or sets in Stream, then I'd need to type check the heck out of everything.
Does this work?
class Stream:
_prefix = 'stream'
def __init__(self):
self.frames = IntType()
self.failures = IntType()
self.uuid = StringType()
Why not define the magic method iadd as well as get and set. This will allow you to do normal addition with assignment on the class. It will also mean you can treat the increment separately from the get function and thereby minimise the database accesses.
So change:
def increment(self,instance,count):
# issue an INCRBY database command
db.hincrby(instance.name,count)
to:
def __iadd__(self,other):
# your code goes here
Try this:
class IntType(object):
def __get__(self,instance,owner):
class IntValue():
def increment(self,count):
# issue an INCRBY database command
db.hincrby(self.name,count)
def getValue(self):
# issue a GET database command
return db.get(my_val)
return IntValue()
def __set__(self,instance,val):
# issue a SET database command
db.set(instance.name,val)

Categories