I am developing a software as part of my work, as soon as I make a call to the API to fetch data the GUI freezes, at first I understood the problem and transferred the functions to threads, the problem is that once I used the join() function the app froze again.
What I would like is to wait at the same point in the function until the threads end and continue from the same point in the function, is there any way to do this in Python?
threads = []
def call_api(self, query, index, return_dict):
thread = threading.Thread( target=self.get_data, args=(query, index, return_dict))
self.threads.append(thread)
thread.start()
def get_all_tickets(self, platform):
if platform == 'All':
self.call_api(query1, 0, return_dict)
self.call_api(query2, 1, return_dict)
for thread in self.threads:
thread.join()
# The app freezes here
# Is there a way to wait at this point asynchronously until the processes are complete
and continue from that point without the GUI freezing?
One possible option would be to use QThreads finished signal it emits that you can connect to with a slot that contains the remaining logic from your get_all_tickets method.
threads = []
def call_api(self, query, index, return_dict):
thread = QThread()
worker = Worker(query, index, return_dict)
worker.moveToThread(thread)
thread.started.connect(worker.run)
worker.finished.connect(thread.terminate)
thread.finished.connect(self.continue_getting_all_tickets)
self.threads.append(thread)
thread.start()
def get_all_tickets(self, platform):
if platform == 'All':
self.call_api(query1, 0, return_dict)
self.call_api(query2, 1, return_dict)
def continue_getting_all_tickets(self):
# this will be called once for each and every thread created
# if you only want it to run once all the threads have completed
# you could do something like this:
if all([thread.isFinished() for thread in self.threads]):
# ... do something
The worker class could look something like this.
class Worker(QObject):
finished = Signal()
def __init__(self, query, index, return_dict):
super().__init__()
self.query = query
self.index = index
self.return_dict = return_dict
def run(self):
# this is where you would put `get_data` code
# once it is finished it should emit the finished signal
self.finished.emit()
Hopefully this will help you find the right direction.
Related
What's the proper way to tell a looping thread to stop looping?
I have a fairly simple program that pings a specified host in a separate threading.Thread class. In this class it sleeps 60 seconds, the runs again until the application quits.
I'd like to implement a 'Stop' button in my wx.Frame to ask the looping thread to stop. It doesn't need to end the thread right away, it can just stop looping once it wakes up.
Here is my threading class (note: I haven't implemented looping yet, but it would likely fall under the run method in PingAssets)
class PingAssets(threading.Thread):
def __init__(self, threadNum, asset, window):
threading.Thread.__init__(self)
self.threadNum = threadNum
self.window = window
self.asset = asset
def run(self):
config = controller.getConfig()
fmt = config['timefmt']
start_time = datetime.now().strftime(fmt)
try:
if onlinecheck.check_status(self.asset):
status = "online"
else:
status = "offline"
except socket.gaierror:
status = "an invalid asset tag."
msg =("{}: {} is {}. \n".format(start_time, self.asset, status))
wx.CallAfter(self.window.Logger, msg)
And in my wxPyhton Frame I have this function called from a Start button:
def CheckAsset(self, asset):
self.count += 1
thread = PingAssets(self.count, asset, self)
self.threads.append(thread)
thread.start()
Threaded stoppable function
Instead of subclassing threading.Thread, one can modify the function to allow
stopping by a flag.
We need an object, accessible to running function, to which we set the flag to stop running.
We can use threading.currentThread() object.
import threading
import time
def doit(arg):
t = threading.currentThread()
while getattr(t, "do_run", True):
print ("working on %s" % arg)
time.sleep(1)
print("Stopping as you wish.")
def main():
t = threading.Thread(target=doit, args=("task",))
t.start()
time.sleep(5)
t.do_run = False
if __name__ == "__main__":
main()
The trick is, that the running thread can have attached additional properties. The solution builds
on assumptions:
the thread has a property "do_run" with default value True
driving parent process can assign to started thread the property "do_run" to False.
Running the code, we get following output:
$ python stopthread.py
working on task
working on task
working on task
working on task
working on task
Stopping as you wish.
Pill to kill - using Event
Other alternative is to use threading.Event as function argument. It is by
default False, but external process can "set it" (to True) and function can
learn about it using wait(timeout) function.
We can wait with zero timeout, but we can also use it as the sleeping timer (used below).
def doit(stop_event, arg):
while not stop_event.wait(1):
print ("working on %s" % arg)
print("Stopping as you wish.")
def main():
pill2kill = threading.Event()
t = threading.Thread(target=doit, args=(pill2kill, "task"))
t.start()
time.sleep(5)
pill2kill.set()
t.join()
Edit: I tried this in Python 3.6. stop_event.wait() blocks the event (and so the while loop) until release. It does not return a boolean value. Using stop_event.is_set() works instead.
Stopping multiple threads with one pill
Advantage of pill to kill is better seen, if we have to stop multiple threads
at once, as one pill will work for all.
The doit will not change at all, only the main handles the threads a bit differently.
def main():
pill2kill = threading.Event()
tasks = ["task ONE", "task TWO", "task THREE"]
def thread_gen(pill2kill, tasks):
for task in tasks:
t = threading.Thread(target=doit, args=(pill2kill, task))
yield t
threads = list(thread_gen(pill2kill, tasks))
for thread in threads:
thread.start()
time.sleep(5)
pill2kill.set()
for thread in threads:
thread.join()
This has been asked before on Stack. See the following links:
Is there any way to kill a Thread in Python?
Stopping a thread after a certain amount of time
Basically you just need to set up the thread with a stop function that sets a sentinel value that the thread will check. In your case, you'll have the something in your loop check the sentinel value to see if it's changed and if it has, the loop can break and the thread can die.
I read the other questions on Stack but I was still a little confused on communicating across classes. Here is how I approached it:
I use a list to hold all my threads in the __init__ method of my wxFrame class: self.threads = []
As recommended in How to stop a looping thread in Python? I use a signal in my thread class which is set to True when initializing the threading class.
class PingAssets(threading.Thread):
def __init__(self, threadNum, asset, window):
threading.Thread.__init__(self)
self.threadNum = threadNum
self.window = window
self.asset = asset
self.signal = True
def run(self):
while self.signal:
do_stuff()
sleep()
and I can stop these threads by iterating over my threads:
def OnStop(self, e):
for t in self.threads:
t.signal = False
I had a different approach. I've sub-classed a Thread class and in the constructor I've created an Event object. Then I've written custom join() method, which first sets this event and then calls a parent's version of itself.
Here is my class, I'm using for serial port communication in wxPython app:
import wx, threading, serial, Events, Queue
class PumpThread(threading.Thread):
def __init__ (self, port, queue, parent):
super(PumpThread, self).__init__()
self.port = port
self.queue = queue
self.parent = parent
self.serial = serial.Serial()
self.serial.port = self.port
self.serial.timeout = 0.5
self.serial.baudrate = 9600
self.serial.parity = 'N'
self.stopRequest = threading.Event()
def run (self):
try:
self.serial.open()
except Exception, ex:
print ("[ERROR]\tUnable to open port {}".format(self.port))
print ("[ERROR]\t{}\n\n{}".format(ex.message, ex.traceback))
self.stopRequest.set()
else:
print ("[INFO]\tListening port {}".format(self.port))
self.serial.write("FLOW?\r")
while not self.stopRequest.isSet():
msg = ''
if not self.queue.empty():
try:
command = self.queue.get()
self.serial.write(command)
except Queue.Empty:
continue
while self.serial.inWaiting():
char = self.serial.read(1)
if '\r' in char and len(msg) > 1:
char = ''
#~ print('[DATA]\t{}'.format(msg))
event = Events.PumpDataEvent(Events.SERIALRX, wx.ID_ANY, msg)
wx.PostEvent(self.parent, event)
msg = ''
break
msg += char
self.serial.close()
def join (self, timeout=None):
self.stopRequest.set()
super(PumpThread, self).join(timeout)
def SetPort (self, serial):
self.serial = serial
def Write (self, msg):
if self.serial.is_open:
self.queue.put(msg)
else:
print("[ERROR]\tPort {} is not open!".format(self.port))
def Stop(self):
if self.isAlive():
self.join()
The Queue is used for sending messages to the port and main loop takes responses back. I've used no serial.readline() method, because of different end-line char, and I have found the usage of io classes to be too much fuss.
Depends on what you run in that thread.
If that's your code, then you can implement a stop condition (see other answers).
However, if what you want is to run someone else's code, then you should fork and start a process. Like this:
import multiprocessing
proc = multiprocessing.Process(target=your_proc_function, args=())
proc.start()
now, whenever you want to stop that process, send it a SIGTERM like this:
proc.terminate()
proc.join()
And it's not slow: fractions of a second.
Enjoy :)
My solution is:
import threading, time
def a():
t = threading.currentThread()
while getattr(t, "do_run", True):
print('Do something')
time.sleep(1)
def getThreadByName(name):
threads = threading.enumerate() #Threads list
for thread in threads:
if thread.name == name:
return thread
threading.Thread(target=a, name='228').start() #Init thread
t = getThreadByName('228') #Get thread by name
time.sleep(5)
t.do_run = False #Signal to stop thread
t.join()
I find it useful to have a class, derived from threading.Thread, to encapsulate my thread functionality. You simply provide your own main loop in an overridden version of run() in this class. Calling start() arranges for the object’s run() method to be invoked in a separate thread.
Inside the main loop, periodically check whether a threading.Event has been set. Such an event is thread-safe.
Inside this class, you have your own join() method that sets the stop event object before calling the join() method of the base class. It can optionally take a time value to pass to the base class's join() method to ensure your thread is terminated in a short amount of time.
import threading
import time
class MyThread(threading.Thread):
def __init__(self, sleep_time=0.1):
self._stop_event = threading.Event()
self._sleep_time = sleep_time
"""call base class constructor"""
super().__init__()
def run(self):
"""main control loop"""
while not self._stop_event.isSet():
#do work
print("hi")
self._stop_event.wait(self._sleep_time)
def join(self, timeout=None):
"""set stop event and join within a given time period"""
self._stop_event.set()
super().join(timeout)
if __name__ == "__main__":
t = MyThread()
t.start()
time.sleep(5)
t.join(1) #wait 1s max
Having a small sleep inside the main loop before checking the threading.Event is less CPU intensive than looping continuously. You can have a default sleep time (e.g. 0.1s), but you can also pass the value in the constructor.
Sometimes you don't have control over the running target. In those cases you can use signal.pthread_kill to send a stop signal.
from signal import pthread_kill, SIGTSTP
from threading import Thread
from itertools import count
from time import sleep
def target():
for num in count():
print(num)
sleep(1)
thread = Thread(target=target)
thread.start()
sleep(5)
pthread_kill(thread.ident, SIGTSTP)
result
0
1
2
3
4
[14]+ Stopped
I created a small flask app to download images and text from pages, this can take verly long time, so
I would like to execute my requests in parell. I create threaded tasks. I would like this tasks to be able to download text or images from sites. I keep my tasks in a list of workers.
However I would like to select a method which thread will execute and then start whole thread.
How can I pass my method to thread run method()? Will this be a sub daemon thread?
import threading
import time
workers = []
class SavePage:
def get_text(self):
print("Getting text")
def get_images(self):
print("Getting images")
class Task(threading.Thread):
def __init__(self):
super().__init__()
self.save_page = SavePage()
def get_text_from_page(self):
self.save_page.get_text()
def get_images_from_page(self):
self.save_page.get_images()
if __name__ == '__main__':
task = Task()
task.get_images_from_page() # Why this executes, when I didn't put task.start() ?
# Moreover, is this really threaded? or just uses a method from class Task?
workers.append(task) # I want this list to be empty, after job is finished
print("".join(str(worker.is_alive()) for worker in workers)) #
print(workers)
task.get_images_from_page() # Why this executes, when I didn't put task.start() ?
# Moreover, is this really threaded? or just uses a method from class Task?
It's not threaded. It's just a normal method call in the main thread.
Thread.start is the method that will start Thread.run function inside another thread.
You could set some state in __init__ to choose which function to execute:
class Task(threading.Thread):
def __init__(self, action):
super().__init__()
self.save_page = SavePage()
self.action = action
def get_text_from_page(self):
self.save_page.get_text()
def get_images_from_page(self):
self.save_page.get_images()
def run(self):
if self.action == "text":
self.get_text_from_page()
elif self.action == "images":
self.get_images_from_page()
Keep in mind that threads can be run in simpler way by passing target function:
def target_func():
save_page = SavePage()
save_page.get_images()
t = threading.Thread(target=target_func)
t.start()
# or in this simple case:
save_page = SavePage()
t = threading.Thread(target=save_page.get_images)
t.start()
We have an application that executes different queries. It starts up to four threads, and runs the extractions on them.
That part looks like this:
if len(self.threads) == 4:
self.__maxThreadsMsg(base)
return False
else:
self.threads.append(Extractor(self.ui, base))
self.threads[-1].start()
self.__extractionMsg(base)
return True
Our Extractor class inherits QThread:
class Extractor(QThread):
def init(self, ui, base):
QThread.__init__(self)
self.ui = ui
self.base = base
def run(self):
self.run_base(base)
and self.ui is set to Ui_MainWindow():
class Cont(QMainWindow):
def __init__(self, parent=None):
QWidget.__init__(self,parent)
self.ui = Ui_MainWindow()
self.ui.setupUi(self)
There is a specific base that sends data to the user (back to the main window) before proceeding (in this case, a pop-up with two buttons):
#This code is in the main file inside a method, not in the Extractor class
msg_box = QMessagebox()
msg_box.setText('Quantity in base: '.format(n))
msg_box.setInformativeText('Would you like to continue?')
msg_box.setStandardButtons(QMessageBox.Ok | QMessageBox.Cancel)
signal = msg_box.exec_()
How can I pause the thread at a certain point, display the window (which I believe would be returning to the main thread) and return to the worker thread, passing the button clicked event?
I read a bit about signals but it seems confusing as it is my first time dealing with threads.
Edit: After reading this question: Similar question, I altered the code to this:
On a method inside of the Cont class
thread = QThread(self)
worker = Worker()
worker.moveToThread(thread)
worker.bv.connect(self.bv_test)
thread.started.connect(worker.process()) # This, unlike in the linked question..
#doesn't work if I remove the parentheses of the process function.
#If I remove it, nothing happens and I get QThread: "Destroyed while thread is still running"
thread.start()
#pyqtSlot(int)
def bv_test(self, n):
k = QMessageBox()
k.setText('Quantity: {}'.format(n))
k.setStandardButtons(QMessageBox.Yes | QMessageBox.No)
ret = k.exec_()
return ret
and this is the Worker class:
class Worker(QObject):
#Signals
bv = pyqtSignal(int)
def process(self):
self.bv.emit(99)
Now I just need to figure out how to send the ret value back to the worker thread so it starts the second process. I also keep getting this error:
TypeError: connect() slot argument should be a callable or a signal, not 'NoneType'
Below is a simple demo based on the code in your question which does what you want. There is not much to say about it, really, other than that you need to communicate between the worker and the main thread via signals (in both directions). The finished signal is used to quit the thread, which will stop the warning message QThread: "Destroyed while thread is still running" being shown.
The reason why you are seeing the error:
TypeError: connect() slot argument should be a callable or a signal, not `NoneType'
is because you are trying to connect a signal with the return value of a function (which is None), rather than the function object itself. You must always pass a python callable object to the connect method - anything else will raise a TypeError.
Please run the script below and confirm that it works as expected. Hopefully it should be easy to see how to adapt it to work with your real code.
from PyQt4.QtCore import *
from PyQt4.QtGui import *
class Cont(QWidget):
confirmed = pyqtSignal()
def __init__(self):
super(Cont, self).__init__()
self.thread = QThread()
self.worker = Worker()
self.worker.moveToThread(self.thread)
self.worker.bv.connect(self.bv_test)
self.worker.finished.connect(self.thread.quit)
self.confirmed.connect(self.worker.process_two)
self.thread.started.connect(self.worker.process_one)
self.thread.start()
def bv_test(self, n):
k = QMessageBox(self)
k.setAttribute(Qt.WA_DeleteOnClose)
k.setText('Quantity: {}'.format(n))
k.setStandardButtons(QMessageBox.Yes | QMessageBox.No)
if k.exec_() == QMessageBox.Yes:
self.confirmed.emit()
else:
self.thread.quit()
class Worker(QObject):
bv = pyqtSignal(int)
finished = pyqtSignal()
def process_two(self):
print('process: two: started')
QThread.sleep(1)
print('process: two: finished')
self.finished.emit()
def process_one(self):
print('process: one: started')
QThread.sleep(1)
self.bv.emit(99)
print('process: one: finished')
app = QApplication([''])
win = Cont()
win.setGeometry(100, 100, 100, 100)
win.show()
app.exec_()
If you want the thread to wait for the action, connect to a signal from the thread using
PyQt4.QtCore.Qt.BlockingQueuedConnection
as flag.
Now I do not understand why you need threading if you let them wait, which brings in a lot of complexity. For me the better solution would be to cut the task you want to perform in the threads in smaller pieces. Each time a piece is ready, you can ask if the user wants the next too.
I was looking for some good example of managing worker process from Qt GUI created in Python. I need this to be as complete as possible, including reporting progress from the process, including aborting the process, including handling of possible errors coming from the process.
I only found some semi-finished examples which only did part of work but when I tried to make them complete I failed. My current design comes in three layers:
1) there is the main thread in which resides the GUI and ProcessScheduler which controls that only one instance of worker process is running and can abort it
2) there is another thread in which I have ProcessObserver which actually runs the process and understands the stuff coming from queue (which is used for inter-process communication), this must be in non-GUI thread to keep GUI responsive
3) there is the actual worker process which executes a given piece of code (my future intention is to replace multiprocessing with multiprocess or pathos or something else what can pickle function objects, but this is not my current issue) and report progress or result to the queue
Currently I have this snippet (the print functions in the code are just for debugging and will be deleted eventually):
import multiprocessing
from PySide import QtCore, QtGui
QtWidgets = QtGui
N = 10000000
# I would like this to be a function object
# but multiprocessing cannot pickle it :(
# so I will use multiprocess in the future
CODE = """
# calculates sum of numbers from 0 to n-1
# reports percent progress of finished work
sum = 0
progress = -1
for i in range(n):
sum += i
p = i * 100 // n
if p > progress:
queue.put(["progress", p])
progress = p
queue.put(["result", sum])
"""
class EvalProcess(multiprocessing.Process):
def __init__(self, code, symbols):
super(EvalProcess, self).__init__()
self.code= code
self.symbols = symbols # symbols must contain 'queue'
def run(self):
print("EvalProcess started")
exec(self.code, self.symbols)
print("EvalProcess finished")
class ProcessObserver(QtCore.QObject):
"""Resides in worker thread. Its role is to understand
to what is received from the process via the queue."""
progressChanged = QtCore.Signal(float)
finished = QtCore.Signal(object)
def __init__(self, process, queue):
super(ProcessObserver, self).__init__()
self.process = process
self.queue = queue
def run(self):
print("ProcessObserver started")
self.process.start()
try:
while True:
# this loop keeps running and listening to the queue
# even if the process is aborted
result = self.queue.get()
print("received from queue:", result)
if result[0] == "progress":
self.progressChanged.emit(result[1])
elif result[0] == "result":
self.finished.emit(result[1])
break
except Exception as e:
print(e) # QUESTION: WHAT HAPPENS WHEN THE PROCESS FAILS?
self.process.join() # QUESTION: DO I NEED THIS LINE?
print("ProcessObserver finished")
class ProcessScheduler(QtCore.QObject):
"""Resides in the main thread."""
sendText = QtCore.Signal(str)
def __init__(self):
super(ProcessScheduler, self).__init__()
self.observer = None
self.thread = None
self.process = None
self.queue = None
def start(self):
if self.process: # Q: IS THIS OK?
# should kill current process and start a new one
self.abort()
self.queue = multiprocessing.Queue()
self.process = EvalProcess(CODE, {"n": N, "queue": self.queue})
self.thread = QtCore.QThread()
self.observer = ProcessObserver(self.process, self.queue)
self.observer.moveToThread(self.thread)
self.observer.progressChanged.connect(self.onProgressChanged)
self.observer.finished.connect(self.onResultReceived)
self.thread.started.connect(self.observer.run)
self.thread.finished.connect(self.onThreadFinished)
self.thread.start()
self.sendText.emit("Calculation started")
def abort(self):
self.process.terminate()
self.sendText.emit("Aborted.")
self.onThreadFinished()
def onProgressChanged(self, percent):
self.sendText.emit("Progress={}%".format(percent))
def onResultReceived(self, result):
print("onResultReceived called")
self.sendText.emit("Result={}".format(result))
self.thread.quit()
def onThreadFinished(self):
print("onThreadFinished called")
self.thread.deleteLater() # QUESTION: DO I NEED THIS LINE?
self.thread = None
self.observer = None
self.process = None
self.queue = None
if __name__ == '__main__':
app = QtWidgets.QApplication([])
scheduler = ProcessScheduler()
window = QtWidgets.QWidget()
layout = QtWidgets.QVBoxLayout(window)
startButton = QtWidgets.QPushButton("sum(range({}))".format(N))
startButton.pressed.connect(scheduler.start)
layout.addWidget(startButton)
abortButton = QtWidgets.QPushButton("Abort")
abortButton.pressed.connect(scheduler.abort)
layout.addWidget(abortButton)
console = QtWidgets.QPlainTextEdit()
scheduler.sendText.connect(console.appendPlainText)
layout.addWidget(console)
window.show()
app.exec_()
It works kind of OK but it still lacks proper error handling and aborting of process. Especially I am now struggling with the aborting. The main problem is that the worker thread keeps running (in the loop listening to the queue) even if the process has been aborted/terminated in the middle of calculation (or at least it prints this error in the console QThread: Destroyed while thread is still running). Is there a way to solve this? Or any alternative approach? Or, if possible, any real-life and compete example of such task fulfilling all the requirements mentioned above? Any comment would be much appreciated.
I have a thread which extends Thread. The code looks a little like this;
class MyThread(Thread):
def run(self):
# Do stuff
my_threads = []
while has_jobs() and len(my_threads) < 5:
new_thread = MyThread(next_job_details())
new_thread.run()
my_threads.append(new_thread)
for my_thread in my_threads
my_thread.join()
# Do stuff
So here in my pseudo code I check to see if there is any jobs (like a db etc) and if there is some jobs, and if there is less than 5 threads running, create new threads.
So from here, I then check over my threads and this is where I get stuck, I can use .join() but my understanding is that - this then waits until it's finished so if the first thread it checks is still in progress, it then waits till it's done - even if the other threads are finished....
so is there a way to check if a thread is done, then remove it if so?
eg
for my_thread in my_threads:
if my_thread.done():
# process results
del (my_threads[my_thread]) ?? will that work...
As TokenMacGuy says, you should use thread.is_alive() to check if a thread is still running. To remove no longer running threads from your list you can use a list comprehension:
for t in my_threads:
if not t.is_alive():
# get results from thread
t.handled = True
my_threads = [t for t in my_threads if not t.handled]
This avoids the problem of removing items from a list while iterating over it.
mythreads = threading.enumerate()
Enumerate returns a list of all Thread objects still alive.
https://docs.python.org/3.6/library/threading.html
you need to call thread.isAlive()to find out if the thread is still running
The answer has been covered, but for simplicity...
# To filter out finished threads
threads = [t for t in threads if t.is_alive()]
# Same thing but for QThreads (if you are using PyQt)
threads = [t for t in threads if t.isRunning()]
Better way is to use Queue class:
http://docs.python.org/library/queue.html
Look at the good example code in the bottom of documentation page:
def worker():
while True:
item = q.get()
do_work(item)
q.task_done()
q = Queue()
for i in range(num_worker_threads):
t = Thread(target=worker)
t.daemon = True
t.start()
for item in source():
q.put(item)
q.join() # block until all tasks are done
A easy solution to check thread finished or not. It is thread safe
Install pyrvsignal
pip install pyrvsignal
Example:
import time
from threading import Thread
from pyrvsignal import Signal
class MyThread(Thread):
started = Signal()
finished = Signal()
def __init__(self, target, args):
self.target = target
self.args = args
Thread.__init__(self)
def run(self) -> None:
self.started.emit()
self.target(*self.args)
self.finished.emit()
def do_my_work(details):
print(f"Doing work: {details}")
time.sleep(10)
def started_work():
print("Started work")
def finished_work():
print("Work finished")
thread = MyThread(target=do_my_work, args=("testing",))
thread.started.connect(started_work)
thread.finished.connect(finished_work)
thread.start()