Thread error in Python & PyQt - python

I noticed that when the function setModel is executed in parallel thread (I tried threading.Timer or threading.thread), I get this:
QObject: Cannot create children for a parent that is in a different thread.
(Parent is QHeaderView(0x1c93ed0), parent's thread is QThread(0xb179c0), current thread is QThread(0x23dce38)
QObject::startTimer: timers cannot be started from another thread
QObject: Cannot create children for a parent that is in a different thread.
(Parent is QTreeView(0xc65060), parent's thread is QThread(0xb179c0), current thread is QThread(0x23dce38)
QObject::startTimer: timers cannot be started from another thread
Is there any way to solve this?

It is indeed a fact of life that multithreaded use of Qt (and other rich frameworks) is a delicate and difficult job, requiring explicit attention and care -- see Qt's docs for an excellent coverage of the subject (for readers experienced in threading in general, with suggested readings for those who yet aren't).
If you possibly can, I would suggest what I always suggest as the soundest architecture for threading in Python: let each subsystem be owned and used by a single dedicated thread; communicate among threads via instances of Queue.Queue, i.e., by message passing. This approach can be a bit restrictive, but it provides a good foundation on which specifically identified and carefully architected exceptions (based on thread pools, occasional new threads being spawned, locks, condition variables, and other such finicky things;-). In the latter category I would also classify Qt-specific things such as cross-thread signal/slot communication via queued connections.

Looks like you've stumped on a Qt limitation there. Try using signals or events if you need objects to communicate across threads.
Or ask the Qt folk about this. It doesn't seem specific to PyQt.

Related

Compatibility Between PyPubSub and PyQt

I have been trying to find the most elegant way to decouple my programs from the GUI, such that I can change my front-end without needing to re-write a whole lot of code.
I work with threads a lot, so I often have the need to notify the main GUI thread of asynchronous happenings either through events (for wxPython) or signals (for PyQt). I have experimented a bit with PyPubSub, which may be what I am looking for, but while there are tons of wxPython examples (since it was originally included with it in early development).
I am not aware if there is a 'proper' way to use it with PyQt without running into race conditions. If anyone has some insight on this, I would really appreciate it!
PyPubSub's sendMessage() will call listeners in the same thread as the sender (default Python behavior). In a multithreaded GUI app, you must ensure that listeners that interact with GUI are called in the main thread. Also threads execute independently, so you need each thread to call its own listeners, based on a timed or idle callback mechanism.
The way to call listeners in the correct thread in PyQt is via signals. PyPubSub can still be used in a multithreaded PyQt GUI, but the mechanism used to transfer the "message" from sender to listener would have to be via a signal. There isn't one best way to do it I don't think, depends on details of your app design. You could for example have a QtPubsubHandler that derives from QObject and gets created in main thread, and a QtPubsubSender class that also derives from QObject and gets created in each worker thread. The QtPubSubSender defines a custom signal, say pubsub, which QtPubsubHandler connects to. Then to send a message, the sender does qtPubsubHandler.sendMessage(topic, **data), which causes a pubsub signal to get emitted, which Qt properly queues and eventually signals the QtPubsubHandler, which actually calls pub.sendMessage().
There are many other ways, but you get the general idea: two classes that derive from QObject, and one of them does the actual sending in the same thread as the intended listener, the other uses a signal so everything is thread safe. Actually you don't have to use PyQt signals: but then you would have to have a queue in main thread and have an idle callback that allows it to process any items on the queue.

Communicating between Processes: Python Multiprocessing

I am using PyQt to create the gui for my application and ran into some trouble using threads for seperate processes, so started to use the multiprocessing.Process class. I was, before, using Signals and slots to communicate between the worker process and the gui, but the SignalInstance class can not be pickled and as far as I know cant be used with Process so I am having to find another way to send a progress report (percent done etc) from the worker process to update a progress bar in the gui. what is the best way of doing this?
Please see this answer, you can share memory between processes using the multiprocessing library. Documentation here (see 16.6.1.4. Sharing state between processes).

python: waiting on multiple objects (Queue, Lock, Condition, etc.)

I'm using the Python threading library. Works fine (subject to the Global Interpreter Lock, of course).
Now I have a condundrum. I have two separate sources of concurrency: either two Queues, or a Queue and a Condition. How can I wait for the first one that is ready? (They have to be separate objects since they are owned by different modular parts of my application.)
Windows has the WaitForMultipleObjects function; is there something similar for Python concurrency primitives?
There is not an already existing function that I know of that you asked about. However there is the threading.enumaerate() which I think just might return a list off all current daemon threads no matter the source. Once you have that list you could iterate over it looking for the condition you want. To set a thread as a daemon each thread has a method that can be called like thread.setDaemon(True) before the thread is started.
I cant say for sure that this is your answer. I don't have as much experience as apparently you do, but I looked this up in a book I have, The Python Standard Library by Example - by Doug Hellmann. He has 23 pages on managing concurrent operations in the section on threading and enumerate seamed to be something that would help.
You could create a new synchronization object (queue, condition, etc.) let's call it the ready_event, and one Thread for each sync object you want to watch. Each thread waits for its sync object to be ready, when a thread's sync object is ready, the thread signals it via the ready_event. after you created and started the threads, you can wait on that ready_event.

A multi-part/threaded downloader via python?

I've seen a few threaded downloaders online, and even a few multi-part downloaders (HTTP).
I haven't seen them together as a class/function.
If any of you have a class/function lying around, that I can just drop into any of my applications where I need to grab multiple files, I'd be much obliged.
If there is there a library/framework (or a program's back-end) that does this, please direct me towards it?
Threadpool by Christopher Arndt may be what you're looking for. I've used this "easy to use object-oriented thread pool framework" for the exact purpose you describe and it works great. See the usage examples at the bottom on the linked page. And it really is easy to use: just define three functions (one of which is an optional exception handler in place of the default handler) and you are on your way.
from http://www.chrisarndt.de/projects/threadpool/:
Object-oriented, reusable design
Provides callback mechanism to process results as they are returned from the worker threads.
WorkRequest objects wrap the tasks assigned to the worker threads and allow for easy passing of arbitrary data to the callbacks.
The use of the Queue class solves most locking issues.
All worker threads are daemonic, so they exit when the main program exits, no need for joining.
Threads start running as soon as you create them. No need to start or stop them. You can increase or decrease the pool size at any time, superfluous threads will just exit when they finish their current task.
You don't need to keep a reference to a thread after you have assigned the last task to it. You just tell it: "don't come back looking for work, when you're done!"
Threads don't eat up cycles while waiting to be assigned a task, they just block when the task queue is empty (though they wake up every few seconds to check whether they are dismissed).
Also available at http://pypi.python.org/pypi/threadpool, easy_install, or as a subversion checkout (see project homepage).

Instance methods called in a separate thread than the instantiation thread

I'm trying to wrap my head around what is happening in this recipe, because I'm planning on implementing a wx/twisted app similar to this (ie. wx and twisted running in separate threads). I understand that both twisted and wx event-loops need to be accessed in a thread-safe manner (ie. reactor.callFromThread, wx.PostEvent, etc). What I am questioning is the thread-safety of passing in instance methods of objects instantiated in one thread (in the case of this recipe, the GUI thread) as deferred callBack and errBack methods for a reactor running in a separate thread. Is that a good idea?
There is a wxreactor available in twisted, but googling reveals that there have been numerous problems with it since it was introduced to the library. Even the person who initially came up with the wxreactor technique, advocates running wx and twisted in separate threads.
I haven't been able to find any other examples of this technique, but I'd love to see some.
I wouldn't say that it's a "good idea". You should just run the reactor and the GUI in the same thread with wxreactor.
The timer-driven event-loop starving approach described by Mr. Schroeder is the worst possible fail-safe way to implement event-loop integration. If you use wxreactor (not wxsupport) Twisted now uses an approach where multiplexing is shunted off to a thread internally so that nothing needs to use a timer. Better yet would be for wxpython to expose wxSocket and have someone base a reactor on it.
However, if you're set on using a separate thread to communicate with Twisted, the one thing to keep in mind is that while you can use objects that originate from any thread you like as the value to pass to Deferred.callback, you must call Deferred.callback only in the reactor thread itself. Deferreds are not threadsafe; thanks to some debugging utilities, not even the Deferred class is threadsafe, so you need to be very careful when you are using them to never leave the Twisted main thread. i.e. when you have a result in the UI thread, use reactor.callFromThread(myDeferred.callback, myresult).
The sole act of passing instance methods between threads is safe as long as you properly synchronize eventual destruction of those instances (threads share memory so it really doesn't matter which one did the allocation/initialization of a bit of it).
The overall thread safety depends on what those methods actually do, so just make them "play nice" and you should be ok.

Categories