time.sleep that allows parent application to still evaluate? - python

I've run into situations as of late when writing scripts for both Maya and Houdini where I need to wait for aspects of the GUI to update before I can call the rest of my Python code. I was thinking calling time.sleep in both situations would have fixed my problem, but it seems that time.sleep just holds up the parent application as well. This means my script evaluates the exact same regardless of whether or not the sleep is in there, it just pauses part way through.
I have a thought to run my script in a separate thread in Python to see if that will free up the application to still run during the sleep, but I haven't had time to test this yet.
Thought I would ask in the meantime if anybody knows of some other solution to this scenario.

Maya - or more precisely Maya Python - is not really multithreaded (Python itself has a dodgy kind of multithreading because all threads fight for the dread global interpreter lock, but that's not your problem here). You can run threaded code just fine in Maya using the threading module; try:
import time
import threading
def test():
for n in range (0, 10):
print "hello"
time.sleep(1)
t = threading.Thread(target = test)
t.start()
That will print 'hello' to your listener 10 times at one second intervals without shutting down interactivity.
Unfortunately, many parts of maya - including most notably ALL user created UI and most kinds of scene manipulation - can only be run from the "main" thread - the one that owns the maya UI. So, you could not do a script to change the contents of a text box in a window using the technique above (to make it worse, you'll get misleading error messages - code that works when you run it from the listener but errors when you call it from the thread and politely returns completely wrong error codes). You can do things like network communication, writing to a file, or long calculations in a separate thread no problem - but UI work and many common scene tasks will fail if you try to do them from a thread.
Maya has a partial workaround for this in the maya.utils module. You can use the functions executeDeferred and executeInMainThreadWithResult. These will wait for an idle time to run (which means, for example, that they won't run if you're playing back an animation) and then fire as if you'd done them in the main thread. The example from the maya docs give the idea:
import maya.utils import maya.cmds
def doSphere( radius ):
maya.cmds.sphere( radius=radius )
maya.utils.executeInMainThreadWithResult( doSphere, 5.0 )
This gets you most of what you want but you need to think carefully about how to break up your task into threading-friendly chunks. And, of course, running threaded programs is always harder than the single-threaded alternative, you need to design the code so that things wont break if another thread messes with a variable while you're working. Good parallel programming is a whole big kettle of fish, although boils down to a couple of basic ideas:
1) establish exclusive control over objects (for short operations) using RLocks when needed
2) put shared data into safe containers, like Queue in #dylan's example
3) be really clear about what objects are shareable (they should be few!) and which aren't
Here's decent (long) overview.
As for Houdini, i don't know for sure but this article makes it sound like similar issues arise there.

A better solution, rather than sleep, is a while loop. Set up a while loop to check a shared value (or even a thread-safe structure like a Queue). The parent processes that your waiting on can do their work (or children, it's not important who spawns what) and when they finish their work, they send a true/false/0/1/whatever to the Queue/variable letting the other processes know that they may continue.

Related

Stop multithreaded Python script on Windows

I have troubles with a simple multithreaded Python looping program. It should loop infinitely and stop with Ctrl+C. Here is an implementation using threading:
from threading import Thread, Event
from time import sleep
stop = Event()
def loop():
while not stop.is_set():
print("looping")
sleep(2)
try:
thread = Thread(target=loop)
thread.start()
thread.join()
except KeyboardInterrupt:
print("stopping")
stop.set()
This MWE is extracted from a more complex code (obviously, I do not need multithreading to create an infinite loop).
It works as expected on Linux, but not on Windows: the Ctrl+C event is not intercepted and the loop continues infinitely. According to the Python Dev mailing list, the different behaviors are due to the way Ctrl+C is handled by the two OSs.
So, it appears that one cannot simply rely on Ctrl+C with threading on Windows. My question is: what are the other ways to stop a multithreaded Python script on this OS with Ctrl+C?
As explained by Nathaniel J. Smith in the link from your question, at least as of CPython 3.7, Ctrl-C cannot wake your main thread on Windows:
The end result is that on Windows, control-C almost never works to
wake up a blocked Python process, with a few special exceptions where
someone did the work to implement this. On Python 2 the only functions
that have this implemented are time.sleep() and
multiprocessing.Semaphore.acquire; on Python 3 there are a few more
(you can grep the source for _PyOS_SigintEvent to find them), but
Thread.join isn't one of them.
So, what can you do?
One option is to just not use Ctrl-C to kill your program, and instead use something that calls, e.g., TerminateProcess, such as the builtin taskkill tool, or a Python script using the os module. But you don't want that.
And obviously, waiting until they come up with a fix in Python 3.8 or 3.9 or never before you can Ctrl-C your program is not acceptable.
So, the only thing you can do is not block the main thread on Thread.join, or anything else non-interruptable.
The quick&dirty solution is to just poll join with a timeout:
while thread.is_alive():
thread.join(0.2)
Now, your program is briefly interruptable while it's doing the while loop and calling is_alive, before going back to an uninterruptable sleep for another 200ms. Any Ctrl-C that comes in during that 200ms will just wait for you to process it, so that isn't a problem.
Except that 200ms is already long enough to be noticeable and maybe annoying.
And it may be too short as well as too long. Sure, it's not wasting much CPU to wake up every 200ms and execute a handful of Python bytecodes, but it's not nothing, and it's still getting a timeslice in the scheduler, and that may be enough to, e.g., keep a laptop from going into one of its long-term low-power modes.
The clean solution is to find another function to block on. As Nathaniel J. Smith says:
you can grep the source for _PyOS_SigintEvent to find them
But there may not be anything that fits very well. It's hard to imagine how you'd design your program to block on multiprocessing.Semaphore.acquire in a way that wouldn't be horribly confusing to the reader…
In that case, you might want to drag in the Win32 API directly, whether via PyWin32 or ctypes. Look at how functions like time.sleep and multiprocessing.Semaphore.acquire manage to be interruptible, block on whatever they're using, and have your thread signal whatever it is you're blocking on at exit.
If you're willing to use undocumented internals of CPython, it looks like, at least in 3.7, the hidden _winapi module has a wrapper function around WaitForMultipleObjects that appends the magic _PyOSSigintEvent for you when you're doing a wait-first rather than wait-all.
One of the things you can pass to WaitForMultipleObjects is a Win32 thread handle, which has the same effect as a join, although I'm not sure if there's an easy way to get the thread handle out of a Python thread.
Alternatively, you can manually create some kind of kernel sync object (I don't know the _winapi module very well, and I don't have a Windows system, so you'll probably have to read the source yourself, or at least help it in the interactive interpreter, to see what wrappers it offers), WaitForMultipleObjects on that, and have the thread signal it.

Pause Execution in Python

I am implementing a Python plugin that is part of a larger C++ program. The goal of this program is to allow the user to input a command's actions in Python. It currently receives a string from the C++ function and runs it via the exec() function. The user can then use an API to affect changes on the larger C++ program.
The current feature I am working on is a pause execution feature. It needs to remember where it is in the code execution as well as the state of any local variables, and resume execution once a condition has been met. I am not very familiar with Python, and I would like some advice how to implement this feature. My first design ideas:
1) Using the yield command.
This seemed to be a good idea at the start since when you use the next command it remembers everything I needed it to, but the problem is that yield only returns to the previous level in the call stack as far as I can tell. So if the user calls a function that yields it will simply return to the user's code, and not the larger C++ program. As far as I can tell there isn't a way to propagate the yield command up the stack???
2) Threading
Create a main python thread that creates a thread for each command. This main thread would spawn a thread for each command executed and kill it when it is done. If it needs to be suspended and restarted it could do so through a queue of locks.
Those were the only two options I came up with. I am not sure the yield function would work or is what it was designed to do. I think the Threading approach would work but might be overkill, and take a long time to develop. I also was looking for some sort of Task Module in Python, but couldn't find exactly what I was looking for. I was wondering if anyone has any other suggestions as I am not very familiar with Python.
EDIT: As mentioned in the comments I did not explain what needs to happen when the script "Pauses". The python plugin needs to allow the C++ program to continue execution. In my mind this means A) returning if we are talking about a single threaded approach, or B) Sending a message(Function call?) to C++
EDIT EDIT: As stated I didn't fully explain the problem description. I will make another post that has a better statement of what currently exists, and what needs to happen as well as providing some sudo code. I am new to Stack Overflow, so if this is not the appropriate response please let me know.
Whenever a signal is sent in Python, execution is immediately paused until whatever signal handler function is being used is finished executing; at that point, the execution continues right where it left off. My suggestion would be to use one of the user-defined signals (signal.SIGUSR1 and signal.SIGUSR2). Take a look at the signal documentation here:
https://docs.python.org/2/library/signal.html
At the beginning of the program, you'd define a signal handler function like so:
def signal_pause(signum, frame):
if signum == signal.SIGUSR1:
# Do your pause here - function processing, etc
else:
pass
Then in the main program somewhere, you'll switch out the default signal handler for the one you just created:
signal.signal(signal.SIGUSR1, signal_pause)
And finally, whenever you want to pause, you'll send the SIGUSR1 signal like so:
os.kill(os.getpid(),signal.SIGUSR1)
Your code will immediately pause, saving its state, and head to the signal_pause function to do whatever you need to do. Once that function exits, normal program execution will resume.
EDIT: this assumes you want to do something sophisticated while you're pausing the program. If all you want to do is wait a few seconds or ask for some user input, there are some much easier ways (time.sleep or input respectively).
EDIT EDIT: this assumes you're on a Unix system.
If you need to communicate with a C program, then sockets are probably the way to go.
https://docs.python.org/2/library/socket.html
One of your two programs acts as the socket server, and the other connects to it as the socket client. When you want the C++ program to continue, you use socket.send() to transmit a continue message. Then your Python program would use socket.recv(), which will cause it to wait around until it receives a message back from the C++ program.
If you need two programs to send signals to each other, this is probably the safest way to go about it.

How to use python (maya) multithreading

I've been looking at examples from other people but I can't seem to get it to work properly.
It'll either use a single core, or basically freeze up maya if given too much to process, but I never seem to get more than one core working at once.
So for example, this is kind of what I'd like it to do, on a very basic level. Mainly just let each loop run simultaneously on a different processor with the different values (in this case, the two values would use two processors)
mylist = [50, 100, 23]
newvalue = [50,51]
for j in range(0, len(newvalue)):
exists = False
for i in range(0, len(mylist)):
#search list
if newvalue[j] == mylist[i]:
exists = True
#add to list
if exists == True:
mylist.append(mylist)
Would it be possible to pull this off? The actual code I'm wanting to use it on can take from a few seconds to like 10 minutes for each loop, but they could theoretically all run at once, so I thought multithreading would speed it up loads
Bear in mind I'm still relatively new to python so an example would be really appreciated
Cheers :)
There are really two different answers to this.
Maya scripts are really supposed to run in the main UI thread, and there are lots of ways they can trip you up if run from a separate thread. Maya includes a module called maya.utils which includes methods for deferred evaluation in the main thread. Here's a simple example:
import maya.cmds as cmds
import maya.utils as utils
import threading
def do_in_main():
utils.executeDeferred (cmds.sphere)
for i in range(10):
t = threading.Thread(target=do_in_main, args=())
t.start()
That will allow you to do things with the maya ui from a separate thread (there's another method in utils that will allow the calling thread to await a response too). Here's a link to the maya documentation on this module
However, this doesn't get you around the second aspect of the question. Maya python isn't going to split up the job among processors for you: threading will let you create separate threads but they all share the same python intepreter and the global interpreter lock will mean that they end up waiting for it rather than running along independently.
You can't use the multiprocessing module, at least not AFAIK, since it spawns new mayas rather than pushing script execution out into other processors in the Maya you are running within. Python aside, Maya is an old program and not very multi-core oriented in any case. Try XSI :)
Any threading stuff in Maya is tricky in any case - if you touch the main application (basically, any function from the API or a maya.whatever module) without the deferred execution above, you'll probably crash maya. Only use it if you have to.
And, BTW, you cant use executeDeferred, etc in batch mode since they are implemented using the main UI loop.
What theodox says is still true today, six years later. However one may go another route by spawning a new process by using the subprocess module. You'll have to communicate and share data via sockets or something similar since the new process is in a seperate interpreter. The new interpreter runs on its own and doesn't know about Maya but you can do any other work in it benefitting from the multi-threaded environment your OS provides before communicating it back to your Maya python script.

using python sched module and enterabs to make a function run at a certain time

I cannot seem to find a simple example of how to schedule an event in Python.
I want to be able to pass a date and time string as an argument into a function.
For example:
String: "m/d/Y HH:MM" would set the time for a future function to run, after the code has been executed. So, like a function that is waiting to go off after I run it.
It seems like the main problem is formatting the string correctly, but a simple example would really help to see how to 'schedule' a function to run.
You don give enough context to understand what are you trying to do in a larger frame - but, generally speaking - "this is not how it works" in Python.
An "ordinary" Python program is a single-threaded, synchronous program - it will run one task, after another, after another, when everything is done, the program exits, and the interpreter exits along with it.
so, something along (with a fictitious "schedule" function):
def main():
print("Hello World")
schedule(60, main)
main()
would not work in Python, if the call to schedule would return immediately - the main function would exit, and the program would try to resume after the main() call, and terminate. There needs to be a piece of code left running, which can count time, and delays, maybe receive network or user generated events, and dispatch them to previously arranged callback functions in order for a program to keep running.
Such a piece of code, which can account for time and dispatch calls, is usually called a "reactor" - and there is none running in a plain Python program. Unlike, say, in a JavaScript program, where the browser, or other JavaScript environment provides such hosting by default.
That is why most Python web or network frameworks, all GUI toolkits, provide such a core - it is usually called at the end of the one main's script and is a method or function named mainloop or serve_forever, start and so on. From that point on, your main script, which had set the appropriate callbacks, scheduled things and so on, stops - the reactor will be the piece of code calling things.
That is where I say your question misses the context of what you want to do: at first you just want to test some scheduling - but afterwards you will want that inside a larger system - that system should be built using an appropriated framework for your "real task" at hand, for example Django, tornado, pyramid, if it is a web-server system, gtk, Qt, Tk if it is a GUI program, PyOgre, kivy, pyglet if it is a multimedia program, twisted for a generic network server of another protocol, or some other thing, like celery or camaelia - these are only general examples.
That said, Python's standard library does offer a "generic" scheduler function - it does implement such a loop, with the bare core of functionality. If you are doing nothing else, and nothing fancy, it will block there until it reaches the time to call your scheduled function, at which point it will exit, and resume the control to your main program. If your called function schedule other things, it will continue running, and so on.
See the documentation and example at:
http://docs.python.org/2/library/sched.html
You can use functions from the datetime module instead of time.time to set r absolute timings as you asking for. Also check the documentation there to threading.Timer - which in a naively way can do more or less what you have in mind, if you want to run a simple function after a given delay, in parallel to whatever other code is running and don't want to rewrite your application to be event based - but simpler as it may seen, it will have many drawbacks in a larger system - you should pick one of the frameworks listed.

How to implement pause (and more) functionality?

My apologies beforehand for the length of the question, I didn't want to leave anything out.
Some background information
I'm trying to automate a data entry process by writing a Python application that uses the Windows API to simulate keystrokes, mouse movement and window/control manipulation. I have to resort to this method because I do not (yet) have the security clearance required to access the datastore/database directly (e.g. using SQL) or indirectly through a better suited API. Bureaucracy, it's a pain ;-)
The data entry process involves the correction of sales orders due to changes in article availability. The unavailable articles are either removed from the order or replaced by another suitable article.
Initially I want a human to be able to monitor the automatic data entry process to make sure everything goes right. To achieve this I slow down the actions on the one hand but also inform the user of what is currently going on through a pinned window.
The actual question
To allow the user to halt the automation process I'm registering the Pause/Break key as a hotkey and in the handler I want to pause the automation functionality. However, I'm currently struggling to figure out a way to properly pause the execution of the automation functionality. When the pause function is invoked I want the automation process to stop dead in its tracks, no matter what it is doing. I don't want it to even execute another keystroke.
UPDATE [23/01]: I actually want to do more than just pause, I want to be able to communicate with the automation process while it is running and request it to pause, skip the current sales order, give up completely and perhaps even more.
Can anybody show me The Right Way (TM) to achieve what I want?
Some more information
Here's an example of how the automation works (I'm using the pywinauto library):
from pywinauto import application
app = application.Application()
app.start_("notepad")
app.Notepad.TypeKeys("abcdef")
UPDATE [25/01]: After a few days of working on my application I've noticed I don't really use pywinauto that much, right now I'm only using it for finding window and then I directly use SendKeysCtypes.SendKeys to simulate keyboard input and win32api functions to simulate mouse input.
What I've found out so far
Here are a few methods I've come across so far in my search for an answer:
I could separate the automation functionality and the interface + hotkey listener in two separate processes. Let's refer to the former as "automator" and the latter as "manager". The manager can then pause the execution of the automator by sending the process a SIGSTOP signal and unpause it using the SIGCONT signal (or the Windows equivalents through SuspendThread/ResumeThread).
To be able to update the user interface the automator will need to inform the manager of its progression through some sort of an IPC mechanism.
Cons:
Would using SIGSTOP not be a little harsh? Would it even work properly? Lots of people seem to be advising against it and even calling it "dangerous".
I am worried that implementing the IPC mechanism is going to be a bit complicated. On the other hand, I have worked with DBus which wouldn't be too hard to implement.
The second method and one that lots of people seem to be suggesting involves using threads and essentially boils down to the following (simplified):
while True:
if self.pause: # pause
# Do the work...
However, doing it this way it seems it will only pause after there is no more work to do. The only way I see this method would work would be to divide the work (the entire automation process) into smaller work segments (i.e. tasks). Before starting on a new task the worker thread would check if it should pause and wait.
Cons:
Seems like an implementation to divide the work into smaller segments, such as the one above, would be very ugly code wise (aesthetically).
The way I imagine it, all statements would be transformed to look something like: queue.put((function, args)) (e.g. queue.put((app.Notepad.TypeKeys, "abcdef"))) and you'd have the automating process thread running through the tasks and continuously checking for the pause state before starting a task. That just can't be right...
The program would not actually stop dead in its tracks, but would first finish a task (however small) before actually pausing.
Progress made
UPDATE [23/01]: I've implemented a version of my application using the first method through the mentioned SuspendThread/ResumeThread functionality. So far this seems to work very nicely and also allows me to write the automation stuff just like you'd write any other script. The only quirk I've come across is that keyboard modifiers (CTRL, ALT, SHIFT) get "stuck" while paused. Something I can probably easily work around.
I've also written a test using the second method (threads and signals/message passing) and implemented the pause functionality. However, it looks really ugly (both checking for the pause flag and everything related to the "doing the work"). So if anybody can show me a proper example of something similar to the second method I'd appreciate it.
Related questions
Pausing a process?
Pausing a thread using threading class
Alex Martelli posted an answer saying:
There is no method for other threads to forcibly pause a thread (any more than there is for other threads to kill that thread) -- the target thread must cooperate by occasionally checking appropriate "flags" (a threading.Condition might be appropriate for the pause/unpause case).
He then referred to the multiprocessing module and SIGSTOP/SIGCONT.
Is there a way to indefinitely pause a thread?
Pausing a process in Windows
An answer to this question quotes the MSDN documentation regarding SuspendThread:
This function is primarily designed for use by debuggers. It is not intended to be used for thread synchronization. Calling SuspendThread on a thread that owns a synchronization object, such as a mutex or critical section, can lead to a deadlock if the calling thread tries to obtain a synchronization object owned by a suspended thread. To avoid this situation, a thread within an application that is not a debugger should signal the other thread to suspend itself. The target thread must be designed to watch for this signal and respond appropriately.
Is there any way to kill a Thread in Python?
How do I pass an exception between threads in python
Keep in mind that although in your level of abstraction, "executing a keystroke" is a single atomic operation, it's implemented on the machine as a rather complicated sequence of machine instructions. So, pausing a thread at arbitrary points could lead to things being in an indeterminate state. Sending SIGSTOP is the same level of dangerous as pausing a thread at an arbitrary point. Depending on where you are in a particular step, though, your automation could potentially be broken. For example, if you pause in the middle of a timing-dependent step.
It seems to me that this problem would be best solved at the level of the automation library. I'm not very familiar with the automation library that you're using. It might be worth contacting the developers of the library to see if they have any suggestions for pausing the execution of automation steps at safe sub-step levels.
I don't know pywinauto. But I'll assume that you have something like an Application class which you obtain and have methods like SendKeys/SendMouseEvent/etc to do things.
Create your own MyApplication class which holds a reference to pywinauto's application class. Provide the same methods but before each method check whether a pause event has occurred. If it has, you can jump into code which handles the pause event. That way you are checking for a pause every time you cause an event, but this all is handled by the one class without putting pause all over your code.
Once you've detected the pause you can handle it any way you like. For example, you can throw an exception to force giving up on the current task.
Separating the functionality and the interface thread/process is definately the best option imho, the second solution is quicker and easier but definately not better.
Perhaps using multiple threads and an exception would be a better idea than using multiple processes. But if you're using multiple processes than SIGSTOP might be your only way to get it to work.
Is there anything against using 2 threads for this?
1 thread for actually executing
1 thread for reading the user input
I use Python but not pywinauto; for this sort of tasks I use AutoHotKey . One way to implement a simple pause in an AutoHotkey script may be using a "toggle" key like ScrollLock and testing the key state in the script. Also, the script can restore the key state after switching the internal pause setting on / off.

Categories