I am using subprocess in Python to call an external program on WINDOWS. I control the process with ThreadPool so that I can limit it to max 6 processes at the same time, and new process continuously began when one was done.
Codes as below:
### some codes above
### Code of Subprocess Part
from multiprocessing.pool import ThreadPool as Pool
def FAST_worker(file):
p = subprocess.Popen([r'E:/pyworkspace/FAST/FAST_RV_W64.exe', file],
cwd = r'E:/pyworkspace/FAST/',
shell = True)
p.wait()
# List of *.in filenames
FAST_in_pathname_li = [
'334.in',
'893.in',
'9527.in',
...
'114514.in',
'1919810.in',
]
# Limit max 6 processes at same time
with Pool(processes = 6) as pool:
for result in pool.imap_unordered(FAST_worker, FAST_in_pathname_li):
pass
### some codes below
I got problem when the external program unexpectedly terminated and showed error message pop-up. Though the other 5 processes still kept going, the whole progress will finally get stuck at the "subprocess part" and couldn't go forward anymore. (unless I came to my desk and manually clicked "Shut down the program")
What I want to know is how can I avoid the pop-up and make the whole script process keep going, like bypass the error message or something, rather than manual click, in order to avoid wasting time.
Since we don't know enough about the program FAST_worker is calling, I'll assume you already checked there isn't any "kill on error" or "quiet" mode that would be more convenient to use in a script.
My two cents: maybe you can setup a timeout on the worker execution, so that a stuck process is killed automatically after a certain delay.
Building on the snippet provided here, here is a draft:
from threading import Timer
def FAST_worker(file, timeout_sec):
def kill_proc():
"""called by the Timer thread upon expiration"""
p.kill()
# maybe add task to list of failed task, for tracability
p = subprocess.Popen([r'E:/pyworkspace/FAST/FAST_RV_W64.exe', file],
cwd = r'E:/pyworkspace/FAST/',
shell = True)
# setup timer to kill the process after a timeout
timer = Timer(timeout_sec, kill_proc)
try:
timer.start()
stdout, stderr = p.wait()
finally:
timer.cancel()
Note that there are also GUI automation libraries in python that can do the clicking for you, but that is likely to be more tedious to program:
tutorial for pyAutoGui
SO question on the subject
Related
I am creating a small automation script to interact with a game. The game is very vulnerable to crashing, but in an unpredictable way (after 30 minutes to 3 hours, depending on what happens in game). Because of this, I wrote a small script below that was IDEALLY going to kill the program, kill the crash monitoring client that checks for crashes and offers to restart, and then relaunch the game and resume. The issue is that I never reach past subprocess.run()
it launches the game again, but it does not allow for any code after that to run.
import psutil
import time
import subprocess
def main():
'''Process kill function'''
for proc in psutil.process_iter():
# check whether the process name matches
# print(proc.name())
if any(procstr in proc.name() for procstr in\
['GameName.exe']):
print(f'Killing {proc.name()}')
proc.kill()
time.sleep(5)
for proc in psutil.process_iter():
# check whether the process name matches
# print(proc.name())
if any(procstr in proc.name() for procstr in\
['GameCrashMonitor.exe']):
print(f'Killing {proc.name()}')
proc.kill()
time.sleep(10)
subprocess.run(r'PathToGame.exe')
print(time.time())
if __name__ == "__main__":
main()
This program successfully gets to subprocess.call, launches the game again, and then python hangs. I cant control c on ipython to stop it. I use Spyder and it even makes the icon for spyder on my task bar error and disappear.
Try replacing subprocess.run()
with variable = subprocess.Popen([r'PathToGame.exe'])
https://docs.python.org/3/library/subprocess.html says that subprocess.run() waits for the command to complete, then returns a CompletedProcess instance. Your command isnt completing. Furthermore, there is a timeout argument you could pass into run() but by default, it is None, and does not time out, if you wanted to still use .run()
I'm trying to manage a game server (a server for players to join, I didn't create the game) through a Python module. I noticed, however, that the server stops when the Python script stops to ask for input (from input()). Is there any way around this?
The server is ran as a subprocess:
server = subprocess.Popen("D:\Windows\System32\cmd.exe", stdin=subprocess.PIPE, stdout=subprocess.PIPE) followed by server.stdin.write calls to run the server exe file
The server seems to work fine if ran without a stdout pipe, but I still need to receive output from it without it stopping if possible.
I apologize for the vague question and my lack of python knowledge.
It sounds like you want to do two things:
Service a subprocess's stdout.
Wait for user input on input.
And you need to do them both simultaneously, and in something close to real time—while you block reading from the subprocess, the user can't enter any commands, and while you block reading from user input, the subprocess hangs on stalled pipe.
The simplest way to do this is to just use a thread for each.
Without seeing any code, it's hard to show a good example, but something like this:
def service_proc_stdout(proc):
while True:
buf = proc.stdout.read()
do_proc_stuff(buf)
proc = subprocess.Popen(…)
t = threading.Thread(target=service_proc_stdout, args=(proc,))
t.start()
while True:
command = input()
do_command_stuff(command)
It sounds like your do_command_stuff is writing to proc.stdin. That may just work, but it's possible that proc.stdin may block if you push input into it too fast, preventing you from reading user input. If you need to solve that, just start a third thread:
def service_proc_stdin(q, proc):
while True:
msg = q.get()
proc.stdin.write(msg)
q = queue.Queue()
tstdin = threading.Thread(target=service_proc_stdin, args=(q, proc))
tstdin.start()
… and now, instead of directly calling proc.stdin.write(…), you call q.put(…).
Threads aren't the only way to handle the concurrency here. For example, you could use an asyncio event loop, or a manual selectors loop around non-blocking pipes. But it's probably the simplest change, at least if you don't need to share or pass anything between the threads beyond messages you push onto a queue.
Update 2: So I piped the output of stderr and it looks like when I include shell=True, i just get the help file for omx player (it lists all the command line switches and such). Is it possible that shell=True might not play nicely with omxplayer?
Update: I came across that link before but it failed on me so I moved on without digging deeper. After Tshepang suggested it again I looked into it further. I have two problems, and I'm hoping the first is caused by the second. The first problem is that when I include shell=True as an arg, the video never plays. If I don't include it, the video plays, but is not ever killed. Updated code below.
So I am trying to write a python app for my raspberry pi that plays a video on a loop (I came across Popen as a good way to accomplish this using OMXplayer) and then on keyboard interrupt, it kills that process and opens another process (playing a different video). My eventual goal is to be able to use vid1 as a sort of "screensaver" and have vid2 play when a user interacts with the system, but for now im simply trying to kill vid1 on keyboard input and running into quite the hard time doing it. I'm hoping someone can tell me where my code is falling down.
Forewarning that I'm extremely new to Python, and linux based systems in general, so if im doing this terribly wrong, please feel free to redirect me, but this seemed to be the fastest way to get there.
Here is my code as it stands:
import subprocess
import os
import signal
vid1 = ['omxplayer', '--loop', '/home/pi/Vids/2779832.mp4']
while True:
#vid = subprocess.Popen(['omxplayer', '--loop', '/home/pi/Vids/2779832.mp4'], stdout=subprocess.PIPE, shell=True, preexec_fn=os.setsid)
vid = subprocess.Popen(vid1, stdout=subprocess.PIPE, preexec_fn=os.setsid)
print 'SID is: ', preexec_fn
#vid = subprocess.Popen(['omxplayer', '--loop', '/home/pi/Vids/2779832.mp4'])
id = raw_input()
if not id:
break
os.killpg(vid.pid, signal.SIGTERM)
print "your input: ", id
print "While loop has exited"
So I am trying to write a python app for my raspberry pi that plays a video on a loop (I came across Popen as a good way to accomplish this using OMXplayer) and then on keyboard interrupt, it kills that process and opens another process (playing a different video).
By default, SIGINT is propagated to all processes in the foreground process group, see "How Ctrl+C works". preexec_fn=os.setsid (or os.setpgrp) actually prevents it: use it only if you do not want omxplayer to receive Ctrl+C i.e., use it if you manually call os.killpg when you need to kill a process tree (assuming omxplayer children do not change their process group).
"keyboard interrupt" (sigint signal) is visible as KeyboardInterrupt exception in Python. Your code should catch it:
#!/usr/bin/env python
from subprocess import call, check_call
try:
rc = call(['omxplayer', 'first file'])
except KeyboardInterrupt:
check_call(['omxplayer', 'second file'])
else:
if rc != 0:
raise RuntimeError('omxplayer failed to play the first file, '
'return code: %d' % rc)
The above assumes that omxplayer exits on Ctrl+C.
You could see the help message due to several reasons e.g., omxplayer does not support --loop option (run it manually to check) or you mistakenly use shell=True and pass the command as a list: always pass the command as a single string if you need shell=True and in reverse: always (on POSIX) pass the command as a list of arguments if shell=False (default).
I've been using Python to run a video processing program on a large collection of .mp4 files. The video processing program (which I did not write and can't alter) does not exit once it reaches the final frame of the video, so using os.system(cmd) in a loop going through all the .mp4 files didn't work for me unless I sat there killing the processing program after each video ended.
I tried to solve this using a subprocess that got terminated after the video ended (a predetermined amount of time):
for file in os.listdir(myPath):
if file.endswith(".mp4"):
vidfile = os.path.join(myPath, file)
command = "./Tracking " + vidfile
p = subprocess.Popen(command, shell=True)
sleep(840)
p.terminate()
However, the Tracking program still doesn't exit, so I end up with tons of videos open at the same time. I can only get rid of them by force quitting each separate frame or by using kill -9 id for the id of that particular instance of the program. I've read that using shell=True isn't recommended, but I'm not sure if that would cause this behavior.
How can I kill the Tracking program after a certain amount of time? I'm extremely new to Python and am not sure how to do this. I was considering doing something like os.system("kill -9 id") after the sleep(), but I don't know how to get the id of the program either.
Drop shell=True, use p.kill() to kill the process:
import subprocess
from time import time as timer, sleep
p = subprocess.Popen(["./Tracking", vidfile])
deadline = timer() + 840
while timer() < deadline:
if p.poll() is not None: # already finished
break
sleep(1)
else: # timeout
try:
p.kill()
except EnvironmentError:
pass # ignore errors
p.wait()
If it doesn't help then try to create a new process group and kill it instead. See How to terminate a python subprocess launched with shell=True.
An application that asks for a keyboard interrupt. How can I send for a keyboard interrupt programmatically? I need it for automation.
Like <C-c> or <C-x>
KeyboardInterrupt
Code running in a separate thread can cause a KeyboardInterrupt to be generated in the main thread by calling thread.interrupt_main().
See https://docs.python.org/2/library/thread.html#thread.interrupt_main
Since you mention automation I assume you want a SendKeys for Python. Try this: http://rutherfurd.net/python/sendkeys/
My suggestion to solve this problem is to use the following code pattern. I used it to programmatically start a tensorboard server and shut it down by sending a CTRL-C when the object it belongs to is deleted. Generally speaking, this should work for any example that provokes a subprocess that is supposed to be send a KeyBoardInterrupt:
Import signal and subprocess
import signal
import subprocess
Create the subprocess using subprocess.Popen. Important: set the creationflags parameter to subprocess.CREATE_NEW_PROCESS_GROUP. This is necessary to later be able to send the KeyboardInterrupt event.
command= <enter your command that is supposed to be run in different process as a string>
process = subprocess.Popen(command.split(),stdout=subprocess.PIPE,stdin=subprocess.PIPE,creationflags=subprocess.CREATE_NEW_PROCESS_GROUP)
Wherever you want to send the keyboardInterrupt, do the following:
process.send_signal(signal.CTRL_C_EVENT)
That is it! Please see the the official subprocess documentation for insights on why the creationflags parameter of popen has to be set that way.
This is how the code looks for my example in a less generic way:
import signal
import subprocess
import time
class ExperimentTracker():
def __init__(self):
self.tensorboard_process=None
def __del__(self):
#shutdown tensorboard server and terminate process
self.tensorboard_process.send_signal(signal.CTRL_C_EVENT)
time.sleep(0.2)
self.tensorboard_process.kill()
def launch_tensorboard(self):
#launch tensorboard
bashCommand= "tensorboard --logdir runs"
self.tensorboard_process = subprocess.Popen(bashCommand.split(),stdout=subprocess.PIPE,stdin=subprocess.PIPE,creationflags=subprocess.CREATE_NEW_PROCESS_GROUP)
time.sleep(2) #sleep for 2 seconds to give tensorboard time to be launched
If say you want to run a program via shell ./program. In Linux what you could do is:
# Made a function to handle more complex programs which require multiple inputs.
run(){
./program
}
# Making a child process that will run the program while you stop it later.
run &
childPid=($!)
# Process id of the program which you want to interrupt (via `run`).
# Time after which you want to interrupt
sleep 5
# Actual command to send the interrupt
kill -SIGINT $childPid
Let me know if it works in windows as well.