How can I avoid Python hanging on subprocess.run() - python

I am creating a small automation script to interact with a game. The game is very vulnerable to crashing, but in an unpredictable way (after 30 minutes to 3 hours, depending on what happens in game). Because of this, I wrote a small script below that was IDEALLY going to kill the program, kill the crash monitoring client that checks for crashes and offers to restart, and then relaunch the game and resume. The issue is that I never reach past subprocess.run()
it launches the game again, but it does not allow for any code after that to run.
import psutil
import time
import subprocess
def main():
'''Process kill function'''
for proc in psutil.process_iter():
# check whether the process name matches
# print(proc.name())
if any(procstr in proc.name() for procstr in\
['GameName.exe']):
print(f'Killing {proc.name()}')
proc.kill()
time.sleep(5)
for proc in psutil.process_iter():
# check whether the process name matches
# print(proc.name())
if any(procstr in proc.name() for procstr in\
['GameCrashMonitor.exe']):
print(f'Killing {proc.name()}')
proc.kill()
time.sleep(10)
subprocess.run(r'PathToGame.exe')
print(time.time())
if __name__ == "__main__":
main()
This program successfully gets to subprocess.call, launches the game again, and then python hangs. I cant control c on ipython to stop it. I use Spyder and it even makes the icon for spyder on my task bar error and disappear.

Try replacing subprocess.run()
with variable = subprocess.Popen([r'PathToGame.exe'])
https://docs.python.org/3/library/subprocess.html says that subprocess.run() waits for the command to complete, then returns a CompletedProcess instance. Your command isnt completing. Furthermore, there is a timeout argument you could pass into run() but by default, it is None, and does not time out, if you wanted to still use .run()

Related

Terminate Python Process and halt it's underlying os.system command line

I am running a script that launches a program via cmd and then, while the program is open, checks the log file of the program for errors. If any, close the program.
I cannot use taskkill command since I don't know the PID of the process and the image is the same as other processes that I don't want to kill.
Here is a code example:
import os, multiprocessing, time
def runprocess():
os.system('"notepad.exe"')
if __name__ == '__main__':
process = multiprocessing.Process(target=runprocess,args=[])
process.start()
time.sleep(5)
#Continuously checking if errors in log file here...
process_has_errors = True #We suppose an error has been found for our case.
if process_has_errors:
process.terminate()
The problem is that I want the notepad windows to close. It seems like the terminate() method will simply disconnect the process without closing all it's tasks.
What can I do to make sure to end all pending tasks in a process when terminating it, instead of simply disconnecting the process from those tasks?
You can use taskkill but you have to use the /T (and maybe /F) switch so all child processes of the cmd process are killed too. You get the process id of the cmd task via process.pid.
You could use a system call if you know the name of the process:
import os
...
if process_has_errors:
processName = "notepad.exe"
process.terminate()
os.system(f"TASKKILL /F /IM {processName}")

Python subprocess module wait method

I am learning the subprocess module in python, and to my understanding, the wait method, blocks the thread from executing the rest of the code until the launched process is closed. But when I cann the wait method it still executes the rest of the code:
def startCalc():
x = subprocess.Popen('C:\\Windows\\System32\\calc.exe')
time.sleep(5)
x.wait()
print('finished waiting')
print(x.poll())
print(x.wait())
startCalc()
If I am not wrong, the "finished waiting statement, would not appear in the output until I close the calculator, but it does.
Where am I wrong?
The problem isn't with your code, but rather with the calc.exe executable. It starts the calculator and returns immediately with 0 exit status. So, from the perspective of your program, the process ran to completion successfully. As far as I know, calc.exe doesn't have a way to launch in attached mode.
Test this by opening a powershell, or cmd terminal and launching calc.exe. You get the prompt back immediately.
I am not familiar with the ".wait" function, but if you want your code to wait for the execution of the "calc.exe" process, you could replace "Popen" with "call":
x = subprocess.call('C:\\Windows\\System32\\calc.exe')

How to keep sub-process running after main process has exited?

I have a requirement to use python to start a totally independent process. That means even the main process exited, the sub-process can still run.
Just like the shell in Linux:
#./a.out &
then even if the ssh connection is lost, then a.out can still keep running.
I need a similar but unified way across Linux and Windows
I have tried the multiprocessing module
import multiprocessing
import time
def fun():
while True:
print("Hello")
time.sleep(3)
if __name__ == '__main__':
p = multiprocessing.Process(name="Fun", target=fun)
p.daemon = True
p.start()
time.sleep(6)
If I set the p.daemon = True, then the print("Hello") will stop in 6s, just after the main process exited.
But if I set the p.daemon = False, the main process won't exit on time, and if I CTRL+C to force quit the main process, the print("Hello") will also be stopped.
So, is there any way the keep print this "Hello" even the main process has exited?
The multiprocessing module is generally used to split a huge task into multiple sub tasks and run them in parallel to improve performance.
In this case, you would want to use the subprocess module.
You can put your fun function in a seperate file(sub.py):
import time
while True:
print("Hello")
time.sleep(3)
Then you can call it from the main file(main.py):
from subprocess import Popen
import time
if __name__ == '__main__':
Popen(["python", "./sub.py"])
time.sleep(6)
print('Parent Exiting')
The subprocess module can do it. If you have a .py file like this:
from subprocess import Popen
p = Popen([r'C:\Program Files\VideoLAN\VLC\vlc.exe'])
The file will end its run pretty quickly and exit, but vlc.exe will stay open.
In your case, because you want to use another function, you could in principle separate that into another .py file

How to skip error message pop-up with subprocess in Python

I am using subprocess in Python to call an external program on WINDOWS. I control the process with ThreadPool so that I can limit it to max 6 processes at the same time, and new process continuously began when one was done.
Codes as below:
### some codes above
### Code of Subprocess Part
from multiprocessing.pool import ThreadPool as Pool
def FAST_worker(file):
p = subprocess.Popen([r'E:/pyworkspace/FAST/FAST_RV_W64.exe', file],
cwd = r'E:/pyworkspace/FAST/',
shell = True)
p.wait()
# List of *.in filenames
FAST_in_pathname_li = [
'334.in',
'893.in',
'9527.in',
...
'114514.in',
'1919810.in',
]
# Limit max 6 processes at same time
with Pool(processes = 6) as pool:
for result in pool.imap_unordered(FAST_worker, FAST_in_pathname_li):
pass
### some codes below
I got problem when the external program unexpectedly terminated and showed error message pop-up. Though the other 5 processes still kept going, the whole progress will finally get stuck at the "subprocess part" and couldn't go forward anymore. (unless I came to my desk and manually clicked "Shut down the program")
What I want to know is how can I avoid the pop-up and make the whole script process keep going, like bypass the error message or something, rather than manual click, in order to avoid wasting time.
Since we don't know enough about the program FAST_worker is calling, I'll assume you already checked there isn't any "kill on error" or "quiet" mode that would be more convenient to use in a script.
My two cents: maybe you can setup a timeout on the worker execution, so that a stuck process is killed automatically after a certain delay.
Building on the snippet provided here, here is a draft:
from threading import Timer
def FAST_worker(file, timeout_sec):
def kill_proc():
"""called by the Timer thread upon expiration"""
p.kill()
# maybe add task to list of failed task, for tracability
p = subprocess.Popen([r'E:/pyworkspace/FAST/FAST_RV_W64.exe', file],
cwd = r'E:/pyworkspace/FAST/',
shell = True)
# setup timer to kill the process after a timeout
timer = Timer(timeout_sec, kill_proc)
try:
timer.start()
stdout, stderr = p.wait()
finally:
timer.cancel()
Note that there are also GUI automation libraries in python that can do the clicking for you, but that is likely to be more tedious to program:
tutorial for pyAutoGui
SO question on the subject

Starting VLC stops the rest of a python script

I am experimenting with different python scripts to run videos and I am encountering a strange error, anytime I start vlc, the rest of the script stops executing.
What am I doing wrong?
import time
import subprocess
subprocess.call(["vlc", "myVideo.mp4", "-f", "-L", "--no-audio", "&"])
print("I never print")
time.sleep(5)
subprocess.call(["killall", "-9", "vlc"])
print("I never print either!")
The list that you pass to the subprocess functions is interpreted as argv for the program. Since shell=False by default, no interpretation is done on the arguments. Specifically, & is passed as the last argument to vlc, and does not start a background process.
To start a background process, call Popen directly. call will always wait for the subprocess to complete, so it's not what you want.

Categories