How to kill subprocess python in windows - python

How would I go about killing a process on Windows?
I am starting the process with
self.p = Process(target=self.GameInitialize, args=(testProcess,))
self.p.start()
I have tried
self.p.kill()
self.p.terminate()
os.kill(self.p.pid, -1)
os.killpg(self.p.pid, signal.SIGTERM) # Send the signal to all the process groups
Errors
Process Object has no Attribute kill
Process Object has no Attribute terminate
Access Denied
I cannot use .join.

On windows, os.killpg will not work because it sends a signal to the process ID to terminate. This is now how you kill a process on Windows, instead you have to use the win32 API's TerminateProcess to kill a process.
So, you can kill a process by the following on windows:
import signal
os.kill(self.p.pid, signal.CTRL_C_EVENT)
If the above does not work, then try signal.CTRL_BREAK_EVENT instead.

I had to do it using this method from this link:
subprocess.call(['taskkill', '/F', '/T', '/PID', str(self._active_process.pid)])
This is because self._active_process.kill() was not adequate

You should provide a minimal, working example of the problem you are having. As show below, this minimal, working example correctly terminates the process (Tested on Python 2.7.5 64-bit), so the error you are seeing lies in code you haven't shown.
import multiprocessing as mp
import time
def work():
while True:
print('work process')
time.sleep(.5)
if __name__ == '__main__':
p = mp.Process(target=work)
p.start()
for i in range(3):
print('main process')
time.sleep(1)
p.terminate()
for i in range(3):
print('main process')
time.sleep(.5)
Output:
main process
work process
work process
main process
work process
work process
main process
work process
work process
main process
main process
main process

os.kill(self.p.pid, -9)
Works. I am unsure why -1 returns a access denied error but -9 does not.

Related

Terminate Python Process and halt it's underlying os.system command line

I am running a script that launches a program via cmd and then, while the program is open, checks the log file of the program for errors. If any, close the program.
I cannot use taskkill command since I don't know the PID of the process and the image is the same as other processes that I don't want to kill.
Here is a code example:
import os, multiprocessing, time
def runprocess():
os.system('"notepad.exe"')
if __name__ == '__main__':
process = multiprocessing.Process(target=runprocess,args=[])
process.start()
time.sleep(5)
#Continuously checking if errors in log file here...
process_has_errors = True #We suppose an error has been found for our case.
if process_has_errors:
process.terminate()
The problem is that I want the notepad windows to close. It seems like the terminate() method will simply disconnect the process without closing all it's tasks.
What can I do to make sure to end all pending tasks in a process when terminating it, instead of simply disconnecting the process from those tasks?
You can use taskkill but you have to use the /T (and maybe /F) switch so all child processes of the cmd process are killed too. You get the process id of the cmd task via process.pid.
You could use a system call if you know the name of the process:
import os
...
if process_has_errors:
processName = "notepad.exe"
process.terminate()
os.system(f"TASKKILL /F /IM {processName}")

How to keep sub-process running after main process has exited?

I have a requirement to use python to start a totally independent process. That means even the main process exited, the sub-process can still run.
Just like the shell in Linux:
#./a.out &
then even if the ssh connection is lost, then a.out can still keep running.
I need a similar but unified way across Linux and Windows
I have tried the multiprocessing module
import multiprocessing
import time
def fun():
while True:
print("Hello")
time.sleep(3)
if __name__ == '__main__':
p = multiprocessing.Process(name="Fun", target=fun)
p.daemon = True
p.start()
time.sleep(6)
If I set the p.daemon = True, then the print("Hello") will stop in 6s, just after the main process exited.
But if I set the p.daemon = False, the main process won't exit on time, and if I CTRL+C to force quit the main process, the print("Hello") will also be stopped.
So, is there any way the keep print this "Hello" even the main process has exited?
The multiprocessing module is generally used to split a huge task into multiple sub tasks and run them in parallel to improve performance.
In this case, you would want to use the subprocess module.
You can put your fun function in a seperate file(sub.py):
import time
while True:
print("Hello")
time.sleep(3)
Then you can call it from the main file(main.py):
from subprocess import Popen
import time
if __name__ == '__main__':
Popen(["python", "./sub.py"])
time.sleep(6)
print('Parent Exiting')
The subprocess module can do it. If you have a .py file like this:
from subprocess import Popen
p = Popen([r'C:\Program Files\VideoLAN\VLC\vlc.exe'])
The file will end its run pretty quickly and exit, but vlc.exe will stay open.
In your case, because you want to use another function, you could in principle separate that into another .py file

Kill a chain of sub processes on KeyboardInterrupt

I'm having a strange problem I've encountered as I wrote a script to start my local JBoss instance.
My code looks something like this:
with open("/var/run/jboss/jboss.pid", "wb") as f:
process = subprocess.Popen(["/opt/jboss/bin/standalone.sh", "-b=0.0.0.0"])
f.write(str(process.pid))
try:
process.wait()
except KeyboardInterrupt:
process.kill()
Should be fairly simple to understand, write the PID to a file while its running, once I get a KeyboardInterrupt, kill the child process.
The problem is that JBoss keeps running in the background after I send the kill signal, as it seems that the signal doesn't propagate down to the Java process started by standalone.sh.
I like the idea of using Python to write system management scripts, but there are a lot of weird edge cases like this where if I would have written it in Bash, everything would have just worked™.
How can I kill the entire subprocess tree when I get a KeyboardInterrupt?
You can do this using the psutil library:
import psutil
#..
proc = psutil.Process(process.pid)
for child in proc.children(recursive=True):
child.kill()
proc.kill()
As far as I know the subprocess module does not offer any API function to retrieve the children spawned by subprocesses, nor does the os module.
A better way of killing the processes would probably be the following:
proc = psutil.Process(process.pid)
procs = proc.children(recursive=True)
procs.append(proc)
for proc in procs:
proc.terminate()
gone, alive = psutil.wait_procs(procs, timeout=1)
for p in alive:
p.kill()
This would give a chance to the processes to terminate correctly and when the timeout ends the remaining processes will be killed.
Note that psutil also provides a Popen class that has the same interface of subprocess.Popen plus all the extra functionality of psutil.Process. You may want to simply use that instead of subprocess.Popen. It is also safer because psutil checks that PIDs don't get reused if a process terminates, while subprocess doesn't.

python daemon thread is exiting when process is killed

I am starting a python script on one terminal and then from another terminal issuing a kill -9 to kill the process. I am hoping that even when the parent process is killed the thread will continue to execute and touch the file. But that is not happening. Any clue how I can achieve this?
import time,os
import threading
# Create your tests here.
def a(fname):
print "a"
time.sleep(20)
touch(fname)
def touch(fname, times=None):
with open(fname, 'a'):
os.utime(fname, times)
print "touched"
fname = "test.txt"
t = threading.Thread(target=a, args=(fname,))
t.setDaemon(True)
t.start()
t.join()
What you are trying is impossible, unfortunately. Threads can only run as long as their parent process runs. If you want to start executing some code from your script, but have that code continue executing after your script exits, you should move that code into a separate script and use the subprocess module to launch it. Specifically, you can use subprocess.Popen to launch the script without blocking to wait for it to complete:
subprocess.Popen(['./yourscript.py', '-a', 'opt'])

How do I start a subprocess which runs in the background until it is told to terminate gracefully?

From within my python script, I want to start another python script which will run in the background waiting for the instruction to terminate.
Host Python script (H1) starts subprocess P1.
P1 performs some short lived work & returns a sentinel to indicate that it is now going to sleep awaiting instructions to terminate.
H1 polls for this sentinel repeatedly. When it receives the sentinel, it performs some other IO bound task and when that completes, tells P1 to die gracefully (meaning close any resources that you have acquired).
Is this feasible to do with the subprocess module ?
Yes, start the process with :
p=subprocess.Popen([list for the script to execute], stdout=subprocess.PIPE, stdin=subprocess.PIPE, stderr=subprocess.PIPE)
You can then read from p.stdout and p.stderr to watch for your sentinel and write to p.stdin to send messages to the child process. If you are running on a posix system, you might consider using pexpect instead; it doesn't support MS Windows, but it handles communicating with child processes better than subprocess.
"""H1"""
from multiprocessing import Process, Pipe
import sys
def P1(conn):
print 'P1: some short lived work'
sys.stdout.flush()
conn.send('work done')
# wait for shutdown command...
conn.recv()
conn.close()
print 'P1: shutting down'
if __name__ == '__main__':
parent_conn, child_conn = Pipe()
p = Process(target=P1, args=(child_conn,))
p.start()
print parent_conn.recv()
print 'H1: some other IO bound task'
parent_conn.send("game over")
p.join()
Output:
P1: some short lived work
work done
H1: some other IO bound task
P1: shutting down

Categories