Python kill parent script (subprocess) - python

Im working on two scripts.
One script is perpetually running.
When it senses an update to itself, it will run the second script as a subprocess.
The second script should kill the first script, implement the changes and run the updated script.
However, I cant find a way to kill the first script. How does the child process kill its parent?

You are doing this backwards, and shouldn't be using the child process to kill the parent process.
Instead, you will want a parent process of your "perpetually running" script (which will now be the subprocess). When an update is detected, the subprocess kills itself, and requests that the parent implement your changes. The parent will then restart the subprocess.

Related

spawn xterm shell from python, pass command to it, keep it running if parent process dies

python beginner here! using python 2.7.
my question: from a python process, I want to spawn a new xterm process (and display the gui to the end user) and run a command in it.
NOTE: I know about process.Popen("xterm"), however I could not find a way to send the command to xterm, I tried some examples using pipes, but they seem be attached to input/output/error streams.
My end goal is to display a "xterm" gui that shows the progress of another command, also if the parent python process dies, it should not kill the child xterm gui

Cannot find daemon after daemonizing python script

I daemonized a python script using the daemonize python library, but now I cannot find the daemon that it spawned. I want to find the daemon and kill it to make some changes to the script.
I used the following to daemonize:
pidfile='/tmp/filename.pid'
daemon = Daemonize(app='filename',pid=pidfile, action=main)
print("daemon started")
daemon.start()
Open a terminal window and try the following:
ps ax | grep <ScriptThatStartedTheDaemon>.py
It should return the PID and the name of the process. Once you have the PID, do:
kill <pid>
Depending on how many times you've run your script, you may have multiple daemons running, in which case you'd want to kill all of them.
To make sure the process was terminated, run the first line of code again. The process with the PID that you killed shouldn't show up if it was successfully terminated.

How to stop a python script launching many subprocesses from terminal

I have a python script which launches a sequence of subprocesses in a loop:
for c in cmds:
subprocess.call(c, shell=True)
When the script is running in terminal, I try to stop it with Ctrl+C or Ctrl+D, however, it will continue to launch next subprocess. How can I terminate the whole python script at once?
What likely happens is that ctrl-C is intercepted by you sub-process (running on the same terminal) and never gets to your Python script. What you can do is to check the return value of subprocess.call to see whether the child process was killed by a signal and decide whether you want to stop submitting new processes.

Using python, how do I launch an independent python process

I am making a python program, lets say A. Which is used to monitor python script B
When the python program shuts down, there is an exit function that as registered via atexit.register(), to do some clean up it need to re-run python script B, which need to stay running even when python script A has shutdown.
Python Script B can't be part of Python Script A.
What do I need to do to make that happen, I have already tried a few things like using subprocess.Popen(programBCommand), but that doesn't seem to work as it prevents A from shutting down.
I am using a Debian Operating System
If script B needs to be launched by script A, and continue running whether or not A completes (and not prevent A from exiting), you're looking at writing a UNIX daemon process. The easiest way to do this is to use the python-daemon module to make script B daemonize itself without a lot of explicit mucking about with the details of changing the working directory, detaching from the parent, etc.
Note: The process of daemonizing, UNIX-style, detaches from the process that launched it, so you couldn't directly monitor script B from script A through the Popen object (it would appear to exit immediately). You'd need to arrange some other form of tracking, e.g. identifying or communicating the pid of the daemonized process to script A by some indirect method.

Popen new process group on linux

I am spawning some processes with Popen (Python 2.7, with Shell=True) and then sending SIGINT to them. It appears that the process group leader is actually the Python process, so sending SIGINT to the PID returned by Popen, which is the PID of bash, doesn't do anything.
So, is there a way to make Popen create a new process group? I can see that there is a flag called subprocess.CREATE_NEW_PROCESS_GROUP, but it is only for Windows.
I'm actually upgrading some legacy scripts which were running with Python2.6 and it seems for Python2.6 the default behavior is what I want (i.e. a new process group when I do Popen).
bash does not handle signals while waiting for your foreground child process to complete. This is why sending it SIGINT does not do anything. This behaviour has nothing to do with process groups.
There are a couple of options to let your child process receive your SIGINT:
When spawning a new process with Shell=True try prepending exec to the front of your command line, so that bash gets replaced with your child process.
When spawning a new process with Shell=True append the command line with & wait %-. This will cause bash to react to signals while waiting for your child process to complete. But it won't forward the signal to your child process.
Use Shell=False and specify full paths to your child executables.

Categories