Using daemontools with a Python script that spawns subprocesses - python

I am trying to set up daemontools with a large python program that spawns various subprocesses, and I'm having issues where the subprocesses are not spawning correctly. The subprocess just appears as a zombified process when launched via daemontools.
I have provided a simplified example to demonstrate this.
/service/test/run:
#!/bin/sh
cd /script_directory/
exec envdir /service/test/env /usr/bin/python3 test_subprocess.py
/script_directory/test_subprocess.py
import subprocess
from time import sleep
subprocess.Popen("xterm")
while True:
sleep(1)
test_subprocess.py simply launches a GUI terminal and stays alive, so I can see if it is still running in top/htop.
If I run the script either as root or a non-root user, the script properly executes and the window is displayed. When run via daemontools/supervise, the xterm is zombified and no window is shown.
Setting the env/DISPLAY and env/XAUTHORITY variables as described here doesn't seem to work for me.
On further investigation, the subprocess is zombified even if it does not use the GUI. For example if the subprocess in subprocess.py is "top" - it will not run.
I've used daemontools successfully on various other projects that don't spawn subprocesses so I don't think the issue is with the basic setup here.
Can daemontools be used with scripts that spawn other processes?
If not, what some other recommended tools for daemonising complex python applications?

bro i can't understand what you went to do. but try this program:
import subprocess
p = subprocess.Popen(
['xterm', '-hold'], stdin=subprocess.PIPE)
p.communicate()
if went to give some argument use -e and type command,and if another problem please let me know.thanks

Related

Run python script from python script BUT outside of python script

It sounds like riddle or joke but actually I havent found answer to this problem.
What is actually the problem?
I want to run 2 scripts. In first script I call another script but I want them to continue parallely and not in 2 separate threads. Mainly I dont want 2nd script to be running inside 1st python script(That means if I run Chrome Browser from python script and then shut down the python script, the Chrome will be shut down too).
What I want is like on Linux machine: I open two terminals and run both scripts in each terminal - They are not two threads, they are independent on each other, shutting one will NOT shut down the other. Or it can be like on Linux machine where I can run 2 python scripts in terminal behind background with 'python xxx.py &' (&) symbol.
Summary:
I would like to run inside 'FIRST.py' script 'SECOND.py' script. However not with threading module and mainly have SECOND.py script independent on FIRST.py script, that is, shutting down FIRST.py will not have any consequence on SECOND.py.
THE SOLUTION SHOULD BE WORKING ON WINDOWS, LINUX AND MAC.
BTW:
I tried on windows:
subprocess.call(['python','second.py','&'])
subprocess.call(['python','second.py'])
os.system('python second.py') # I was desperate
They run serially, so first.py script is blocked untill second.py finishes.
I havent try Threading with daemon=False but I feel its kind of Demon and I dont feel my skill is that far that I can control threads existing outside of my playground :)
Thanks in advance for help
You can use the Popen constructor from the subprocess module to launch background processes, using
import subprocess
p = subprocess.Popen(["python","second.py"])
creates a background process and execution of first.py is not blocked.

How to spawn detached background process on Linux in either bash or python

I have a long running python script on Linux, and in some situations it needs to execute a command to stop and restart itself. So, I would like to have an external script (either in bash or python) that executes command to restart the original script. Let me elaborate.
Suppose I have original_script.py. In original_script.py I have this in an infinite loop:
if some_error_condition:
somehow call external script external.sh or external.py
Let's suppose I can call external.sh and it contains this:
#!/bin/bash
command_to_restart_original_script
Finally, I know the command "command_to_restart_original_script". That isn't the problem. What need is the python command to "somehow call external script external.sh". I need the external script (which is a child process) to keep running as the parent process original_script.py is restarting, ie I need the child process to be detached/daemonized. How do I do this?
I found lots of suggestions in various places, but the only answer that worked for me was this:
How to launch and run external script in background?
import subprocess
subprocess.Popen(["nohup", "python", "test.py"])
In my case I ran a script called longrun.sh so the actual command is:
import subprocess
subprocess.Popen(["nohup", "/bin/bash", "longrun.sh"])
I tested this using this run.py:
import subprocess
subprocess.Popen(["nohup", "/bin/bash", "longrun.sh"])
print "Done!"
and I verified (using ps -ax | grep longrun) that longrun.sh does indeed run in the background long after run.py exits.

Toggle process with Python-script (kill when running, start when not running)

I'm currently running an OpenELEC (XBMC) installation on a Raspberry Pi and installed a tool named "Hyperion" which takes care of the connected Ambilight. I'm a total noob when it comes to Python-programming, so here's my question:
How can I run a script that checks if a process with a specific string in its name is running and:
kill the process when it's running
start the process when it's not running
The goal of this is to have one script that toggles the Ambilight. Any idea how to achieve this?
You may want to have a look at the subprocess module which can run shell commands from Python. For instance, have a look at this answer. You can then get the stdout from the shell command to a variable. I suspect you are going to need the pidof shell command.
The basic idea would be along the lines of:
import subprocess
try:
subprocess.check_output(["pidof", "-s", "-x", "hyperiond"])
except subprocess.CalledProcessError:
# spawn the process using a shell command with subprocess.Popen
subprocess.Popen("hyperiond")
else:
# kill the process using a shell command with subprocess.call
subprocess.call("kill %s" % output)
I've tested this code in Ubuntu with bash as the process and it works as expected. In your comments you note that you are getting file not found errors. You can try putting the complete path to pidof in your check_output call. This can be found using which pidof from the terminal. The code for my system would then become
subprocess.check_output(["/bin/pidof", "-s", "-x", "hyperiond"])
Your path may differ. On windows adding shell=True to the check_output arguments fixes this issue but I don't think this is relevant for Linux.
Thanks so much for your help #will-hart, I finally got it working. Needed to change some details because the script kept saying that "output" is not defined. Here's how it now looks like:
#!/usr/bin/env python
import subprocess
from subprocess import call
try:
subprocess.check_output(["pidof", "hyperiond"])
except subprocess.CalledProcessError:
subprocess.Popen(["/storage/hyperion/bin/hyperiond.sh", "/storage/.config/hyperion.config.json"])
else:
subprocess.call(["killall", "hyperiond"])

"script command" and logging in python

I have a python app that has lots of outputs on the screen which can be used for debugging. out of all the logging techniques, "script" command works well for me because I can see the output on the screen as well as logging it. I want to include that at the beginning of my python app to run automatically and log everything, when I do, however, the python program doesn't run. as soon as I type exit at the terminal (which stops script logging) the app starts working. The command I'm using is:
command="script /tmp/appdebug/debug.txt"
os.system(command)
I have also tried script -q but the same issue is there. Would appreciate any help.
Cheers
Well, I did find the answer for anyone who is interested:
https://stackoverflow.com/questions/15507602/logging-all-bash-in-and-out-with-script-command
and
Bash script: Using "script" command from a bash script for logging a session
I will keep this question as others might have the same issue and finding those answers wasn't exactly easy :)
Cheers
Try to use subprocess, like so:
from subprocess import Popen, PIPE
p = Popen(['script', '/tmp/appdebug/debug.txt'], stderr=PIPE, stdout=PIPE)
stdout, stderr = p.communicate()
script is a wrapper for a session of interactions. Even if it appears to terminate quickly after being started in a shell, this is not so; instead it starts a new shell in which you can interact so that everything is logged to a file.
What does this mean for you?
Your approach of using script cannot work. You start script using os.system which will wait for script to terminate before the next Python statement is executed. script's work will only happen before it terminates (i. e. during the uninteresting waiting period of your Python program).
I propose to use script -c yourprog.py yourprog.log instead. This will execute and wrap the yourprog.py and the session will be stored in yourprog.log.

Running a commandline application from GUI silently

I would like to run the specific commandline application:
ffmpeg -i video.mp4 audio.mp3
I'm running the command through a GUI, and when the console window doesn't exist, the ffmpeg process is running in a new cmd window.
Testers find the "black window that appears" scary and not userfriendly.
How can I run the application without any visible window coming up? os.system(), subprocess.Popen() and subprocess.call() all do launch the cmd window.
If it matters, I'm using pyqt4 and py2exe. I'm targeting Windows OS users.
This recipe at ActiveState may solve your problem:
http://code.activestate.com/recipes/409002/
Slight changes are required for Python 2.7. See How do I eliminate Windows consoles from spawned processes in Python (2.7)?
Launch ffmpeg from the START command. If you use the /B switch, no command window will be shown.
Use subprocess.Popen (or call) and redirect stdout/stderr somewhere. They're currently hooked to your own process's stdout and stderr, which is why they're coming through.
If you need something that can integrate nicely with your GUI event loop, use Twisted's process-launching stuff.

Categories