I want to run a shell script from python.
The shell script is something which runs a server, which needs ctrl+c to break.
How to do that is there a way to run such type of scripts from python.
Just send the SIGINT signal to your app:
proc = subprocess.Popen(stdin=subprocess.DEVNULL, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
# When it needs to be stopped
proc.send_signal(signal.SIGINT)
Related
I want to call shell script from python code. The shell script which I am trying to call is having multiple database (DB2) call in it; means it connects to DB2 database multiple times and execute different database sqls. I tried using subprocess.call method like (subprocess.call(['./<shell script with full path>'])); but it seems before the script connects to database and executes the commands mentioned within the script, it is terminating. But when I am calling the shell script as a standalone script from command line, then it is working good.
Is there any other way this can be handled?
subprocess: The subprocess module allows you to spawn new processes,
connect to their input/output/error pipes, and obtain their return
codes.
http://docs.python.org/library/subprocess.html
Usage:
import subprocess
process = subprocess.Popen(command, shell=True)
process.wait()
print process.returncode
Side note: It is best practice to avoid using shell=True as it is a security hazard.
Actual meaning of 'shell=True' in subprocess
I have a small Flask API that is receiving requests from a remote server. Whenever a request is received, a subprocess is started. This subprocess is simply executing a second Python file that is in the same folder. This subprocess can run for several hours and several of these subprocesses can run simultaneously. I am using stdout to write the output of the python file into a text file.
All of this is working fine, but every couple of weeks it happens that the Flask API becomes unresponsive and needs to be restarted. As soon as I stop the Flask server, all running subprocesses stop. I would like to avoid this and run each subprocess independently from the flask API.
This is a small example that illustrates what I am doing (this code is basically inside a method that can be called through the API)
import subprocess
f = open("log.txt","wb")
subprocess.Popen(["python","job.py"],cwd = "./", stdout = f, stderr = f)
I would like to achieve that the subprocess keeps running after I stop the Flask API. This is currently not the case. Somewhere else I read that the reason is that I am using the stdout and stderr parameters, but even after removing those the behavior stays the same.
Any help would be appreciated.
Your sub-processes stop because their parent process dies when you restart your Flask server. You need to completely separate your sub-processes from your Flask process by running your Python call in a new shell:
from subprocess import call
# On Linux:
command = 'gnome-terminal -x bash -l -c "python job.py"'
# On Windows:
# command = 'cmd /c "python job.py"'
call(command, shell=True)
This way your Python call of job.py will run in a separate terminal window, unaffected by your Flask server process.
Use fork() to create a child process of the process in which you are calling this function. On successful fork(), it returns a zero for the child id.
Below is a basic example of fork, which you can easily incorporate in your code.
import os
pid = os.fork()
if pid == 0: # new process
os.system("nohup python ./job.py &")
Hope this helps!
I have a python script which launches a sequence of subprocesses in a loop:
for c in cmds:
subprocess.call(c, shell=True)
When the script is running in terminal, I try to stop it with Ctrl+C or Ctrl+D, however, it will continue to launch next subprocess. How can I terminate the whole python script at once?
What likely happens is that ctrl-C is intercepted by you sub-process (running on the same terminal) and never gets to your Python script. What you can do is to check the return value of subprocess.call to see whether the child process was killed by a signal and decide whether you want to stop submitting new processes.
I'm trying to run a script that runs putty and within the putty terminal that gets created, run a command. I've been able to start putty from my script using Python's subprocess module with check_call or Popen. However, I'm confused as to how I can run a command within the subprocess Putty terminal from my script. I need to be able to run this command in putty and analyze its output on the Putty terminal. Thanks for any help.
You need to set the stdin argument to PIPE and use the communicate function of Popen to send data to stdin.
from subprocess import Popen, PIPE
p = Popen('/the/command', stdin=PIPE, stdout=PIPE, stderr=PIPE)
std_out, std_err = p.communicate('command to putty')
That being said, it may be easier to use a python library that implements the ssh protocol (like paramiko) rather than going through putty.
I am trying to use os.system (soon to be replaced with subprocess) to call a shell script (which runs a process as a daemon)
os.system('/path/to/shell_script.sh')
The shell script looks like:
nohup /path/to/program &
If I execute this shell script in my local environment, I have to hit enter before being returned to the console as the shell script is running a process as a daemon. If I do the above command in python, I also have to hit enter before being returned to the console.
However, if I do this in a python program, it just hangs forever.
How can I get the python program to resume execution after calling a shell script that runs a process as a daemon?
From here -
Within a script, running a command in the background with an ampersand (&)
may cause the script to hang until ENTER is hit. This seems to occur with
commands that write to stdout.
You should try redirecting your output to some file (or null if you do not need it), maybe /dev/null , if you really do not need the output.
nohup /path/to/program > /dev/null &
Why don't you trying using a separate thread?
Wrap up your process into something like
def run(my_arg):
my_process(my_arg)
thread = Thread(target = run, args = (my_arg, ))
thread.start()
Checkout join and lock for more control over the thread execution.
https://docs.python.org/2/library/threading.html