I am trying to compile a set of lines and execute them and append the output to text file. Instead of writing the same thing, I used a python script to compile and execute in background.
import subprocess
subprocess.call(["ifort","-openmp","mod1.f90","mod2.f90","pgm.f90","-o","op.o"])
subprocess.call(["nohup","./op.o",">","myout.txt","&"])
The program pgm.f90 is getting compliled using the ifort compiler, but the ouput is not getting appended to myout.txt. Instead it is appending output to nohup.out and the program is not running in the background even after specifying "&" in the python script.
What obvious error have I made here?
Thanks in advance
You can call a subprocess as if you were in the shell by using Popen() with the argument shell=True:
subprocess.Popen("nohup ./op.o > myout.txt &", shell=True)
This issue is that when you supply arguments as a list of elements, the subprocess library bypasses the shell and uses the exec syscall to directly run your program (in your case, "nohup"). Thus, rather than the ">" and "&" operators being interpreted by the shell to redirect your output and run in the background, they are being passed as literal arguments to the nohup command.
You can tell subprocess to execute your command via the shell, but this starts a whole extra instance of shell and can be wasteful. For a workaround, use the built-in redirection functionality in subprocess instead of using the shell primitives:
p = subprocess.Popen(['nohup', "./op.o"],
stdout=open('myout.txt', 'w'))
# process is now running in the background.
# if you want to wait for it to finish, use:
p.wait()
# or investigate p.poll() if you want to check to see if
# your process is still running.
For more information: http://docs.python.org/2/library/subprocess.html
Related
I am running spyder on windows 10 and when I attempt to run a command similar to the following:
cmd = 'python /path/to/program.py arg1 arg2'
subprocess.run(cmd,shell=True)
The script is being run as expected but I would like to see what is being printed to screen by the executed command in the spyder ipython console. I know the program is printing things to screen as expected by other methods (running the program from a shell) so there is not an error in the script I am running.
How do I go about enabling printing for the subprocess?
The output comes in a stream called stdout. To capture it, you need to redirect it to a pipe, which then is terminated in the calling process. subprocess.run(...) has builtin support for handling this:
import subprocess
cmd = 'python /path/to/program.py arg1 arg2'.split()
proc = subprocess.run(cmd, stdout=subprocess.PIPE, universal_newlines=True)
print(proc.stdout)
As can be seen, the output is caught in the CompletedProcess object (proc) and then accessed as member data.Also, to make the output into text (a string) rather than a bytearray, I have passed the parameter universal_newlines=True.
A caveat, though, is that subprocess.run(...) runs to completion before it returns control. Therefore, this does not allow for capturing the output "live" but rather after the whole process has finsihed. If you want live capture, you must instead use subprocess.Popen(...) and then use .communicate() or some other means of communication to catch the output from the subprocess.
Another comment I like to make, is that using shell=True is not recommended. Specifically not when handling unknown or not trusted input. It leaves the interpretation of cmd to the shell which can lead to all kind of security breaches and bad behavior. Instead, split cmd into a list (e.g. as I have done) and then pass that list to subprocess.run(...) and leave out shell=True.
I have a python script that launches subprocesses using subprocess.Popen. The subprocess then launches an external command (in my case, it plays an mp3). The python script needs to be able to interrupt the subprocesses, so I used the method described here which gives the subprocess its own session ID. Unfortunately, when I close the python script now, the subprocess will continue to run.
How can I make sure a subprocess launched from a script, but given a different session ID still closes when the python script stops?
Is there any way to kill a Thread in Python?
and make sure you use it as thread
import threading
from subprocess import call
def thread_second():
call(["python", "secondscript.py"])
processThread = threading.Thread(target=thread_second) # <- note extra ','
processThread.start()
print 'the file is run in the background'
TL;DR Change the Popen params: Split up the Popen cmd (ex. "list -l" -> ["list", "-l"]) and use Shell=False
~~~
The best solution I've seen so far was just not to use shell=True as an argument for Popen, this worked because I didn't really need shell=True, I was simply using it because Popen wouldn't recognize my cmd string and I was too lazy too split it into a list of args. This caused me a lot of other problems (ex. using .terminate() becomes a lot more complicated while using shell and needs to have its session id, see here)
Simply splitting the cmd from a string to a list of args lets me use Popen.terminate() without having to give it its own session id, by not having a separate session id the process will be closed when the python script is stopped
I am trying to execute a shell script(not command) from python:
main.py
-------
from subprocess import Popen
Process=Popen(['./childdir/execute.sh',str(var1),str(var2)],shell=True)
execute.sh
----------
echo $1 //does not print anything
echo $2 //does not print anything
var1 and var2 are some string that I am using as an input to shell script. Am I missing something or is there another way to do it?
Referred: How to use subprocess popen Python
The problem is with shell=True. Either remove that argument, or pass all arguments as a string, as follows:
Process=Popen('./childdir/execute.sh %s %s' % (str(var1),str(var2),), shell=True)
The shell will only pass the arguments you provide in the 1st argument of Popen to the process, as it does the interpretation of arguments itself.
See a similar question answered here. What actually happens is your shell script gets no arguments, so $1 and $2 are empty.
Popen will inherit stdout and stderr from the python script, so usually there's no need to provide the stdin= and stderr= arguments to Popen (unless you run the script with output redirection, such as >). You should do this only if you need to read the output inside the python script, and manipulate it somehow.
If all you need is to get the output (and don't mind running synchronously), I'd recommend trying check_output, as it is easier to get output than Popen:
output = subprocess.check_output(['./childdir/execute.sh',str(var1),str(var2)])
print(output)
Notice that check_output and check_call have the same rules for the shell= argument as Popen.
you actually are sending the arguments ... if your shell script wrote a file instead of printing you would see it. you need to communicate to see your printed output from the script ...
from subprocess import Popen,PIPE
Process=Popen(['./childdir/execute.sh',str(var1),str(var2)],shell=True,stdin=PIPE,stderr=PIPE)
print Process.communicate() #now you should see your output
If you want to send arguments to shellscript from python script in a simple way.. You can use python os module :
import os
os.system(' /path/shellscriptfile.sh {} {}' .format(str(var1), str(var2))
If you have more arguments.. Increase the flower braces and add the args..
In shellscript file.. This will read the arguments and u can execute the commands accordingly
I am trying to use the subprocess module in python and trying to fetch the process id of firefox
cmd = "firefox &"
fire = subprocess.Popen(cmd,shell=True, stdout=subprocess.PIPE, preexec_fn=os.setsid)
fire_task_procs = find_task(fire.pid)
print "fire_task_procs",fire_task_procs
I think I am getting the pid of the commandline argument that I am executing.. am I doing something wrong?
I confirmed that it is not the same using the ps aux | grep firefox
If you use shell=True the pid you'll get ist that of the started shell, not that of the process you want, specially as you use & to send the process into background.
You should use the long (list) form of supplying the parameters, without &, as that makes little sense anyway if you combine it with output redirection.
Don't use the shell, instead just use
subprocess.Popen(['firefox'], stdout=subprocess.PIPE, preexec_fn=os.setsid)
However, if firefox is already running then this will not work either since in this case firefox will use some IPC to tell the existing process to open a new window and then terminates.
I'm using the OS.System command to call a python script.
example:
OS.System("call jython script.py")
In the script I'm calling, the following command is present:
x = raw_input("Waiting for input")
If I run script.py from the command line I can input data no problem, if I run it via the automated approach I get an EOFError. I've read in the past that this happens because the system expects a computer to be running it and therefore could never receive input data in this way.
So the question is how can I get python to wait for user input while being run in an automated way?
The problem is the way you run your child script. Since you use os.system() the script's input channel is closed immediately and the raw_input() prompt hits an EOF (end of file). And even if that didn't happen, you wouldn't have a way to actually send some input text to the child as I assume you'd want given that you are using raw_input().
You should use the subprocess module instead.
import subprocess
from subprocess import PIPE
p = subprocess.Popen(["jython", "script.py"], stdin=PIPE, stdout=PIPE)
print p.communicate("My input")
Your question is a bit unclear. What is the process calling your Python script and how is it being run? If the parent process has no standard input, the child won't have it either.