I'm trying to invoke a command line utility from Python. The code is as follows
import subprocess
import sys
class Executor :
def executeEXE(self,executable ) :
CREATE_NO_WINDOW = 0x08000000
process = subprocess.Popen(executable, stdout=subprocess.PIPE,
creationflags=CREATE_NO_WINDOW )
while True:
line = process.stdout.readline()
if line == '' and process.poll() != None:
break
print line
The problem with above code is I want the real-time output of above process which I'm not getting. What I'm doing wrong here.
there are 2 problems in your code:
first of all, readline() will block untill when a new line is printed out and flushed.
That means you should execute the code
while True:
...
in a new Thread and call a callback function when the output is ready.
Since the readline is waiting for a new line, you must use
print 'Hello World'
sys.stdout.flush()
everytime in your executable.
You can see some code and example on my git:
pyCommunicator
Instead, if your external tool is buffered, the only thing you can try is to use stderr as PIPE:
https://stackoverflow.com/a/11902799/2054758
Related
I want code like this:
if True:
run('ABC.PY')
else:
if ScriptRunning('ABC.PY):
stop('ABC.PY')
run('ABC.PY'):
Basically, I want to run a file, let's say abc.py, and based on some conditions. I want to stop it, and run it again from another python script. Is it possible?
I am using Windows.
You can use python Popen objects for running processes in a child process
So run('ABC.PY') would be p = Popen("python 'ABC.PY'")
if ScriptRunning('ABC.PY) would be if p.poll() == None
stop('ABC.PY') would be p.kill()
This is a very basic example for what you are trying to achieve
Please checkout subprocess.Popen docs to fine tune your logic for running the script
import subprocess
import shlex
import time
def run(script):
scriptArgs = shlex.split(script)
commandArgs = ["python"]
commandArgs.extend(scriptArgs)
procHandle = subprocess.Popen(commandArgs, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
return procHandle
def isScriptRunning(procHandle):
return procHandle.poll() is None
def stopScript(procHandle):
procHandle.terminate()
time.sleep(5)
# Forcefully terminate the script
if isScriptRunning(procHandle):
procHandle.kill()
def getOutput(procHandle):
# stderr will be redirected to stdout due "stderr=subprocess.STDOUT" argument in Popen call
stdout, _ = procHandle.communicate()
returncode = procHandle.returncode
return returncode, stdout
def main():
procHandle = run("main.py --arg 123")
time.sleep(5)
isScriptRunning(procHandle)
stopScript(procHandle)
print getOutput(procHandle)
if __name__ == "__main__":
main()
One thing that you should be aware about is stdout=subprocess.PIPE.
If your python script has a very large output, the pipes may overflow causing your script to block until .communicate is called over the handle.
To avoid this, pass a file handle to stdout, like this
fileHandle = open("main_output.txt", "w")
subprocess.Popen(..., stdout=fileHandle)
In this way, the output of the python process will be dumped into the file.(You will have to modily the getOutput() function too for this)
import subprocess
process = None
def run_or_rerun(flag):
global process
if flag:
assert(process is None)
process = subprocess.Popen(['python', 'ABC.PY'])
process.wait() # must wait or caller will hang
else:
if process.poll() is None: # it is still running
process.terminate() # terminate process
process = subprocess.Popen(['python', 'ABC.PY']) # rerun
process.wait() # must wait or caller will hang
I'm trying to terminate a subprocess pid if a string is in the output, but it is not working. What is wrong?
import subprocess
import shlex
if "PING" in subprocess.check_call(shlex.split("ping -c 10 gogole.com")):
subprocess.check_call(shlex.split("ping -c 10 gogole.com")).terminate()
Please refere to the documentation for the methods you call. First of all, check_call executes until the process is finished, then returns the return code from the process. I'm not sure how you intend to find "PING" from a return code, which is typically an integer.
If it is there, look at the body of your if statement: you fork a totally new instance of ping, wait for it to complete, and then try to terminate the return code.
I recommend that you work through a tutorial on subprocesses. Learn how to grab a process handle and invoke operations on that. You'll need to get a handle on the output stream, look for "PING" in that, and then call terminate on the process handle you got at invocation.
import subprocess, os
run = "ping -c 10 google.com"
log = ""
process = subprocess.Popen(run, stdout=subprocess.PIPE, shell=True)
while True:
out = process.stdout.read(1)
log +=out
print log
if out == '' and process.poll() != None:
break
if "PING" in log:
print "terminated!"
process.kill()
process.terminate()
break
I am working on a python program which implements the cmd window.
I am using subproccess with PIPE.
If for example i write "dir" (by stdout), I use communicate() in order to get the response from the cmd and it does work.
The problem is that in a while True loop, this doesn't work more than one time, it seems like the subprocess closes itself..
Help me please
import subprocess
process = subprocess.Popen('cmd.exe', shell=False, stdin=subprocess.PIPE,stdout=subprocess.PIPE,stderr=None)
x=""
while x!="x":
x = raw_input("insert a command \n")
process.stdin.write(x+"\n")
o,e=process.communicate()
print o
process.stdin.close()
The main problem is that trying to read subprocess.PIPE deadlocks when the program is still running but there is nothing to read from stdout. communicate() manually terminates the process to stop this.
A solution would be to put the piece of code that reads stdout in another thread, and then access it via Queue, which allows for reliable sharing of data between threads by timing out instead of deadlocking.
The new thread will read standard out continuously, stopping when there is no more data.
Each line will be grabbed from the queue stream until a timeout is reached(no more data in Queue), then the list of lines will be displayed to the screen.
This process will work for non-interactive programs
import subprocess
import threading
import Queue
def read_stdout(stdout, queue):
while True:
queue.put(stdout.readline()) #This hangs when there is no IO
process = subprocess.Popen('cmd.exe', shell=False, stdout=subprocess.PIPE, stdin=subprocess.PIPE)
q = Queue.Queue()
t = threading.Thread(target=read_stdout, args=(process.stdout, q))
t.daemon = True # t stops when the main thread stops
t.start()
while True:
x = raw_input("insert a command \n")
if x == "x":
break
process.stdin.write(x + "\n")
o = []
try:
while True:
o.append(q.get(timeout=.1))
except Queue.Empty:
print ''.join(o)
I have a dtrace snippet run via python script and the dtrace snippet is such that it generates data when CTRL-C is issued to it. So I had a signal_handler defined in the python script to catch CTRL-C from user and relay this to the dtrace invocation done via subprocess.Popen but I am unable to get any output in my log file. Here is the script:
Proc = []
signal_posted = False
def signal_handler(sig, frame):
print("Got CTRL-C!")
global signal_posted
signal_posted = True
global Proc
Proc.send_signal(signal.SIGINT) #Signal posting from handler
def execute_hotkernel():
#
# Generate the .out output file
#
fileout = "hotkernel.out"
fileo = open(fileout, "w+")
global Proc
Proc = subprocess.Popen(['/usr/sbin/dtrace', '-n', dtrace_script], stdout = fileo)
while Proc.poll() is None:
time.sleep(0.5)
def main():
signal.signal(signal.SIGINT, signal_handler) # Change our signal handler
execute_hotkernel()
if __name__ == '__main__':
main()
Since I have a file hotkernel.out set in subprocess.Popen command for stdout I was expecting the output from dtrace to be redirected to hotkernel.out on doing a CTRL-C but it is empty. What is missing here?
I have a similar issue.
In my case, it's a shell script that runs until you hit Control-C, and then prints out summary information. When I run this using subprocess.Popen, whether using a PIPE or a file object for stdout, I either don't get the information (with a file object) or it hangs when I try to run stdout.readline().
I finally tried running the subprocess from the interpreter and discovered I could get the last line of output after the SIGINT with a PIPE if I call stdout.readline() (where it hangs) and hit Control-C (in the interpreter), and then call stdout.readline() again.
I do not know how to emulate this in script, for a file output or for a PIPE. I did not try the file output in the interpreter.
EDIT:
I finally got back to this and determined, it's actually pretty easy to emulate outside of python and really has nothing to do with python.
/some_cmd_that_ends_on_sigint
(enter control-c)
*data from stdout in event handler*
Works
/some_cmd_that_ends_on_sigint | tee some.log
(enter control-c)
*Nothing sent to stdout in event handler prints to the screen or the log*
Where's my log?
I ended up just adding a file stream in the event handler (in the some_cmd_that_ends_on_sigint source) that writes the data to a (possibly secondary) log. Works, if a bit awkward. You get the data on the screen if running without any piping, but I can also read it when piped or from python from the secondary log.
I am making a simple IDE with a text box and a run button. This asks the user to enter the filename, writes the code written to a file and runs that file.
I want to show whatever is outputted from the console, such as a print, input, etc.. like IDE's do.
Is this possible?
Here is my code:
from Tkinter import *
import tkFileDialog
import ScrolledText
import subprocess
filepath=""
def run():
global filepath
print "<<<<<<=====-------------Restart-------------=====>>>>>>"
py=code.get(1.0,END)
if filepath=="":
filepath=tkFileDialog.asksaveasfilename()
if ".py" not in filepath:
filepath=filepath+".py"
script=open(filepath, "w")
script.write(py)
script.close()
p = subprocess.Popen(['python', filepath],
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
)
for line in iter(p.stdout.readline, ''):
print line
print "<<<<<<=====-------------EXECUTION FINISHED-------------=====>>>>>>"
root = Tk()
code=ScrolledText.ScrolledText(root)
code.pack()
run=Button(root,text="Run", command=run)
run.pack()
root.mainloop()
Yes, just use the subprocess module.
Getting output in one go
import subprocess
output = subprocess.check_output(['python', filepath])
If you want to capture the standard error out of the called process as well as standard out use this instead:
output = subprocess.check_output(['python', filepath], stderr=subprocess.STDOUT)
Or if you want to capture them separately:
p = subprocess.Popen(['python', filepath],
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
)
out, err = p.communicate()
Getting the output as it's produced
This gives you the combined stdout and strerr output, line by line:
p = subprocess.Popen(['python', filepath],
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
)
for line in iter(p.stdout.readline, ''):
# process the line
Keeping the UI responsive
If you run the above code on the same thread as your GUI, you are essentially blocking the Tk's event loop from running while you are waiting for each line. This means that though you are getting each line in real-time, and writing it to your GUI, it won't update the display until the event loop gets to run again, and process all your calls to Tk.
You need to run the subprocess code on a new thread, and in your GUI thread, periodically check for new input.
I've done an example based on yours for you, you can find it here:
http://pastebin.com/FRFpaeJ2