How could I know subprocess is waiting for input on GUI - python

I am writing a top-script (python) to control the EDA tools on IC design flow, the top-script have a GUI (python-tkinter), run EDA tools with subprocess.Popen(), print the stdout on the GUI.
Sometime EDA tool will not exit but waiting for input, then the GUI will hang and wait for my input, but top-script cannot catch the right stdout message from subprocess.stdout, and cannot put my input into subprocess.
Below is part of my top-script.
class runBlocks():
... ...
def subprocessRun(self):
# run EDA tool "runCommand" with subprocess.Popen.
self.RC = subprocess.Popen(runCommand, shell=True, stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
# save subprocess stdout into queue.
self.queue = queue.Queue()
getRunProcessThread = threading.Thread(target=self.getRunProcess)
getRunProcessThread.setDaemon(True)
getRunProcessThread.start()
# update GUI with the stdout (on queue).
self.updateRunProcess()
# wait for subprocess finish.
(stdout, stderr) = self.RC.communicate()
return(self.RC.returncode)
def getRunProcess(self):
# no block getting subprocess.stdout
while self.RC.poll() == None:
stdoutLine = str(self.RC.stdout.readline(), encoding='utf-8')
self.queue.put(stdoutLine)
def updateRunProcess(self):
# if there is still something from subprocess.stdout, print them into GUI Text.
while self.RC.poll() == None or not self.queue.empty():
if self.queue.empty():
time.sleep(0.1)
conitnue
else:
line = self.queue.get_nowait()
if line:
# "self.runProcessText" is a Text on GUI, accept EDA tool output message.
self.runProcessText.insert(END, line)
# "self.runProcessText" is on "self.frame6", after inserting new message, update the frame.
self.frame6.update()
self.runProcessText.see(END)
If I run the EDA tool on terminal directly, it will stop and waiting for my input.
$ dc_shell-t -64 -topo -f ./analyze.run.tcl -o analyze.log
...
$ #quit
$ dc_shell-topo>
If I run the EDA tool with my top-script, the subprocess.stdout will stop on message "#quit", I cannot get message "dc_shell-topo>".
...
#quit
I know the subprocess is waiting for my input, but GUI will stop on the message "#quit", and hang with "time.sleep(0.1)" on the while command.
I also tried replace "time.sleep(0.1)" with "self.GUItop.after(100, self.updateRunProcess)", then stdout will go through "dc_shell-topo>" command without any input, then finish directly.
...
dc_shell=topo>
Memory usage for main task 82 Mbytes.
CPU usage for this session 6 seconds ...
Thank you...
My expected behavior is:
When command stop with "dc_shell-topo>" on subprocess, I can get the message with subprocess.stdout.
GUI will not hang when waiting for my intput.
My questions are:
Why use "time.sleep()" and "self.GUItop.after()" can affect the subprocess.stdout message?
When EDA tool is waiting for input with message "dc_shell-topo>" on subprocess, how could i get such message on subprocess.stdout()?
When using self.GUItop.after, the GUI Text before the subprocess finish (It is waiting for cpu free), but without self.GUItop.after, GUI will hang on time.sleep() command, how to solve such issue?
I thinks this is really a headache problem, I have read thousands related questions on google, but none of them can answer my question.

Related

How to run an EXE program in the background and get the outuput in python

I want to run an exe program in the background
Let's say the program is httpd.exe
I can run it but when I want to get the outupt It get stuck becuase there is no output if It starts successfully. But if there is an error It's OK.
Here is the code I'm using:
import asyncio
import os
os.chdir('c:\\apache\\bin')
process, stdout, stderr = asyncio.run(run('httpd.exe'))
print(stdout, stderr)
async def run(cmd):
proc = await asyncio.create_subprocess_exec(
cmd,
stdout=asyncio.subprocess.PIPE,
stderr=asyncio.subprocess.PIPE)
stdout, stderr = await proc.communicate()
return (proc, stdout, stderr)
I tried to make the following code as general as possible:
I make no assumptions as to whether the program being run only writes its output to stdout alone or stderr alone. So I capture both outputs by starting two threads, one for each stream, and then write the output to a common queue that can be read in real time. When end-of-stream in encountered on each stdout and stderr, the threads write a special None record to the queue to indicate end of stream. So the reader of the queue know that after seeing two such "end of stream" indicators that there will be no more lines being written to the queue and that the process has effectively ended.
The call to subprocess.Popen can be made with argument shell=True so that this can also built-in shell commands and also to make the specification of the command easier (it can now be a single string rather than a list of strings).
The function run_cmd returns the created process and the queue. You just have to now loop reading lines from the queue until two None records are seen. Once that occurs, you can then just wait for the process to complete, which should be immediate.
If you know that the process you are starting only writes its output to stdout or stderr (or if you only want to catch one of these outputs), then you can modify the program to start only one thread and specify the subprocess.PIPE value for only one of these outputs and then the loop that is reading lines from the queue should only be looking for one None end-of-stream indicator.
The threads are daemon threads so that if you wish to terminate based on output from the process that has been read before all the end-of-stream records have been detected, then the threads will automatically be terminated along with the main process.
run_apache, which runs Apache as a subprocess, is itself a daemon thread. If it detects any output from Apache, it sets an event that has been passed to it. The main thread that starts run_apache can periodically test this event, wait on this event, wait for the run_apache thread to end (which will only occur when Apache ends) or can terminate Apache via global variable proc.
import subprocess
import sys
import threading
import queue
def read_stream(f, q):
for line in iter(f.readline, ''):
q.put(line)
q.put(None) # show no more data from stdout or stderr
def run_cmd(command, run_in_shell=True):
"""
Run command as a subprocess. If run_in_shell is True, then
command is a string, else it is a list of strings.
"""
proc = subprocess.Popen(command, shell=run_in_shell, stdout=subprocess.PIPE, stderr=subprocess.PIPE, text=True)
q = queue.Queue()
threading.Thread(target=read_stream, args=(proc.stdout, q), daemon=True).start()
threading.Thread(target=read_stream, args=(proc.stderr, q), daemon=True).start()
return proc, q
import os
def run_apache(event):
global proc
os.chdir('c:\\apache\\bin')
proc, q = run_cmd(['httpd.exe'], False)
seen_None_count = 0
while seen_None_count < 2:
line = q.get()
if line is None:
# end of stream from either stdout or stderr
seen_None_count += 1
else:
event.set() # Seen output line:
print(line, end='')
# wait for process to terminate, which should be immediate:
proc.wait()
# This event will be set if Apache write output:
event = threading.Event()
t = threading.Thread(target=run_apache, args=(event,), daemon=True)
t.start()
# Main thread runs and can test event any time to see if it has done any output:
if event.is_set():
...
# The main thread can wait for run_apache thread to normally terminate,
# will occur when Apache terminates:
t.join()
# or the main thread can kill Apache via global variable procL
proc.terminate() # No need to do t.join() since run_apache is a daemon thread

Python subprocess hangs on interaction

I am trying to write a minecraft server wrapper that allows me to send it commands and receive output. Eventually, I'll attach a socket interface so that I can control my home server remotely to restart / second commands / etc.
To this end, I am attempting to use the python subprocess module to start the server, then send commands and receive the output of the server. Right now, I am running into an issue I can grab the output of the server and reflect it to screen, but the very first command I send to the process freezes the whole thing and I have to kill it. It should be noted that I have attempted to remove the process.communicate line and instead replaced it with a print(command). This also froze the process My very basic current code is as follows:
from subprocess import Popen, PIPE
from threading import Thread
import threading
def listen(process):
while process.poll() is None:
output = process.stdout.readline()
print(str(output))
def talk(process):
command = input("Enter command: ")
while command != "exit_wrapper":
#freezes on first send command
parse_command(process, command)
command = input("Enter command: ")
print("EXITTING! KILLING SERVER!")
process.kill()
def parse_command(process, command):
process.communicate(command.encode())
def main():
process = Popen("C:\\Minecraft Servers\\ServerStart.bat", cwd = "C:\\Minecraft Servers\\", stdout=PIPE, stdin=PIPE)
listener = Thread(None, listen, None, kwargs={'process':process})
listener.start()
talker = Thread(None, talk, None, kwargs={'process':process})
talker.start()
listener.join()
talker.join()
if __name__ == "__main__":
main()
Any help offered would be greatly appreciated!
subprocess.Popen.communicate() documentation clearly states:
Interact with process: Send data to stdin. Read data from stdout and stderr, until end-of-file is reached. Wait for process to terminate.
And in your case it's doing exactly that. What you want to do instead of waiting for the process to be terminated is to interact with the process so, much like you're reading from the STDOUT stream directly in your listen() function, you should write to the process STDIN in order to send it commands. Something like:
def talk(process):
command = input("Enter command: ")
while command != "exit_wrapper" and process.poll() is None:
process.stdin.write(command) # write the command to the process STDIN
process.stdin.flush() # flush it
command = input("Enter command: ") # get the next command from user
if process.poll() is None:
print("EXITING! KILLING SERVER!")
process.kill()
The problem with this approach, however, is that you'll have a potential of overwriting the server's output with your Enter command: prompt and that the user will end up typing the command over the server's output instead in the 'prompt' you've designated.
What you might want to do instead is to parse your server's output in the listen() function and then based on the collected output determine when the wrapped server expects user input and then, and only then, call the talk() function (of course, remove the while loop from it) to obtain user input.
You should also pipe-out STDERR as well, in case the Minecraft server is trying to tell you something over it.

start a service with popen : command not stopping

I try to make a backup script in python and start, stop a service with popen...
Stopping the service is working, but unfortunatly starting the service works, but blocks the rest of the execution, the scripts stays there, why ?
Seems to be somehow linked with the httpd service... :-(
the program config element is like "service;httpd;start" or "/etc/init.d/myprog;start"
class execute(actions):
def __init__(self,config,section,logger):
self.name="execute"
actions.__init__(self,config,section,logger)
def process(self):
try:
program=self.config.get(self.section,"program").split(";")
self.logger.debug("program=%s" % program)
p = subprocess.Popen(program, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout, stderr = p.communicate()
if stdout:
self.logger.info(stdout)
if stderr:
self.logger.error(stderr)
return p.returncode
except Exception:
self.logger.exception(Exception)
You have to open a stdin as a pipe as well, and then close it (if you use read() and write() instead of communicate()).
p = subprocess.Popen(..., stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
p.stdin.close()
print "Stdout:", p.stdout.read()
print "Stderr:", p.stderr.read()
If it doesn't work, and you don't really need any checks, just close all pipes after call to Popen, what will cause program execution and detachment
from pipes.
Warning: This will make program run as a daemon if it doesn't terminate on its own.
After doing this you may call wait() to see whether it'll block as well. And use exitcodes to check for eventual errors.
There are not much of them. Just service started or not. Sometimes even it returns that service is running, but service crashes.
To check whether service script is still running, but without blocking, use:
if p.poll()==None: print "Still running"
Else, poll() returns the exit code.
This works neatly for starting and stopping a service:
from subprocess import Popen, PIPE
service = "brltty"
p = Popen(["service", service, "start"], stdin=PIPE, stdout=PIPE, stderr=PIPE)
# Note: using sequence uses shell=0
stdout, stderr = p.communicate()
print "Stdout:", stdout
print "Stderr:", stderr
Don't forget to change start to stop :D :D :D
The call to p.communicate() waits for the process to terminate.
Refer to: subprocess documentation
Interact with process: Send data to stdin. Read data from stdout and
stderr, until end-of-file is reached. Wait for process to terminate.
The optional input argument should be a string to be sent to the child
process, or None, if no data should be sent to the child.
You can try to use p.poll() instead. This method doesn't wait for a process to terminate.

PYTHON subprocess cmd.exe closes after first command

I am working on a python program which implements the cmd window.
I am using subproccess with PIPE.
If for example i write "dir" (by stdout), I use communicate() in order to get the response from the cmd and it does work.
The problem is that in a while True loop, this doesn't work more than one time, it seems like the subprocess closes itself..
Help me please
import subprocess
process = subprocess.Popen('cmd.exe', shell=False, stdin=subprocess.PIPE,stdout=subprocess.PIPE,stderr=None)
x=""
while x!="x":
x = raw_input("insert a command \n")
process.stdin.write(x+"\n")
o,e=process.communicate()
print o
process.stdin.close()
The main problem is that trying to read subprocess.PIPE deadlocks when the program is still running but there is nothing to read from stdout. communicate() manually terminates the process to stop this.
A solution would be to put the piece of code that reads stdout in another thread, and then access it via Queue, which allows for reliable sharing of data between threads by timing out instead of deadlocking.
The new thread will read standard out continuously, stopping when there is no more data.
Each line will be grabbed from the queue stream until a timeout is reached(no more data in Queue), then the list of lines will be displayed to the screen.
This process will work for non-interactive programs
import subprocess
import threading
import Queue
def read_stdout(stdout, queue):
while True:
queue.put(stdout.readline()) #This hangs when there is no IO
process = subprocess.Popen('cmd.exe', shell=False, stdout=subprocess.PIPE, stdin=subprocess.PIPE)
q = Queue.Queue()
t = threading.Thread(target=read_stdout, args=(process.stdout, q))
t.daemon = True # t stops when the main thread stops
t.start()
while True:
x = raw_input("insert a command \n")
if x == "x":
break
process.stdin.write(x + "\n")
o = []
try:
while True:
o.append(q.get(timeout=.1))
except Queue.Empty:
print ''.join(o)

Capturing stdout from subprocess after sending SIGINT

I have a dtrace snippet run via python script and the dtrace snippet is such that it generates data when CTRL-C is issued to it. So I had a signal_handler defined in the python script to catch CTRL-C from user and relay this to the dtrace invocation done via subprocess.Popen but I am unable to get any output in my log file. Here is the script:
Proc = []
signal_posted = False
def signal_handler(sig, frame):
print("Got CTRL-C!")
global signal_posted
signal_posted = True
global Proc
Proc.send_signal(signal.SIGINT) #Signal posting from handler
def execute_hotkernel():
#
# Generate the .out output file
#
fileout = "hotkernel.out"
fileo = open(fileout, "w+")
global Proc
Proc = subprocess.Popen(['/usr/sbin/dtrace', '-n', dtrace_script], stdout = fileo)
while Proc.poll() is None:
time.sleep(0.5)
def main():
signal.signal(signal.SIGINT, signal_handler) # Change our signal handler
execute_hotkernel()
if __name__ == '__main__':
main()
Since I have a file hotkernel.out set in subprocess.Popen command for stdout I was expecting the output from dtrace to be redirected to hotkernel.out on doing a CTRL-C but it is empty. What is missing here?
I have a similar issue.
In my case, it's a shell script that runs until you hit Control-C, and then prints out summary information. When I run this using subprocess.Popen, whether using a PIPE or a file object for stdout, I either don't get the information (with a file object) or it hangs when I try to run stdout.readline().
I finally tried running the subprocess from the interpreter and discovered I could get the last line of output after the SIGINT with a PIPE if I call stdout.readline() (where it hangs) and hit Control-C (in the interpreter), and then call stdout.readline() again.
I do not know how to emulate this in script, for a file output or for a PIPE. I did not try the file output in the interpreter.
EDIT:
I finally got back to this and determined, it's actually pretty easy to emulate outside of python and really has nothing to do with python.
/some_cmd_that_ends_on_sigint
(enter control-c)
*data from stdout in event handler*
Works
/some_cmd_that_ends_on_sigint | tee some.log
(enter control-c)
*Nothing sent to stdout in event handler prints to the screen or the log*
Where's my log?
I ended up just adding a file stream in the event handler (in the some_cmd_that_ends_on_sigint source) that writes the data to a (possibly secondary) log. Works, if a bit awkward. You get the data on the screen if running without any piping, but I can also read it when piped or from python from the secondary log.

Categories