I am writing a python script for automation.
I need to run a linux shell command (Program: dvbv5-zap) and wait for specific command output (DVR interface '/dev/dvb/adapter0/dvr0' can now be opened). When command outputs this string python should run another shell program.
I don't know how to capture subprocess cli output, I tried with .stdout.readline(), and I got nothing.
I run a command with subprocess.Popen(['dvbv5-zap', 'args'], stdout=subprocess.PIPE)
I found my answer here: https://fredrikaverpil.github.io/2013/10/11/catching-string-from-stdout-with-python/
Code snippet:
# Imports
import os, sys, subprocess
# Build command
command = [ 'python', os.join.path('/path/to', 'scriptFile.py') ]
# Execute command
p = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
# Read stdout and print each new line
sys.stdout.flush()
for line in iter(p.stdout.readline, b''):
# Print line
sys.stdout.flush()
print(">>> " + line.rstrip())
# Look for the string 'Render done' in the stdout output
if 'Render done' in line.rstrip():
# Write something to stdout
sys.stdout.write('Nice job on completing the render, I am executing myFunction()\n' )
sys.stdout.flush()
# Execute something
myFunction()
I have a .war file I'd like to launch via python. I want it to run in the background so no log messages appear in my terminal. Also I would love to have the actual log output put into a logfile. This is the python code I use to solve this.
I had no luck yet. The process is not detached because I cannot run other shell commands after executing the script. The logfile is created but no log-output is appended there.
EDIT: To make things more clear. I want to enhance this script to run multiple java processes in the end. Therefore this python script should spawn those java processes and die in the end. How to achieve exactly that including the functionality of redirecting stdout to a file?
#!/usr/bin/env python3
import subprocess
import re
platformDir = "./platform/"
fe = "frontend-webapp-0.5.0.war"
logfile = open("frontend-log", 'w')
process = subprocess.Popen(['java', '-jar', platformDir + fe],
stdout=subprocess.PIPE)
for line in process.stdout:
logFile.write(line)
Here is how you can try it using Python 3:
import sys
from subprocess import Popen, PIPE, STDOUT
platformDir = "./platform/"
fe = "frontend-webapp-0.5.0.war"
logfile = open("frontend-log", 'ab')
p = Popen(['java', '-jar', platformDir + fe], stdout=PIPE, stderr=STDOUT, bufsize=1)
for line in p.stdout:
sys.stdout.buffer.write(line)
logfile.write(line)
I actually just simply had to give stdout the file handle like stdout=logFile. The solution with the for loop through the stdout would leave me in the python script process all the time
import sys
from subprocess import Popen, PIPE, STDOUT
platformDir = "./platform/"
fe = "frontend-webapp-0.5.0.war"
logfile = open("frontend-log", 'ab')
p = Popen(['java', '-jar', platformDir + fe], stdout=logfile, stderr=STDOUT, bufsize=1)
I have a custom input method and I have a python module to communicate with it. I'm trying to control the shell with it so everything from local stdout is printed on the remote device and everything sent from the remote device goes into local stdin, so that remote device can control the input given to the program, like if there was an input function inside the program the remote device can answer to that too (like in ssh).
I used python subprocess to control the stdin and stdout:
#! /usr/bin/python
from subprocess import Popen, PIPE
import thread
from mymodule import remote_read, remote_write
def talk2proc(dap):
while True:
try:
remote_write(dap.stdout.read())
incmd = remote_read()
dap.stdin.write(incmd)
except Exception as e:
print (e)
break
while True:
cmd = remote_read()
if cmd != 'quit':
p = Popen(['bash', '-c', '"%s"'%cmd], stdout=PIPE, stdin=PIPE, stderr=PIPE)
thread.start_new_thread(talk2proc, (p,))
p.wait()
else:
break
But it doesn't work, what should I do?
p.s.
is there a difference for windows?
I had this problem, I used this for STDIN
from subprocess import call
call(['some_app', 'param'], STDIN=open("a.txt", "rb"))
a.txt
:q
This I used for a git wrapper, this will enter the data line wise whenever there is an interrupt in some_app that is expecting and user input
There is a difference for Windows. This line won't work in Windows:
p = Popen(['bash', '-c', '"%s"'%cmd], stdout=PIPE, stdin=PIPE, stderr=PIPE)
because the equivalent of 'bash' is 'cmd.exe'.
I'm trying to write the return of terminal in a file, called debug.log.
I would like also to get the pid to kill the process, for this moment the kill is working.
But the debug.log is empty
cmd1 = "cvlc rtp://232.0.2.183:8200 --sout file/mkv:/media/file.mkv"
with open("/home/user/.cache/debug.log", 'w') as out:
proc = subprocess.Popen(cmd1, stdout=out, shell=True, preexec_fn=os.setsid)
pid = proc.pid
with open("/home/user/.cache/pid.log", 'w') as f:
f.write(str(pid))
f.close()
Edit: I'm using this method to kill the process
and this method (from here) to write the log:
###########kill the process############
import os
import signal
import subprocess
# The os.setsid() is passed in the argument preexec_fn so
# it's run after the fork() and before exec() to run the shell.
pro = subprocess.Popen(cmd, stdout=subprocess.PIPE,
shell=True, preexec_fn=os.setsid)
os.killpg(pro.pid, signal.SIGTERM) # Send the signal to all the process groups
######### write the log #############
import subprocess
cmd = ['ls', '-l'] # example of command
with open('output.txt', 'w') as out:
return_code = subprocess.call(cmd, stdout=out)
In fact, I would like to mix both of examples.
Thanks
You need to redirect the 'stderr' (not 'stdout') to 'out'.
proc = subprocess.Popen(cmd1, stderr=out, shell=True, preexec_fn=os.setsid)
My python script uses subprocess to call a linux utility that is very noisy. I want to store all of the output to a log file and show some of it to the user. I thought the following would work, but the output doesn't show up in my application until the utility has produced a significant amount of output.
#fake_utility.py, just generates lots of output over time
import time
i = 0
while True:
print hex(i)*512
i += 1
time.sleep(0.5)
#filters output
import subprocess
proc = subprocess.Popen(['python','fake_utility.py'],stdout=subprocess.PIPE)
for line in proc.stdout:
#the real code does filtering here
print "test:", line.rstrip()
The behavior I really want is for the filter script to print each line as it is received from the subprocess. Sorta like what tee does but with python code.
What am I missing? Is this even possible?
Update:
If a sys.stdout.flush() is added to fake_utility.py, the code has the desired behavior in python 3.1. I'm using python 2.6. You would think that using proc.stdout.xreadlines() would work the same as py3k, but it doesn't.
Update 2:
Here is the minimal working code.
#fake_utility.py, just generates lots of output over time
import sys, time
for i in range(10):
print i
sys.stdout.flush()
time.sleep(0.5)
#display out put line by line
import subprocess
proc = subprocess.Popen(['python','fake_utility.py'],stdout=subprocess.PIPE)
#works in python 3.0+
#for line in proc.stdout:
for line in iter(proc.stdout.readline,''):
print line.rstrip()
I think the problem is with the statement for line in proc.stdout, which reads the entire input before iterating over it. The solution is to use readline() instead:
#filters output
import subprocess
proc = subprocess.Popen(['python','fake_utility.py'],stdout=subprocess.PIPE)
while True:
line = proc.stdout.readline()
if not line:
break
#the real code does filtering here
print "test:", line.rstrip()
Of course you still have to deal with the subprocess' buffering.
Note: according to the documentation the solution with an iterator should be equivalent to using readline(), except for the read-ahead buffer, but (or exactly because of this) the proposed change did produce different results for me (Python 2.5 on Windows XP).
Bit late to the party, but was surprised not to see what I think is the simplest solution here:
import io
import subprocess
proc = subprocess.Popen(["prog", "arg"], stdout=subprocess.PIPE)
for line in io.TextIOWrapper(proc.stdout, encoding="utf-8"): # or another encoding
# do something with line
(This requires Python 3.)
Indeed, if you sorted out the iterator then buffering could now be your problem. You could tell the python in the sub-process not to buffer its output.
proc = subprocess.Popen(['python','fake_utility.py'],stdout=subprocess.PIPE)
becomes
proc = subprocess.Popen(['python','-u', 'fake_utility.py'],stdout=subprocess.PIPE)
I have needed this when calling python from within python.
You want to pass these extra parameters to subprocess.Popen:
bufsize=1, universal_newlines=True
Then you can iterate as in your example. (Tested with Python 3.5)
A function that allows iterating over both stdout and stderr concurrently, in realtime, line by line
In case you need to get the output stream for both stdout and stderr at the same time, you can use the following function.
The function uses Queues to merge both Popen pipes into a single iterator.
Here we create the function read_popen_pipes():
from queue import Queue, Empty
from concurrent.futures import ThreadPoolExecutor
def enqueue_output(file, queue):
for line in iter(file.readline, ''):
queue.put(line)
file.close()
def read_popen_pipes(p):
with ThreadPoolExecutor(2) as pool:
q_stdout, q_stderr = Queue(), Queue()
pool.submit(enqueue_output, p.stdout, q_stdout)
pool.submit(enqueue_output, p.stderr, q_stderr)
while True:
if p.poll() is not None and q_stdout.empty() and q_stderr.empty():
break
out_line = err_line = ''
try:
out_line = q_stdout.get_nowait()
except Empty:
pass
try:
err_line = q_stderr.get_nowait()
except Empty:
pass
yield (out_line, err_line)
read_popen_pipes() in use:
import subprocess as sp
with sp.Popen(my_cmd, stdout=sp.PIPE, stderr=sp.PIPE, text=True) as p:
for out_line, err_line in read_popen_pipes(p):
# Do stuff with each line, e.g.:
print(out_line, end='')
print(err_line, end='')
return p.poll() # return status-code
You can also read lines w/o loop. Works in python3.6.
import os
import subprocess
process = subprocess.Popen(command, stdout=subprocess.PIPE)
list_of_byte_strings = process.stdout.readlines()
Pythont 3.5 added the methods run() and call() to the subprocess module, both returning a CompletedProcess object. With this you are fine using proc.stdout.splitlines():
proc = subprocess.run( comman, shell=True, capture_output=True, text=True, check=True )
for line in proc.stdout.splitlines():
print "stdout:", line
See also How to Execute Shell Commands in Python Using the Subprocess Run Method
I tried this with python3 and it worked, source
When you use popen to spawn the new thread, you tell the operating system to PIPE the stdout of the child processes so the parent process can read it and here, stderr is copied to the stderr of the parent process.
in output_reader we read each line of stdout of the child process by wrapping it in an iterator that populates line by line output from the child process whenever a new line is ready.
def output_reader(proc):
for line in iter(proc.stdout.readline, b''):
print('got line: {0}'.format(line.decode('utf-8')), end='')
def main():
proc = subprocess.Popen(['python', 'fake_utility.py'],
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
t = threading.Thread(target=output_reader, args=(proc,))
t.start()
try:
time.sleep(0.2)
import time
i = 0
while True:
print (hex(i)*512)
i += 1
time.sleep(0.5)
finally:
proc.terminate()
try:
proc.wait(timeout=0.2)
print('== subprocess exited with rc =', proc.returncode)
except subprocess.TimeoutExpired:
print('subprocess did not terminate in time')
t.join()
The following modification of RĂ´mulo's answer works for me on Python 2 and 3 (2.7.12 and 3.6.1):
import os
import subprocess
process = subprocess.Popen(command, stdout=subprocess.PIPE)
while True:
line = process.stdout.readline()
if line != '':
os.write(1, line)
else:
break
I was having a problem with the arg list of Popen to update servers, the following code resolves this a bit.
import getpass
from subprocess import Popen, PIPE
username = 'user1'
ip = '127.0.0.1'
print ('What is the password?')
password = getpass.getpass()
cmd1 = f"""sshpass -p {password} ssh {username}#{ip}"""
cmd2 = f"""echo {password} | sudo -S apt update"""
cmd3 = " && "
cmd4 = f"""echo {password} | sudo -S apt upgrade -y"""
cmd5 = " && "
cmd6 = "exit"
commands = [cmd1, cmd2, cmd3, cmd4, cmd5, cmd6]
command = " ".join(commands)
cmd = command.split()
with Popen(cmd, stdout=PIPE, bufsize=1, universal_newlines=True) as p:
for line in p.stdout:
print(line, end='')
And to run the update on a local computer, the following code example does this.
import getpass
from subprocess import Popen, PIPE
print ('What is the password?')
password = getpass.getpass()
cmd1_local = f"""apt update"""
cmd2_local = f"""apt upgrade -y"""
commands = [cmd1_local, cmd2_local]
with Popen(['echo', password], stdout=PIPE) as auth:
for cmd in commands:
cmd = cmd.split()
with Popen(['sudo','-S'] + cmd, stdin=auth.stdout, stdout=PIPE, bufsize=1, universal_newlines=True) as p:
for line in p.stdout:
print(line, end='')