Say I have this program printer.py:
#!/usr/bin/env python3
import sys
import time
sys.stdout.write("STDOUT 1\n")
time.sleep(1)
sys.stderr.write("STDERR 2\n")
time.sleep(1)
sys.stdout.write("STDOUT 3\n")
time.sleep(1)
sys.stderr.write("STDERR 4\n")
time.sleep(1)
It prints to stdout and stderr to produce:
./printer.py
STDOUT 1
STDERR 2
STDOUT 3
STDERR 4
I would like to execute printer.py inside another python script, runner.py, and print in real time both stderr and stdout. The following version of runner.py does not work:
#!/usr/bin/env python3
import sys
import subprocess
def run_command(command):
process = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE)
while True:
output = process.stdout.readline().decode()
if output == '' and process.poll() is not None:
break
if output:
print(output.strip())
rc = process.poll()
return rc
rc = run_command('./printer.py')
because it prints the stderr lines first in real-time and the stdout lines later all at once:
./runner.py
STDERR 2
STDERR 4
STDOUT 1
STDOUT 3
How can fix it to have the correct order 1, 2, 3, and 4 in real-time? The closer I could get is by using:
rc = run_command('./printer.py 1>&2')
which is kind of ok, but I wonder whether I could make it do the proper thing and print to stdout and stderr in the same way as printer.py.
sys.stdout.flush() as suggested in comments makes no difference:
#!/usr/bin/env python3
import sys
import subprocess
def run_command(command):
process = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE)
while True:
output = process.stdout.readline().decode()
if output == '' and process.poll() is not None:
break
if output:
sys.stdout.write(output.strip() + '\n')
sys.stdout.flush()
rc = process.poll()
return rc
rc = run_command('./printer.py')
./runner.py
STDERR 2
STDERR 4
STDOUT 1
STDOUT 3
The same for print(..., flush=True). Am I doing something wrong?
I'm collating comments and add a bit of mine. Credit goes to #Barmar and #MarkSetchell.
In the end, I think I'm going for the following solution:
rc = run_command('PYTHONUNBUFFERED=1 ./printer.py')
it should do the same as #MarkSetchell's python -u ./printer.py. However, for that I would to explicitly set the path to printer.py and I would rather avoid that. But I'm not sure yet about the pro and cons of each.
unbuffer solution: On my Ubuntu 18 is not installed so I'd rather avoid an additional dependency. As I understand it I would use it as rc = run_command('unbuffer ./printer.py')?
Editing printer.py is not an option for me otherwise adding sys.stdout.flush() after each print or sys.stdout.write should also work.
I pass an executable on the command-line to my python script. I do some calculations and then I'd like to send the result of these calculations on STDIN to the executable. When it has finished I would like to get the executable's result back from STDOUT.
ciphertext = str(hex(C1))
exe = popen([sys.argv[1]], stdout=PIPE, stdin=PIPE)
result = exe.communicate(input=ciphertext)[0]
print(result)
When I print result I get nothing, not None, an empty line. I'm sure that the executable works with the data as I've repeated the same thing using the '>' on the command-line with the same previously calculated result.
A working example
#!/usr/bin/env python
import subprocess
text = 'hello'
proc = subprocess.Popen(
'md5sum',stdout=subprocess.PIPE,
stdin=subprocess.PIPE)
proc.stdin.write(text)
proc.stdin.close()
result = proc.stdout.read()
print result
proc.wait()
to get the same thing as “execuable < params.file > output.file”, do this:
#!/usr/bin/env python
import subprocess
infile,outfile = 'params.file','output.file'
with open(outfile,'w') as ouf:
with open(infile,'r') as inf:
proc = subprocess.Popen(
'md5sum',stdout=ouf,stdin=inf)
proc.wait()
I am trying to make a terminal with python subprocess.I have this code but i don't know how to give it the inputs.
import subprocess
while True:
command = input("cmd>")
process = subprocess.Popen(command, stdin=subprocess.PIPE, stdout=subprocess.PIPE, shell=True)
while True:
line = process.stdout.readline()
if process.poll() is not None:
break
if line:
print(line.decode(), end="")
Example:
if run this python file with it how can type the message?
Command:
cmd>python test.py
#test.py
message = input("message:")
print(message)
I am using subprocess module to interact with output of the linux commands. below is my code.
import subprocess
import sys
file_name = 'myfile.txt'
p = subprocess.Popen("grep \"SYSTEM CONTROLLER\" "+ file_name, stdout=subprocess.PIPE, shell=True)
(output, err) = p.communicate()
print output.strip()
p = subprocess.Popen("grep \"controller\|worker\" "+ file_name, stdout=subprocess.PIPE, shell=True)
(output, err) = p.communicate()
lines = output.rstrip().split("\n")
print lines
My program hangs while executing second subprocess i.e.
p = subprocess.Popen("grep \"controller\|worker\""+ file_name,stdout=subprocess.PIPE, shell=True)
I got to know that the reason of process hang is buffer redirected to subprocess.PIPE is getting filled, which blocks the process from writing further.
I want to know if there is any way to avoid the buffer full situation so that my program keeps on executing without any hang issue ?
The actual issue is that there is a whitespace missing between the pattern and the filename and therefore grep waits for input on the standard input (stdin).
"buffer full" (.communicate() is not susceptible) or p.stdout.read() (it fixes nothing: it loads the output into memory and unlike .communicate() it fails if more than one pipe is used) are a red herring here.
Drop shell=True and use a list argument for the command:
#!/usr/bin/env python
from subprocess import Popen, PIPE
p = Popen(["grep", r"controller\|worker", file_name], stdout=PIPE)
output = p.communicate()[0]
if p.returncode == 0:
print('found')
elif p.returncode == 1:
print('not found')
else:
print('error')
As it says at https://docs.python.org/3/library/subprocess.html#subprocess.Popen.communicate:
Note: The data read is buffered in memory, so do not use this method
if the data size is large or unlimited.
Instead, use the file objects to read the text as it is produced:
output = p.stdout.read()
As long as no other pipes (e.g. stderr) fill up while you are reading, the process shouldn't be blocked.
My python script uses subprocess to call a linux utility that is very noisy. I want to store all of the output to a log file and show some of it to the user. I thought the following would work, but the output doesn't show up in my application until the utility has produced a significant amount of output.
#fake_utility.py, just generates lots of output over time
import time
i = 0
while True:
print hex(i)*512
i += 1
time.sleep(0.5)
#filters output
import subprocess
proc = subprocess.Popen(['python','fake_utility.py'],stdout=subprocess.PIPE)
for line in proc.stdout:
#the real code does filtering here
print "test:", line.rstrip()
The behavior I really want is for the filter script to print each line as it is received from the subprocess. Sorta like what tee does but with python code.
What am I missing? Is this even possible?
Update:
If a sys.stdout.flush() is added to fake_utility.py, the code has the desired behavior in python 3.1. I'm using python 2.6. You would think that using proc.stdout.xreadlines() would work the same as py3k, but it doesn't.
Update 2:
Here is the minimal working code.
#fake_utility.py, just generates lots of output over time
import sys, time
for i in range(10):
print i
sys.stdout.flush()
time.sleep(0.5)
#display out put line by line
import subprocess
proc = subprocess.Popen(['python','fake_utility.py'],stdout=subprocess.PIPE)
#works in python 3.0+
#for line in proc.stdout:
for line in iter(proc.stdout.readline,''):
print line.rstrip()
I think the problem is with the statement for line in proc.stdout, which reads the entire input before iterating over it. The solution is to use readline() instead:
#filters output
import subprocess
proc = subprocess.Popen(['python','fake_utility.py'],stdout=subprocess.PIPE)
while True:
line = proc.stdout.readline()
if not line:
break
#the real code does filtering here
print "test:", line.rstrip()
Of course you still have to deal with the subprocess' buffering.
Note: according to the documentation the solution with an iterator should be equivalent to using readline(), except for the read-ahead buffer, but (or exactly because of this) the proposed change did produce different results for me (Python 2.5 on Windows XP).
Bit late to the party, but was surprised not to see what I think is the simplest solution here:
import io
import subprocess
proc = subprocess.Popen(["prog", "arg"], stdout=subprocess.PIPE)
for line in io.TextIOWrapper(proc.stdout, encoding="utf-8"): # or another encoding
# do something with line
(This requires Python 3.)
Indeed, if you sorted out the iterator then buffering could now be your problem. You could tell the python in the sub-process not to buffer its output.
proc = subprocess.Popen(['python','fake_utility.py'],stdout=subprocess.PIPE)
becomes
proc = subprocess.Popen(['python','-u', 'fake_utility.py'],stdout=subprocess.PIPE)
I have needed this when calling python from within python.
You want to pass these extra parameters to subprocess.Popen:
bufsize=1, universal_newlines=True
Then you can iterate as in your example. (Tested with Python 3.5)
A function that allows iterating over both stdout and stderr concurrently, in realtime, line by line
In case you need to get the output stream for both stdout and stderr at the same time, you can use the following function.
The function uses Queues to merge both Popen pipes into a single iterator.
Here we create the function read_popen_pipes():
from queue import Queue, Empty
from concurrent.futures import ThreadPoolExecutor
def enqueue_output(file, queue):
for line in iter(file.readline, ''):
queue.put(line)
file.close()
def read_popen_pipes(p):
with ThreadPoolExecutor(2) as pool:
q_stdout, q_stderr = Queue(), Queue()
pool.submit(enqueue_output, p.stdout, q_stdout)
pool.submit(enqueue_output, p.stderr, q_stderr)
while True:
if p.poll() is not None and q_stdout.empty() and q_stderr.empty():
break
out_line = err_line = ''
try:
out_line = q_stdout.get_nowait()
except Empty:
pass
try:
err_line = q_stderr.get_nowait()
except Empty:
pass
yield (out_line, err_line)
read_popen_pipes() in use:
import subprocess as sp
with sp.Popen(my_cmd, stdout=sp.PIPE, stderr=sp.PIPE, text=True) as p:
for out_line, err_line in read_popen_pipes(p):
# Do stuff with each line, e.g.:
print(out_line, end='')
print(err_line, end='')
return p.poll() # return status-code
You can also read lines w/o loop. Works in python3.6.
import os
import subprocess
process = subprocess.Popen(command, stdout=subprocess.PIPE)
list_of_byte_strings = process.stdout.readlines()
Pythont 3.5 added the methods run() and call() to the subprocess module, both returning a CompletedProcess object. With this you are fine using proc.stdout.splitlines():
proc = subprocess.run( comman, shell=True, capture_output=True, text=True, check=True )
for line in proc.stdout.splitlines():
print "stdout:", line
See also How to Execute Shell Commands in Python Using the Subprocess Run Method
I tried this with python3 and it worked, source
When you use popen to spawn the new thread, you tell the operating system to PIPE the stdout of the child processes so the parent process can read it and here, stderr is copied to the stderr of the parent process.
in output_reader we read each line of stdout of the child process by wrapping it in an iterator that populates line by line output from the child process whenever a new line is ready.
def output_reader(proc):
for line in iter(proc.stdout.readline, b''):
print('got line: {0}'.format(line.decode('utf-8')), end='')
def main():
proc = subprocess.Popen(['python', 'fake_utility.py'],
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
t = threading.Thread(target=output_reader, args=(proc,))
t.start()
try:
time.sleep(0.2)
import time
i = 0
while True:
print (hex(i)*512)
i += 1
time.sleep(0.5)
finally:
proc.terminate()
try:
proc.wait(timeout=0.2)
print('== subprocess exited with rc =', proc.returncode)
except subprocess.TimeoutExpired:
print('subprocess did not terminate in time')
t.join()
The following modification of Rômulo's answer works for me on Python 2 and 3 (2.7.12 and 3.6.1):
import os
import subprocess
process = subprocess.Popen(command, stdout=subprocess.PIPE)
while True:
line = process.stdout.readline()
if line != '':
os.write(1, line)
else:
break
I was having a problem with the arg list of Popen to update servers, the following code resolves this a bit.
import getpass
from subprocess import Popen, PIPE
username = 'user1'
ip = '127.0.0.1'
print ('What is the password?')
password = getpass.getpass()
cmd1 = f"""sshpass -p {password} ssh {username}#{ip}"""
cmd2 = f"""echo {password} | sudo -S apt update"""
cmd3 = " && "
cmd4 = f"""echo {password} | sudo -S apt upgrade -y"""
cmd5 = " && "
cmd6 = "exit"
commands = [cmd1, cmd2, cmd3, cmd4, cmd5, cmd6]
command = " ".join(commands)
cmd = command.split()
with Popen(cmd, stdout=PIPE, bufsize=1, universal_newlines=True) as p:
for line in p.stdout:
print(line, end='')
And to run the update on a local computer, the following code example does this.
import getpass
from subprocess import Popen, PIPE
print ('What is the password?')
password = getpass.getpass()
cmd1_local = f"""apt update"""
cmd2_local = f"""apt upgrade -y"""
commands = [cmd1_local, cmd2_local]
with Popen(['echo', password], stdout=PIPE) as auth:
for cmd in commands:
cmd = cmd.split()
with Popen(['sudo','-S'] + cmd, stdin=auth.stdout, stdout=PIPE, bufsize=1, universal_newlines=True) as p:
for line in p.stdout:
print(line, end='')