Can't get stdout data python - python

I'm trying to get the output of a command's STDOUT with the HandBrakeCLI program when encoding a video. I can't seem to get python to handle its output on the standard output stream. I've tried the following codes:
import subprocess
import sys
encode = subprocess.check_output("HandBrakeCLI -i video.mkv -o out.mp4", shell=True, stderr=subprocess.STDOUT, universal_newlines=True)
print(encode)
This printed nothing as well as this which I also tried:
import subprocess
import sys
encode = subprocess.Popen("HandBrakeCLI -i video.mkv -o out.mp4", stdout=subprocess.PIPE, stderr = subprocess.PIPE, shell=True, universal_newlines=True)
print(encode.stdout.read())
As stated before, both will result in no output. This application is the type that will update text on a single line in bash as it's encoding. I'm not sure if that type of data stream creates a problem with python or not.

It seems HandBrakeCLI changes its output depending on whether it prints to a terminal. Either specify a command-line flag to force the necessary output or you could trick it by providing a pseudo-tty (if your system supports it) using pexpect or pty module directly.
Code examples on how to get output from a subprocess using pexpect, pty modules:
Last unbuffered line can't be read
Python subprocess readlines() hangs

Related

Python3 getting output from subprocess [duplicate]

This question already has answers here:
Store output of subprocess.Popen call in a string [duplicate]
(15 answers)
Closed 5 months ago.
I am trying to use Popen to scp a file from my laptop to a device on my network. The process is pretty straight foward...I can get the file to transfer but I cant get the output from the command to display. I am specificlly looking for the percentage complete. Here is what I have:
from subprocess import Popen, STDOUT, PIPE
scp_command = 'scp -i c:<local_key> <filepath to local file> <user>#<destination_device>:\path'
local_scp_command = Popen(scp_command, text=True, stout=PIPE)
output = local_scp_transfer.communicate
print(output)
I have tried a number of different combinations of stdout and printing the output. I cant even remember all the ways I have tried this. I imagine that there is something kind of easy that I am missing. I am pretty new at programming so even the easy things are compliacted for me.
Thank you so much for all your help!
Use poll() to determine whether or not the process has finished and read a line:
from subprocess import Popen, STDOUT, PIPE
import shlex
scp_command = 'scp -i c:<local_key> <filepath to local file> <user>#<destination_device>:\path'
local_scp_command = Popen(shlex.split(scp_command), text=True, stdout=PIPE)
while local_scp_command.poll() is None and line := local_scp_command.stdout.readline():
print(line)
I added a shlex.split because that's the proper format for Popen.

stdout of subprocess.Popen not working correctly

I am unable to save my output of subprocess.Popen correctly. I get this in the file I chose. The directory specified is correct, as just above I told it to erase text already existing in it, which worked. Any solutions to this?
Code is below
f = open("hunter_logs.txt", "w")
subp = subprocess.Popen(
'docker run -p 5001-5110:5001-5110/udp -v D:\Hunter\hunter\hunter-scenarios:/hunter-scenarios europe-west3-docker.pkg.dev/hunter-all/controller-repo/hunter_controller:latest -d /hunter-scenarios -s croatia -i OPFOR', stdout=f)
Probably the process is outputting some of its logs to stderr and some to stdout. Add stderr=f as another argument to Popen() in order to capture both streams to the same file.

Running bash command on server

I am trying to run the bash command pdfcrack in Python on a remote server. This is my code:
bashCommand = "pdfcrack -f pdf123.pdf > myoutput.txt"
import subprocess
process = subprocess.Popen(bashCommand.split(), stdout=subprocess.PIPE)
output, error = process.communicate()
I, however, get the following error message:
Non-option argument myoutput2.txt
Error: file > not found
Can anybody see my mistake?
The first argument to Popen is a list containing the command name and its arguments. > is not an argument to the command, though; it is shell syntax. You could simply pass the entire line to Popen and instruct it to use the shell to execute it:
process = subprocess.Popen(bashCommand, shell=True)
(Note that since you are redirecting the output of the command to a file, though, there is no reason to set its standard output to a pipe, because there will be nothing to read.)
A better solution, though, is to let Python handle the redirection.
process = subprocess.Popen(['pdfcrack', '-f', 'pdf123.pdf'], stdout=subprocess.PIPE)
with open('myoutput.txt', 'w') as fh:
for line in process.stdout:
fh.write(line)
# Do whatever else you want with line
Also, don't use str.split as a replacement for the shell's word splitting. A valid command line like pdfcrack -f "foo bar.pdf" would be split into the incorrect list ['pdfcrack', '-f', '"foo', 'bar.pdf"'], rather than the correct list ['pdfcrack', '-f', 'foo bar.pdf'].
> is interpreted by shell, but not valid otherwise.
So, that would work (don't split, use as-is):
process = subprocess.Popen(bashCommand, shell=True)
(and stdout=subprocess.PIPE isn't useful since all output is redirected to the output file)
But it could be better with native python for redirection to output file and passing arguments as list (handles quote protection if needed)
with open("myoutput.txt","w") as f:
process = subprocess.Popen(["pdfcrack","-f","pdf123.pdf"], stdout=subprocess.PIPE)
f.write(process.read())
process.wait()
Your mistake is > in command.
It doesn't treat this as redirection to file because normally bash does it and now you run it without using bash.
Try with shell=True if you whan to use bash. And then you don't have to split command into list.
subprocess.Popen("pdfcrack -f pdf123.pdf > myoutput.txt", shell=True)

Subprocess writing stdin and reading stdout python 3.4

I am writing a script which would run a Linux command and write a string (up to EOL) to stdin and read a string (until EOL) from stdout. The easiest illustration would be cat - command:
p=subprocess.Popen(['cat', '-'], stdin=subprocess.PIPE, stdout=subprocess.PIPE)
stringin="String of text\n"
p.stdin.write=(stringin)
stringout=p.stout.read()
print(stringout)
I aim to open the cat - process once and use it to write a string multiple times to its stdin every time getting a string from its stdout.
I googled quite a bit and a lot of recipes don't work, because the syntax is incompatible through different python versions (I use 3.4). That is my first python script from scratch and I find the python documentation to be quite confusing so far.
Thank you for your solution Salva.
Unfortunately communicate() closes the cat - process. I did not find any solution with subprocess to communicate with the cat - without having to open a new cat - for every call. I found an easy solution with pexpect though:
import pexpect
p = pexpect.spawn('cat -')
p.setecho(False)
def echoback(stringin):
p.sendline(stringin)
echoback = p.readline()
return echoback.decode();
i = 1
while (i < 11):
print(echoback("Test no: "+str(i)))
i = i + 1
In order to use pexpect Ubuntu users will have to install it through pip. If you wish to install it for python3.x, you will have to install pip3 (python3-pip) first from the Ubuntu repo.
Well you need to communicate with the process:
from subprocess import Popen, PIPE
s = Popen(['cat', '-'], stdin=PIPE, stdout=PIPE, stderr=PIPE)
input = b'hello!' # notice the input data are actually bytes and not text
output, errs = s.communicate(input)
To use unicode strings, you would need to encode() the input and decode() the output:
from subprocess import Popen, PIPE
s = Popen(['cat', '-'], stdin=PIPE, stdout=PIPE, stderr=PIPE)
input = 'EspaƱa'
output, errs = s.communicate(input.encode())
output, errs = output.decode(), errs.decode()

Redirect message from python console to txt file

I'm using a Python script to call a batch file that call after the generation tools and compiler. I save this to a output file using:
os.system("run.bat >output.txt")
The problem is that warnings and compiler error are not saved in this output file - they are only displayed on the Python console.
How would I go about saving the output to a file?
You can do several things:
Use the subprocess family to get handles to the stdout and stderr files
Use a redirect of the stderr in your os.system call.
The first one is more pythonic, and allows your python script to be aware of the errors.
A simple way to redirect stderr to the same file is like this:
os.system("run.bat > output.txt 2>&1")
Don't use os.system ~ you have no error handling there, or even worse, you have no indicator if an error happened.
I would solve this problem like this:
from subprocess import Popen, PIPE
proc = Popen(['run.bat'], stdout=PIPE, stderr=PIPE)
result = proc.communicate()
with open('output.txt', 'w') as output:
output.write(result[0])
with open('errors.txt', 'w') as errors:
errors.write(result[-1])

Categories