Why is my output one character per line? - python

I have this python script which calls a shell script and processes the output.
$ cat cd-and-run.py
#!/usr/bin/env python
import sys, getopt
import subprocess
def run_mock_phantom (test_value):
aid = 'unknown'
proc = subprocess.Popen(['./mock-phanton.sh', test_value], stdout=subprocess.PIPE)
for line in proc.communicate()[0]:
print line
return aid
def main(argv):
app_id = run_mock_phantom ( 'test-one-two' )
print "app is %s" % (app_id)
if __name__ == "__main__":
main(sys.argv[1:])
Here is the shell script the above script calls:
$ cat mock-phanton.sh
#!/bin/sh
appid=$1
if [ -z $appid ]
then
echo "oh man you didnt do it right ..."
exit 0
fi
echo "APP_ID=$appid"
When I run the script I get this output:
$ ./cd-and-run.py
A
P
P
_
I
D
=
t
e
s
t
-
o
n
e
-
t
w
o
app is unknown
What I don't understand is why does each character get outputted on a separate line and not just ...
APP_ID=test-one-two
?

Try changing this:
for line in proc.communicate()[0]:
print line
to this:
print proc.communicate()[0]
I think you're unintentionally iterating over a string.
EDIT:
As mentioned in this question, you can iterate over proc.communicate()[0].splitlines() for multiple lines of stdout. Perhaps a cleaner way to do it is like this, described in the same question:
for line in proc.stdout:
print line

Related

run cmake command via subprocess.Popen or subprocess.run [duplicate]

If I run echo a; echo b in bash the result will be that both commands are run. However if I use subprocess then the first command is run, printing out the whole of the rest of the line.
The code below echos a; echo b instead of a b, how do I get it to run both commands?
import subprocess, shlex
def subprocess_cmd(command):
process = subprocess.Popen(shlex.split(command), stdout=subprocess.PIPE)
proc_stdout = process.communicate()[0].strip()
print proc_stdout
subprocess_cmd("echo a; echo b")
You have to use shell=True in subprocess and no shlex.split:
import subprocess
command = "echo a; echo b"
ret = subprocess.run(command, capture_output=True, shell=True)
# before Python 3.7:
# ret = subprocess.run(command, stdout=subprocess.PIPE, shell=True)
print(ret.stdout.decode())
returns:
a
b
I just stumbled on a situation where I needed to run a bunch of lines of bash code (not separated with semicolons) from within python. In this scenario the proposed solutions do not help. One approach would be to save a file and then run it with Popen, but it wasn't possible in my situation.
What I ended up doing is something like:
commands = '''
echo "a"
echo "b"
echo "c"
echo "d"
'''
process = subprocess.Popen('/bin/bash', stdin=subprocess.PIPE, stdout=subprocess.PIPE)
out, err = process.communicate(commands)
print out
So I first create the child bash process and after I tell it what to execute. This approach removes the limitations of passing the command directly to the Popen constructor.
Join commands with "&&".
os.system('echo a > outputa.txt && echo b > outputb.txt')
If you're only running the commands in one shot then you can just use subprocess.check_output convenience function:
def subprocess_cmd(command):
output = subprocess.check_output(command, shell=True)
print output
>>> command = "echo a; echo b"
>>> shlex.split(command);
['echo', 'a; echo', 'b']
so, the problem is shlex module do not handle ";"
Got errors like when I used capture_output=True
TypeError: __init__() got an unexpected keyword argument 'capture_output'
After made changes like as below and its works fine
import subprocess
command = '''ls'''
result = subprocess.run(command, stdout=subprocess.PIPE,shell=True)
print(result.stdout.splitlines())
import subprocess
cmd = "vsish -e ls /vmkModules/lsom/disks/ | cut -d '/' -f 1 | while read diskID ; do echo $diskID; vsish -e cat /vmkModules/lsom/disks/$diskID/virstoStats | grep -iE 'Delete pending |trims currently queued' ; echo '====================' ;done ;"
def subprocess_cmd(command):
process = subprocess.Popen(command,stdout=subprocess.PIPE, shell=True)
proc_stdout = process.communicate()[0].strip()
for line in proc_stdout.decode().split('\n'):
print (line)
subprocess_cmd(cmd)

passing json output from one script as input to second script in python [duplicate]

I wrote a script and I want it to be pipeable in bash. Something like:
echo "1stArg" | myscript.py
Is it possible? How?
See this simple echo.py:
import sys
if __name__ == "__main__":
for line in sys.stdin:
sys.stderr.write("DEBUG: got line: " + line)
sys.stdout.write(line)
running:
ls | python echo.py 2>debug_output.txt | sort
output:
echo.py
test.py
test.sh
debug_output.txt content:
DEBUG: got line: echo.py
DEBUG: got line: test.py
DEBUG: got line: test.sh
I'll complement the other answers with a grep example that uses fileinput to implement the typical behaviour of UNIX tools: 1) if no arguments are specified, it reads data from stdin; 2) many files can be specified as arguments; 3) a single argument of - means stdin.
import fileinput
import re
import sys
def grep(lines, regexp):
return (line for line in lines if regexp.search(line))
def main(args):
if len(args) < 1:
print("Usage: grep.py PATTERN [FILE...]", file=sys.stderr)
return 2
regexp = re.compile(args[0])
input_lines = fileinput.input(args[1:])
for output_line in grep(input_lines, regexp):
sys.stdout.write(output_line)
if __name__ == "__main__":
sys.exit(main(sys.argv[1:]))
Example:
$ seq 1 20 | python grep.py "4"
4
14
In your Python script you simply read from stdin.
Everything that reads from stdin is "pipeable". Pipe simply redirects stdout of former program to the latter.

How to execute if else unix command from python and get ret value

Following is the code which I am trying to execute using python
from subprocess import Popen, PIPE
cmd = 'if (-e "../a.txt") then \n ln -s ../a.txt . \n else \n echo "file is not present " \n endif'
ret_val = subprocess.call(cmd,shell="True")
When executed gives following error message
/bin/sh: -c: line 5: syntax error: unexpected end of file
In sh scripts, if is terminated with fi, not endif.
http://tldp.org/LDP/Bash-Beginners-Guide/html/sect_07_01.html
Or just write the darn code in Python:
import os
if os.path.exists('../a.txt'):
print 'Exists'
os.symlink('../a.txt', 'a.txt')
else:
print 'Does not exist'
os.path.exists ()
os.symlink()
If you really want to run tcsh commands, then:
import shlex
import subprocess
args = ['tcsh', '-c'] + shlex.split(some_tcsh_command)
ret = suprocess.call(args)

read -p command in linux script is not displayed correctly by subprocess in python

I want to read output line by line
below is my bash code (meta.sh)
#!/bin/bash
echo "hello!"
read -p "Continue?(y/n)"
[ "$REPLY" == "y" ] || exit
echo "lol"
below is my subprocess code (test.py)
import subprocess
def create_subprocess(Command):
proc = subprocess.Popen(
Command,
shell=True,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
bufsize=1
)
return proc
Command = "/root/Desktop/meta.sh"
proc = create_subprocess(Command)
while True:
line = proc.stdout.readline()
if not line: break
print line.strip()
now when i run "python test.py"
it shows
hello!
now when I press y and press enter then it shows
Continue?(y/n)
lol
what ideally it should show is this
hello!
Continue?(y/n)
now when I press y and press enter then it should show
lol
As I mentioned in the comment, the problem is that python's readline is waiting until it gets either an end of line or end of file, neither of which are produced by the read -p command (note that you type onto the same line, because there is not a line ending on that one). That means your readline doesn't give you anything back until after lol is printed. You can get around this by reading one char at a time though, like so:
while True:
cur = proc.stdout.read(1)
if(not cur):
break
sys.stdout.write(cur)
sys.stdout.flush()
though you'll have to import sys as well
This will display each character from proc.stdout as soon as it's read and not wait for newlines.

How to make a python script "pipeable" in bash?

I wrote a script and I want it to be pipeable in bash. Something like:
echo "1stArg" | myscript.py
Is it possible? How?
See this simple echo.py:
import sys
if __name__ == "__main__":
for line in sys.stdin:
sys.stderr.write("DEBUG: got line: " + line)
sys.stdout.write(line)
running:
ls | python echo.py 2>debug_output.txt | sort
output:
echo.py
test.py
test.sh
debug_output.txt content:
DEBUG: got line: echo.py
DEBUG: got line: test.py
DEBUG: got line: test.sh
I'll complement the other answers with a grep example that uses fileinput to implement the typical behaviour of UNIX tools: 1) if no arguments are specified, it reads data from stdin; 2) many files can be specified as arguments; 3) a single argument of - means stdin.
import fileinput
import re
import sys
def grep(lines, regexp):
return (line for line in lines if regexp.search(line))
def main(args):
if len(args) < 1:
print("Usage: grep.py PATTERN [FILE...]", file=sys.stderr)
return 2
regexp = re.compile(args[0])
input_lines = fileinput.input(args[1:])
for output_line in grep(input_lines, regexp):
sys.stdout.write(output_line)
if __name__ == "__main__":
sys.exit(main(sys.argv[1:]))
Example:
$ seq 1 20 | python grep.py "4"
4
14
In your Python script you simply read from stdin.
Everything that reads from stdin is "pipeable". Pipe simply redirects stdout of former program to the latter.

Categories