closing python command subprocesses - python

I want to continue with commands after closing subprocess. I have following code but fsutil is not executed. how can I do it?
import os
from subprocess import Popen, PIPE, STDOUT
os.system('mkdir c:\\temp\\vhd')
p = Popen( ["diskpart"], stdin=PIPE, stdout=PIPE )
p.stdin.write("create vdisk file=c:\\temp\\vhd\\test.vhd maximum=2000 type=expandable\n")
p.stdin.write("attach vdisk\n")
p.stdin.write("create partition primary size=10\n")
p.stdin.write("format fs=ntfs quick\n")
p.stdin.write("assign letter=r\n")
p.stdin.write("exit\n")
p.stdout.close
os.system('fsutil file createnew r:\dummy.txt 6553600') #this doesn´t get executed

At the least, I think you need to change your code to look like this:
import os
from subprocess import Popen, PIPE
os.system('mkdir c:\\temp\\vhd')
p = Popen(["diskpart"], stdin=PIPE, stdout=PIPE, stderr=PIPE)
p.stdin.write("create vdisk file=c:\\temp\\vhd\\test.vhd maximum=2000 type=expandable\n")
p.stdin.write("attach vdisk\n")
p.stdin.write("create partition primary size=10\n")
p.stdin.write("format fs=ntfs quick\n")
p.stdin.write("assign letter=r\n")
p.stdin.write("exit\n")
results, errors = p.communicate()
os.system('fsutil file createnew r:\dummy.txt 6553600')
From the documentation for Popen.communicate():
Interact with process: Send data to stdin. Read data from stdout and stderr, until end-of-file is reached. Wait for process to terminate. The optional input argument should be a string to be sent to the child process, or None, if no data should be sent to the child.
You could replace the p.communicate() with p.wait(), but there is this warning in the documentation for Popen.wait()
Warning This will deadlock when using stdout=PIPE and/or stderr=PIPE and the child process generates enough output to a pipe such that it blocks waiting for the OS pipe buffer to accept more data. Use communicate() to avoid that.

Related

How can I simulate a key press in a Python subprocess?

The scenario is, I have a Python script which part of it is to execute an external program using the code below:
subprocess.run(["someExternalProgram", "some options"], shell=True)
And when the external program finishes, it requires user to "press any key to exit".
Since this is just a step in my script, it would be good for me to just exit on behalf of the user.
Is it possible to achieve this and if so, how?
from subprocess import Popen, PIPE
p = Popen(["someExternalProgram", "some options"], stdin=PIPE, shell=True)
p.communicate(input=b'\n')
If you want to capture the output and error log
from subprocess import Popen, PIPE
p = Popen(["someExternalProgram", "some options"], stdin=PIPE, stdout=PIPE, stderr=PIPE, shell=True)
output, error = p.communicate(input=b'\n')
remember that the input has to be a bytes object

Displaying subprocess output to stdout and redirecting it

I'm running a script via Python's subprocess module. Currently I use:
p = subprocess.Popen('/path/to/script', stdout=subprocess.PIPE, stderr=subprocess.PIPE)
result = p.communicate()
I then print the result to the stdout. This is all fine but as the script takes a long time to complete, I wanted real time output from the script to stdout as well. The reason I pipe the output is because I want to parse it.
To save subprocess' stdout to a variable for further processing and to display it while the child process is running as it arrives:
#!/usr/bin/env python3
from io import StringIO
from subprocess import Popen, PIPE
with Popen('/path/to/script', stdout=PIPE, bufsize=1,
universal_newlines=True) as p, StringIO() as buf:
for line in p.stdout:
print(line, end='')
buf.write(line)
output = buf.getvalue()
rc = p.returncode
To save both subprocess's stdout and stderr is more complex because you should consume both streams concurrently to avoid a deadlock:
stdout_buf, stderr_buf = StringIO(), StringIO()
rc = teed_call('/path/to/script', stdout=stdout_buf, stderr=stderr_buf,
universal_newlines=True)
output = stdout_buf.getvalue()
...
where teed_call() is define here.
Update: here's a simpler asyncio version.
Old version:
Here's a single-threaded solution based on child_process.py example from tulip:
import asyncio
import sys
from asyncio.subprocess import PIPE
#asyncio.coroutine
def read_and_display(*cmd):
"""Read cmd's stdout, stderr while displaying them as they arrive."""
# start process
process = yield from asyncio.create_subprocess_exec(*cmd,
stdout=PIPE, stderr=PIPE)
# read child's stdout/stderr concurrently
stdout, stderr = [], [] # stderr, stdout buffers
tasks = {
asyncio.Task(process.stdout.readline()): (
stdout, process.stdout, sys.stdout.buffer),
asyncio.Task(process.stderr.readline()): (
stderr, process.stderr, sys.stderr.buffer)}
while tasks:
done, pending = yield from asyncio.wait(tasks,
return_when=asyncio.FIRST_COMPLETED)
assert done
for future in done:
buf, stream, display = tasks.pop(future)
line = future.result()
if line: # not EOF
buf.append(line) # save for later
display.write(line) # display in terminal
# schedule to read the next line
tasks[asyncio.Task(stream.readline())] = buf, stream, display
# wait for the process to exit
rc = yield from process.wait()
return rc, b''.join(stdout), b''.join(stderr)
The script runs '/path/to/script command and reads line by line both its stdout&stderr concurrently. The lines are printed to parent's stdout/stderr correspondingly and saved as bytestrings for future processing. To run the read_and_display() coroutine, we need an event loop:
import os
if os.name == 'nt':
loop = asyncio.ProactorEventLoop() # for subprocess' pipes on Windows
asyncio.set_event_loop(loop)
else:
loop = asyncio.get_event_loop()
try:
rc, *output = loop.run_until_complete(read_and_display("/path/to/script"))
if rc:
sys.exit("child failed with '{}' exit code".format(rc))
finally:
loop.close()
p.communicate() waits for the subprocess to complete and then returns its entire output at once.
Have you tried something like this instead, where you read the subprocess output line-by-line?
p = subprocess.Popen('/path/to/script', stdout=subprocess.PIPE, stderr=subprocess.PIPE)
for line in p.stdout:
# do something with this individual line
print line
The Popen.communicate doc clearly states:
Note: The data read is buffered in memory, so do not use this method if the data size is large or unlimited.
https://docs.python.org/2/library/subprocess.html#subprocess.Popen.communicate
So if you need realtime output, you need to use something like this:
stream_p = subprocess.Popen('/path/to/script', stdout=subprocess.PIPE, stderr=subprocess.PIPE)
while stream_line in stream_p:
#Parse it the way you want
print stream_line
This prints both stdout and stderr to the terminal as well as saving both stdout and stderr into a variable:
from subprocess import Popen, PIPE, STDOUT
with Popen(args, stdout=PIPE, stderr=STDOUT, text=True, bufsize=1) as p:
output = "".join([print(buf, end="") or buf for buf in p.stdout])
However, depending on what exactly you're doing, this might be important to note: By using stderr=STDOUT, we cannot differentiate between stdout and stderr anymore and with the call to print, your output will always be printed to stdout, doesn't matter if it came from stdout or stderr.
For Python < 3.7 you will need to use universal_newlines instead of text.
New in version 3.7: text was added as a more readable alias for universal_newlines.
Source: https://docs.python.org/3/library/subprocess.html#subprocess.Popen

subprocess or commands.getstatusoutput in STDOUT AND STORE at variable

I want something like this
run the 'ls' command and output on STDOUT and want to store same output in variable
For long running process I need to see the executing output on screen and also at last capture on variable
proc = subprocess.Popen(["ls"], stdout=subprocess.PIPE, shell=False)
(out, err) = proc.communicate()
print "program output:-", out
here the output coming after execution
To print output line-by-line as soon as child processes flushes its stdout and to store it in a variable:
from subprocess import Popen, PIPE
buf = []
proc = Popen([cmd], stdout=PIPE, bufsize=1)
for line in iter(proc.stdout.readline, b''):
buf.append(line)
print line,
proc.communicate() # close `proc.stdout`; wait for the child process to exit
output = b"".join(buf)
There could be a buffering issue (the output appears with a delay); to fix it, you could use pexpect, pty modules or stdbuf, unbuffer, script commands.

Unwanted new lines using python Popen and pandoc to parse html?

I am trying to convert several pieces of html to latex using python and pandoc and I have got stuck with a couple of problems.
To communicate my python script with pandoc I use subprocess.Popen, redirecting stdout to a file I am saving for including it in a latex template.
If I use the classic way of implementing Popen
from subprocess import Popen, PIPE, STDOUT
filedesc = open('myfile.tex','w')
args = ['pandoc', '-f', 'html', '-t', 'latex']
p = Popen(args, stdout=PIPE, stdin=PIPE, stderr=STDOUT)
outp, err = p.communicate(input=html)
filedesc.write(outp)
I get the lines with an additional new line where there shouldn't be any:
> \textbf{M. John Harrison} (Rugby, Warckwickshire, 1945) is a contemporary
>
> English writer.
This is (misteriously?) easy to solve by changing the stdout=PIPE to the file descriptor:
from subprocess import Popen, PIPE, STDOUT
filedesc = open('myfile.tex','w')
args = ['pandoc', '-f', 'html', '-t', 'latex']
p = Popen(args, stdout=filedesc, stdin=PIPE, stderr=STDOUT)
outp, err = p.communicate(input=html)
# not needed
# filedesc.write(outp)
But if I want to use a string buffer, the same problem occurs, since i cannot use it as the stdout parameter.
Any idea on how to stop Popen/pandoc from doing this?
Thanks!
Well, it seems to be a "kind of bug" in python's PIPE (???).
I am executing this code in a Windows system. This means that when a new line is entered, they are in the CR+LF (\r\n) style rather than the (cleaner) LF (\n) new line in unix-style.
At the time I introduce a large html text to be converted by pandoc, the output is returned by the pipe to the command line. Thus, every time the standard column width is reached, an ugly "new line" character is introduced. In my case, a CR+LF. This was making my output look so weird.
The dirty solution I have implemented is to add a replace('\r\n','\n') before writing the output but I am not sure if it's the most elegant one.
from subprocess import Popen, PIPE, STDOUT
html = '<p><b>Some random html code</b> longer than 80 columns ... </p>'
filedesc = open('myfile.tex','w')
args = ['pandoc', '-f', 'html', '-t', 'latex']
p = Popen(args, stdout=PIPE, stdin=PIPE, stderr=STDOUT)
outp, err = p.communicate(input=html)
filedesc.write(outp.replace('\r\n','\n'))**strong text**

Popen.communicate escapes a string I send to stdin

I am trying to spawn a process using Popen and send it a particular string to its stdin.
I have:
pipe = subprocess.Popen(cmd, shell=True, stdin=subprocess.PIPE)
pipe.communicate( my_stdin_str.encode(encoding='ascii') )
pipe.stdin.close()
However, the second line actually escapes the whitespace in my_stdin_str. For example, if I have:
my_stdin_str="This is a string"
The process will see:
This\ is\ a\ string
How can I prevent this behaviour?
I can't reproduce it on Ubuntu:
from subprocess import Popen, PIPE
shell_cmd = "perl -pE's/.\K/-/g'"
p = Popen(shell_cmd, shell=True, stdin=PIPE)
p.communicate("This $PATH is a string".encode('ascii'))
In this case shell=True is unnecessary:
from subprocess import Popen, PIPE
cmd = ["perl", "-pE" , "s/.\K/-/g"]
p = Popen(cmd, stdin=PIPE)
p.communicate("This $PATH is a string".encode('ascii'))
Both produce the same output:
T-h-i-s- -$-P-A-T-H- -i-s- -a- -s-t-r-i-n-g-
Unless you know you need it for some reason, don't run with "shell=True" in general (which, without testing, sounds like what's going on here).

Categories