Running dd from python and getting progress - python

this is my code but it doesn't works.
dd command is executed, but no output is printed out.
Note: if I change the stdout to a regular text file, the dd output is progressively saved in the file at every progress line that dd prints out.
Ideas?
Regards.
import sys
from subprocess import Popen, STDOUT, PIPE
with Popen(["dd", "if=/dev/cdrom", "of=/tmp/prova.iso", "bs=2048", "count=499472", "status=progress"], stderr=STDOUT, stdout=PIPE) as proc:
print("ok")
print(proc.stdout.read())

I've found a solution.
import subprocess
import sys
cmd = ["dd", "if=/dev/cdrom", "of=/tmp/iso.iso", "bs=2048", "count=499472", "status=progress"]
process = subprocess.Popen(cmd, stderr=subprocess.PIPE)
line = ''
while True:
out = process.stderr.read(1)
if out == '' and process.poll() != None:
break
if out != '':
s = out.decode("utf-8")
if s == '\r':
print(line)
line = ''
else:
line = line + s
Thank you all for your answers.

Take a look at this:
dd with progress in python

Related

Python: Why do my process returned by Popen has a "none" stdin?

I am trying to make a program in python to brute force a ctf C program where you have to find a fruit salad recipe in order to get the flag.
what I want to do : I want to be able to write on the stdin of the C program in python.
Problem : the stdin of the process returned by Popen has a none value while stdout and stderr are correct.
output from my program :
start bruteforce...
<_io.BufferedReader name=3>
<_io.BufferedReader name=5>
None
code :
as you can see I am using print then exit before the loop to debug the process std, I don't understand why I get None when I print print(process.stdin)
!/usr/bin/python3
import random
import os
import sys
from subprocess import *
from contextlib import contextmanager
from io import StringIO
fruit = ["banana", "raspberry", "orange", "lemon"]
comb = ""
found = False
print("start bruteforce...")
process = Popen(['./fruit'], stdout=PIPE, stderr=PIPE)
print(process.stdout)
print(process.stderr)
print(process.stdin)
sys.exit(1)
while True:
for i in range(4):
pick = random.choice(fruit)
inp, output = process.stdin, process.stdout
comb += pick
comb += " "
inp.write(pick)
inp.write("\n")
out = output.read().decode('utf-8')
if "flag" in out:
found = True
break
if found == True:
print("found : " + com)
break
print(comb + " : is not valid")
comb = ""
os.kill(p.pid, signal.CTRL_C_EVENT)
thanks you !
fixed thanks to Ackdari, I replaced :
process = Popen(['./fruit'], stdout=PIPE, stderr=PIPE)
with
process = Popen(['./fruit'], stdout=PIPE, stdin=PIPE)
since I am not using stderr anyways.
I want to be able to write on the stdin
This is forbidden, at least by POSIX standards, and makes no sense on Linux. As suggested by its name stdin is a standard input and you program should read not write it.
Of course, notice that pipe(7)-s have an input and an output. You are writing on your stdout and that is the stdin of the popen-ed process.

Kill the child process when the child process output satisfies the condition?

I want Python to kill this process if the output of the child process meets the criteria
For example this is a infinity loop while_file.py, it print 0 to 999 then looks like there's no response.
i = 0
while 1:
if i < 1000:
print i
i += 1
I want to check if the output of the child process is 999, then kill it.
import os
import signal
import subprocess
def run_cmd(cmd):
pro = subprocess.Popen(cmd, stdout=subprocess.PIPE,
shell=True, preexec_fn=os.setsid)
while True:
r = pro.stdout.read()
print r
if r == 999:
os.killpg(os.getpgid(pro.pid), signal.SIGTERM)
if __name__=='__main__':
print run_cmd('python while_file.py')
But it seems no response... why? Is it be blocked?
There are a couple of problems:
"read" function (from pro.stoud.read) reads the whole file (until EOF). Which in your case will not work because the stdout is never close, which means no EOF. You should use readline
when you are reading you are reading strings not numbers (so you should compare with "999" not with 999)
I would recommend making sure there is no buffering (can get nasty later if something stays in a buffer)
The changed codes:
import sys
i = 0
while True:
if i < 1000:
print i
sys.stdout.flush()
i += 1
and
import os
import signal
import subprocess
def run_cmd(cmd):
pro = subprocess.Popen(cmd, stdout=subprocess.PIPE,
shell=True, preexec_fn=os.setsid)
while True:
r = pro.stdout.readline()
r = r.strip()
print(r)
if r == "999":
os.killpg(os.getpgid(pro.pid), signal.SIGTERM)
print("Process killed")
break
if __name__=='__main__':
print run_cmd('PYTHONUNBUFFERED=1; python a.py')
I don't know if it is a good idea or not, but here is a way you can do it:
while_file.py
i = 0
while 1:
if i < 1000:
print(i)
i += 1
stackoverflow.py
import os
import signal
import subprocess
def run_cmd(cmd):
pro = subprocess.Popen(cmd, stdout=subprocess.PIPE, shell=True, preexec_fn=os.setsid)
for line in iter(pro.stdout.readline, ''):
r = line.rstrip()
print(r)
if r == b'999':
os.killpg(os.getpgid(pro.pid), signal.SIGTERM)
print("True !")
if __name__ == '__main__':
print(run_cmd('python while_file.py'))

indentation Error python 2.7

Hello I am having an indentation error on python 2.7
My program looks like this:
Imports: import sys, os
import subprocess
from threading import Thread
from re import split
Actual Code: def GetProcesses(Clean=True):
#
if Clean == True:
#
#
x = subprocess.Popen(["sudo", "ps", "aux"], stdout=subprocess.PIPE, shell=True)
(out, err) = x.communicate()
print(out)
print(out.split("\n"))
print("--------------")
Processes = out.split("\n")
print(Processes)
print("------")
print(Processes[0])
print("----------")
Header = Processes[0]
Process_list = Processes.remove(Processes[0])
return((Header, Process_list))
#
else:
#
#
if Clean == True: #added problem so future SE users can see it
x = subprocess.Popen(["ps", "aux"], stdout=subprocess.PIPE, shell=True)
(out, err) = x.communicate()
return(out)
I am not understanding the error, I have tried de-denting everyline 1 space, indenting back to the originals, and adding 1 space but it always says that it either expects an indent, or unexpected indent. P.S I am using only spaces.
Actual Error:File "MemoryHandling.py", line 31
x = subprocess.Popen(["ps", "aux"], stdout=subprocess.PIPE, shell=True)
^
IndentationError: expected an indented block
I have check several online questions/sources (mainly SE) for this error (below) both during and before asking this question, i found that they were case specific, downvoted / not answered, or were not usefull:
-- Python Indentation Error # i am using spaces
-- Python 2.7 Indentation error # i have manually checked indentation multiple times, and tried using the tab key
-- Unexpected indentation error, but indentation looks correct # once again not using tabs and spaces
-- Code Indent Error, code is indented? # not answered (bug?)
-- Python Indentation Error when there is no indent error # once more not using tabs and spaces
import sys, os
import subprocess
from threading import Thread
from re import split
#Actual Code:
def GetProcesses(Clean):
#
if Clean == True:
#
#
x = subprocess.Popen(["sudo", "ps", "aux"], stdout=subprocess.PIPE, shell=True)
(out, err) = x.communicate()
print(out)
print(out.split("\n"))
print("--------------")
Processes = out.split("\n")
print(Processes)
print("------")
print(Processes[0])
print("----------")
Header = Processes[0]
Process_list = Processes.remove(Processes[0])
return((Header, Process_list))
#
else:
#
#
x = subprocess.Popen(["ps", "aux"], stdout=subprocess.PIPE, shell=True)
(out, err) = x.communicate()
return(out)
The problem is here. After if part, there is no indentation provided.
else:
if Clean == True: #added problem so future SE users can see it
x = subprocess.Popen(["ps", "aux"], stdout=subprocess.PIPE, shell=True)
(out, err) = x.communicate()
return(out)

rsync called by subprocess Popen works when running script but does not when I generate an app with py2app

This is my code:
def uploadByRSync(host, user, passwd, src, dst, statusManager):
try:
os.environ["RSYNC_PASSWORD"] = passwd
print host, user, passwd, src, dst
parameters = ["rsync", "-azP", "--partial", src ,"{3}#{0}::{2}/{1}".format(host, dst, user, user)]
print " ".join(parameters)
process = subprocess.Popen(parameters, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, universal_newlines=True)
for line in unbuffered(process):
if "%" in line:
spl = line.split()
statusManager.uploadSpeed = spl[2]
statusManager.uploaded = spl[1]
return not process.wait()
except Exception as ex:
print ex
return False
newlines = ['\n', '\r\n', '\r']
def unbuffered(proc, stream='stdout'):
stream = getattr(proc, stream)
with contextlib.closing(stream):
while True:
out = []
last = stream.read(1)
# Don't loop forever
if last == '' and proc.poll() is not None:
break
while last not in newlines:
# Don't loop forever
if last == '' and proc.poll() is not None:
break
out.append(last)
last = stream.read(1)
out = ''.join(out)
print out
yield out
When running with the py2app version I can never get an output. When running as script everything works just fine. ps: this code runs on a separated thread of a Qt app. Does anyone have any idea why this is happening?
Most likely you have a stream buffering issue. Here is how you can output all lines of your process in real time:
import subprocess
import select
p = subprocess.Popen(parameters,
stdout=subprocess.PIPE, stderr=subprocess.PIPE,
bufsize=0)
poll = [p.stdout.fileno(), p.stderr.fileno()]
while True:
# check if process is still running and read remaining data
if p.poll() is not None:
for l in p.stdout.readlines():
print(l)
for l in p.stderr.readlines():
print(l)
break
# blocks until data is being recieved
ret = select.select(poll, [], [])
for fd in ret[0]:
line = p.stdout.readline() if fd == p.stdout.fileno() else p.stderr.readline()
print(line)
Just made a test changing the Popen call by a simple 'ls',but I still cannot get the output when running py2app version. It works just fine when running python script. When I kill the py2app version app the output is just printed.
process = subprocess.Popen(["ls"], stdout=subprocess.PIPE, stderr=subprocess.STDOUT, universal_newlines=True)

Print output of external command in realtime and have it in a string at the same time in python

For example:
#!/usr/bin/env python3
# cmd.py
import time
for i in range(10):
print("Count %d" % i)
time.sleep(1)
#!/usr/bin/env python3
import subprocess
# useCmd.py
p = subprocess.Popen(['./cmd.py'], stdout=subprocess.PIPE)
out, err = p.communicate()
out = out.decode()
print(out)
In useCmd.py I can print out the output of cmd.py, but only after it's finished outputting. How can I print out it in realtime and still have it stored in a string? (sort of like tee in bash.)
If you don't have to deal with stdin, you could avoid using communicate that is blocking, and read directly from the process stdout until your stdout ends:
p = subprocess.Popen(['python', 'cmd.py'], stdout=subprocess.PIPE)
# out, err = p.communicate()
while True:
line = p.stdout.readline()
if line != '':
print line,
else:
break
related

Categories