Run tail command as background process in Python [duplicate] - python

This question already has answers here:
Read streaming input from subprocess.communicate()
(7 answers)
Closed last year.
I have file named test.py and I am trying to run below command as a background process as part of this script
"tail -n0 -f debug.log"
Also, I want this process to end as soon as test.py execution is completed.
However, I can't get this to working. I have tried below code but tail command not exiting even after main script is completed.
I am new Python, can someone help me do this clean way ?
pro = subprocess.Popen(["tail", "-n0", "-f", log_file], stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
for line in pro.stdout:
print(line)
os.killpg(os.getpgid(pro.pid), signal.SIGTERM)

I used it once before, so I can't remember exactly, but I think I exiting 'subprocess.Popen' using 'with'. I'm not sure, but I recommend giving it a try.
with subprocess.Popen(["tail", "-n0", "-f", log_file], stdout=subprocess.PIPE, stderr=subprocess.STDOUT) as pro:
for line in pro.stdout:
print(line)

Related

Why is log.plaintext responding late in Python? [duplicate]

This question already has answers here:
Run command with PyQt5 and getting the stdout and stderr
(2 answers)
How to keep PyQt5 responsive when calling gnuplot?
(1 answer)
Closed 1 year ago.
I want to see the immediate execution result of stdout after exe's execution on pyqt. I implemented all the actions I want in Python code, and the rest are linked to pyqt's widget. I want to print out the immediate execution result of exe from pyqt. Though everything outputs normally in the cmd window, while only log.insertPlainText part output shows at the last time after all the cmd execution ends.
I am sure this is not a duplicate question. What I want is not a solution from pyqt, but I want to solve it entirely with Python code. I've tried many different things, but the log is still printed all at once at the end of the cmd execution. The reason I used log.insertPlainText instead of print is, because I want to show the output result in pyqt.
What I already tried:
cmd = [cmd command]
p = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE, universal_newlines=True, bufsize=1)
for line in iter(p.stdout):
log.insertPlainText(line)
p.stdout.close()
p.wait()
and
with Popen([cmd command], stdout=PIPE, bufsize=1,
universal_newlines=True) as p:
for line in p.stdout:
log.insertPlainText(line)

Live output status from subprocess command Python [duplicate]

This question already has answers here:
live output from subprocess command
(21 answers)
Closed 2 years ago.
I'm writing a script to get netstat status using subprocess.check_output.
cmd = 'netstat -nlpt'
result = subprocess.check_output(cmd, shell=True, timeout=1800)
print(result.decode('utf-8'))
The above is running perfectly. Is there any way to get the live-streaming output. I have heard poll() function does this job. In live output from subprocess command they are using popen but i'm using check_output please some one help me on this issue. Thank you!
From this answer:
The difference between check_output and Popen is that, while popen is a non-blocking function (meaning you can continue the execution of the program without waiting the call to finish), check_output is blocking.
Meaning if you are using subprocess.check_output(), you cannot have a live output.
Try switching to Popen().

Submitting an LSF script via python's subprocess.Popen() without shell=True [duplicate]

This question already has answers here:
Using greater than operator with subprocess.call
(2 answers)
Closed 5 years ago.
Forewarning, question is probably more about a lack of understanding of the bsub command and login shells than python's Popen().
I am trying to submit a LSF script within a python script using subprocess.Popen()
pipe = subprocess.Popen(shlex.split("bsub < script.lsf"),
stdout=subprocess.PIPE, stderr=subprocess.PIPE)
It seems a subprocess is properly launched and the command executed, but I get an error in stderr from the bsub command that reads "Empty job. Job not submitted."
I was worried it had something to do with subprocess launching a login-shell, so I tried to use shell=True within the Popen command to alleviate that.
pipe = subprocess.Popen("bsub < script.lsf", shell=True,
stdout=subprocess.PIPE, stderr=subprocess.PIPE)
This successfully submits the job as expected. So my question is how can I submit this job without using shell=True?
I've tried using some bsub options such as
pipe = subprocess.Popen(shlex.split("bsub -L /bin/sh < script.lsf"),
stdout=subprocess.PIPE, stderr=subprocess.PIPE)
with the same "Empty job. Job not submitted." error returned. I feel this is close to what I am looking for, but I don't fully understanding what "The name of the login shell (must specify an absolute path" is exactly. Perhaps it is not /bin/sh on the system I am using. Is there a way to print this path?
< and > are not arguments to your command (in this case bsub), but are instructions to the shell about where stdin or stdout (respectively) should be directed before that command is started.
Thus, the appropriate replacement is to specify this redirection with a separate argument to Popen:
pipe = subprocess.Popen(['bsub', '-L', '/bin/sh'],
stdout=subprocess.PIPE, stderr=subprocess.PIPE,
stdin=open('script.lsf', 'r'))

Python - read output from long-running subprocess [duplicate]

This question already has answers here:
Read streaming input from subprocess.communicate()
(7 answers)
Closed 6 years ago.
Using the subprocess module (Python 2.7), I'm running a command and attempting to process its output as it runs.
I have code like the following:
process = subprocess.Popen(
['udevadm', 'monitor', '--subsystem=usb', '--property'],
stdout=subprocess.PIPE)
for line in iter(process.stdout.readline, ''):
print(line)
However, the output only gets printed after I Ctrl+C, even if I add sys.stdout.flush() after the print statement.
Why is this happening, and how can I live stream the output from this process?
Notably, this udevadm monitor command is not intended to terminate, so I can't simply wait for the process to terminate and process its output all at once.
I found live output from subprocess command but the approach in the accepted answer did not solve my problem.
You could use unbuffer :
process = subprocess.Popen(
["unbuffer", 'udevadm', 'monitor', '--subsystem=usb', '--property'],
stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
for line in iter(process.stdout.readline, ''):
print(line)

subprocess.Popen not printing/running properly [duplicate]

This question already has answers here:
Read streaming input from subprocess.communicate()
(7 answers)
Closed 7 years ago.
I have a script that reads from external sensors (and runs forever), when I run it as ./zwmeter /dev/ttyUSB0 300 it behaves normally and prints output continuously to stdout. I am using bash on Ubuntu. I'd like to execute this command as part of a python script. I have tried:
from subprocess import Popen, PIPE
proc = Popen(['./zwmeter', '/dev/ttyUSB0', '300'], stderr=PIPE, stdout=PIPE)
print proc.communicate()
but I get a program that runs forever without producing any output. I don't care about stderr, only stdout and have tried splitting up the printing but still with no success.
Thank you for any help you can provide!
I think the problem has to do with the process I'm calling not terminating. I found a good work around on this site:
http://eyalarubas.com/python-subproc-nonblock.html

Categories