I have an exec call that runs some python code. It can take a while to run, so I would like to stream its output as it comes in.
One thought I have is something like this:
f = StringIO()
with redirect_stdout(f):
exec(code)
for char in f.gevalue():
print(char)
but this still waits for exec to finish.
Related
I am trying to learn how to write a script control.py, that runs another script test.py in a loop for a certain number of times, in each run, reads its output and halts it if some predefined output is printed (e.g. the text 'stop now'), and the loop continues its iteration (once test.py has finished, either on its own, or by force). So something along the lines:
for i in range(n):
os.system('test.py someargument')
if output == 'stop now': #stop the current test.py process and continue with next iteration
#output here is supposed to contain what test.py prints
The problem with the above is that, it does not check the output of test.py as it is running, instead it waits until test.py process is finished on its own, right?
Basically trying to learn how I can use a python script to control another one, as it is running. (e.g. having access to what it prints and so on).
Finally, is it possible to run test.py in a new terminal (i.e. not in control.py's terminal) and still achieve the above goals?
An attempt:
test.py is this:
from itertools import permutations
import random as random
perms = [''.join(p) for p in permutations('stop')]
for i in range(1000000):
rand_ind = random.randrange(0,len(perms))
print perms[rand_ind]
And control.py is this: (following Marc's suggestion)
import subprocess
command = ["python", "test.py"]
n = 10
for i in range(n):
p = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
while True:
output = p.stdout.readline().strip()
print output
#if output == '' and p.poll() is not None:
# break
if output == 'stop':
print 'sucess'
p.kill()
break
#Do whatever you want
#rc = p.poll() #Exit Code
You can use subprocess module or also the os.popen
os.popen(command[, mode[, bufsize]])
Open a pipe to or from command. The return value is an open file object connected to the pipe, which can be read or written depending on whether mode is 'r' (default) or 'w'.
With subprocess I would suggest
subprocess.call(['python.exe', command])
or the subprocess.Popen --> that is similar to os.popen (for instance)
With popen you can read the connected object/file and check whether "Stop now" is there.
The os.system is not deprecated and you can use as well (but you won't get a object from that), you can just check if return at the end of execution.
From subprocess.call you can run it in a new terminal or if you want to call multiple times ONLY the test.py --> than you can put your script in a def main() and run the main as much as you want till the "Stop now" is generated.
Hope this solve your query :-) otherwise comment again.
Looking at what you wrote above you can also redirect the output to a file directly from the OS call --> os.system(test.py *args >> /tmp/mickey.txt) then you can check at each round the file.
As said the popen is an object file that you can access.
What you are hinting at in your comment to Marc Cabos' answer is Threading
There are several ways Python can use the functionality of other files. If the content of test.py can be encapsulated in a function or class, then you can import the relevant parts into your program, giving you greater access to the runnings of that code.
As described in other answers you can use the stdout of a script, running it in a subprocess. This could give you separate terminal outputs as you require.
However if you want to run the test.py concurrently and access variables as they are changed then you need to consider threading.
Yes you can use Python to control another program using stdin/stdout, but when using another process output often there is a problem of buffering, in other words the other process doesn't really output anything until it's done.
There are even cases in which the output is buffered or not depending on if the program is started from a terminal or not.
If you are the author of both programs then probably is better using another interprocess channel where the flushing is explicitly controlled by the code, like sockets.
You can use the "subprocess" library for that.
import subprocess
command = ["python", "test.py", "someargument"]
for i in range(n):
p = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
while True:
output = p.stdout.readline()
if output == '' and p.poll() is not None:
break
if output == 'stop now':
#Do whatever you want
rc = p.poll() #Exit Code
I would like to write a program that can take input piped to it, manipulate it, and output it immediately. It seems that no matter what various things I try (using stdout.write rather than print, explicitly flushing stdout, setting stdin to be unbuffered), the output seems to be delayed until the input program is closed.
Example input script:
import sys
import time
sys.stdout.write('1\n')
sys.stdout.flush()
time.sleep(3)
sys.stdout.write('2\n')
sys.stdout.flush()
Example output script:
import sys
import time
for line in sys.stdin:
print time.time()
sys.stdout.write(line)
sys.stdout.flush()
If I run this:
python input.py | python output.py
I get something like this:
1428023794.99
1
1428023794.99
2
Clearly the lines are making it to the second program at the same time. I've also tried running the second python script with the -u flag to unbuffer stdin, but get the same output.
How can I make my program pulling the text from stdin output the text immediately?
I have been trying to log a script output with syslog-ng, however, if I use any time.sleep() function in my code, it breaks syslog-ng and stops logging the output of the script.
Here are the details.
// samplescript.py
import time
while True:
print "hello world"
time.sleep(5)
I use pipe to get it's output to syslog-ng, and I use unix logger tool, so I'm calling the script like this;
$ python sampleoutput.py | logger
This is not generating any output to my log file. The code is simple, and working.
By the way,
I don't thing anything wrong with syslog-ng conf file, since if i use the code below, it works as expected.
// samplescript.py
while True:
print "hello world"
Q: why time.sleep() is breaking syslog-ng? Is there any equivalence for sleep() function that I might use on my code?
thanks in advance.
You need to flush the buffered stdout-stream:
import sys
sys.stdout.flush()
How to flush output of Python print?
I've written a Python wrapper (pyprog) to run a program (someprogram), something like this:
...do some setup stuff in Python...
print("run [y=yes]")
CHOICE=input()
...do some setup stuff in Python...
if CHOICE == "y":
status=subprocess.call(["someprogram"])
sys.exit(status)
A user wants to use a shell script to run the program and feed it input using a here document like this:
#!/bin/sh
pyprog > pyprog.log << EOF
y
file1
file2
EOF
Is there a way to spawn the subprocess so that the here document will work (the "y" gets consumed by the Python input(), and the "file1" and "file2" continue along as stdin to someprogram)? Right now, the Python input() takes the "y", but the rest of it disappears.
You need to connect sys.stdin to the stdin of the call.
status=subprocess.call(["someprogram"], stdin=sys.stdin)
import sys
status=subprocess.call(["someprogram"], stdin=sys.stdin)
I've used something like this a few times before: https://gist.github.com/887225
Basically it's a python script that accepts a number of command line parameters, performs some transformation based on what was input, then uses os.system() to evoke a shell command.
In this example I'm calling Java, passing in a class path then running the ProgramName.jar program.
I have a long-running Python script that I run from the command-line. The script writes progress messages and results to the standard output. I want to capture everything the script write to the standard output in a file, but also see it on the command line. Alternatively, I want the output to go to the file immediately, so I can use tail to view the progress. I have tried this:
python MyLongRunngingScript.py | tee log.txt
But it does not produce any output (just running the script produces output as expected). Can anyone propose a simple solution? I am using Mac OS X 10.6.4.
Edit I am using print for output in my script.
You are on the right path but the problem is python buffering the output.
Fortunately there is a way to tell it not to buffer output:
python -u MyLongRunngingScript.py | tee log.txt
The fact that you don't see anything is probably related to the fact that buffering is occurring. So you only get output every 4 Ko of text or so.
instead, try something like this :
class OutputSplitter(object):
def __init__(self, real_output, *open_files):
self.__stdout = real_output
self.__fds = open_files
self.encoding = real_output.encoding
def write(self, string):
self.__stdout.write(string) # don't catch exception on that one.
self.__stdout.flush()
for fd in self.__fds:
try:
fd.write(string)
fd.flush()
except IOError:
pass # do what you want here.
def flush(self):
pass # already flushed
Then decorate sys.stdout with that class with some code like that :
stdout_saved = sys.stdout
logfile = open("log.txt","a") # check exception on that one.
sys.stdout = OutputSplitter(stdout_saved, logfile)
That way, every output (print included) is flushed to the standard output and to the specified file. Might require tweaking because i haven't tested that implementation.
Of course, expect to see a (small most of the time) performance penalty when printing messages.
Another simple solution could also be
python script.py > output.log
You could try doing sys.stdout.flush() occasionally in your script, and running with tee again. When stdout is redirected through to tee, it might get buffered for longer than if it's going straight to a terminal.