Let me introduce the goal of the application I'm building: I am creating a front-end GUI using PySide (Qt) for a fortran based application used in the framework of CFD. The fortran application is compiled as a *.exe file, and, when executed, it continuously provides the simulated lapse of time and other output details (when I launch it from the console, these data continously appear until it finishes).
For example, if I executed the external code from the console I would get
>> myCFDapplication.exe
Initializing...
Simulation start...
Time is 0.2
Time is 0.4
Time is 0.6
Time is 0.8
Time is 1.0
Simulation finished
>>
With quite a long lapse of time between "Time is .." and the next line.
The objective of the GUI is to generate the initialization files for the external application, to launch the external application and finally to provide the user the computation output information in real time (as plane text).
From other similar topics in this site, I have been able to launch my external application from Python using the following code
import os, sys
import subprocess
procExe = subprocess.Popen("pru", shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE, universal_newlines=True)
while procExe.poll() is None:
line = procExe.stdout.readline()
print("Print:" + line)
but the output is only displayed when the execution finishes, and moreover, the whole GUI freezes until that moment.
I would like to know how to launch my external application using Python, getting the output in real time and passing it to the GUI instantaneously, if possible. The idea would be to print the output in different lines inside a "TextEdit" dialog using the function "append(each_output_line)".
Check out Non-blocking read on a subprocess.PIPE in python and look at the use of Queues to do a non-blocking read of the subprocess. The biggest change for your Qt application is that you are probably going to have to use multiprocessing since, as you have observed, anything blocking in your application is going to freeze the GUI.
Related
I have a program that produces a csv file and right at the end I am using os.startfile(fileName) but then due to the program finishing execution the opening file just closes also, same happens if I add a sleep after also, file loads up then once the sleep ends it closes again?
Any help would be appreciated.
From the documentation for os.startfile:
startfile() returns as soon as the associated application is launched. There is no option to wait for the application to close, and no way to retrieve the application’s exit status.
When using this function, there is no way to make your script wait for the program to complete because you have no way of knowing when it is complete. Because the program is being launched as a subprocess of your python script, the program will exit when the python script exits.
Since you don't say in your question exactly what the desired behavior is, I'm going to guess that you want the python script to block until the program finishes execution (as opposed to detaching the subprocess). There are multiple ways to do this.
Use the subprocess module
The subprocess module allows you to make a subprocess call that will not return until the subprocess completes. The exact call you make to launch the subprocess depends heavily on your specific situation, but this is a starting point:
subprocess.Popen(['start', fileName], shell=True)
Use input to allow user to close script
You can have your script block until the user tells the python script that the external program has closed. This probably requires the least modification to your code, but I don't think it's a good solution, as it depends on user input.
os.startfile(fileName)
input('Press enter when external program has completed...')
I have a python loop that at each iteration is creating a new image in a new path. I want to send that path into a process previously executed that is waiting for a path at the command prompt.
In more detail:
I am running a self-driving simulator in python and every 5 frames I want to test that current frame (that is saved in a RAM disk) into an object detector algorithm I have trained (that spends arround 9ms to detect my object BUT 2 seconds to open the process). Actually, I execute the trained algorithm by using the subprocess module, but the problem I have is that when that process is opened (I just open the process once, when I run the main script) it is waiting for a path image. I believe that with your tips I am close to the answer but I don't face how to pass that path image to this subprocess that is waiting for it at each 5 frames iteration.
PD: I am on Windows, Python 3.5.4
Do you know what can I do?
If you're on a *nix environment, and I'm understanding what you want, pipes provides what you want to do quite elegantly. I don't know Windows, but maybe the same concept could be used there.
Here's a simple example that illustrates this:
Pipe1a.py:
#!/usr/bin/env python
import time
import sys
for i in range(10):
time.sleep(1) # represent some processing delay
print("filename{}.jpg".format(i))
sys.stdout.flush()
Pipe1b.py:
#!/usr/bin/env python
import sys
while True:
line = sys.stdin.readline()
if len(line) == 0:
break
print "Processing image '{}'".format(line.strip())
If you made both of these scripts executable, then you could chain them together via a pipe at the command prompt:
> Pipe1a.py | Pipe1b.py
Resulting output:
Processing image 'filename0.jpg'
Processing image 'filename1.jpg'
Processing image 'filename2.jpg'
Processing image 'filename3.jpg'
Processing image 'filename4.jpg'
Processing image 'filename5.jpg'
Processing image 'filename6.jpg'
Processing image 'filename7.jpg'
Processing image 'filename8.jpg'
Processing image 'filename9.jpg'
The concept here is that one process writes data to its stdout and a second process reads that data from its stdin.
If you don't want to use the command prompt to string these together, but rather want to run a single script, you can do this same thing, with pretty much the same code, using the Python subprocess1 module. For this example, you could have the Pipe1b.py program run the Pipe1a.py program via subprocess.Popen and then process its output in this same way.
Thank you for all your comments and for your help. Finally I have achieved it.
I use the library 'pexpect' that allows to launch a program inside a python script with the function process = popen_spawn.PopenSpawn. Then, a function from that library called 'send' allows to pass an argument to that running process (process.send(arg)).
In my case, I launch the program (.exe) at the beginning of the script defining that instance as a global variable. Then I just have to execute the send function at each iteration.
first of all a short overview over my current goal:
I want to use a scheduler to execute a simple python program every second. This program reads some data and enter the results inside a database. Because the scheduled task will operate over several days on a raspberry pie the process should start in the background. Therefore I want to create a python file which can start, stop and get the current status from the background job. Furthermore it should be possible to exit and reenter the control file without stopping the background job.
Currently I tried apscheduler to execute the python file every second. The actual problem is, that I can't access the current python file, to control the status, from another external file. Overall I found no real solution how I can control a subprocess form an external file and after find the same subprocess again after restarting the controlling python file.
EDIT:
So overall as far I got it now I'm able to find the current process with his pid. With that im able to send send a terminatesignal to the current process. Inside my scheduled file I'm able to catch these signals and shut down the program on a normal way.
To control (start, restart, stop, schedule) the background process use subprocess. Here is example of subrocess' popen with timeout.
To pass some data between the scheduler and the background job use one of IPC mechanisms, for example sockets.
Question: Is there a way, using Python, to access the stdout of a running process? This process has not been started by Python.
Context: There is a program called mayabatch, that renders out images from 3D Maya scene files. If I were to run the program from the command line I would see progress messages from mayabatch. Sometimes, artists close these windows, leaving the progress untracable until the program finishes. That led me along this route of trying to read its stdout after it's been spawned by a foreign process.
Background:
OS: Windows 7 64-bit
My research so far: I have only found questions and answers of how to do this if it was a subprocess, using the subprocess module. I also looked briefly into psutil, but I could not find any way to read a process' stdout.
Any help would be really appreciated. Thank you.
I don't think you can get to the stdout of a process outside of the code that created it
The lazy way to is just to pipe the output of mayabatch to a text file, and then poll the text file periodically in your own code so it's under your control, rather than forcing you to wait on the pipe (which is especially hard on Windows, since Windows select doesn't work with the pipes used by subprocess.
I think this is what maya does internally too: by default mayaBatch logs its results to a file called mayaRenderLog.txt in the user's Maya directory.
If you're running mayabatch from the command line or a bat file, you can funnel stdout to a file with a > character:
mayabatch.exe "file.ma" > log.txt
You should be able to poll that text file from the outside using standard python as long as you only open it for reading. The advantage of doing it this way is that you control the frequency at which you check the file.
OTOH If you're doing it from python, it's a little tougher unless you don't mind having your python script idled until the mayabatch completes. The usual subprocess recipe, which uses popen.communicate() is going to wait for an end-of-process return code:
test = subprocess.Popen(["mayabatch.exe","filename.mb"], stdout=subprocess.PIPE)
print test.communicate()[0]
works but won't report until the process dies. But you calling readlines on the process's stdout will trigger the process and report it one line at a time:
test = subprocess.Popen(["mayabatch.exe","filename.mb"], stdout=subprocess.PIPE)
reader = iter(test.subprocess.readlines, "")
for line in reader:
print line
More discussion here
I’m trying to write a program in Python. What I want to write is a script which immediately returns a friendly message to the user, but spawns a long subprocess in the background that takes with several different files and writes them to a granddaddy file. I’ve done several tutorials on threading and processing, but what I’m running into is that no matter what I try, the program waits and waits until the subprocess is done before it displays the aforementioned friendly message to the user. Here’s what I’ve tried:
Threading example:
#!/usr/local/bin/python
import cgi, cgitb
import time
import threading
class TestThread(threading.Thread):
def __init__(self):
super(TestThread, self).__init__()
def run(self):
time.sleep(5)
fileHand = open('../Documents/writable/output.txt', 'w')
fileHand.write('Big String Goes Here.')
fileHand.close()
print 'Starting Program'
thread1 = TestThread()
#thread1.daemon = True
thread1.start()
I’ve read these SO posts on multithreading
How to use threading in Python?
running multiple threads in python, simultaneously - is it possible?
How do threads work in Python, and what are common Python-threading specific pitfalls?
The last of these says that running threads concurrently in Python is actually not possible. Fair enough. Most of those posts also mention the multiprocessing module, so I’ve read up on that, and it seems fairly straightforward. Here’s the some of the resources I’ve found:
How to run two functions simultaneously
Python Multiprocessing Documentation Example
https://docs.python.org/2/library/multiprocessing.html
So here’s the same example translated to multiprocessing:
#!/usr/local/bin/python
import time
from multiprocessing import Process, Pipe
def f():
time.sleep(5)
fileHand = open('../Documents/writable/output.txt', 'w')
fileHand.write('Big String Goes Here.')
fileHand.close()
if __name__ == '__main__':
print 'Starting Program'
p = Process(target=f)
p.start()
What I want is for these programs to immediately print ‘Starting Program’ (in the web-browser) and then a few seconds later a text file shows up in a directory to which I’ve given write privileges. However, what actually happens is that they’re both unresponsive for 5 seconds and then they print ‘Starting Program’ and create the text file at the same time. I know that my goal is possible because I’ve done it in PHP, using this trick:
//PHP
exec("php child_script.php > /dev/null &");
And I figured it would be possible in Python. Please let me know if I’m missing something obvious or if I’m thinking about this in the completely wrong way. Thanks for your time!
(System information: Python 2.7.6, Mac OSX Mavericks. Python installed with homebrew. My Python scripts are running as CGI executables in Apache 2.2.26)
Ok- I think I found the answer. Part of it was my own misunderstanding. A python script can't simply return message to a client-side (ajax) program but still be executing a big process. The very act of responding to the client means that the program has finished, threads and all. The solution, then, is to use the python version of this PHP trick:
//PHP
exec("php child_script.php > /dev/null &");
And in Python:
#Python
subprocess.call(" python worker.py > /dev/null &", shell=True)
It starts an entirely new process outside the current one, and it will continue after the current one has ended. I'm going to stick with Python because at least we're using a civilized api function to start the worker script instead of the exec function, which always made me uncomfortable.