Doing something after popen is finished - python

I want to make a background process that displays a file with an external viewer. When the process is stopped, it should delete the file.
The following piece of code does what I want to do, but it is ugly and I guess there is a more idiomatic way.
It would be perfect, if it is even OS independent.
subprocess.Popen(viewer + ' ' + file + ' && rm ' + file, shell=True)

Using subprocess.call() to open the viewer and view the file will exactly do that. Subsequently, run the command to delete the file.
If you want the script to continue while the process is running, use threading
An example:
from threading import Thread
import subprocess
import os
def test():
file = "/path/to/somefile.jpg"
subprocess.call(["eog", file])
os.remove(file)
Thread(target = test).start()
# the print command runs, no matter if the process above is finished or not
print("Banana")
This will do exactly what you describe:
open the file with eog (viewer), wait for it to finish (closeeog) and remove the file.
In the meantime continue the script and print "Banana".

Related

Run a program from python several times whitout initialize different shells

I want to run a compiled Fortran numerical model from Python. It is too complex to compile it using F2PY without implement several changes in the Fortran routines. This is why I am just calling its executable using the subprocess module.
The problem is that I have to call it few thousands of times, and I have the feeling that generating soo many shells is slowing the whole thing.
My implememtation (It is difficult to provide a reproducible example, sorry) looks like:
import os
import subprocess
foo_path = '/path/to/compiled/program/'
program_dir = os.path.join(foo_path, "FOO") #FOO is the Fortran executable
instruction = program_dir + " < nlst" #It is necesary to provide FOO a text file (nlst)
#with the configuration to the program
subprocess.call(instruction, shell=True, cwd=foo_path) #run the executable
Running it in this way (inside a loop), it works well and FOO generates a text file output that I can read from python. But I'd like to do the same keeping the shell active and just providing to it the "nlst" file path. Another nice option may be start an empty shell and keep it waiting for the instruction string, that will look like "./FOO < nlst". But I am not sure about how to do it, any ideas?
Thanks!
[Edited] Something like this should work but .comunicate ends process and a second call does not work:
from subprocess import Popen, PIPE
foo_path = '/path/to/FOO/'
process = Popen(['bash'], stdin=PIPE, cwd=foo_path)
process.communicate(b'./FOO < nlst')
I found this solution using the pexpect module,
import pexpect
import os.path
foo_path = '/path/to/FOO/'
out_path = '/path/to/FOO/foo_out_file' #path to output file
child = pexpect.spawn('bash', cwd=foo_path)
child.sendline('./FOO < nlst')
while not os.path.exists(out_path): #wait until out_path is created
continue
To extend my comment, here is an example for threading with your code:
import os
import subprocess
from concurrent.futures import ThreadPoolExecutor
foo_path = '/path/to/compiled/program/'
program_dir = os.path.join(foo_path, "FOO") #FOO is the Fortran executable
instruction = program_dir + " < nlst" #It is necesary to provide FOO a text file (nlst)
#with the configuration to the program
def your_function():
subprocess.call(instruction, shell=True, cwd=foo_path) #run the executable
# create executor object
executor = ThreadPoolExecutor(max_workers=4) # uncertain of how many workers you might need/want
# specify how often you want to run the function
for i in range(10):
# start your function as thread
executor.submit(your_function)
What I meant in my comment was something like the following Python script:
from subprocess import Popen, PIPE
foo_path = '/home/ronald/tmp'
process = Popen(['/home/ronald/tmp/shexecutor'], stdin=PIPE, cwd=foo_path)
process.stdin.write("ls\n")
process.stdin.write("echo hello\n")
process.stdin.write("quit\n")
And the shell script that executes the commands:
#!/bin/bash
while read cmdline; do
if [ "$cmdline" == "quit" ]; then
exit 0
fi
eval "$cmdline" >> x.output
done
Instead of doing an "eval", you can do virtually anything.
Note that this is just an outline of a real implementation.
You'd need to do some error handling. And if you are going to use this in a production environment, be sure to harden the code to the limit.

How to monitor the subprocess object?

I need to convert a lot of files using mediapipe compiler which runs on bazel. There are hundreds of files so this process has to be automated. The command to be executed normally would be something like:
GLOG_logtostderr=1 bazel-bin/mediapipe/examples/desktop/multi_hand_tracking/multi_hand_tracking_cpu --input_video_path=/home/tony/Videos/HandWashDataset/Step_1/HandWash_001_A_01_G01.mp4
Where GLOG_logtostderr is the logger attached to the bazel program to output the log(result). I used redirect (2>a.txt) at the end to get the results to be written as txt file.
I have got the program working by writing a Python script using subprocess module.
import glob
import os
import time
import subprocess
files = glob.glob("/home/tony/Videos/HandWashDataset/Step_1/*.mp4")
os.chdir("/home/tony/mediapipe")
output = ""
for i in files:
print("process file {}".format(i))
output = (i[:len(i)-4]) + ".txt"
inputf = "--input_video_path=" + i
outputf = "2>" + output
f = open("blah.txt", "w")
sp = subprocess.call(["GLOG_logtostderr=1 bazel-bin/mediapipe/examples/desktop/multi_hand_tracking/multi_hand_tracking_cpu " + inputf], shell=True, stderr=f)
time.sleep(180)
f.close()
print("process finished")
The problem I am having is currently it seems to have no control on the process in each iteration. Since it is invoking another program in the script. The call to sp seems to be nearly instant but the actual conversion actually takes a few minutes. Without time.sleep, all instances of the bazel are launched at once and killed my computer. Is there a way to monitor process so I can convert one file at a time?
You should use subprocess.run :
That runs the command described by args,
waits for the command to complete,
then returns a CompletedProcess instance.
see also : https://docs.python.org/3/library/subprocess.html

subprocess run in background and write output line by line to a file

I have two files:
main.py
import subprocess
import shlex
def main():
command = 'python test_output.py'
logfile = open('output', 'w')
proc = subprocess.Popen(shlex.split(command), stdout=logfile)
if __name__ == "__main__":
main()
and test_output.py
from time import sleep
import os
for i in range(0, 30):
print("Slept for => ", i+1, "s")
sleep(1)
os.system("notify-send completed -t 1500")
The output of the process is written in logfile once the child process is completed. Is there any way to:
Start child process from main and exit it (like it does now).
Keep running the child process in background.
As child process produces an output, write it immediately to logfile. (Don't wait for the child process to finish, as it does now.)
There are other questions (like this one) where solution is given for reading line by line, but they make the main.py wait. Is it possible to do everything in background, without keeping main.py waiting?
Both the buffers of the filehandler as the subprocess can be set to 'line-buffering', where a newline character causes each object's buffer to be forwarded. This is done by setting the buffer parameter to 1, see open() command and subprocess.
You need to make sure that the child process will not buffer by itself. By seeing that you are running a Python script too, you either need to implement this in the code there, like flush=True for a print statement:
print(this_and_that, flush=True)
Source
Credit

Unable to debug python script with subprocess

I have written a python function which runs another python script in a remote desktop using PSTools (psexec). I have run the script successfully several times when the function is called only once. But when I call the function multiple times from another file, the subprocess does not run in the second call. In fact it immediately quits the entire program in the second iteration without throwing any exception or Traceback.
controlpc_clean_command = self.psexecpath + ' -s -i 2 -d -u ' + self.controlPClogin + ' -p ' + self.controlPCpwd + ' \\' + self.controlPCIPaddr + ' cmd.exe /k ' + self.controlPC_clean_code
logfilePath = self.psexeclogs + 'Ctrl_Clean_Log.txt'
logfile = file(logfilePath,'w')
try:
process = subprocess.Popen(controlpc_clean_command, stdout = subprocess.PIPE,stderr = subprocess.PIPE)
for line in process.stderr:
print "** INSIDE SUBPROCESS STDERR TO START PSEXEC **\n"
sys.stderr.write(line)
logfile.write(line)
process.wait()
except OSError:
print "********COULD NOT FIND PSEXEC.EXE, PLEASE REINSTALL AND SET THE PATH VARIABLE PROPERLY********\n"
The above code runs once perfectly. Even if I run it from a different python file with different parameters, it runs good. The problem happens when I call the function more than once from one file, then in the second call the function quits after printing "** INSIDE SUBPROCESS STDERR TO START PSEXEC **\n" and it does not even print anything in the main program after that.
I am unable to figure out how to debug this issue. As I am completely clueless where the program goes after printing this line. How do I debug this?
Edit:
After doing some search, I added
stdout, stderr = subprocess.communicate()
after the subprocess.Popen line in my script. Now, I am able to proceed with the code but with one problem. Nothing is now getting written in the logfile 'Ctrl_Clean_Log.txt' after adding subprocess.communicate() !! How can I write in the file as well as proceed with the code?
Maybe your first process is stuck waiting and blocking other processes.
https://docs.python.org/2/library/subprocess.html
Popen.wait()
Wait for child process to terminate. Set and return returncode attribute.
Warning This will deadlock when using stdout=PIPE and/or stderr=PIPE and the
child process generates enough output to a pipe such that it blocks waiting
for the OS pipe buffer to accept more data. Use communicate() to avoid that.

how to create and destroy non blocking sub process that are OS independent in python?

In a python script i want to spawn a process that runs a file in the same directory
I dont want the python script to be blocked by the new process
Then want to be able to close the spawned process from the script.
Ontop of it all i need it to be OS independant.
What is the best of doing this?
As #Keith suggested use subprocess module, but more specifically use Popen. For example, on Windows, this opens myfile.txt with notepad and then terminates it after a 20 seconds:
import subprocess
import time
command = "notepad myfile.txt"
pipe = subprocess.Popen(command, shell=False)
time.sleep(5)
pipe.poll()
print("%s" % pipe.returncode) #"None" when working fine
time.sleep(5)
pipe.terminate()
pipe.wait()
print("%s" % pipe.returncode) # 1 after termination

Categories