python subprocess non-blocking and returning output - python

I know this has been asked a lot of times but I've yet to find a proper way of doing this. If I want to run a local command the docs say I have to use subprocess as it's replacing all other methods such as os.system/peopen etc.
If I call subprocess.Popen(command, shell=True, stdout=subprocess.PIPE) in my program and the command is for example a openvpn directive which connects my computer to a VPN the process will hang indefinitely since openvpn returns it's output ending with a new line but hangs in there while connected and so does my program (frozen).
Some say I should remove the stdout=subprocess.PIPE which indeed works in a non-blocking way but then everything gets printed to the console instead of me having some sort of control over the output (maybe I dont want to print it).
So is there some sort of proper way of doing this, an example maybe of executing commands in a non-blocking way and also having control over the output.?

If you specify stdout=PIPE, then your subprocess will write to the pipe and hang when the pipe buffer is full. The python program shoudn't hang - Popen is asynchronous which is why Popen.wait() can be called later to wait for the subprocess to exit. Read from Popen.stdout in order to keep the subprocess happy, and print, discard, or process the output as you see fit.

Consider running your process within a terminal. For example,
subprocess.Popen("xterm -e /bin/bash -c '/path/to/openvpn'", shell=True)
or even, you could try,
import shlex
subprocess.Poen(shlex.split("xterm -e /bin/bash -c '/path/to/openvpn'"), shell=False)

Related

subprocces.Popen, kill process started with sudo

I am trying to start and later kill a process that requires sudo via a python-script. Even if the python script itself is run with sudo and kill() does not give any permission errors the process is not killed (and never receives SIGKILL).
Investigating this, i found out that Popen() returns the the process id of the sudo process, i assume at least, rather than the process i want to control. So when i correctly kill it later the underlying process keeps running. (Although if i kill the python program before killing the sudo process in python code the underlying process is also killed, so i guess there must be a way to do this manually, too).
I know it might be an option to use pgrep or pidof to search for the correct process, but as the processes name might not be unique it seems unnescessarly error prone (it might also occur that a process with the same name is started around the same time, so taking the latest one might not help).
Is there any solution to get reliably the pid of the underlying process started with sudo in python?
Using Python3.
My code for conducting the tests, taken slightly modified from https://stackoverflow.com/a/43417395/1171541:
import subprocess, time
cmd = ["sudo", "testscript.sh"]
def myfunction(action, process=None):
if action === "start":
process = subprocess.Popen(cmd)
return process
if action === "stop"
# kill() and send_signal(signal.SIGTERM) do not work either
process.terminate()
process = myfunction("start")
time.sleep(5)
myfunction("stop", process);
Okay, i can answer my own question here (which i found on https://izziswift.com/how-to-terminate-a-python-subprocess-launched-with-shelltrue/). The trick was to open the process with:
subprocess.Popen(cmd, stdout=subprocess.PIPE, shell=True, preexec_fn=os.setsid)
and then kill it:
os.killpg(os.getpgid(process.pid), signal.SIGTERM)
This time i use a shell to open and use the os to kill all the processes in the process group.

Send command to subprocess.popen()

My problem
I need to send a command to a popen process that executes a batch file. I have been searching pretty long in the internet for a solution but didn't find anything. The "command" I have to send to is stop.
Tried solutions
I have already tried it with .stdin.write() what ended up in not being able to send commands from the normal console and also the popen process to wait until another execution of the file. Another thing I tried is .communicate() which ended up again with the popen process to wait until another execution of the file.
Current code
Starting code:
mcserver = subprocess.Popen('C:/Users/FlexGames/Desktop/Minecraft_Server/FTBTrident-1.4.0-1.7.10Server/ServerStart.bat',
stdin=subprocess.PIPE)
Current part to send a command to console:
mcserver.stdin.write(b'stop\n')

How to keep xfoil open using subprocess

Summary
I am automating xfoil with the subprocess module. I would like to be able to start an xfoil session with several commands and leave it open for the user to take on.
This would help debugging and also more generally to have a basic routine to start xfoil (without manually typing the same set of commands every time).
I am able to run any xfoil command using subprocess.communicate().
However, when open with subprocess xfoil systematically closes without user action.
Example
With the following code, you can see xfoil opening and closing quickly.
import subprocess
XFOIL_PATH = 'xfoil.exe'
xfoil = subprocess.Popen(XFOIL_PATH, stdin=subprocess.PIPE, stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
universal_newlines=True)
actions = 'NACA 0012\nGDES\n'
xfoil.communicate(input=actions)
Note
I've used subprocess.Popen() with Rhino and Rhino stays open until I close it manually. I do not understand why the behavior is different with xfoil.
I suspect it has something to do with the specific application's stdout but it's a wild guess. Hopefully it is possible to do something about it.
My understanding is when you call communicate() with the input parameter, the call will close stdin, which terminates the xfoil.exe process. Try the following instead of calling communicate():
xfoil.stdin.write(actions)
xfoil.stdin.flush()
After that, the process continues until you exit your script.
Update
If you want the xfoil project to continue even after your script ends, please look into pexpect.

Python: run xdg-open in background

I've console application on Python. I try to use xdg-open and run it in background, but I can't. I tried
os.system('xdg-open http://google.com &')
subprocess.call('xdg-open http://google.com &', shell=True)
I don't know what you mean by
but I can't
because it works for me. I imagine, however, that you're complaining that the parent process does not close until the child has.
That code is, however, an outdated practice (if it ever was in favour); the modern equivalent would be
process = subprocess.Popen(['xdg-open', 'Dunno.png'])
Instead of asking the shell to fork for you, this runs in the background from the start without ever passing through a shell. This should deal with the problem above, too.
If you want to capture sys.stdout, you can use
process = subprocess.Popen(['xdg-open', 'Dunno.png'], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
which redirects the output of the process' stdout and stderr to buffers. (You can access those buffers with process.stdout and process.stderr and communicate either by reading and writing to those or using process.communicate.
You can get the return code with process.returncode.
If your problem is not this, a problem description (traceback?) would be useful. It's also worth checking that the behaviour of using xdg-open in the shell is what you expect.

Getting live output from running unix command in python

I am using below code for running unix commands:
cmd = 'ls -l'
(status,output) = commands.getstatusoutput(cmd)
print output
But the problem is that it shows output only after the command completed, but i want to see the output printed as the execution progresses.
ls -l is just dummy command, i am using some complex command in actual program.
Thanks!!
Since this is homework, here's what to do instead of the full solution:
Use the subprocess.Popen class to call the executable. Note that the constructor takes a named stdout argument, and take a look at subprocess.PIPE.
Read from the Popen object's STDOUT pipe in a separate thread to avoid dead locks. See the threading module.
Wait until the subprocess has finished (see Popen.wait).
Wait until the thread has finished processing the output (see Thread.join). Note that this may very well happen after the subprocess has finished.
If you need more help please describe your precise problem.
Unless there are simpler ways in Python which I'm not aware of, I believe you'll have to dig into the slightly more complex os.fork and os.pipe functions.
Basically, the idea is to fork your process, have the child execute your command, while having its standard output redirected to a pipe which will be read by the parent. You'll easily find examples of this kind of pattern.
Most programs will use block buffered output if they are not connected to a tty, so you need to run the program connected to a pty; the easiest way is to use pexpect:
for line in pexpect.spawn('command arg1 arg2'):
print line

Categories