i have a simple python program that I'm using to test asyncio with subprocesses:
import sys, time
for x in range(100):
print("processing (%s/100) " % x)
sys.stdout.flush()
print("enjoy")
sys.stdout.flush()
Running this on the command line produces the desired results.
However, when called from asyncio, it never finishes
process = yield from asyncio.create_subprocess_exec(
*["python", "program.py"],
stdout=async_subprocess.PIPE,
stderr=async_subprocess.STDOUT,
cwd=working_dir
)
# this never finishes
yield from process.communicate()
ps ax shows this process is <defunct>, not sure what that means
I suspect your issue is just related to how you're calling asyncio.create_subprocess_exec and process.communiate(). This complete example works fine for me:
import asyncio
from asyncio import subprocess
#asyncio.coroutine
def do_work():
process = yield from asyncio.create_subprocess_exec(
*["python", "program.py"],
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT
)
stdout, _= yield from process.communicate()
print(stdout)
if __name__ == "__main__":
loop = asyncio.get_event_loop()
loop.run_until_complete(do_work())
You have to place code that uses yield from inside of a asyncio.coroutine, and then call it inside an event loop (using loop.run_until_complete), for it to behave the way you want it to.
Related
I'm using asyncio subprocess to execute a subcommand. I want to see the long-running process and save the content at the same time to a buffer for later use. Furthermore, I found this related question (Getting live output from asyncio subprocess), but it mainly centers around the use case for ssh.
The asyncio subprocess docs have an example for reading the output line-by-line, which goes into the direction of what I want to achieve. (https://docs.python.org/3/library/asyncio-subprocess.html#examples)
import asyncio
import sys
async def get_date():
code = 'import datetime; print(datetime.datetime.now())'
# Create the subprocess; redirect the standard output
# into a pipe.
proc = await asyncio.create_subprocess_exec(
sys.executable, '-c', code,
stdout=asyncio.subprocess.PIPE)
# Read one line of output.
data = await proc.stdout.readline()
line = data.decode('ascii').rstrip()
# Wait for the subprocess exit.
await proc.wait()
return line
date = asyncio.run(get_date())
print(f"Current date: {date}")
I adapted this example to the following:
async def subprocess_async(cmd, **kwargs):
cmd_list = shlex.split(cmd)
proc = await asyncio.create_subprocess_exec(
*cmd_list,
stdout=asyncio.subprocess.PIPE,
stderr=asyncio.subprocess.STDOUT, **kwargs)
full_log = ""
while True:
buf = await proc.stdout.readline()
if not buf:
break
full_log += buf.decode()
print(f' {buf.decode().rstrip()}')
await proc.wait()
res = subprocess.CompletedProcess(cmd, proc.returncode, stdout=full_log.encode(), stderr=b'')
return res
The issue here is, that the proc.returncode value sometimes becomes None. I guess, I have a misunderstanding, how proc.wait() works and when it is safe to stop reading the output. How do I achieve continuous output using asyncio subprocess?
Your code is working fine as-is for me. What command are you trying to run that is causing your issue?
Two things I can think of to help, are
Instead of calling .wait() afterwards, set the returncode as the loop condition to keep running.
Don't wait for full line returns in case the program is like ffmpeg where it will do some tricks to paste over itself in console and not actually send newline characters.
Example code:
import asyncio, shlex, subprocess, sys
async def subprocess_async(cmd, **kwargs):
cmd_list = shlex.split(cmd)
proc = await asyncio.create_subprocess_exec(
*cmd_list,
stdout=asyncio.subprocess.PIPE,
stderr=asyncio.subprocess.STDOUT,
**kwargs)
full_log = b""
while proc.returncode is None:
buf = await proc.stdout.read(20)
if not buf:
break
full_log += buf
sys.stdout.write(buf.decode())
res = subprocess.CompletedProcess(cmd, proc.returncode, stdout=full_log, stderr=b'')
return res
if __name__ == '__main__':
asyncio.run(subprocess_async("ffprobe -i video.mp4"))
This i what I have so far...
from gpiozero import MotionSensor
import subprocess
import threading
import time
pir = MotionSensor(4)
while True:
pir.wait_for_motion()
print("Start Playing Music")
subprocess.call(['mplayer', '-vo', 'null', '-ao', 'alsa', '-playlist', 'myplaylist', '-shuffle'])
The music playing part works great, but as for the timing, I've tried threading and time, but all seem to do is pause the code for a given amount of time. I want to run the subprocess for a given amount of time, then return to wait on motion. I'm still learning. Thanks for your help.
Python 2.7 - 3.x
Create your subprocess command. I have chosen Popen.
Popen doesn't block, allowing you to interact with the process while it's running, or continue with other things in your Python program. The call to Popen returns a Popen object.
You can read the difference between subprocess.Popen and subprocess.call here
You can use shlex module to split your string command - very comfortable.
After that, you can call your command in the thread. From this moment, you can manage your task called in a thread. There is a simple example, how to do it:
Example of code:
import logging
import shlex
import subprocess
import sys
import threading
logging.basicConfig(filename='log.log',
filemode='a',
format='%(asctime)s,%(msecs)d %(name)s %(levelname)s %(message)s',
datefmt='%H:%M:%S',
level=logging.INFO)
log = logging.getLogger(__name__)
def exec_cmd(command):
try:
cmd = subprocess.Popen(shlex.split(command), # nosec
shell=False,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
universal_newlines=True)
_thread_command(cmd)
out, err = cmd.communicate()
log.error(err) if err else log.info(out)
except subprocess.CalledProcessError as su_err:
log.error('Calledprocerr: %s', su_err)
except OSError as os_error:
log.error('Could not execute command: %s', os_error)
def _thread_command(task, timeout=5):
"""
Thread. If task is longer than <timeout> - kill.
:param task: task to execute.
"""
task_thread = threading.Thread(target=task.wait)
task_thread.start()
task_thread.join(timeout)
if task_thread.is_alive(): # do whatever you want with your task, for example, kill:
task.kill()
logging.error('Timeout! Executed time is more than: %s', timeout)
sys.exit(1)
if __name__ == '__main__':
exec_cmd('sleep 10') # put your string command here
Tested on Centos:
[kchojnowski#zabbix4-worker1 ~]$ cat log.log
11:31:48,348 root ERROR Timeout! Executed time is more than: 5
I want code like this:
if True:
run('ABC.PY')
else:
if ScriptRunning('ABC.PY):
stop('ABC.PY')
run('ABC.PY'):
Basically, I want to run a file, let's say abc.py, and based on some conditions. I want to stop it, and run it again from another python script. Is it possible?
I am using Windows.
You can use python Popen objects for running processes in a child process
So run('ABC.PY') would be p = Popen("python 'ABC.PY'")
if ScriptRunning('ABC.PY) would be if p.poll() == None
stop('ABC.PY') would be p.kill()
This is a very basic example for what you are trying to achieve
Please checkout subprocess.Popen docs to fine tune your logic for running the script
import subprocess
import shlex
import time
def run(script):
scriptArgs = shlex.split(script)
commandArgs = ["python"]
commandArgs.extend(scriptArgs)
procHandle = subprocess.Popen(commandArgs, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
return procHandle
def isScriptRunning(procHandle):
return procHandle.poll() is None
def stopScript(procHandle):
procHandle.terminate()
time.sleep(5)
# Forcefully terminate the script
if isScriptRunning(procHandle):
procHandle.kill()
def getOutput(procHandle):
# stderr will be redirected to stdout due "stderr=subprocess.STDOUT" argument in Popen call
stdout, _ = procHandle.communicate()
returncode = procHandle.returncode
return returncode, stdout
def main():
procHandle = run("main.py --arg 123")
time.sleep(5)
isScriptRunning(procHandle)
stopScript(procHandle)
print getOutput(procHandle)
if __name__ == "__main__":
main()
One thing that you should be aware about is stdout=subprocess.PIPE.
If your python script has a very large output, the pipes may overflow causing your script to block until .communicate is called over the handle.
To avoid this, pass a file handle to stdout, like this
fileHandle = open("main_output.txt", "w")
subprocess.Popen(..., stdout=fileHandle)
In this way, the output of the python process will be dumped into the file.(You will have to modily the getOutput() function too for this)
import subprocess
process = None
def run_or_rerun(flag):
global process
if flag:
assert(process is None)
process = subprocess.Popen(['python', 'ABC.PY'])
process.wait() # must wait or caller will hang
else:
if process.poll() is None: # it is still running
process.terminate() # terminate process
process = subprocess.Popen(['python', 'ABC.PY']) # rerun
process.wait() # must wait or caller will hang
I have to record a wav file and at the same time I have to analyze it with sox. I am using fifo type file for this operation.
So here I need to start 2 threads at the same time but even if I use the threads I am not able to achieve what I wanna do. Always one executing first and then the other. I want them to be in parallel so that I can do some stuff.
#this should be in one thread
def test_wav(self):
""" analyze the data """
bashCommand = "sox {} -n stat".format(self.__rawfile)
while self.__rec_thread.is_alive():
process = subprocess.Popen(bashCommand.split(),stdout=subprocess.PIPE,stderr=subprocess.PIPE)
wav_output = process.communicate()[1] #sox outputs the details in stderr
#do something and return
#this should be in another thread
def record_wav(self):
bashCommand = "arecord -d 10 -c 2 -r 48000 -f S32_LE > {}".format(self.__rawfile)
pid = subprocess.Popen(bashCommand.split())
pid.wait()
if pid.returncode != 0:
raise RecordException("Failed while recording with error {}".format(pid.returncode))
I tried the following code to make them threads but failed(Always one executing first and then the other. I want them to be in parallel so that I can do some stuff).
imported from threading import Thread
self.__rec_thread = Thread(target = self.record_wav())
amp_thread = Thread(target = self.test_wav())
self.__rec_thread.start()
amp_thread.start()
EDIT: First its executing the record(it minimum takes 10 sec because of the option -d 10) function completely and then the test wav function. Its like calling them one after another.
... target = self.record_wav() ...
is calling record_wav(): it executes immediately, and the program doesn't proceed until record_wav() completes. You almost always want to pass a function (or method) object to target=, almost never the result of executing the function/method. So just lose the parentheses:
... target = self.record_wav ...
if you probably use python3, you can use asyncio to run the shell command in goroutines way.
import asyncio
import sys
async def execute(command, cwd=None, shell=True):
process = await asyncio.create_subprocess_exec(*command,
stdout=asyncio.subprocess.PIPE,
stderr=asyncio.subprocess.PIPE,
cwd=cwd,
shell=shell)
std_out, std_err = await process.communicate()
error = std_err.decode().strip()
result = std_out.decode().strip()
print(result)
print(error)
return result
if sys.platform == "win32":
loop = asyncio.ProactorEventLoop()
asyncio.set_event_loop(loop)
else:
loop = asyncio.get_event_loop()
try:
loop.run_until_complete(
asyncio.gather(execute(["bash", "-c", "echo hello && sleep 2"]), execute(["bash", "-c", "echo ok && sleep 1"])))
except Exception as e:
raise e
finally:
loop.close()
I am working on a python program which implements the cmd window.
I am using subproccess with PIPE.
If for example i write "dir" (by stdout), I use communicate() in order to get the response from the cmd and it does work.
The problem is that in a while True loop, this doesn't work more than one time, it seems like the subprocess closes itself..
Help me please
import subprocess
process = subprocess.Popen('cmd.exe', shell=False, stdin=subprocess.PIPE,stdout=subprocess.PIPE,stderr=None)
x=""
while x!="x":
x = raw_input("insert a command \n")
process.stdin.write(x+"\n")
o,e=process.communicate()
print o
process.stdin.close()
The main problem is that trying to read subprocess.PIPE deadlocks when the program is still running but there is nothing to read from stdout. communicate() manually terminates the process to stop this.
A solution would be to put the piece of code that reads stdout in another thread, and then access it via Queue, which allows for reliable sharing of data between threads by timing out instead of deadlocking.
The new thread will read standard out continuously, stopping when there is no more data.
Each line will be grabbed from the queue stream until a timeout is reached(no more data in Queue), then the list of lines will be displayed to the screen.
This process will work for non-interactive programs
import subprocess
import threading
import Queue
def read_stdout(stdout, queue):
while True:
queue.put(stdout.readline()) #This hangs when there is no IO
process = subprocess.Popen('cmd.exe', shell=False, stdout=subprocess.PIPE, stdin=subprocess.PIPE)
q = Queue.Queue()
t = threading.Thread(target=read_stdout, args=(process.stdout, q))
t.daemon = True # t stops when the main thread stops
t.start()
while True:
x = raw_input("insert a command \n")
if x == "x":
break
process.stdin.write(x + "\n")
o = []
try:
while True:
o.append(q.get(timeout=.1))
except Queue.Empty:
print ''.join(o)