Sending to the stdin of a program in python3 - python

I have to files, main.py and child.py.
I am trying to send a string to the stdin of main.py.
This is my incomplete code:
main.py
from subprocess import *
import time
def main():
program = Popen(['python.exe'. 'child.py', 'start'])
while True: #waiting for'1' to be sent to the stdin
if sys.stdin == '1':
print('text)
if __name__ == '__main__':
main()
child.py
import sys
if sys.argv[1] == 'start':
inp = input('Do you want to send the argument?\n').lower()
if inp == 'no':
sys.exit()
elif inp == 'yes':
#Somehow send '1' to the stdin of 1.py while it is running
I have no idea how to do this.
I am running windows 10 with python 3.5.1
-Thanks
EDIT:
When I am sending the argument back to main.py, I can not re-open the program. os.system re-opens the program which is not useful in my case.
These programs are a small demo of what I am trying to do. In my actual program, I am not able to do that as the two programs are "communicating" with each other an need to be open at all times.
What I need answered is a way to send an argument to main.py perhaps using stdin but when I am sending my argument, It can not re-open the program. Some examples like os.system re-open the program which is not what I am trying to do. I need main.py open at all times.
I have my new current code which is not working. A window pops up and then closes.
main.py
from subprocess import Popen, PIPE, STDOUT
x = Popen(['python.exe', '2.py', 'start'], stdout=PIPE, stdin=PIPE, stderr=STDOUT)
while x.poll() is None:
if b'Do you want to send the argument?' in x.stdout.read():
x.stdin.write(b'yes\n')
child.py
import sys
import time
time.sleep(1)
if 1 = 1:
inp = input('Do you want to send the argument?\n').lower()
if inp == 'no':
sys.exit()
elif inp == 'yes':
sys.stdout.write('1')
sys.stdout.flush()
That is my code.

What you need is something along the lines of (in main.py):
from subprocess import Popen, PIPE, STDOUT
x = Popen(['some_child.exe', 'parameter'], stdout=PIPE, stdin=PIPE, stderr=STDOUT)
while x.poll() is None:
child_output = x.stdout.readline()
print(child_output)
if b'Do you want to send the argument?' in child_output:
x.stdin.write(b'yes\n')
x.stdin.flush()
x.stdout.close()
x.stdin.close()
You're assuming child.exe (in your mockup demo, python.exe) is communicating with main.py via sys.stdin/stdout, however these I/O's are used to communicate with the shell that spawned the process.
Much like the childs stdout/stdin will be communicating with the shell that spawned that process, in this case Popen().
Each spawned child process of subprocess.Popen(...) will be isolated with it's own stdout/stdin/stderr, otherwise every subprocess would make a huge mess of your main process stdout/stdin. This means you'll have to check for output on that particular subprocess and write to it accordingly as done in the above example.
One way to look at it is this:
You're starting main.py, and you communicate with it via sys.stdout and sys.stdin. Each input() in main.py will output something to sys.stdout so you can read it.
Exactly the same logic applies to child.exe where every input() will output something to it's sys.stdout (- But remember - sys is not a shared variable across processes).
import sys
if sys.argv[1] == 'start':
inp = input('Do you want to send the argument?\n').lower()
if inp == 'no':
sys.exit()
elif inp == 'yes':
#Somehow send '1' to the stdin of 1.py while it is running
sys.stdout.write('1')
sys.stdout.flush()
But a simple print(1) would do the same because it will essentially output the 1 to sys.stdout for you.
Edit 2018: Don't forget to close your inputs and outputs, as they might leave open file descriptors on your file system, hogging resources and causing problems later in life.
Other conveyers of information
Assuming you have control of the code to child.exe and you can modify the communication pipe in any way, some other options are:
sockets - Use regular sockets to communicate, on *nix the most efficient would be Unix sockets.
Some other solutions can be found here: Best way to return a value from a python script
More cautionary tails!
.readline() will assume there's a \n somewhere in your data, most likely at the end. I switched to .readline() for two reasons, .read() will hang and wait for EOF unless you specify exactly how many bytes to read, if I'm not out on a bicycle. To be able to read all kinds of output you need to incorporate select.select() into your code - or a buffer of some sort where you call x.stdout.read(1) to read one byte at a time. Because if you try to read .read(1024) and there's not 1024 bytes in the buffer, your read will hang until there are 1024 characters.
I left a bug in your child.py code on purpose (mine works) - It's trivial and basic Python - in hopes that it's a learning experience on how to debug errors (you mentioned you're not good at it, this is a way to learn).

Related

Python: program hangs when trying to read from subprocess stdout while it is running

I am trying to communicate with a c++ script (let's call it script A) using the python subprocess module. Script A is running alongside the python program and is constantly interacted with. My goal is to send script A input commands, and capture the outputs that are being printed to STDOUT afterwards by script A. I'm working on windows 10.
Here is a snippet describing the logic:
proc = subprocess.Popen([".\\build\\bin.exe"], stdout=subprocess.PIPE, stdin=subprocess.PIPE)
terminate = False
while not terminate:
command = input("Enter your command here: ")
if command == "q":
terminate = True
else:
proc.stdin.write(command.encode()) # send input to script A
output = proc.stdout.readline().decode() # problematic line, trying to capture output from script A
print(f"Output is: {output}")
The problem is that while script A is writing output to STDOUT after each command like I expect it to, the python script hangs when it reaches the line highlighted above. I tried to capture the output using proc.stdout.read(1) with bufsize=0 on the call to Popen and for line in iter(proc.stdout.readlines()) and some other ways but the problem persists.
Would appreciate any help on this because nothing I tried is working for me.
Thanks in advance!
You already suggested to use bufsize=0, which seems the right solution. However, this only affects buffering at the Python side. If the executable you are calling uses buffered input or output, I don't think there's anything you can do about it (as also mentioned here].
If both programs are under your own control, then you can easily make this work. Here is an example. For simplicity I created two Python scripts that interact with each other in a similar way you are doing. Note that this doesn't differ very much from the situation with a C++ application, since in both cases an executable is started as subprocess.
File pong.py (simple demo application that reads input and responds to it - similar to your "script A"):
while True:
try:
line = input()
except EOFError:
print('EOF')
break
if line == 'ping':
print('pong')
elif line == 'PING':
print('PONG')
elif line in ['exit', 'EXIT', 'quit', 'QUIT']:
break
else:
print('what?')
print('BYE!')
File main.py (the main program that communicates with pong.py):
import subprocess
class Test:
def __init__(self):
self.proc = subprocess.Popen(['python.exe', 'pong.py'], bufsize=0, encoding='ascii',
stdout=subprocess.PIPE, stdin=subprocess.PIPE)
def talk(self, tx):
print('TX: ' + tx)
self.proc.stdin.write(tx + '\n')
rx = self.proc.stdout.readline().rstrip('\r\n')
print('RX: ' + rx)
def main():
test = Test()
test.talk('ping')
test.talk('test')
test.talk('PING')
test.talk('exit')
if __name__ == '__main__':
main()
Output of python main.py:
TX: ping
RX: pong
TX: test
RX: what?
TX: PING
RX: PONG
TX: exit
RX: BYE!
Of course there are other solutions as well. For example, you might use a socket to communicate between the two applications. However, this is only applicable if you can modify both application (e.g. if you are developing both applications), not if the executable you are calling is a third-party application.
first, buffersize=0, which is the right solution. However, this is not enough.
In your executeable program, you should set the stdout buffersize to 0. or flush in time.
In C/C++ program, you can add
setbuf(stdout, nullptr);
to your source code.

Subprocess will not run any commands after initial args in Popen constructor, even after flushing all pipes [duplicate]

I want to achieve something which is very similar to this.
My actual goal is to run Rasa from within python.
Taken from Rasa's site:
Rasa is a framework for building conversational software: Messenger/Slack bots, Alexa skills, etc. We’ll abbreviate this as a bot in this documentation.
It is basically a chatbot which runs in the command prompt. This is how it works on cmd :
Now I want to run Rasa from python so that I can integrate it with my Django-based website. i.e. I want to keep taking inputs from the user, pass it to rasa, rasa processes the text and gives me an output which I show back to the user.
I have tried this (running it from cmd as of now)
import sys
import subprocess
from threading import Thread
from queue import Queue, Empty # python 3.x
def enqueue_output(out, queue):
for line in iter(out.readline, b''):
queue.put(line)
out.close()
def getOutput(outQueue):
outStr = ''
try:
while True: #Adds output from the Queue until it is empty
outStr+=outQueue.get_nowait()
except Empty:
return outStr
p = subprocess.Popen('command_to_run_rasa',
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
shell=False,
universal_newlines=True,
)
outQueue = Queue()
outThread = Thread(target=enqueue_output, args=(p.stdout, outQueue))
outThread.daemon = True
outThread.start()
someInput = ""
while someInput != "stop":
someInput = input("Input: ") # to take input from user
p.stdin.write(someInput) # passing input to be processed by the rasa command
p.stdin.flush()
output = getOutput(outQueue)
print("Output: " + output + "\n")
p.stdout.flush()
But it works fine only for the first line of output. Not for successive input/output cycles. See output below.
How do I get it working for multiple cycles?
I've referred to this, and I think I understand the problem in my code from it but I dont know how to solve it.
EDIT: I'm using Python 3.6.2 (64-bit) on Windows 10
You need to keep interacting with your subprocess - at the moment once you pick the output from your subprocess you're pretty much done as you close its STDOUT stream.
Here is the most rudimentary way to continue user input -> process output cycle:
import subprocess
import sys
import time
if __name__ == "__main__": # a guard from unintended usage
input_buffer = sys.stdin # a buffer to get the user input from
output_buffer = sys.stdout # a buffer to write rasa's output to
proc = subprocess.Popen(["path/to/rasa", "arg1", "arg2", "etc."], # start the process
stdin=subprocess.PIPE, # pipe its STDIN so we can write to it
stdout=output_buffer, # pipe directly to the output_buffer
universal_newlines=True)
while True: # run a main loop
time.sleep(0.5) # give some time for `rasa` to forward its STDOUT
print("Input: ", end="", file=output_buffer, flush=True) # print the input prompt
print(input_buffer.readline(), file=proc.stdin, flush=True) # forward the user input
You can replace input_buffer with a buffer coming from your remote user(s) and output_buffer with a buffer that forwards the data to your user(s) and you'll get essentially what you're looking for - the sub-process will be getting the input directly from the user (input_buffer) and print its output to the user (output_buffer).
If you need to perform other tasks while all this is running in the background, just run everything under the if __name__ == "__main__": guard in a separate thread, and I'd suggest adding a try..except block to pick up KeyboardInterrupt and exit gracefully.
But... soon enough you'll notice that it doesn't exactly work properly all the time - if it takes longer than half a second of wait for rasa to print its STDOUT and enter the wait for STDIN stage, the outputs will start to mix. This problem is considerably more complex than you might expect. The main issue is that STDOUT and STDIN (and STDERR) are separate buffers and you cannot know when a subprocess is actually expecting something on its STDIN. This means that without a clear indication from the subprocess (like you have the \r\n[path]> in Windows CMD prompt on its STDOUT for example) you can only send data to the subprocesses STDIN and hope it will be picked up.
Based on your screenshot, it doesn't really give a distinguishable STDIN request prompt because the first prompt is ... :\n and then it waits for STDIN, but then once the command is sent it lists options without an indication of its end of STDOUT stream (technically making the prompt just ...\n but that would match any line preceding it as well). Maybe you can be clever and read the STDOUT line by line, then on each new line measure how much time has passed since the sub-process wrote to it and once a threshold of inactivity is reached assume that rasa expects input and prompt the user for it. Something like:
import subprocess
import sys
import threading
# we'll be using a separate thread and a timed event to request the user input
def timed_user_input(timer, wait, buffer_in, buffer_out, buffer_target):
while True: # user input loop
timer.wait(wait) # wait for the specified time...
if not timer.is_set(): # if the timer was not stopped/restarted...
print("Input: ", end="", file=buffer_out, flush=True) # print the input prompt
print(buffer_in.readline(), file=buffer_target, flush=True) # forward the input
timer.clear() # reset the 'timer' event
if __name__ == "__main__": # a guard from unintended usage
input_buffer = sys.stdin # a buffer to get the user input from
output_buffer = sys.stdout # a buffer to write rasa's output to
proc = subprocess.Popen(["path/to/rasa", "arg1", "arg2", "etc."], # start the process
stdin=subprocess.PIPE, # pipe its STDIN so we can write to it
stdout=subprocess.PIPE, # pipe its STDIN so we can process it
universal_newlines=True)
# lets build a timer which will fire off if we don't reset it
timer = threading.Event() # a simple Event timer
input_thread = threading.Thread(target=timed_user_input,
args=(timer, # pass the timer
1.0, # prompt after one second
input_buffer, output_buffer, proc.stdin))
input_thread.daemon = True # no need to keep the input thread blocking...
input_thread.start() # start the timer thread
# now we'll read the `rasa` STDOUT line by line, forward it to output_buffer and reset
# the timer each time a new line is encountered
for line in proc.stdout:
output_buffer.write(line) # forward the STDOUT line
output_buffer.flush() # flush the output buffer
timer.set() # reset the timer
You can use a similar technique to check for more complex 'expected user input' patterns. There is a whole module called pexpect designed to deal with this type of tasks and I wholeheartedly recommend it if you're willing to give up some flexibility.
Now... all this being said, you are aware that Rasa is built in Python, installs as a Python module and has a Python API, right? Since you're already using Python why would you call it as a subprocess and deal with all this STDOUT/STDIN shenanigans when you can directly run it from your Python code? Just import it and interact with it directly, they even have a very simple example that does exactly what you're trying to do: Rasa Core with minimal Python.

Running interactive program from within python

I want to achieve something which is very similar to this.
My actual goal is to run Rasa from within python.
Taken from Rasa's site:
Rasa is a framework for building conversational software: Messenger/Slack bots, Alexa skills, etc. We’ll abbreviate this as a bot in this documentation.
It is basically a chatbot which runs in the command prompt. This is how it works on cmd :
Now I want to run Rasa from python so that I can integrate it with my Django-based website. i.e. I want to keep taking inputs from the user, pass it to rasa, rasa processes the text and gives me an output which I show back to the user.
I have tried this (running it from cmd as of now)
import sys
import subprocess
from threading import Thread
from queue import Queue, Empty # python 3.x
def enqueue_output(out, queue):
for line in iter(out.readline, b''):
queue.put(line)
out.close()
def getOutput(outQueue):
outStr = ''
try:
while True: #Adds output from the Queue until it is empty
outStr+=outQueue.get_nowait()
except Empty:
return outStr
p = subprocess.Popen('command_to_run_rasa',
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
shell=False,
universal_newlines=True,
)
outQueue = Queue()
outThread = Thread(target=enqueue_output, args=(p.stdout, outQueue))
outThread.daemon = True
outThread.start()
someInput = ""
while someInput != "stop":
someInput = input("Input: ") # to take input from user
p.stdin.write(someInput) # passing input to be processed by the rasa command
p.stdin.flush()
output = getOutput(outQueue)
print("Output: " + output + "\n")
p.stdout.flush()
But it works fine only for the first line of output. Not for successive input/output cycles. See output below.
How do I get it working for multiple cycles?
I've referred to this, and I think I understand the problem in my code from it but I dont know how to solve it.
EDIT: I'm using Python 3.6.2 (64-bit) on Windows 10
You need to keep interacting with your subprocess - at the moment once you pick the output from your subprocess you're pretty much done as you close its STDOUT stream.
Here is the most rudimentary way to continue user input -> process output cycle:
import subprocess
import sys
import time
if __name__ == "__main__": # a guard from unintended usage
input_buffer = sys.stdin # a buffer to get the user input from
output_buffer = sys.stdout # a buffer to write rasa's output to
proc = subprocess.Popen(["path/to/rasa", "arg1", "arg2", "etc."], # start the process
stdin=subprocess.PIPE, # pipe its STDIN so we can write to it
stdout=output_buffer, # pipe directly to the output_buffer
universal_newlines=True)
while True: # run a main loop
time.sleep(0.5) # give some time for `rasa` to forward its STDOUT
print("Input: ", end="", file=output_buffer, flush=True) # print the input prompt
print(input_buffer.readline(), file=proc.stdin, flush=True) # forward the user input
You can replace input_buffer with a buffer coming from your remote user(s) and output_buffer with a buffer that forwards the data to your user(s) and you'll get essentially what you're looking for - the sub-process will be getting the input directly from the user (input_buffer) and print its output to the user (output_buffer).
If you need to perform other tasks while all this is running in the background, just run everything under the if __name__ == "__main__": guard in a separate thread, and I'd suggest adding a try..except block to pick up KeyboardInterrupt and exit gracefully.
But... soon enough you'll notice that it doesn't exactly work properly all the time - if it takes longer than half a second of wait for rasa to print its STDOUT and enter the wait for STDIN stage, the outputs will start to mix. This problem is considerably more complex than you might expect. The main issue is that STDOUT and STDIN (and STDERR) are separate buffers and you cannot know when a subprocess is actually expecting something on its STDIN. This means that without a clear indication from the subprocess (like you have the \r\n[path]> in Windows CMD prompt on its STDOUT for example) you can only send data to the subprocesses STDIN and hope it will be picked up.
Based on your screenshot, it doesn't really give a distinguishable STDIN request prompt because the first prompt is ... :\n and then it waits for STDIN, but then once the command is sent it lists options without an indication of its end of STDOUT stream (technically making the prompt just ...\n but that would match any line preceding it as well). Maybe you can be clever and read the STDOUT line by line, then on each new line measure how much time has passed since the sub-process wrote to it and once a threshold of inactivity is reached assume that rasa expects input and prompt the user for it. Something like:
import subprocess
import sys
import threading
# we'll be using a separate thread and a timed event to request the user input
def timed_user_input(timer, wait, buffer_in, buffer_out, buffer_target):
while True: # user input loop
timer.wait(wait) # wait for the specified time...
if not timer.is_set(): # if the timer was not stopped/restarted...
print("Input: ", end="", file=buffer_out, flush=True) # print the input prompt
print(buffer_in.readline(), file=buffer_target, flush=True) # forward the input
timer.clear() # reset the 'timer' event
if __name__ == "__main__": # a guard from unintended usage
input_buffer = sys.stdin # a buffer to get the user input from
output_buffer = sys.stdout # a buffer to write rasa's output to
proc = subprocess.Popen(["path/to/rasa", "arg1", "arg2", "etc."], # start the process
stdin=subprocess.PIPE, # pipe its STDIN so we can write to it
stdout=subprocess.PIPE, # pipe its STDIN so we can process it
universal_newlines=True)
# lets build a timer which will fire off if we don't reset it
timer = threading.Event() # a simple Event timer
input_thread = threading.Thread(target=timed_user_input,
args=(timer, # pass the timer
1.0, # prompt after one second
input_buffer, output_buffer, proc.stdin))
input_thread.daemon = True # no need to keep the input thread blocking...
input_thread.start() # start the timer thread
# now we'll read the `rasa` STDOUT line by line, forward it to output_buffer and reset
# the timer each time a new line is encountered
for line in proc.stdout:
output_buffer.write(line) # forward the STDOUT line
output_buffer.flush() # flush the output buffer
timer.set() # reset the timer
You can use a similar technique to check for more complex 'expected user input' patterns. There is a whole module called pexpect designed to deal with this type of tasks and I wholeheartedly recommend it if you're willing to give up some flexibility.
Now... all this being said, you are aware that Rasa is built in Python, installs as a Python module and has a Python API, right? Since you're already using Python why would you call it as a subprocess and deal with all this STDOUT/STDIN shenanigans when you can directly run it from your Python code? Just import it and interact with it directly, they even have a very simple example that does exactly what you're trying to do: Rasa Core with minimal Python.

Controlling a python script from another script

I am trying to learn how to write a script control.py, that runs another script test.py in a loop for a certain number of times, in each run, reads its output and halts it if some predefined output is printed (e.g. the text 'stop now'), and the loop continues its iteration (once test.py has finished, either on its own, or by force). So something along the lines:
for i in range(n):
os.system('test.py someargument')
if output == 'stop now': #stop the current test.py process and continue with next iteration
#output here is supposed to contain what test.py prints
The problem with the above is that, it does not check the output of test.py as it is running, instead it waits until test.py process is finished on its own, right?
Basically trying to learn how I can use a python script to control another one, as it is running. (e.g. having access to what it prints and so on).
Finally, is it possible to run test.py in a new terminal (i.e. not in control.py's terminal) and still achieve the above goals?
An attempt:
test.py is this:
from itertools import permutations
import random as random
perms = [''.join(p) for p in permutations('stop')]
for i in range(1000000):
rand_ind = random.randrange(0,len(perms))
print perms[rand_ind]
And control.py is this: (following Marc's suggestion)
import subprocess
command = ["python", "test.py"]
n = 10
for i in range(n):
p = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
while True:
output = p.stdout.readline().strip()
print output
#if output == '' and p.poll() is not None:
# break
if output == 'stop':
print 'sucess'
p.kill()
break
#Do whatever you want
#rc = p.poll() #Exit Code
You can use subprocess module or also the os.popen
os.popen(command[, mode[, bufsize]])
Open a pipe to or from command. The return value is an open file object connected to the pipe, which can be read or written depending on whether mode is 'r' (default) or 'w'.
With subprocess I would suggest
subprocess.call(['python.exe', command])
or the subprocess.Popen --> that is similar to os.popen (for instance)
With popen you can read the connected object/file and check whether "Stop now" is there.
The os.system is not deprecated and you can use as well (but you won't get a object from that), you can just check if return at the end of execution.
From subprocess.call you can run it in a new terminal or if you want to call multiple times ONLY the test.py --> than you can put your script in a def main() and run the main as much as you want till the "Stop now" is generated.
Hope this solve your query :-) otherwise comment again.
Looking at what you wrote above you can also redirect the output to a file directly from the OS call --> os.system(test.py *args >> /tmp/mickey.txt) then you can check at each round the file.
As said the popen is an object file that you can access.
What you are hinting at in your comment to Marc Cabos' answer is Threading
There are several ways Python can use the functionality of other files. If the content of test.py can be encapsulated in a function or class, then you can import the relevant parts into your program, giving you greater access to the runnings of that code.
As described in other answers you can use the stdout of a script, running it in a subprocess. This could give you separate terminal outputs as you require.
However if you want to run the test.py concurrently and access variables as they are changed then you need to consider threading.
Yes you can use Python to control another program using stdin/stdout, but when using another process output often there is a problem of buffering, in other words the other process doesn't really output anything until it's done.
There are even cases in which the output is buffered or not depending on if the program is started from a terminal or not.
If you are the author of both programs then probably is better using another interprocess channel where the flushing is explicitly controlled by the code, like sockets.
You can use the "subprocess" library for that.
import subprocess
command = ["python", "test.py", "someargument"]
for i in range(n):
p = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
while True:
output = p.stdout.readline()
if output == '' and p.poll() is not None:
break
if output == 'stop now':
#Do whatever you want
rc = p.poll() #Exit Code

How to implement Streaming Input with Subprocesses in PYTHON?

Since input and raw_input() stop the program from running anymore, I want to use a subprocess to run this program...
while True: print raw_input()
and get its output.
This is what I have as my reading program:
import subprocess
process = subprocess.Popen('python subinput.py', stdout=subprocess.PIPE, stderr=subprocess.PIPE)
while True:
output=process.stdout.read(12)
if output=='' and process.poll()!=None:
break
if output!='':
sys.stdout.write(output)
sys.stdout.flush()
When I run this, the subprocess exits almost as fast as it started. How can I fix this?
I'm afraid it won't work this way.
You assume, that subprocess will attach your console (your special
case of stdin). This does not work, the module only has two
options for specifying that: PIPE and STDOUT.
When nothing is specified, the subprocess won't be able to use
the corresponding stream - it's output will go nowhere or it will
receive no input. The raw_input() ends because of EOF.
The way to go is to have your input in the "main" program,
and the work done in a subprocess.
EDIT:
Here's an example in multiprocessing
from multiprocessing import Process, Pipe
import time
def child(conn):
while True:
print "Processing..."
time.sleep(1)
if conn.poll(0):
output = conn.recv()
print output
else:
print "I got nothing this time"
def parent():
parent_conn, child_conn = Pipe()
p = Process(target=child, args=(child_conn,))
p.start()
while True:
data = raw_input()
parent_conn.send(data)
# p.join() - you have to find some way to stop all this...
# like a specific message to quit etc.
if __name__ == '__main__':
parent()
You of course need to make it more robust by finding a way too stop
this cooperation. In my example both processes are in the same file,
but you may organize it differently.
This example works on Linux, you may have some problems with pipes on Windows,
but it should be altogether solvable.
The "Processing" is the part where you want to do something else, not just
wait for the data from the parent.
I think the problem is that subprocesses are not directly hooked up to stdout and stdin, and therefore cannot receive keyboard input. Presumably raw_input() is throwing an exception.
If this is a practical issue and not an experiment, I recommend you use a library such as curses or pygame to handle your input. If you're experimenting and want to do it yourself, then I suppose you'll have to look at threads instead of subprocesses, though this is a fairly complex thing to try to do so you're certain to run into other issues.
Well, try different architecture. You can use zeromq.
Producer produces all the items(here output which to be sent via stdout) and broadcasted via zmq.
Consumer should listen to the port no which is being broadcasted by the producer and process them accordingly.
Here is the Example http://code.saghul.net/implementing-a-pubsub-based-application-with
Note
Use gevent or multiprocessing to spawn these process.
You will have master program which takes care of spawning producer and consumer

Categories