So I have this small program (using Linux):
import getch
while True:
print("Hello World!")
key = getch.getch()
if key == 'q':
break
So all it does is wait for the user to hit a key, and they displays "Hello World!" to the console. However, is there a way so that I can continuously display "Hello World!" to the console, and the only way to get it to end is if the user presses the "q" key?
This question is similar to this one, but it's in C++.
My first thought was to look up threading, however, I tried all the threads I could find and none of them worked. Then I came across the Global Interpreter Lock (GIL), and it supposedly prevents "multiple native threads from executing Python bytecodes at once."
Then I tried to use multiprocessing, but it still didn't work for me. This is how far I got using it:
import multiprocessing
import getch
def test1():
print("Hello World!")
def test2():
key = getch.getch()
if key == 'q':
exit()
while True:
p1 = multiprocessing.Process(target=test1, args=())
p2 = multiprocessing.Process(target=test2, args=())
p1.start()
p2.start()
p1.join()
p2.join()
Am I missing something here? Or is there another way in which I can do something while also waiting for getch()? Or do I have to write this in another language that supports multithreading like C++?
Thanks
I was not able to install fetch, probably because I am on Windows at the moment, but you can implement what you want (However, is there a way so that I can continuously display "Hello World!" to the console, and the only way to get it to end is if the user presses the "q" key?) the following way:
import time
from threading import Thread
def another_thread():
while True:
time.sleep(2)
print("Working...\n")
def main_thread():
while True:
x = input("Press a key: \n")
if x == "q":
break
if __name__ == '__main__':
# create another Thread object
# daemon means that it will stop if the Main Thread stops
th = Thread(target=another_thread, daemon=True)
th.start() # start the side Thread
main_thread() # start main logic
Related
I can't get this code to run an input whilst another block of code is running. I want to know if there are any workarounds, my code is as follows.
import multiprocessing
def test1():
input('hello')
def test2():
a=True
while a == True:
b = 5
if __name__ == "__main__":
p1 = multiprocessing.Process(target=test1)
p2 = multiprocessing.Process(target=test2)
p1.start()
p2.start()
p1.join()
p2.join()
When the code is run I get an EOF error which apparently happens when the input function is interrupted.
I would have the main process create a daemon thread responsible for doing the input in conjunction with creating the greatly under-utilized full duplex Pipe which provides two two-way communication Connection instances. For simplicity the following demo just creates one Process instance that loops doing input requests echoing the response until the user enters 'quit':
import multiprocessing
import threading
def test1(conn):
while True:
conn.send('Please enter a value: ')
s = conn.recv()
if s == 'quit':
break
print(f'You entered: "{s}"')
def inputter(conn):
while True:
# The contents of the request is the prompt to be used:
prompt = conn.recv()
conn.send(input(prompt))
if __name__ == "__main__":
conn1, conn2 = multiprocessing.Pipe(duplex=True)
t = threading.Thread(target=inputter, args=(conn1,), daemon=True)
p = multiprocessing.Process(target=test1, args=(conn2,))
t.start()
p.start()
p.join()
That's not all of your code, because it doesn't show the multiprocessing. However, the issue is that only the main process can interact with the console. The other processes do not have a stdin. You can use a Queue to communicate with the main process if you need to, but in general you want the secondary processes to be pretty much standalone.
I am trying to make a console for my python applications, but i ran into a problem:
when printing something using the print() function, the text in the input field is also included. This is purely visual, because the program still works.
I tried searching online, but I do not even now what to search for and had no luck.
This is my code. It prints "foo" until the user types "exit":
import multiprocessing as mp
import os
import time
def f(q):
while True:
print(q)
time.sleep(1)
if __name__=="__main__":
p=mp.Process(target=f, args=("foo",))
p.start()
while True:
comm=str(input())
if comm=="exit":
p.terminate()
break
When the program is running, the user can still type, but when the program prints something, it also takes whatever is in the input field at the time:
foo
foo
foo
foo
efoo
xfoo
itfoo
When pressing "enter", the program still registers the input correctly and exits the program.
Here is a modification of your code that only prints foo after you have finished your input typing (i.e., until you hit Enter):
import multiprocessing as mp
from multiprocessing import Queue
def f(q, queue):
while True:
queue.get()
print(q)
if __name__=="__main__":
queue = Queue()
p=mp.Process(target=f, args=("foo", queue))
p.start()
while True:
comm=str(input())
queue.put(None)
if comm=="exit":
p.terminate()
break
If terminating the process is all you want your user to be able to do, then you can instruct them to enter Ctrl+C if they wish to stop the operation and then catch the KeyboardInterrupt exception that comes along with it.
import multiprocessing as mp
import os
import time
def f(q):
while True:
print(q)
time.sleep(1)
if __name__=="__main__":
p=mp.Process(target=f, args=("foo",))
print("Process starting. Use Ctrl+c anytime to stop it!")
p.start()
try:
while True:
input() # Trash command
except KeyboardInterrupt:
print("Terminating process...")
p.terminate()
print("Process terminated...")
If you want to do more complicated commands then a GUI would be your best approach (as mentioned by John)
I really need an asyncio compatible getkey() so I can
async def stuff():
await getkey()
So when the coroutine stuff hits the await our loop stops the task, and continues on another one.
I am new to coding but there sure is such a thing somewhere?
If not, it is possible to build such a coroutine or not?
The getkey() could return the pressed key value in any form.
But it should have cbreak and noecho on (Don't wait for enter and Don't print the pressed key).
(clarification, no real need to continue read.)
please help me^^ I know, that this way of doing it seems unusual. Curses running in it's own thread would be the right way to go. But I can use curses only for displaying.. also I am really new to coding.. and I have no time to look into this threading thing:/ I just need my 100 lines to work fluently really fast and also only once :!
If you don't want to actually wait for the keypress, one way is to use a thread to detect the keypress:
from threading import Thread
import curses
key_pressed = False
def detect_key_press():
global key_pressed
stdscr = curses.initscr()
key = stdscr.getch()
key_pressed = True
def main():
thread = Thread(target = detect_key_press)
thread.start()
while not key_pressed:
print('Hello world\r')
curses.endwin()
main()
It's not exactly nice to use global variables, but it's a quick way to do it.
Here's a solution using keyboard instead of curses:
import keyboard
import time
from threading import Thread
key_pressed = False
def detect_key_press():
global key_pressed
while not keyboard.is_pressed('q'):
pass
key_pressed = True
def main():
thread = Thread(target = detect_key_press)
thread.start()
while not key_pressed:
print("Waiting for keypress...")
time.sleep(1)
main()
Starts a thread inside main that waits for keypress
I have yet another question about Python multiprocessing.
I have a module that creates a Process and just runs in a while True loop.
This module is meant to be enabled/disabled from another Python module.
That other module will import the first one once and is also run as a process.
How would I better implement this?
so for a reference:
#foo.py
def foo():
while True:
if enabled:
#do something
p = Process(target=foo)
p.start()
and imagine second module to be something like that:
#bar.py
import foo, time
def bar():
while True:
foo.enable()
time.sleep(10)
foo.disable()
Process(target=bar).start()
Constantly running a process checking for condition inside a loop seems like a waste, but I would gladly accept the solution that just lets me set the enabled value from outside.
Ideally I would prefer to be able to terminate and restart the process, again from outside of this module.
From my understanding, I would use a Queue to pass commands to the Process. If it is indeed just that, can someone show me how to set it up in a way that I can add something to the queue from a different module.
Can this even be easily done with Python or is it time to abandon hope and switch to something like C or Java
I purposed in comment two different approches :
using a shared variable from multiprocessing.Value
pause / resume the process with signals
Control by sharing a variable
def target_process_1(run_statement):
while True:
if run_statement.value:
print "I'm running !"
time.sleep(1)
def target_process_2(run_statement):
time.sleep(3)
print "Stoping"
run_statement.value = False
time.sleep(3)
print "Resuming"
run_statement.value = True
if __name__ == "__main__":
run_statement = Value("i", 1)
process_1 = Process(target=target_process_1, args=(run_statement,))
process_2 = Process(target=target_process_2, args=(run_statement,))
process_1.start()
process_2.start()
time.sleep(8)
process_1.terminate()
process_2.terminate()
Control by sending a signal
from multiprocessing import Process
import time
import os, signal
def target_process_1():
while True:
print "Running !"
time.sleep(1)
def target_process_2(target_pid):
time.sleep(3)
os.kill(target_pid, signal.SIGSTOP)
time.sleep(3)
os.kill(target_pid, signal.SIGCONT)
if __name__ == "__main__":
process_1 = Process(target=target_process_1)
process_1.start()
process_2 = Process(target=target_process_2, args=(process_1.pid,))
process_2.start()
time.sleep(8)
process_1.terminate()
process_2.terminate()
Side note: if possible do not run a while True.
EDIT: if you want to manage your process in two different files, supposing you want to use a control by sharing a variable, this is a way to do.
# file foo.py
from multiprocessing import Value, Process
import time
__all__ = ['start', 'stop', 'pause', 'resume']
_statement = None
_process = None
def _target(run_statement):
""" Target of the foo's process """
while True:
if run_statement.value:
print "I'm running !"
time.sleep(1)
def start():
global _process, _statement
_statement = Value("i", 1)
_process = Process(target=_target, args=(_statement,))
_process.start()
def stop():
global _process, _statement
_process.terminate()
_statement, _process = None, _process
def enable():
_statement.value = True
def disable():
_statement.value = False
I want to create a python script that prints out messages from one thread, while still waiting for you to input on another. Is this possible? And if so, how?
System: Windows 7
Language: Python 2.7
I have tried this (modified from a different question):
import threading
import time
def message_loop():
while True:
time.sleep(1)
print "Hello World"
thread = threading.Thread(target = message_loop)
thread.start()
while True:
input = raw_input("Prompt> ")
But what happens is: the program waits until I have finished inputting before it outputs Hello World.
It's absolutely possible. If you have a function that prints output (let's call it print_output) you can start it up in a different thread using the threading module:
>>> import threading
>>> my_thread = threading.Thread(target=print_output)
>>> my_thread.start()
You should now start getting your output. You can then run the input bit on the main thread. You could also run it in a new thread, but there are some advantages to running input in the main thread.
This works for me.
The code prints message before you input 'q'
import threading
import time
def run_thread():
while True:
print('thread running')
time.sleep(2)
global stop_threads
if stop_threads:
break
stop_threads = False
t1 = threading.Thread(target=run_thread)
t1.start()
time.sleep(0.5)
q = ''
while q != 'q':
q = input()
stop_threads = True
t1.join()
print('finish')