I need to launch a python console and control its output. I am using python subprocess.Popen() to a create a new instance.
I saved following code in a python script and running it from windows command prompt. When I run the script it launches the python instance in current windows command prompt, do not launch it in a separate console.
p = subprocess.Popen(["C:\Python31\python.exe"], shell=False,
# stdin=subprocess.PIPE,
stdout=subprocess.PIPE)
out, _ = p.communicate()
print(out.decode())
In Windows you can spawn subprocesses in new console sessions by using the CREATE_NEW_CONSOLE creation flag:
from subprocess import Popen, CREATE_NEW_CONSOLE, PIPE
p = Popen(["C:\Python31\python.exe"], creationflags=CREATE_NEW_CONSOLE)
If you are on windows you can use win32console module to open a second console for your thread or subprocess output. This is the most simple and easiest way that works if you are on windows.
Here is a sample code:
import win32console
import multiprocessing
def subprocess(queue):
win32console.FreeConsole() #Frees subprocess from using main console
win32console.AllocConsole() #Creates new console and all input and output of subprocess goes to this new console
while True:
print(queue.get())
#prints any output produced by main script passed to subprocess using queue
if __name__ == "__main__":
queue = multiprocessing.Queue()
multiprocessing.Process(target=subprocess, args=[queue]).start()
while True:
print("Hello World in main console")
queue.put("Hello work in sub process console")
#sends above string to subprocess and it prints it into its console
#and whatever else you want to do in ur main process
You can also do this with threading. You have to use queue module if you want the queue functionality as threading module doesn't have queue
Here is the win32console module documentation
Related
This question already has answers here:
Python spawn off a child subprocess, detach, and exit
(2 answers)
Closed 1 year ago.
I'm trying to create a Python script that can do two things:
Run normally if no args are present.
If install is passed as argument, install itself to a specific directory (/tmp), and run the installed version in the background, detached from the current process.
I've tried multiple combinations of subprocess.run, subprocess.Popen with the shell, close_fds and other options (even tried nohup), but since I do not have a good understanding of how process spawning works, I do not seem to be using the correct one.
What I'm looking for when I use the install argument is to see "Installing..." and that's it, the new process should be running in the background detached and my shell ready. But what I see is the child process still attached and my terminal busy outputting "Running..." just after the installing message.
How should this be done?
import subprocess
import sys
import time
import os
def installAndRun():
print('Installing...')
scriptPath = os.path.realpath(__file__)
scriptName = (__file__.split('/')[-1] if '/' in __file__ else __file__)
# Copy script to new location (installation)
subprocess.run(['cp', scriptPath, '/tmp'])
# Now run the installed script
subprocess.run(['python3', f'/tmp/{scriptName}'])
def run():
for _ in range(5):
print('Running...')
time.sleep(1)
if __name__=="__main__":
if 'install' in sys.argv:
installAndRun()
else:
run()
Edit: I've just realised that the process does not end when called like that.
Do not use "cp" to copy the script, but shutil.copy() instead.
Instead of "python3", use sys.executable to start the script with the same interpreter the original is started with.
subprocess.Popen() without anything else will work as long as the child process isn't writing anything to stdout and stderr, and isn't requesting any output. In general, the process is not started unless communicate() is not called or PIPEs being read/written to. You have to use os.fork() to detach from the parent (research how daemons are made), then use:
p = subprocess.Popen([sys.executable, new_path], stdout=subprocess.PIPE, stderr=subprocess.PIPE, stdin=subprocess.PIPE)
p.stdin.close() # If you do not need it
p.communicate()
or do not use subprocess.PIPE for stdin, stderr and stdout and make sure that the terminal is bound to the child when forking. After os.fork() you do with the parent what you want and with the child what you want. You can bind the child to whatever terminal you want or start a new shell e.g.:
pid = os.fork()
if pid==0: # Code in this if block is the child
<code to change the terminal and appropriately point sys.stdout, sys.stderr and sys.stdin>
subprocess.Popen([os.getenv("SHELL"), "-c", sys.executable, new_path]).communicate()
Note that you can point PIPEs to file-like objects using stdin, stderr and stdout arguments if you need.
To detach on Windows you can use os.startfile() or use subprocess.Popen(...).communicate() in a thread. If you then sys.exit() the parent, the child should stay opened. (that is how it worked on Windows XP with Python 2.x, I didn't try with Py3 nor on newer Win versions)
It seems like the correct combination was to use Popen + subprocess.PIPE for both stdout and stderr. The code now looks like this:
import subprocess
import sys
import time
import os
def installAndRun(scriptPath, scriptName):
print('Installing...')
# Copy script to new location (installation)
subprocess.run(['cp', scriptPath, '/tmp'])
# Now run the installed script
subprocess.Popen(['python3', f'/tmp/{scriptName}'],
stdout=subprocess.PIPE, stderr=subprocess.PIPE)
def run(scriptPath):
for _ in range(5):
print(f'Running... {scriptPath}')
time.sleep(1)
if __name__=="__main__":
scriptPath = os.path.realpath(__file__)
scriptName = (__file__.split('/')[-1] if '/' in __file__ else __file__)
if 'install' in sys.argv:
installAndRun(scriptPath, scriptName)
else:
run(scriptPath)
I need to call a bash Script Out of my Python Script.
import subprocess
subprocess.call("path/to/script.sh")
This is working, but the Script is starting another programm and therefore wont Exit. So my main Loop is blocked by the subprocess.
Is there a way to call the Script as Thread, not subprocess in Python?
You're better off using Popen
Execute a child program in a new process. On Unix, the class uses
os.execvp()-like behavior to execute the child program. On Windows,
the class uses the Windows CreateProcess() function. The arguments to
Popen are as follows
But if you insist on using threads this might also work:
import subprocess
import threading
def basher():
subprocess.call("echo hello > /tmp/test.txt", shell=True)
t = threading.Thread(target=basher)
t.start()
print('started')
# doing something else
t.join()
print('finished')
Don't use call; its only purpose is to block until the command exits. Use Popen directly:
import subprocess
p = subprocess.Popen("path/to/script.sh")
Now script.sh runs in the forked process while your Python script continues. Use p.wait() when you are ready to check if the script has completed.
Since you specifically asked for a separate thread, I recommend using the multiprocessing module (documentation):
from multiprocessing import Process
import subprocess
def myTask():
subprocess.call("path/to/script.sh")
p = Process(target=myTask) # creates a new thread which will run the function myTask
p.start() # starts the thread
# the script is now running in a separate thread
# you can now continue doing what you want
If at some point in your python script (e.g. before exiting) you want to make sure that the bash script has finished running you can call p.join() which blocks the python script until the bash script has terminated.
A basic example of multiprocessing Process class runs when executed from file, but not from IDLE. Why is that and can it be done?
from multiprocessing import Process
def f(name):
print('hello', name)
p = Process(target=f, args=('bob',))
p.start()
p.join()
Yes. The following works in that function f is run in a separate (third) process.
from multiprocessing import Process
def f(name):
print('hello', name)
if __name__ == '__main__':
p = Process(target=f, args=('bob',))
p.start()
p.join()
However, to see the print output, at least on Windows, one must start IDLE from a console like so.
C:\Users\Terry>python -m idlelib
hello bob
(Use idlelib.idle on 2.x.) The reason is that IDLE runs user code in a separate process. Currently the connection between the IDLE process and the user code process is via a socket. The fork done by multiprocessing does not duplicate or inherit the socket connection. When IDLE is started via an icon or Explorer (in Windows), there is nowhere for the print output to go. When started from a console with python (rather than pythonw), output goes to the console, as above.
I have a some Python code that occasionally needs to span a new process to run a shell script in a "fire and forget" manner, i.e. without blocking. The shell script will not communicate with the original Python code and will in fact probably terminate the calling Python process, so the launched shell script cannot be a child process of the calling Python process. I need it to be launched as an independent process.
In other words, let's say I have mycode.py and that launches script.sh. Then mycode.py will continue processing without blocking. The script script.sh will do some things independently and will then actually stop and restart mycode.py. So the process that runs script.py must be completely independent of mycode.py. How exactly can I do this? I think subprocess.Popen will not block, but will still create a child process that terminates as soon as mycode.py stops, which is not what I want.
Try prepending "nohup" to script.sh. You'll probably need to decide what to do with stdout and stderr; I just drop it in the example.
import os
from subprocess import Popen
devnull = open(os.devnull, 'wb') # Use this in Python < 3.3
# Python >= 3.3 has subprocess.DEVNULL
Popen(['nohup', 'script.sh'], stdout=devnull, stderr=devnull)
Just use subprocess.Popen. The following works OK for me on Windows XP / Windows 7 and Python 2.5.4, 2.6.6, and 2.7.4. And after being converted with py2exe - not tried 3.3 - it comes from the need to delete expired test software on the clients machine.
import os
import subprocess
import sys
from tempfile import gettempdir
def ExitAndDestroy(ProgPath):
""" Exit and destroy """
absp = os.path.abspath(ProgPath)
fn = os.path.join(gettempdir(), 'SelfDestruct.bat')
script_lines = [
'#rem Self Destruct Script',
'#echo ERROR - Attempting to run expired test only software',
'#pause',
'#del /F /Q %s' % (absp),
'#echo Deleted Offending File!',
'#del /F /Q %s\n' % (fn),
#'#exit\n',
]
bf = open(fn, 'wt')
bf.write('\n'.join(script_lines))
bf.flush()
bf.close()
p = subprocess.Popen([fn], shell=False)
sys.exit(-1)
if __name__ == "__main__":
ExitAndDestroy(sys.argv[0])
In a python script i want to spawn a process that runs a file in the same directory
I dont want the python script to be blocked by the new process
Then want to be able to close the spawned process from the script.
Ontop of it all i need it to be OS independant.
What is the best of doing this?
As #Keith suggested use subprocess module, but more specifically use Popen. For example, on Windows, this opens myfile.txt with notepad and then terminates it after a 20 seconds:
import subprocess
import time
command = "notepad myfile.txt"
pipe = subprocess.Popen(command, shell=False)
time.sleep(5)
pipe.poll()
print("%s" % pipe.returncode) #"None" when working fine
time.sleep(5)
pipe.terminate()
pipe.wait()
print("%s" % pipe.returncode) # 1 after termination