I believe this to be a very simple question, but I have been failing to find a simple answer.
I am running a python program that terminates an AWS cluster (using starcluster). I am just calling a command from my python program using subprocess, something like the following.
subprocess.call('starcluster terminate cluster', shell=True)
The actual command is largely irrelevant for my question but provides some context. This command will begin terminating the cluster, but will prompt for a yes/no input before continuing, like so:
Terminate EBS cluster (y/n)?
How do I automate typing yes from within my python program as input to this prompt?
While possible to do with subprocess alone in somewhat limited way, I would go with pexpect for such interaction, e.g.:
import pexpect
child = pexpect.spawn('starcluster terminate cluster')
child.expect('Terminate EBS cluster (y/n)?')
child.sendline('y')
You can write to stdin using Popen:
from subprocess import Popen, PIPE
proc = Popen(['starcluster', 'terminate', 'cluster'], stdin=PIPE)
proc.stdin.write("y\r")
It might be simplest to check the documentation on the target program. Often a flag can be set to answer yes to all prompts, e.g., apt-get -y install.
Related
Tools like sudo read from /dev/tty to read a password.
I would like to avoid this.
The subprocess should not be able to read /dev/tty. The subprocess should fail immediately instead of waiting for input for ever.
I am using the subprocess module of Python. The subprocess should fail if it tries to read from /dev/tty.
Remember: The tool sudo is just an example. A fancy command line argument to sudo does not solve my problem. This should work for all linux command line tools.
Question: How to make any tool fail as soon as it wants to read from /dev/tty (called via the subprocess module of Python)?
Background: This is a normal linux user process, not root.
Since python3.2 Popen takes an argument start_new_session which will cause the executed process to be started detached from the current controlling terminal by calling setsid() prior to executing the child process.
So all you should need is to start the process with start_new_session=True
I have a python app that has lots of outputs on the screen which can be used for debugging. out of all the logging techniques, "script" command works well for me because I can see the output on the screen as well as logging it. I want to include that at the beginning of my python app to run automatically and log everything, when I do, however, the python program doesn't run. as soon as I type exit at the terminal (which stops script logging) the app starts working. The command I'm using is:
command="script /tmp/appdebug/debug.txt"
os.system(command)
I have also tried script -q but the same issue is there. Would appreciate any help.
Cheers
Well, I did find the answer for anyone who is interested:
https://stackoverflow.com/questions/15507602/logging-all-bash-in-and-out-with-script-command
and
Bash script: Using "script" command from a bash script for logging a session
I will keep this question as others might have the same issue and finding those answers wasn't exactly easy :)
Cheers
Try to use subprocess, like so:
from subprocess import Popen, PIPE
p = Popen(['script', '/tmp/appdebug/debug.txt'], stderr=PIPE, stdout=PIPE)
stdout, stderr = p.communicate()
script is a wrapper for a session of interactions. Even if it appears to terminate quickly after being started in a shell, this is not so; instead it starts a new shell in which you can interact so that everything is logged to a file.
What does this mean for you?
Your approach of using script cannot work. You start script using os.system which will wait for script to terminate before the next Python statement is executed. script's work will only happen before it terminates (i. e. during the uninteresting waiting period of your Python program).
I propose to use script -c yourprog.py yourprog.log instead. This will execute and wrap the yourprog.py and the session will be stored in yourprog.log.
I may not at all understand this correctly, but I am trying to allow a Python program to interface with a subprocess that runs commands as if on a Linux shell.
For example, I want to be able to run "cd /" and then "pwd later in the program and get "/".
I am currently trying to use subprocess.Popen and the communicate() method to send and receive data. The first command, sent with the Popen constructor, runs fine and gives proper output. But I cannot send another command via communicate(input="pwd").
My code so far:
from subprocess i
term=Popen("pwd", stdout=PIPE, stdin=PIPE)
print(flush(term.communicate()))
term.communicate(input="cd /")
print(flush(term.communicate(input="pwd")))
Is there a better way to do this? Thanks.
Also, I am running Python 3.
First of all, you need to understand that running a shell command and running a program aren't the same thing.
Let me give you an example:
>>> import subprocess
>>> subprocess.call(['/bin/echo', '$HOME'])
$HOME
0
>>> subprocess.call(['/bin/echo $HOME'], shell=True)
/home/kkinder
0
Notice that without the shell=True parameter, the text of $HOME is not expanded. That's because the /bin/echo program doesn't parse $HOME, Bash does. What's really happening in the second call is something analogous to this:
>>> subprocess.call(['/bin/bash', '-c', '/bin/echo $HOME'])
/home/kkinder
0
Using the shell=True parameter basically says to the subprocess module, go interpret this text using a shell.
So, you could add shell=True, but then the problem is that once the command finishes, its state is lost. Each application in the stack has its own working directory. So what the directory is will be something like this:
bash - /foo/bar
python - /foo
bash via subprocess - /
After your command executes, the python process's path stays the same and the subprocess's path is discarded once the shell finishes your command.
Basically, what you're asking for isn't practical. What you would need to do is, open a pipe to Bash, interactively feed it commands your user types, then read the output in a non-blocking way. That's going to involve a complicated pipe, threads, etc. Are you sure there's not a better way?
I am working on scripting a process in Python, and in my code I make several command line calls using p = subprocess.Popen(['example', 'command', 'and', 'args'). I recently ran into an issue that several of the command line calls need sudo permissions, and as such the user ends up having to type their sudo password several times, which is not desired.
Is there a way to only spawn one shell instance, and repeatedly use it throughout the program? Can this be done even if the different commands are run from different classes?
Thanks!
For concreteness, suppose your program looked like this:
import subprocess
import shlex
proc = subprocess.Popen(shlex.split('fdisk -l'))
proc.communicate()
proc = subprocess.Popen(shlex.split('fdisk -l'))
proc.communicate()
Running it as a normal user returns no output.
But if you run
% sudo python /path/to/test.py
then you are prompted once for the root password, and both subprocess.Popen commands run as root.
I need to write a Python script that can run another command line program and interact with it's stdin and stdout streams. Essentially, the Python script will read from the target command line program, intelligently respond by writing to its stdin, and then read the results from the program again. (It would do this repeatedly.)
I've looked through the subprocess module, and I can't seem to get it to do this read/write/read/write thing that I'm looking for. Is there something else I should be trying?
To perform such detailed interaction (when, outside of your control, the other program may be buffering its output unless it thinks it's talking to a terminal) needs something like pexpect -- which in turns requires pty, a Python standard library module that (on operating systems that allow it, such as Linux and Mac OS x) implements "pseudo-terminals".
Life is harder on Windows, but maybe this zipfile can help -- it's supposed to be a port of pexpect to Windows (sorry, I have no Windows machine to check it on). The project in question, called wexpect, lives here.
see the question
wxPython: how to create a bash shell window?
there I have given a full fledged interaction with bash shell
reading stdout and stderr and communicating via stdin
main part is extension of this code
bp = Popen('bash', shell=False, stdout=PIPE, stdin=PIPE, stderr=PIPE)
bp.stdin.write("ls\n")
bp.stdout.readline()
if we read all data it will get blocked so the link to script I have given does it in a thread. That is a complete wxpython app mimicking bash shell partially.