I'm writing a python script that executes a csh script in Solaris 10. The csh script prompts the user for the root password (which I know) but I'm not sure how to make the python script answer the prompt with the password. Is this possible? Here is what I'm using to execute the csh script:
import commands
commands.getoutput('server stop')
Have a look at the pexpect module. It is designed to deal with interactive programs, which seems to be your case.
Oh, and remember that hard-encoding root's password in a shell or python script is potentially a security hole :D
Use subprocess. Call Popen() to create your process and use communicate() to send it text. Sorry, forgot to include the PIPE..
from subprocess import Popen, PIPE
proc = Popen(['server', 'stop'], stdin=PIPE)
proc.communicate('password')
You would do better do avoid the password and try a scheme like sudo and sudoers. Pexpect, mentioned elsewhere, is not part of the standard library.
import pexpect
child = pexpect.spawn('server stop')
child.expect_exact('Password:')
child.sendline('password')
print "Stopping the servers..."
index = child.expect_exact(['Server processes successfully stopped.', 'Server is not running...'], 60)
child.expect(pexpect.EOF)
Did the trick! Pexpect rules!
Add input= in proc.communicate() make it run, for guys who like to use standard lib.
from subprocess import Popen, PIPE
proc = Popen(['server', 'stop'], stdin=PIPE)
proc.communicate(input='password')
Should be able to pass it as a parameter. something like:
commands.getoutput('server stop -p password')
This seems to work better:
import popen2
(stdout, stdin) = popen2.popen2('server stop')
stdin.write("password")
But it's not 100% yet. Even though "password" is the correct password I'm still getting su: Sorry back from the csh script when it's trying to su to root.
To avoid having to answer the Password question in the python script I'm just going to run the script as root. This question is still unanswered but I guess I'll just do it this way for now.
Related
Normally you can automate answers to an interactive prompt by piping stdin:
import subprocess as sp
cmd = 'rpmbuild --sign --buildroot {}/BUILDROOT -bb {}'.format(TMPDIR, specfile)
p = sp.Popen(cmd, stdout=sp.PIPE, stderr=sp.PIPE, stdin=sp.PIPE, universal_newline=True, shell=True)
for out in p.communicate(input='my gpg passphrase\n'):
print(out)
For whatever reason, this is not working for me. I've tried writing to p.stdin, before executing p.communicate(), I've tried flushing the buffer, I've tried using bytes without universal_newlines=True, I've hard coded things, etc. In all scenarios, the command is executed and hangs on:
Enter pass phrase:
My first hunch was that stdin was not the correct file descriptor and that rpmbuild was internally calling a gpg command, and maybe my input isn't piped. But when I do p.stdin.close() I get an OSerror about subprocess trying to write to the closed descriptor.
What is the rpmbuild command doing to stdin that prevents me from writing to it?
Is there a hack I can do? I tried echo "my passphrase" | rpmbuild .... as the command but that doesn't work.
I know I can do something with gpg like command and sign packages without a passphrase but I kind of want to avoid that.
EDIT:
After some more reading, I realize this is issue is common to commands that require password input, typically using a form of getpass.
I see a solution would be to use a library like pexpect, but I want something from the standard library. I am going to keep looking, but I think maybe i can try writing to something similar /dev/tty.
rpm uses getpass(3) which reopens /dev/tty.
There are 2 approaches to automating:
1) create a pseudotty
2) (linux) find the reopened file descriptor in /proc
If scripting, expect(1) has (or had) a short example with pseudotty's that can be used.
I have a python script in blender where it has
subprocess.call(os.path.abspath('D:/Test/run-my-script.sh'),shell=True)
followed by many other code which depends on this shell script to finish. What happens is that it doesn't wait for it to finish, I don't know why? I even tried using Popen instead of call as shown:
p1 = subprocess.Popen(os.path.abspath('D:/Test/run-my-script.sh'),shell=True)
p1.wait()
and I tried using commuincate but it still didn't work:
p1 = subprocess.Popen(os.path.abspath('D:/Test/run-my-script.sh'),shell=True).communicate()
this shell script works great on MacOS (after changing paths) and waits when using subprocess.call(['sh', '/userA/Test/run-my-script.sh'])
but on Windows this is what happens, I run the below python script in Blender then once it gets to the subprocess line Git bash is opened and runs the shell script while blender doesn't wait for it to finish it just prints Hello in its console without waiting for the Git Bash to finish. Any help?
import bpy
import subprocess
subprocess.call(os.path.abspath('D:/Test/run-my-script.sh'),shell=True)
print('Hello')
You can use subprocess.call to do exactly that.
subprocess.call(args, *, stdin=None, stdout=None, stderr=None, shell=False, timeout=None)
Run the command described by args. Wait for command to complete, then return the returncode attribute.
Edit: I think I have a hunch on what's going on. The command works on your Mac because Macs, I believe, support Bash out of the box (at least something functionally equivalent) while on Windows it sees your attempt to run a ".sh" file and instead fires up Git Bash which I presume performs a couple forks when starting.
Because of this Python thinks that your script is done, the PID is gone.
If I were you I would do this:
Generate a unique, non-existing, absolute path in your "launching" script using the tempfile module.
When launching the script, pass the path you just made as an argument.
When the script starts, have it create a file at the path. When done, delete the file.
The launching script should watch for the creation and deletion of that file to indicate the status of the script.
Hopefully that makes sense.
You can use Popen.communicate API.
p1 = subprocess.Popen(os.path.abspath('D:/Test/run-my-script.sh'),shell=True)
sStdout, sStdErr = p1.communicate()
The command
Popen.communicate(input=None, timeout=None)
Interact with process: Send data to stdin. Read data from stdout and stderr, until end-of-file is reached. Wait for the process to terminate.
subprocess.run will by default wait for the process to finish.
Use subprocess.Popen and Popen.wait:
process = subprocess.Popen(['D:/Test/run-my-script.sh'],shell=True, executable="/bin/bash")
process.wait()
You could also use check_call() instead of Popen.
You can use os.system, like this:
import bpy
import os
os.system("sh "+os.path.abspath('D:/Test/run-my-script.sh'))
print('Hello')
There are apparently cases when the run command fails.
This is my workaround:
def check_has_finished(pfi, interval=1, timeout=100):
if os.path.exists(pfi):
if pfi.endswith('.nii.gz'):
mustend = time.time() + timeout
while time.time() < mustend:
try:
# Command is an ad hoc one to check if the process has finished.
subprocess.check_output('command {}'.format(pfi), shell=True)
except subprocess.CalledProcessError:
print "Caught CalledProcessError"
else:
return True
time.sleep(interval)
msg = 'command {0} not working after {1} tests. \n'.format(pfi, timeout)
raise IOError(msg)
else:
return True
else:
msg = '{} does not exist!'.format(pfi)
raise IOError(msg)
A wild try, but are you running the shell as Admin while Blender as regular user or vice versa?
Long story short (very short), Windows UAC is a sort of isolated environment between admin and regular user, so random quirks like this can happen. Unfortunately I can't remember the source of this, the closest I found is this.
My problem was the exact opposite of yours, the wait() got stuck in a infinite loop because my python REPL was fired from an admin shell and wasn't able to read the state of the regular user subprocess. Reverting to normal user shell got it fixed. It's not the first time I'm bit from this UAC snafu.
I am running this python code from the command line:
# run on command line as: python firstscript.py
import sys, subprocess
pid = subprocess.Popen([sys.executable, 'secondscript.py']).pid
sys.exit()
Unfortunately I can't get it to exit all the way to the command line. If I hit the enter key (on OSX) it will finally exit. Is there a way to force the script to exit all the way to the command line without lingering in this weird limbo state? Also, I don't want to redirect stdout or stderr anywhere else because if I do, I lose the ability in secondscript.py to log output to a log file.
Thanks for the help.
The changes below worked for me:
# run on command line as: python firstscript.py
import sys, subprocess
process = subprocess.Popen([sys.executable, 'secondscript.py'])
output = process.communicate()[0]
You seem to be asking if there is a better way to do this. Check out check_output. I have always found it much more convenient and fool proof compared to the lower level stuff in subprocess.
I'm using the OS.System command to call a python script.
example:
OS.System("call jython script.py")
In the script I'm calling, the following command is present:
x = raw_input("Waiting for input")
If I run script.py from the command line I can input data no problem, if I run it via the automated approach I get an EOFError. I've read in the past that this happens because the system expects a computer to be running it and therefore could never receive input data in this way.
So the question is how can I get python to wait for user input while being run in an automated way?
The problem is the way you run your child script. Since you use os.system() the script's input channel is closed immediately and the raw_input() prompt hits an EOF (end of file). And even if that didn't happen, you wouldn't have a way to actually send some input text to the child as I assume you'd want given that you are using raw_input().
You should use the subprocess module instead.
import subprocess
from subprocess import PIPE
p = subprocess.Popen(["jython", "script.py"], stdin=PIPE, stdout=PIPE)
print p.communicate("My input")
Your question is a bit unclear. What is the process calling your Python script and how is it being run? If the parent process has no standard input, the child won't have it either.
I'm hacking some support for DomainKeys and DKIM into an open source email marketing program, which uses a python script to send the actual emails via SMTP. I decided to go the quick and dirty route, and just write a perl script that accepts an email message from STDIN, signs it, then returns it signed.
What I would like to do, is from the python script, pipe the email text that's in a string to the perl script, and store the result in another variable, so I can send the email signed. I'm not exactly a python guru, however, and I can't seem to find a good way to do this. I'm pretty sure I can use something like os.system for this, but piping a variable to the perl script is something that seems to elude me.
In short: How can I pipe a variable from a python script, to a perl script, and store the result in Python?
EDIT: I forgot to include that the system I'm working with only has python v2.3
Use subprocess. Here is the Python script:
#!/usr/bin/python
import subprocess
var = "world"
pipe = subprocess.Popen(["./x.pl", var], stdout=subprocess.PIPE)
result = pipe.stdout.read()
print result
And here is the Perl script:
#!/usr/bin/perl
use strict;
use warnings;
my $name = shift;
print "Hello $name!\n";
os.popen() will return a tuple with the stdin and stdout of the subprocess.
from subprocess import Popen, PIPE
p = Popen(['./foo.pl'], stdin=PIPE, stdout=PIPE)
p.stdin.write(the_input)
p.stdin.close()
the_output = p.stdout.read()
"I'm pretty sure I can use something like os.system for this, but piping a variable to the perl script is something that seems to elude me."
Correct. The subprocess module is like os.system, but provides the piping features you're looking for.
I'm sure there's a reason you're going down the route you've chosen, but why not just do the signing in Python?
How are you signing it? Maybe we could provide some assitance in writing a python implementation?
I tried also to do that only configure how to make it work as
pipe = subprocess.Popen(
['someperlfile.perl', 'param(s)'],
stdin=subprocess.PIPE
)
response = pipe.communicate()[0]
I wish this will assist u to make it work.