scp to a remote server using pexpect - python

I'm trying to learn a little bit on pexpect: in particular I'm trying to copy a file from my laptop to a remote server.
I'm experiencing a weird behaviour: more or less the same code works if I write it line by line but it won't if I run it as a script.
Here is what I write line-by-line:
child = pexpect.spawn('scp pathdir/file.ext username#hostname:pathdir')
r=child.expect ('assword:')
r
it returns 0 and I finish the job with the password
child.sendline ('password')
When I do ssh to the server I found my file there. So I collect all the steps in a script; it exits without errors, but the file it was not copied... why? But more importantly, how can I fix that?
Here is the script:
child = pexpect.spawn('scp pathdir/file.ext username#hostname:pathdir')
r=child.expect ('assword:')
print r
if r==0:
child.sendline ('password')
child.close()
I'm not sure how pexpect works so I print r to be sure it is 0. And it is.

I faced the "same" problem recently. Here's how I did it. I hope this will definitely help you.
Your question :
I'm not sure how pexpect works so I print r to be sure it is 0. And it is.
Yes it is zero.
Try the code below :
try:
var_password = "<YOUR PASSWORD>" Give your password here
var_command = "scp pathdir/file.ext username#hostname:pathdir"
#make sure in the above command that username and hostname are according to your server
var_child = pexpect.spawn(var_command)
i = var_child.expect(["password:", pexpect.EOF])
if i==0: # send password
var_child.sendline(var_password)
var_child.expect(pexpect.EOF)
elif i==1:
print "Got the key or connection timeout"
pass
except Exception as e:
print "Oops Something went wrong buddy"
print e
child.expect can accept more than one arguments. In such case you have to send those arguments in form of list. In above scenario, if output of pexpect.spawn is "password:" then i will get 0 as output and if EOF is encountered instead of "password" then the value of i will be 1.
I hope this would clear your doubt. If not, then let me know. I will try to enhance the explanation for you.

After sending password i.e
child.sendline('password')
write:
child.expect(pexpect.EOF)
This waits till copying of file finishes

I ran into the same problem. It happened when I specified the home directory (~/) of the client as the destination. This worked fine when manually typing in the scp command but for some reason not when using pexpect. Simply using a relative or absolute destination directory path solved the problem for me.

you have to finish your code with child.interact() then it will run the whole commands that you have written before that.
It will look like this:
child = pexpect.spawn('scp pathdir/file.ext username#hostname:pathdir')
r=child.expect ('assword:')
print r
if r==0:
child.sendline ('password')
child.interact()
child.close()

Related

Automate stdin with Python using stdin.write()

I am trying to automate the setup of generating self-signed SSL certificate. This is my code:
#!/usr/bin/env python
import subprocess
pass_phrase = 'example'
common_name = 'example.com'
webmaster_email = 'webmaster#example.com'
proc = subprocess.Popen(['openssl', 'req', '-x509', '-newkey', 'rsa:2048', '-rand', '/dev/urandom', '-keyout', '/etc/pki/tls/private/server.key', '-out', '/etc/pki/tls/certs/server.crt', '-days', '180'], stdout=subprocess.PIPE, stdin=subprocess.PIPE, stderr=subprocess.PIPE)
for i in range(2):
proc.stdin.write(pass_phrase)
for i in range(5):
proc.stdin.write('.')
proc.stdin.write(common_name)
proc.stdin.write(webmaster_email)
proc.stdin.flush()
stdout, stderr = proc.communicate()
When I run it, it still prompts me for the PEM passphrase, then returns this error:
Country Name (2 letter code) [XX]:weird input :-(
problems making Certificate Request
It should feed in the passphrase above and not prompt me for anything. Any ideas what's going wrong?
PS. I know about pexpect. Please don't suggest it to me.
Edit: Upon further investigation, I've figured it out. If you don't specify -nodes, the private key will be encrypted. So, OpenSSL will prompt for a PEM passphrase immediately. This means the order of my stdin.write() gets messed up. I guess the alternative is to use -nodes and encrypt the private key later.
There are several errors in your code e.g., no newlines are sent to the child process.
The main issue is that openssl expects the pass phrase directly from the terminal (like getpass.getpass() in Python). See the first reason in Why not just use a pipe (popen())?:
First an application may bypass stdout and print directly to its
controlling TTY. Something like SSH will do this when it asks you for
a password. This is why you cannot redirect the password prompt
because it does not go through stdout or stderr.
pexpect that provides pseudo-tty works fine in this case:
#!/usr/bin/env python
import sys
from pexpect import spawn, EOF
pass_phrase = "dummy pass Phr6se"
common_name = "example.com"
email = "username#example.com"
keyname, certname = 'server.key', 'server.crt'
cmd = 'openssl req -x509 -newkey rsa:2048 -rand /dev/urandom '.split()
cmd += ['-keyout', keyname, '-out', certname, '-days', '180']
child = spawn(cmd[0], cmd[1:], timeout=10)
child.logfile_read = sys.stdout # show openssl output for debugging
for _ in range(2):
child.expect('pass phrase:')
child.sendline(pass_phrase)
for _ in range(5):
child.sendline('.')
child.sendline(common_name)
child.sendline(email)
child.expect(EOF)
child.close()
sys.exit(child.status)
An alternative is to try to use -passin option to instruct openssl to get the pass phrase from a different source (stdin, a file, pipe, envvar, command-line). I don't know whether it works with openssl req command.
Two problems:
You are not giving it the data it expects in the order it expects. At some point it is expecting a country code and you are giving it some other data instead.
The write() method of file objects does not automatically insert a newline. You need to add "\n" to your strings or write() separate "\n" strings out after each line of input you want to feed to the program. For example: proc.stdin.write(pass_phrase + "\n")

How do I open a file in python and insert one or more inputs to it?

I've been trying to code a bit of a "game" to help others learn python, but I've run into a wall right after I jumped out of the brainstorming phase.
See, it involves making a script open another script, and then insert input to it. For example:
username = raw_input('Insert username:')
password = raw_input('Insert password:')
if username == user:
if password == 1234:
print('Congratulations, you cracked it!')
This would be my source code. Then I'd have another code, in which I'd write something to open the former script, insert "user" as if I'd typed it myself in the command prompt, and then tried to insert every number between 0 and, say, 10000. So something like:
for n in range(0, 10000)
[Insert script to open file]
[input 'user']
[input n]
How would I go on about to code the last part?
The subprocess module lets you run another program—including a script—and control its input and output. For example:
import subprocess, sys
p = subprocess.Popen([sys.executable, 'thescript.py'], stdin=subprocess.PIPE)
p.stdin.write('user\n')
p.stdin.write('{}\n'.format(n))
p.wait()
If you can build all the input at once and pass it as a single string, you can use communicate.
If you also want to capture its output, add another PIPE for stdout.
import subprocess
p = subprocess.Popen(['python', 'thescript.py'],
stdin=subprocess.PIPE, stdout=subprocess.PIPE)
out, err = p.communicate('user\n{}\n'.format(n))
For details on how this works, read the documentation; it's all explained pretty well. (However, it's not organized perfectly; you might want to read the opening section, then skip down to "Replacing Older Functions", then read the "Frequently Used Arguments", then come back to the top and go through in order.)
If you need to interact with it in any way more complicated than "send all my input, then get all the output", that gets very hard to do correctly, so you should take a look at the third-party pexpect module.
Would this be what you wanted?
import subprocess
for n in range(0, 10000):
p = subprocess.Popen("python another_script.py", shell=True,
stdin=subprocess.PIPE, stdout=subprocess.PIPE)
p.stdin.write("user\n" + str(n) + "\n")
out = p.stdout.read()
if "cracked" in out:
print "cracked: " + str(n)
break
Okay, I did it. Thanks for the help guys, but I settled with using modules.
I made my own small module like this:
Filename: pass1
def insertpassword(username, password):
if username == 'user':
if password == '12345':
print('You did it!')
Then what I do is:
import pass1
pass1.insertpassword(raw_input('Insert username:'),raw_input('Insert password:'))
As for the cracking:
import pass1
for n in range(0, 100000):
pass1.insertpassword('user', str(n))
Thanks anyway, everyone.

execute local python script over sshClient() with Paramiko in remote machine

This is my first post in StackOverflow, so I hope to do it the right way! :)
I have this task to do for my new job that needs to connect to several servers and execute a python script in all of them. I'm not very familiar with servers (and just started using paramiko), so I apologize for any big mistakes!
The script I want to run on them modifies the authorized_keys file but to start, I'm trying it with only one server and not yet using the aforementioned script (I don't want to make a mistake and block the server in my first task!).
I'm just trying to list the directory in the remote machine with a very simple function called getDir(). So far, I've been able to connect to the server with paramiko using the basics (I'm using pdb to debug the script by the way):
try_paramiko.py
#!/usr/bin/python
import paramiko
from getDir import get_dir
import pdb
def try_this(server):
pdb.set_trace()
ssh = paramiko.SSHClient()
ssh.load_host_keys("pth/to/known_hosts")
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
my_key = paramiko.RSAKey.from_private_key_file("pth/to/id_rsa")
ssh.connect(server, username = "root", pkey = my_key)
i, o, e = ssh.exec_command(getDir())
This is the function to get the directory list:
getDir.py
#!/usr/bin/python
import os
import pdb
def get_dir():
pdb.set_trace()
print "Current dir list is:"
for item in os.listdir(os.getcwd()):
print item
While debugging I got the directory list of my local machine instead of the one from the remote machine... is there a way to pass a python function as a parameter through paramiko? I would like to just have the script locally and run it remotely like when you do it with a bash file from ssh with:
ssh -i pth/to/key username#domain.com 'bash -s' < script.sh
so to actually avoid to copy the python script to every machine and then run it from them (I guess with the above command the script would also be copied to the remote machine and then deleted, right?) Is there a way to do that with paramiko.sshClient()?
I have also tried to modify the code and use the standard output of the channel that creates exec_command to list the directory leaving the scripts like:
try_paramiko.py
#!/usr/bin/python
import paramiko
from getDir import get_dir
import pdb
def try_this(server):
pdb.set_trace()
ssh = paramiko.SSHClient()
ssh.load_host_keys("pth/to/known_hosts")
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
my_key = paramiko.RSAKey.from_private_key_file("pth/to/id_rsa")
ssh.connect(server, username = "root", pkey = my_key)
i, o, e = ssh.exec_command(getDir())
for line in o.readlines():
print line
for line in e.readlines():
print line
getDir.py
def get_dir():
return ', '.join(os.listdir(os.getcwd()))
But with this, it actually tries to run the local directory list as commands (which actually makes sense they way I have it). I had to convert the list to a string because I was having a TypeError saying that it expects a string or a read-only character buffer, not a list... I know this was a desperate attempt to pass the function... Does anyone know how I could do such thing (pass a local function through paramiko to execute it on a remote machine)?
If you have any corrections or tips on the code, they are very much welcome (actually, any kind of help would be very much appreciated!).
Thanks a lot in advance! :)
You cannot just execute python function through ssh. ssh is just a tunnel with your code on one side (client) and shell on another (server). You should execute shell commands on remote side.
If using raw ssh code is not critical, i suggest fabric as library for writing administration tools. It contains tools for easy ssh handling, file transferring, sudo, parallel execution and other.
I think you might want change the paramaters you're passing into ssh.exec_command Here's an idea:
Instead of doing:
def get_dir():
return ', '.join(os.listdir(os.getcwd()))
i, o, e = ssh.exec_command(getDir())
You might want to try:
i, o, e = ssh.exec_command('pwd')
o.printlines()
And other things to explore:
Writing a bash script or a Python that lives on your servers. You can use Paramiko to log onto the server and executing the script with ssh.exec_command(some_script.sh) or ssh.exec_command(some_script.py)
Paramiko has some FTP/SFTP utilities so you can actually use it to put the script on the server and then execute it.
It is possible to do this by using a here document to feed a module into the remote server's python interpreter.
remotepypath = "/usr/bin/"
# open the module as a text file
with open("getDir.py", "r") as f:
mymodule = f.read()
# setup from OP code
ssh = paramiko.SSHClient()
ssh.load_host_keys("pth/to/known_hosts")
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
my_key = paramiko.RSAKey.from_private_key_file("pth/to/id_rsa")
ssh.connect(server, username = "root", pkey = my_key)
# use here document to feed module into python interpreter
stdin, stdout, stderr = ssh.exec_command("{p}python - <<EOF\n{s}\nEOF".format(p=remotepypath, s=mymodule))
print("stderr: ", stderr.readlines())
print("stdout: ", stdout.readlines())

How can I tell whether screen is running?

I am trying to run a Python program to see if the screen program is running. If it is, then the program should not run the rest of the code. This is what I have and it's not working:
#!/usr/bin/python
import os
var1 = os.system ('screen -r > /root/screenlog/screen.log')
fd = open("/root/screenlog/screen.log")
content = fd.readline()
while content:
if content == "There is no screen to be resumed.":
os.system ('/etc/init.d/tunnel.sh')
print "The tunnel is now active."
else:
print "The tunnel is running."
fd.close()
I know there are probably several things here that don't need to be and quite a few that I'm missing. I will be running this program in cron.
from subprocess import Popen, PIPE
def screen_is_running():
out = Popen("screen -list",shell=True,stdout=PIPE).communicate()[0]
return not out.startswith("This room is empty")
Maybe the error message that you redirect on the first os.system call is written on the standard error instead of the standard output. You should try replacing this line with:
var1 = os.system ('screen -r 2> /root/screenlog/screen.log')
Note the 2> to redirect standard error to your file.

Interface with remote computers using Python

I've just become the system admin for my research group's cluster and, in this respect, am a novice. I'm trying to make a few tools to monitor the network and need help getting started implementing them with python (my native tongue).
For example, I would like to view who is logged onto remote machines. By hand, I'd ssh and who, but how would I get this info into a script for manipulation? Something like,
import remote_info as ri
ri.open("foo05.bar.edu")
ri.who()
Out[1]:
hutchinson tty7 2009-08-19 13:32 (:0)
hutchinson pts/1 2009-08-19 13:33 (:0.0)
Similarly for things like cat /proc/cpuinfo to get the processor information of a node. A starting point would be really great. Thanks.
Here's a simple, cheap solution to get you started
from subprocess import *
p = Popen('ssh servername who', shell=True, stdout=PIPE)
p.wait()
print p.stdout.readlines()
returns (eg)
['usr pts/0 2009-08-19 16:03 (kakapo)\n',
'usr pts/1 2009-08-17 15:51 (kakapo)\n',
'usr pts/5 2009-08-17 17:00 (kakapo)\n']
and for cpuinfo:
p = Popen('ssh servername cat /proc/cpuinfo', shell=True, stdout=PIPE)
I've been using Pexpect, which let's you ssh into machines, send commands, read the output, and react to it, with success. I even started an open-source project around it, Proxpect - which haven't been updated in ages, but I digress...
The pexpect module can help you interface with ssh. More or less, here is what your example would look like.
child = pexpect.spawn('ssh servername')
child.expect('Password:')
child.sendline('ABCDEF')
(output,status) = child.sendline('who')
If your needs overgrow simple "ssh remote-host.example.org who" then there is an awesome python library, called RPyC. It has so called "classic" mode which allows to almost transparently execute Python code over the network with several lines of code. Very useful tool for trusted environments.
Here's an example from Wikipedia:
import rpyc
# assuming a classic server is running on 'hostname'
conn = rpyc.classic.connect("hostname")
# runs os.listdir() and os.stat() remotely, printing results locally
def remote_ls(path):
ros = conn.modules.os
for filename in ros.listdir(path):
stats = ros.stat(ros.path.join(path, filename))
print "%d\t%d\t%s" % (stats.st_size, stats.st_uid, filename)
remote_ls("/usr/bin")
If you're interested, there's a good tutorial on their wiki.
But, of course, if you're perfectly fine with ssh calls using Popen or just don't want to run separate "RPyC" daemon, then this is definitely an overkill.
This covers the bases. Notice the use of sudo for things that needed more privileges. We configured sudo to allow those commands for that user without needing a password typed.
Also, keep in mind that you should run ssh-agent to make this "make sense". But all in all, it works really well. Running deploy-control httpd configtest will check the apache configuration on all the remote servers.
#!/usr/local/bin/python
import subprocess
import sys
# The user#host: for the SourceURLs (NO TRAILING SLASH)
RemoteUsers = [
"deploy#host1.example.com",
"deploy#host2.appcove.net",
]
###################################################################################################
# Global Variables
Arg = None
# Implicitly verified below in if/else
Command = tuple(sys.argv[1:])
ResultList = []
###################################################################################################
for UH in RemoteUsers:
print "-"*80
print "Running %s command on: %s" % (Command, UH)
#----------------------------------------------------------------------------------------------
if Command == ('httpd', 'configtest'):
CommandResult = subprocess.call(('ssh', UH, 'sudo /sbin/service httpd configtest'))
#----------------------------------------------------------------------------------------------
elif Command == ('httpd', 'graceful'):
CommandResult = subprocess.call(('ssh', UH, 'sudo /sbin/service httpd graceful'))
#----------------------------------------------------------------------------------------------
elif Command == ('httpd', 'status'):
CommandResult = subprocess.call(('ssh', UH, 'sudo /sbin/service httpd status'))
#----------------------------------------------------------------------------------------------
elif Command == ('disk', 'usage'):
CommandResult = subprocess.call(('ssh', UH, 'df -h'))
#----------------------------------------------------------------------------------------------
elif Command == ('uptime',):
CommandResult = subprocess.call(('ssh', UH, 'uptime'))
#----------------------------------------------------------------------------------------------
else:
print
print "#"*80
print
print "Error: invalid command"
print
HelpAndExit()
#----------------------------------------------------------------------------------------------
ResultList.append(CommandResult)
print
###################################################################################################
if any(ResultList):
print "#"*80
print "#"*80
print "#"*80
print
print "ERRORS FOUND. SEE ABOVE"
print
sys.exit(0)
else:
print "-"*80
print
print "Looks OK!"
print
sys.exit(1)
Fabric is a simple way to automate some simple tasks like this, the version I'm currently using allows you to wrap up commands like so:
run('whoami', fail='ignore')
you can specify config options (config.fab_user, config.fab_password) for each machine you need (if you want to automate username password handling).
More info on Fabric here:
http://www.nongnu.org/fab/
There is a new version which is more Pythonic - I'm not sure whether that is going to be better for you int his case... works fine for me at present...

Categories