What's the most pythonic way to scp a file in Python? The only route I'm aware of is
os.system('scp "%s" "%s:%s"' % (localfile, remotehost, remotefile) )
which is a hack, and which doesn't work outside Linux-like systems, and which needs help from the Pexpect module to avoid password prompts unless you already have passwordless SSH set up to the remote host.
I'm aware of Twisted's conch, but I'd prefer to avoid implementing scp myself via low-level ssh modules.
I'm aware of paramiko, a Python module that supports SSH and SFTP; but it doesn't support SCP.
Background: I'm connecting to a router which doesn't support SFTP but does support SSH/SCP, so SFTP isn't an option.
EDIT:
This is a duplicate of How to copy a file to a remote server in Python using SCP or SSH?. However, that question doesn't give an scp-specific answer that deals with keys from within Python. I'm hoping for a way to run code kind of like
import scp
client = scp.Client(host=host, user=user, keyfile=keyfile)
# or
client = scp.Client(host=host, user=user)
client.use_system_keys()
# or
client = scp.Client(host=host, user=user, password=password)
# and then
client.transfer('/etc/local/filename', '/etc/remote/filename')
Try the Python scp module for Paramiko. It's very easy to use. See the following example:
import paramiko
from scp import SCPClient
def createSSHClient(server, port, user, password):
client = paramiko.SSHClient()
client.load_system_host_keys()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(server, port, user, password)
return client
ssh = createSSHClient(server, port, user, password)
scp = SCPClient(ssh.get_transport())
Then call scp.get() or scp.put() to do SCP operations.
(SCPClient code)
You might be interested in trying Pexpect (source code). This would allow you to deal with interactive prompts for your password.
Here's a snip of example usage (for ftp) from the main website:
# This connects to the openbsd ftp site and
# downloads the recursive directory listing.
import pexpect
child = pexpect.spawn ('ftp ftp.openbsd.org')
child.expect ('Name .*: ')
child.sendline ('anonymous')
child.expect ('Password:')
child.sendline ('noah#example.com')
child.expect ('ftp> ')
child.sendline ('cd pub')
child.expect('ftp> ')
child.sendline ('get ls-lR.gz')
child.expect('ftp> ')
child.sendline ('bye')
Couldn't find a straight answer, and this "scp.Client" module doesn't exist.
Instead, this suits me:
from paramiko import SSHClient
from scp import SCPClient
ssh = SSHClient()
ssh.load_system_host_keys()
ssh.connect('example.com')
with SCPClient(ssh.get_transport()) as scp:
scp.put('test.txt', 'test2.txt')
scp.get('test2.txt')
You could also check out paramiko. There's no scp module (yet), but it fully supports sftp.
[EDIT]
Sorry, missed the line where you mentioned paramiko.
The following module is simply an implementation of the scp protocol for paramiko.
If you don't want to use paramiko or conch (the only ssh implementations I know of for python), you could rework this to run over a regular ssh session using pipes.
scp.py for paramiko
import paramiko
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect('<IP Address>', username='<User Name>',password='' ,key_filename='<.PEM File path')
#Setup sftp connection and transmit this script
print ("copying")
sftp = client.open_sftp()
sftp.put(<Source>, <Destination>)
sftp.close()
if you install putty on win32 you get an pscp (putty scp).
so you can use the os.system hack on win32 too.
(and you can use the putty-agent for key-managment)
sorry it is only a hack
(but you can wrap it in a python class)
As of today, the best solution is probably AsyncSSH
https://asyncssh.readthedocs.io/en/latest/#scp-client
async with asyncssh.connect('host.tld') as conn:
await asyncssh.scp((conn, 'example.txt'), '.', recurse=True)
You can use the package subprocess and the command call to use the scp command from the shell.
from subprocess import call
cmd = "scp user1#host1:files user2#host2:files"
call(cmd.split(" "))
Have a look at fabric.transfer.
from fabric import Connection
with Connection(host="hostname",
user="admin",
connect_kwargs={"key_filename": "/home/myuser/.ssh/private.key"}
) as c:
c.get('/foo/bar/file.txt', '/tmp/')
It has been quite a while since this question was asked, and in the meantime, another library that can handle this has cropped up:
You can use the copy function included in the Plumbum library:
import plumbum
r = plumbum.machines.SshMachine("example.net")
# this will use your ssh config as `ssh` from shell
# depending on your config, you might also need additional
# params, eg: `user="username", keyfile=".ssh/some_key"`
fro = plumbum.local.path("some_file")
to = r.path("/path/to/destination/")
plumbum.path.utils.copy(fro, to)
If you are on *nix you can use sshpass
sshpass -p password scp -o User=username -o StrictHostKeyChecking=no src dst:/path
Hmmm, perhaps another option would be to use something like sshfs (there an sshfs for Mac too). Once your router is mounted you can just copy the files outright. I'm not sure if that works for your particular application but it's a nice solution to keep handy.
I while ago I put together a python SCP copy script that depends on paramiko. It includes code to handle connections with a private key or SSH key agent with a fallback to password authentication.
http://code.activestate.com/recipes/576810-copy-files-over-ssh-using-paramiko/
Related
Please pardon me as I am a very new to any programming language. I have around 25 network devices combination of cisco, juniper, linux etc which i need to remotely access and run some basic cli commands to get the output. Individually SSHing in to the devices will take long time. Can some tell me where to start this basic script?
Try the following:
pip install paramiko
then in your script:
import base64
import paramiko
key = paramiko.RSAKey(data=base64.b64decode(b'AAA...'))
client = paramiko.SSHClient()
client.get_host_keys().add('ssh.example.com', 'ssh-rsa', key)
client.connect('ssh.example.com', username='strongbad', password='thecheat')
def run_command(command)
stdin, stdout, stderr = client.exec_command(command)
for line in stdout:
print('... ' + line.strip('\n'))
return True
run_command('ls')
run_command('cd..')
run_command('apt-get update')
client.close()
You can use Netmiko or NAPALM.
These two python libraries support almost all different vendor devices.
https://napalm.readthedocs.io/en/latest/index.html
https://pynet.twb-tech.com/blog/automation/netmiko.html
Moin!
Situation: connect to the destination.host over the jump.host and run a command on the destination.host, which connects in the background to the another.host (on this host my ssh key is needed).
Scheme: client --> jump.host --> destination.host --- remote_command with ssh key needed on the other host --> another.host
#!/usr/bin/python
import paramiko
jumpHost=paramiko.SSHClient()
sshKey = paramiko.RSAKey.from_private_key_file('path.to.key/file', password = 'the.passphrase')
jumpHost.set_missing_host_key_policy(paramiko.AutoAddPolicy())
jumpHost.connect('jump.hostname',username='foo', pkey = sshKey)
jumpHostTransport = jumpHost.get_transport()
dest_addr = ('destination.hostname', 22)
local_addr = ('jump.hostname', 22)
jumpHostChannel = jumpHostTransport.open_channel("direct-tcpip", dest_addr, local_addr)
destHost=paramiko.SSHClient()
destHost.set_missing_host_key_policy(paramiko.AutoAddPolicy())
destHost.connect('destination.hostname', username='foo', sock=jumpHostChannel, pkey=sshKey)
destHostAgentSession = destHost.get_transport().open_session()
paramiko.agent.AgentRequestHandler(destHostAgentSession)
stdin, stderr, stdout = destHost.exec_command("my.command.which.connects.to.another.host")
print(stdout.read())
print(stderr.read())
destHost.close()
jumpHost.close()
The above code works well, if run "local" commands on the destination.host - e.g. uname, whoami, hostname, ls and so on... But if i run a command, which connects in the background to another host where my ssh key is needed, the code raised in the error:
raise AuthenticationException("Unable to connect to SSH agent")
paramiko.ssh_exception.AuthenticationException: Unable to connect to SSH agent
If i connect via Putty at the same chain, it works well.
Can anyone give me a hint to resolve my problem?
Thx in advance.
Assumption: Your keys work across jump host and destination host.
Creating a local agent in that case will work. You could manually create it via shell first and test it via iPython.
eval `ssh-agent`; ssh-add <my-key-file-path>
Programmatically this can be done -
# Using shell=True is not a great idea because it is a security risk.
# Refer this post - https://security.openstack.org/guidelines/dg_avoid-shell-true.html
subprocess.check_output("eval `ssh-agent`; ssh-add <my-key-file-path>", shell=True)
I am trying to do something similar and came across this post, I will update if I find a better solution.
EDIT: I have posted the implementation over here - https://adikrishnan.in/2018/10/25/agent-forwarding-with-paramiko/
I have been trying to do scp a file to a remote computer by using password. I used this code:
import os
import scp
client = scp.Client(host="104.198.152.xxx", username="nxxx", password="xxxxxx")
client.transfer("script.py", "~/script.py")
as it's suggested in How to scp in python?, but it outputs:
File "script.py", line 5, in <module>
client = scp.Client(host="104.198.152.153", username="nazarihome", password="mohMOH13579")
AttributeError: 'module' object has no attribute 'Client'
I also tried other ways that people suggest and seems that none of them works. Does anybody have a suggestion that really works?
p.s. I have to use password not the key if your answer depends on that.
The scp.py GitHub page has the following example that uses itself with the paramiko library for handling SSL:
from paramiko import SSHClient
from scp import SCPClient
ssh = SSHClient()
ssh.load_system_host_keys()
ssh.connect(hostname='ip',
port = 'port',
username='username',
password='password',
pkey='load_key_if_relevant')
# SCPCLient takes a paramiko transport as its only argument
scp = SCPClient(ssh.get_transport())
scp.put('file_path_on_local_machine', 'file_path_on_remote_machine')
scp.get('file_path_on_remote_machine', 'file_path_on_local_machine')
scp.close()
So the actual type you want it is scp.SCPClient.
This is working as Jan 2019:
Install required Python packages:
pip install scp
pip install paramiko
Include library in the code:
from paramiko import SSHClient
from scp import SCPClient
Wrote a function for it:
# SSH/SCP Directory Recursively
def ssh_scp_files(ssh_host, ssh_user, ssh_password, ssh_port, source_volume, destination_volume):
logging.info("In ssh_scp_files()method, to copy the files to the server")
ssh = SSHClient()
ssh.load_system_host_keys()
ssh.connect(ssh_host, username=ssh_user, password=ssh_password, look_for_keys=False)
with SCPClient(ssh.get_transport()) as scp:
scp.put(source_volume, recursive=True, remote_path=destination_volume)
Now call it anywhere you want in the code:
ssh_scp_files(ssh_host, ssh_user, ssh_password, ssh_port, source_volume, destination_volume)
If all above implemented correctly, you will see the the successful messages in the console/logs like this:
I wanted to know whether there is a possibility to use a python script in order to connect to a router and control the interface (shut down, restart wireless network etc..) with an ssh connection.
SO far I wrote these lines,but still it does not work. When i look to the terminal I see that everything is blocked at the point when my script should echo the password for the router to finalize the connection. How can I correct this please ?
Here are the lines :
import os, urllib, urllib2, re
def InterfaceControl():
#os.system("echo training")
os.system("ssh -l root 192.168.2.1")
os.system("echo yes")
os.system("echo My_ROUTER_PASSWORD")
os.system("shutdown -r")
def main():
InterfaceControl()
if __name__=="__main__":
main()
Thank you so much in advance
You can use paramiko which is a python library that abstracts remote shell connections through ssh with several options allowing users to use authentication with rsa keys, etc. This is a sample code you can reuse to solve your problem:
import paramiko
ssh = paramiko.SSHClient()
ssh.connect( 'hostname', username = 'username', password = 'password' )
ssh.exec_command( 'ls -al' )
By the way paramiko can be easily added to your python environment if you're running your script from a virtual environment (virtualenv).
plumbum is what you're looking for. (remote commands)
I have this 30 virtual machines and I am doing everything manually right now which is a kind of trouble for me. I am trying now to write a script so I can connect to them all automatically and do my desired task. I made a flow diagram of what I need to do. Can anybody just give me some hints on how will i achieve this task programatically. I am attaching the flow diagram.
Thanks in Advance.
Please Right Click On Image and Click View Image to See the Flow Diagram.
Reading a text file and getting the data is trivial:
with open('host.txt', 'r') as inf:
lines = inf.readlines()
hostlist = [ln.split() for ln in lines]
Now hostlist should be a list of lists;
[['192.168.0.23', 'root', 'secret'], ['192.168.0.24', 'root', 'secret2'] ...
But your list shouldn't have to contain more than the hostnames. IP adresses can be gotten from DNS that you have to configure anyway, and ssh can login without passwords if configured correctly.
Putting the passwords for all your virtual hosts in a plain text file has security concerns. If you want to go that route, make sure to restrict access to that file!
You can use subprocess to execute commands. I would suggest using rsync to push the desired files to the virtual machines. That minimizes network traffic. And you can deploy directly from filesystem to filesystem without having to roll a tarball. It can be as simple as
status = subprocess.check_output(['rsync', '-av', localdir, remotedir])
Where localdir is the directory where the files for the virtual machine in question are stored (it should end with a '/'), and 'remotedir' is the hostname::directory on the virtual machine where the data should land (this should not end with a '/').
For executing commands remotely, ssh is the way to go. Configure that for passwordless login, using ssh's authorized_keys file on each remote host. Then you don't need to put passwords in your list of hosts.
Fabric is best solution for you. Fabric based on paramiko ( which based on libssh2), it makes very easy to work with commands on remote host and provides function to upload and download fiels from remote host.
Here it is http://docs.fabfile.org/en/1.5/
Docs about put function here
I don't get your question in the flow diagram, however you can use paramiko as suggested and I have a large number of background utilities written on top of paramiko which enables support people to monitor remote web servers on the browser. Snippet below,
client = paramiko.SSHClient()
client.set_missing_host_key_policy( paramiko.AutoAddPolicy() )
client.load_system_host_keys()
client.connect( '192.168.100.1', port=4001, username='monitor', password='XXXX' )
cmds = [ "sed -i 's/\/bin\/date -u/\/bin\/date/g' /etc/cron.hourly/reboot" ]
for cmd in cmds:
if __DEBUG_MODE__:
print 'Executing..... ' + cmd
stdin, stdout, stderr = client.exec_command( cmd )
Also if you want to push files below is the snippet,
def setupSFTPClient(self, ip_add):
print 'Setting SFTP client: ' + ip_add
tb = 'Finished SFTP-ing. '
try:
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.load_system_host_keys()
client.connect(ip_add, port=4001, username='monitor', password='XXXX')
sftp_client = client.open_sftp()
# NB you need the filename for remote path otherwise
# paramiko barfs with IOError: Failure
sftp_client.put( '/home/projects/portal_release.tgz', '/var/ND/portal_release.tgz' )
sftp_client.put( '/home/projects/portal_installer.sh', '/var/ND/portal_installer.sh' )
sftp_client.close()
except Exception, e:
print e
tb = traceback.format_exc()
finally:
print tb
client.close()