SCP in python by using password - python

I have been trying to do scp a file to a remote computer by using password. I used this code:
import os
import scp
client = scp.Client(host="104.198.152.xxx", username="nxxx", password="xxxxxx")
client.transfer("script.py", "~/script.py")
as it's suggested in How to scp in python?, but it outputs:
File "script.py", line 5, in <module>
client = scp.Client(host="104.198.152.153", username="nazarihome", password="mohMOH13579")
AttributeError: 'module' object has no attribute 'Client'
I also tried other ways that people suggest and seems that none of them works. Does anybody have a suggestion that really works?
p.s. I have to use password not the key if your answer depends on that.

The scp.py GitHub page has the following example that uses itself with the paramiko library for handling SSL:
from paramiko import SSHClient
from scp import SCPClient
ssh = SSHClient()
ssh.load_system_host_keys()
ssh.connect(hostname='ip',
port = 'port',
username='username',
password='password',
pkey='load_key_if_relevant')
# SCPCLient takes a paramiko transport as its only argument
scp = SCPClient(ssh.get_transport())
scp.put('file_path_on_local_machine', 'file_path_on_remote_machine')
scp.get('file_path_on_remote_machine', 'file_path_on_local_machine')
scp.close()
So the actual type you want it is scp.SCPClient.

This is working as Jan 2019:
Install required Python packages:
pip install scp
pip install paramiko
Include library in the code:
from paramiko import SSHClient
from scp import SCPClient
Wrote a function for it:
# SSH/SCP Directory Recursively
def ssh_scp_files(ssh_host, ssh_user, ssh_password, ssh_port, source_volume, destination_volume):
logging.info("In ssh_scp_files()method, to copy the files to the server")
ssh = SSHClient()
ssh.load_system_host_keys()
ssh.connect(ssh_host, username=ssh_user, password=ssh_password, look_for_keys=False)
with SCPClient(ssh.get_transport()) as scp:
scp.put(source_volume, recursive=True, remote_path=destination_volume)
Now call it anywhere you want in the code:
ssh_scp_files(ssh_host, ssh_user, ssh_password, ssh_port, source_volume, destination_volume)
If all above implemented correctly, you will see the the successful messages in the console/logs like this:

Related

How to send data from a file in a Python docker container to remote SFTP server?

I have a Python script I am trying to run in a Docker container to send a file that is on this container to an SFTP server.
I tried the following :
import paramiko
ssh=paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
session = ssh.connect(hostname="X", port=X, username='X', password="X")
stdin,stdout,stderr = ssh.exec_command('sftp -P X ldzdl#hostname', get_pty=True)
I also tried with paramiko transport method but didn't work from remote (docker container) to remote SFTP.
But I have the following error : paramiko.ssh_exception.AuthenticationException: Authentication failed.
How can I do this ? I don't know if my method is okay or if there is other better way to solve it (send data from container to an SFTP server).
The argument given to the exec_command function is not the command you would normally run on the local (client) host's shell, but rather attempted to be ran on the remote (server) host. While it is not likely to get an AuthenticationException by attempting to run remote commands, as you did not post a full traceback in the question - it is hard to tell for sure.
I suggest checking the following code:
import paramiko
ssh=paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
session = ssh.connect(hostname="X", port=X, username='X', password="X")
### so far, same code as in the question ###
print("Auth OK")
sftp_connection = ssh.open_sftp()
sftp_connection.put("local/file/path", "remote/file/path")
If you see the "Auth OK" print - then you should be good to go, just replace the file path arguments of the sftp_connection.put() method with actual local and remote file paths.
Otherwise - there is an actual authentication issue which should be resolved.

Parakimo asking password when try to send a scp file [duplicate]

What's the most pythonic way to scp a file in Python? The only route I'm aware of is
os.system('scp "%s" "%s:%s"' % (localfile, remotehost, remotefile) )
which is a hack, and which doesn't work outside Linux-like systems, and which needs help from the Pexpect module to avoid password prompts unless you already have passwordless SSH set up to the remote host.
I'm aware of Twisted's conch, but I'd prefer to avoid implementing scp myself via low-level ssh modules.
I'm aware of paramiko, a Python module that supports SSH and SFTP; but it doesn't support SCP.
Background: I'm connecting to a router which doesn't support SFTP but does support SSH/SCP, so SFTP isn't an option.
EDIT:
This is a duplicate of How to copy a file to a remote server in Python using SCP or SSH?. However, that question doesn't give an scp-specific answer that deals with keys from within Python. I'm hoping for a way to run code kind of like
import scp
client = scp.Client(host=host, user=user, keyfile=keyfile)
# or
client = scp.Client(host=host, user=user)
client.use_system_keys()
# or
client = scp.Client(host=host, user=user, password=password)
# and then
client.transfer('/etc/local/filename', '/etc/remote/filename')
Try the Python scp module for Paramiko. It's very easy to use. See the following example:
import paramiko
from scp import SCPClient
def createSSHClient(server, port, user, password):
client = paramiko.SSHClient()
client.load_system_host_keys()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(server, port, user, password)
return client
ssh = createSSHClient(server, port, user, password)
scp = SCPClient(ssh.get_transport())
Then call scp.get() or scp.put() to do SCP operations.
(SCPClient code)
You might be interested in trying Pexpect (source code). This would allow you to deal with interactive prompts for your password.
Here's a snip of example usage (for ftp) from the main website:
# This connects to the openbsd ftp site and
# downloads the recursive directory listing.
import pexpect
child = pexpect.spawn ('ftp ftp.openbsd.org')
child.expect ('Name .*: ')
child.sendline ('anonymous')
child.expect ('Password:')
child.sendline ('noah#example.com')
child.expect ('ftp> ')
child.sendline ('cd pub')
child.expect('ftp> ')
child.sendline ('get ls-lR.gz')
child.expect('ftp> ')
child.sendline ('bye')
Couldn't find a straight answer, and this "scp.Client" module doesn't exist.
Instead, this suits me:
from paramiko import SSHClient
from scp import SCPClient
ssh = SSHClient()
ssh.load_system_host_keys()
ssh.connect('example.com')
with SCPClient(ssh.get_transport()) as scp:
scp.put('test.txt', 'test2.txt')
scp.get('test2.txt')
You could also check out paramiko. There's no scp module (yet), but it fully supports sftp.
[EDIT]
Sorry, missed the line where you mentioned paramiko.
The following module is simply an implementation of the scp protocol for paramiko.
If you don't want to use paramiko or conch (the only ssh implementations I know of for python), you could rework this to run over a regular ssh session using pipes.
scp.py for paramiko
import paramiko
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect('<IP Address>', username='<User Name>',password='' ,key_filename='<.PEM File path')
#Setup sftp connection and transmit this script
print ("copying")
sftp = client.open_sftp()
sftp.put(<Source>, <Destination>)
sftp.close()
if you install putty on win32 you get an pscp (putty scp).
so you can use the os.system hack on win32 too.
(and you can use the putty-agent for key-managment)
sorry it is only a hack
(but you can wrap it in a python class)
As of today, the best solution is probably AsyncSSH
https://asyncssh.readthedocs.io/en/latest/#scp-client
async with asyncssh.connect('host.tld') as conn:
await asyncssh.scp((conn, 'example.txt'), '.', recurse=True)
You can use the package subprocess and the command call to use the scp command from the shell.
from subprocess import call
cmd = "scp user1#host1:files user2#host2:files"
call(cmd.split(" "))
Have a look at fabric.transfer.
from fabric import Connection
with Connection(host="hostname",
user="admin",
connect_kwargs={"key_filename": "/home/myuser/.ssh/private.key"}
) as c:
c.get('/foo/bar/file.txt', '/tmp/')
It has been quite a while since this question was asked, and in the meantime, another library that can handle this has cropped up:
You can use the copy function included in the Plumbum library:
import plumbum
r = plumbum.machines.SshMachine("example.net")
# this will use your ssh config as `ssh` from shell
# depending on your config, you might also need additional
# params, eg: `user="username", keyfile=".ssh/some_key"`
fro = plumbum.local.path("some_file")
to = r.path("/path/to/destination/")
plumbum.path.utils.copy(fro, to)
If you are on *nix you can use sshpass
sshpass -p password scp -o User=username -o StrictHostKeyChecking=no src dst:/path
Hmmm, perhaps another option would be to use something like sshfs (there an sshfs for Mac too). Once your router is mounted you can just copy the files outright. I'm not sure if that works for your particular application but it's a nice solution to keep handy.
I while ago I put together a python SCP copy script that depends on paramiko. It includes code to handle connections with a private key or SSH key agent with a fallback to password authentication.
http://code.activestate.com/recipes/576810-copy-files-over-ssh-using-paramiko/

Upload file via SFTP with Python

I wrote a simple code to upload a file to a SFTP server in Python. I am using Python 2.7.
import pysftp
srv = pysftp.Connection(host="www.destination.com", username="root",
password="password",log="./temp/pysftp.log")
srv.cd('public') #chdir to public
srv.put('C:\Users\XXX\Dropbox\test.txt') #upload file to nodejs/
# Closes the connection
srv.close()
The file did not appear on the server. However, no error message appeared. What is wrong with the code?
I have enabled logging. I discovered that the file is uploaded to the root folder and not under public folder. Seems like srv.cd('public') did not work.
I found the answer to my own question.
import pysftp
srv = pysftp.Connection(host="www.destination.com", username="root",
password="password",log="./temp/pysftp.log")
with srv.cd('public'): #chdir to public
srv.put('C:\Users\XXX\Dropbox\test.txt') #upload file to nodejs/
# Closes the connection
srv.close()
Put the srv.put inside with srv.cd
Do not use pysftp it's dead. Use Paramiko directly. See also pysftp vs. Paramiko.
The code with Paramiko will be pretty much the same, except for the initialization part.
import paramiko
with paramiko.SSHClient() as ssh:
ssh.load_system_host_keys()
ssh.connect(host, username=username, password=password)
sftp = ssh.open_sftp()
sftp.chdir('public')
sftp.put('C:\Users\XXX\Dropbox\test.txt', 'test.txt')
To answer the literal OP's question: the key point here is that pysftp Connection.cd works as a context manager (so its effect is discarded without with statement), while Paramiko SFTPClient.chdir does not.
import pysftp
with pysftp.Connection(host="www.destination.com", username="root",
password="password",log="./temp/pysftp.log") as sftp:
sftp.cwd('/root/public') # The full path
sftp.put('C:\Users\XXX\Dropbox\test.txt') # Upload the file
No sftp.close() is needed, because the connection is closed automatically at the end of the with-block
I did a minor change with cd to cwd
Syntax -
# sftp.put('/my/local/filename') # upload file to public/ on remote
# sftp.get('remote_file') # get a remote file

Python: How can remote from my local pc to remoteA to remoteb to remote c using Paramiko

I would like to know how to to jump from one remote server to another remote server using paramiko. I want to ssh from my local pc to remote-A then from remote-A to remote-B and from remote- B to remote-C.
import paramiko
def connect(ip, usr, psw):
client = paramiko.SSHClient()
client.load_host_keys('/home/dgomez/.ssh/known_hosts')
client.connect(ip, username=usr, password=psw)
return client
host1 = connect('192.168.1.2', 'username', 'password')
# Here I'm connect to remote-A
Now I would to know how can I connect from Remote-A to Remote-B.
use for pexpect module it is very useful for you http://www.noah.org/wiki/pexpect
and and pexpect module simplified in pxssh module that very good for remote login http://dsnra.jpl.nasa.gov/software/Python/site-packages/Contrib/pxssh.html
simple code:
import pxssh
host = pxssh.pxssh
host.login('hostname','username','password')
host.sendline('command')#'ls'
print host.before

ssh connection to a router with a python script

I wanted to know whether there is a possibility to use a python script in order to connect to a router and control the interface (shut down, restart wireless network etc..) with an ssh connection.
SO far I wrote these lines,but still it does not work. When i look to the terminal I see that everything is blocked at the point when my script should echo the password for the router to finalize the connection. How can I correct this please ?
Here are the lines :
import os, urllib, urllib2, re
def InterfaceControl():
#os.system("echo training")
os.system("ssh -l root 192.168.2.1")
os.system("echo yes")
os.system("echo My_ROUTER_PASSWORD")
os.system("shutdown -r")
def main():
InterfaceControl()
if __name__=="__main__":
main()
Thank you so much in advance
You can use paramiko which is a python library that abstracts remote shell connections through ssh with several options allowing users to use authentication with rsa keys, etc. This is a sample code you can reuse to solve your problem:
import paramiko
ssh = paramiko.SSHClient()
ssh.connect( 'hostname', username = 'username', password = 'password' )
ssh.exec_command( 'ls -al' )
By the way paramiko can be easily added to your python environment if you're running your script from a virtual environment (virtualenv).
plumbum is what you're looking for. (remote commands)

Categories