SFTP in Python? (platform independent) - python

I'm working on a simple tool that transfers files to a hard-coded location with the password also hard-coded. I'm a python novice, but thanks to ftplib, it was easy:
import ftplib
info= ('someuser', 'password') #hard-coded
def putfile(file, site, dir, user=(), verbose=True):
"""
upload a file by ftp to a site/directory
login hard-coded, binary transfer
"""
if verbose: print 'Uploading', file
local = open(file, 'rb')
remote = ftplib.FTP(site)
remote.login(*user)
remote.cwd(dir)
remote.storbinary('STOR ' + file, local, 1024)
remote.quit()
local.close()
if verbose: print 'Upload done.'
if __name__ == '__main__':
site = 'somewhere.com' #hard-coded
dir = './uploads/' #hard-coded
import sys, getpass
putfile(sys.argv[1], site, dir, user=info)
The problem is that I can't find any library that supports sFTP. What's the normal way to do something like this securely?
Edit: Thanks to the answers here, I've gotten it working with Paramiko and this was the syntax.
import paramiko
host = "THEHOST.com" #hard-coded
port = 22
transport = paramiko.Transport((host, port))
password = "THEPASSWORD" #hard-coded
username = "THEUSERNAME" #hard-coded
transport.connect(username = username, password = password)
sftp = paramiko.SFTPClient.from_transport(transport)
import sys
path = './THETARGETDIRECTORY/' + sys.argv[1] #hard-coded
localpath = sys.argv[1]
sftp.put(localpath, path)
sftp.close()
transport.close()
print 'Upload done.'
Thanks again!

Paramiko supports SFTP. I've used it, and I've used Twisted. Both have their place, but you might find it easier to start with Paramiko.

You should check out pysftp https://pypi.python.org/pypi/pysftp it depends on paramiko, but wraps most common use cases to just a few lines of code.
import pysftp
import sys
path = './THETARGETDIRECTORY/' + sys.argv[1] #hard-coded
localpath = sys.argv[1]
host = "THEHOST.com" #hard-coded
password = "THEPASSWORD" #hard-coded
username = "THEUSERNAME" #hard-coded
with pysftp.Connection(host, username=username, password=password) as sftp:
sftp.put(localpath, path)
print 'Upload done.'

Here is a sample using pysftp and a private key.
import pysftp
def upload_file(file_path):
private_key = "~/.ssh/your-key.pem" # can use password keyword in Connection instead
srv = pysftp.Connection(host="your-host", username="user-name", private_key=private_key)
srv.chdir('/var/web/public_files/media/uploads') # change directory on remote server
srv.put(file_path) # To download a file, replace put with get
srv.close() # Close connection
pysftp is an easy to use sftp module that utilizes paramiko and pycrypto. It provides a simple interface to sftp.. Other things that you can do with pysftp which are quite useful:
data = srv.listdir() # Get the directory and file listing in a list
srv.get(file_path) # Download a file from remote server
srv.execute('pwd') # Execute a command on the server
More commands and about PySFTP here.

If you want easy and simple, you might also want to look at Fabric. It's an automated deployment tool like Ruby's Capistrano, but simpler and of course for Python. It's build on top of Paramiko.
You might not want to do 'automated deployment' but Fabric would suit your use case perfectly none the less. To show you how simple Fabric is: the fab file and command for your script would look like this (not tested, but 99% sure it will work):
fab_putfile.py:
from fabric.api import *
env.hosts = ['THEHOST.com']
env.user = 'THEUSER'
env.password = 'THEPASSWORD'
def put_file(file):
put(file, './THETARGETDIRECTORY/') # it's copied into the target directory
Then run the file with the fab command:
fab -f fab_putfile.py put_file:file=./path/to/my/file
And you're done! :)

fsspec is a great option for this, it offers a filesystem like implementation of sftp.
from fsspec.implementations.sftp import SFTPFileSystem
fs = SFTPFileSystem(host=host, username=username, password=password)
# list a directory
fs.ls("/")
# open a file
with fs.open(file_name) as file:
content = file.read()
Also worth noting that fsspec uses paramiko in the implementation.

With RSA Key then refer here
Snippet:
import pysftp
import paramiko
from base64 import decodebytes
keydata = b"""AAAAB3NzaC1yc2EAAAADAQABAAABAQDl"""
key = paramiko.RSAKey(data=decodebytes(keydata))
cnopts = pysftp.CnOpts()
cnopts.hostkeys.add(host, 'ssh-rsa', key)
with pysftp.Connection(host=host, username=username, password=password, cnopts=cnopts) as sftp:
with sftp.cd(directory):
sftp.put(file_to_sent_to_ftp)

Twisted can help you with what you are doing, check out their documentation, there are plenty of examples. Also it is a mature product with a big developer/user community behind it.

There are a bunch of answers that mention pysftp, so in the event that you want a context manager wrapper around pysftp, here is a solution that is even less code that ends up looking like the following when used
path = "sftp://user:p#ssw0rd#test.com/path/to/file.txt"
# Read a file
with open_sftp(path) as f:
s = f.read()
print s
# Write to a file
with open_sftp(path, mode='w') as f:
f.write("Some content.")
The (fuller) example: http://www.prschmid.com/2016/09/simple-opensftp-context-manager-for.html
This context manager happens to have auto-retry logic baked in in the event you can't connect the first time around (which surprisingly happens more often than you'd expect in a production environment...)
The context manager gist for open_sftp: https://gist.github.com/prschmid/80a19c22012e42d4d6e791c1e4eb8515

Paramiko is so slow. Use subprocess and shell, here is an example:
remote_file_name = "filename"
remotedir = "/remote/dir"
localpath = "/local/file/dir"
ftp_cmd_p = """
#!/bin/sh
lftp -u username,password sftp://ip:port <<EOF
cd {remotedir}
lcd {localpath}
get {filename}
EOF
"""
subprocess.call(ftp_cmd_p.format(remotedir=remotedir,
localpath=localpath,
filename=remote_file_name
),
shell=True, stdout=sys.stdout, stderr=sys.stderr)

PyFilesystem with its sshfs is one option. It uses Paramiko under the hood and provides a nicer paltform independent interface on top.
import fs
sf = fs.open_fs("sftp://[user[:password]#]host[:port]/[directory]")
sf.makedir('my_dir')
or
from fs.sshfs import SSHFS
sf = SSHFS(...

Here's a generic function that will download any given sftp url to a specified path
from urllib.parse import urlparse
import paramiko
url = 'sftp://username:password#hostname/filepath.txt'
def sftp_download(url, dest):
url = urlparse(url)
with paramiko.Transport((url.hostname, 22)) as transport:
transport.connect(None,url.username,url.password)
with paramiko.SFTPClient.from_transport(transport) as sftp:
sftp.get(url.path, dest)
Call it with
sftp_download(url, "/tmp/filepath.txt")

Related

Downloading files using wildcard from SFTP server using Python Paramiko

I've a working code which works good when I mention the file name exactly how it is in folder. But these files have date and time added to its name. How can I make the file path for the same to read? Below is my code.
import paramiko
import os
paramiko.util.log_to_file('logfile.log')
host = "ftp.servername.com"
port = 22
transport = paramiko.Transport((host, port))
password = "mypass"
username = "myuser"
transport.connect(username=username, password=password)
sftp = paramiko.SFTPClient.from_transport(transport)
filepath = '/Home/user/Automation_2021-07-14_170139.csv'
localpath = 'file/dkc.csv'
sftp.get(filepath, localpath)
sftp.close()
transport.close()
How can I pass the * in the filepath? I wanted to make it like below.
filepath = '/Home/user/Automation_*.csv'
localpath = 'file/dkc.csv'
sftp.get(filepath, localpath)
You cannot pass a wildcard directly to Paramiko's SFTPClient.get.
You have to first find out the exact name and then use it with the get.
See List files on SFTP server matching wildcard in Python using Paramiko
Below code helped me to fix the issue.
latest = 0
latestfile = None
for fileattr in sftp.listdir_attr():
if fileattr.filename.startswith('Automation_DKC') and fileattr.st_mtime > latest:
latest = fileattr.st_mtime
latestfile = fileattr.filename

FTPHost.walk in ftputil connected to FTPS server not returning anything

I'm trying to get a list of files and paths under a directory in an FTP site using ftputil's walk method:
import ftputil
from ftplib import FTP_TLS
host = 'my_host'
user = 'my_user'
pw = 'my_pw'
folder = '/my/dir'
ftp = ftputil.FTPHost(host, user, pw, session_factory=FTP_TLS)
for root,dirs,files in ftp.walk(folder):
print(root, dirs, files)
However, nothing is printed. ftp.walk(folder) does return a generator object, but nothing is being generated. What am I missing? Maybe I'm not handling the TLS connection right (although I don't get any error)?
I needed to run prot_p as part of setting up the session:
class TLSFTPSession(FTP_TLS):
def __init__(self, host, userid, password):
FTP_TLS.__init__(self)
#self.set_debuglevel(2)
self.connect(host, 21)
self.login(userid, password)
self.prot_p()
ftp = ftputil.FTPHost(host, user, pw, session_factory=TLSFTPSession)
Then it works!

Python ftplib upload text file with commands to server: error 502 command not recognized [duplicate]

I would like to make a script to upload a file to FTP.
How would the login system work? I'm looking for something like this:
ftp.login=(mylogin)
ftp.pass=(mypass)
And any other sign in credentials.
Use ftplib, you can write it like this:
import ftplib
session = ftplib.FTP('server.address.com','USERNAME','PASSWORD')
file = open('kitten.jpg','rb') # file to send
session.storbinary('STOR kitten.jpg', file) # send the file
file.close() # close file and FTP
session.quit()
Use ftplib.FTP_TLS instead if you FTP host requires TLS.
To retrieve it, you can use urllib.retrieve:
import urllib
urllib.urlretrieve('ftp://server/path/to/file', 'file')
EDIT:
To find out the current directory, use FTP.pwd():
FTP.pwd(): Return the pathname of the current directory on the server.
To change the directory, use FTP.cwd(pathname):
FTP.cwd(pathname): Set the current directory on the server.
ftplib now supports context managers so I guess it can be made even easier
from ftplib import FTP
from pathlib import Path
file_path = Path('kitten.jpg')
with FTP('server.address.com', 'USER', 'PWD') as ftp, open(file_path, 'rb') as file:
ftp.storbinary(f'STOR {file_path.name}', file)
No need to close the file or the session
You will most likely want to use the ftplib module for python
import ftplib
ftp = ftplib.FTP()
host = "ftp.site.uk"
port = 21
ftp.connect(host, port)
print (ftp.getwelcome())
try:
print ("Logging in...")
ftp.login("yourusername", "yourpassword")
except:
"failed to login"
This logs you into an FTP server. What you do from there is up to you. Your question doesnt indicate any other operations that really need doing.
Try this:
#!/usr/bin/env python
import os
import paramiko
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect('hostname', username="username", password="password")
sftp = ssh.open_sftp()
localpath = '/home/e100075/python/ss.txt'
remotepath = '/home/developers/screenshots/ss.txt'
sftp.put(localpath, remotepath)
sftp.close()
ssh.close()
To avoid getting the encryption error you can also try out below commands
ftp = ftplib.FTP_TLS("ftps.dummy.com")
ftp.login("username", "password")
ftp.prot_p()
file = open("filename", "rb")
ftp.storbinary("STOR filename", file)
file.close()
ftp.close()
ftp.prot_p() ensure that your connections are encrypted
I just answered a similar question here
IMHO, if your FTP server is able to communicate with Fabric please us Fabric. It is far better than doing raw ftp.
I have an FTP account from dotgeek.com so I am not sure if this will work for other FTP accounts.
#!/usr/bin/python
from fabric.api import run, env, sudo, put
env.user = 'username'
env.hosts = ['ftp_host_name',] # such as ftp.google.com
def copy():
# assuming i have wong_8066.zip in the same directory as this script
put('wong_8066.zip', '/www/public/wong_8066.zip')
save the file as fabfile.py and run fab copy locally.
yeukhon#yeukhon-P5E-VM-DO:~$ fab copy2
[1.ai] Executing task 'copy2'
[1.ai] Login password:
[1.ai] put: wong_8066.zip -> /www/public/wong_8066.zip
Done.
Disconnecting from 1.ai... done.
Once again, if you don't want to input password all the time, just add
env.password = 'my_password'
You can use the below function. I haven't tested it yet, but it should work fine. Remember the destination is a directory path where as source is complete file path.
import ftplib
import os
def uploadFileFTP(sourceFilePath, destinationDirectory, server, username, password):
myFTP = ftplib.FTP(server, username, password)
if destinationDirectory in [name for name, data in list(remote.mlsd())]:
print "Destination Directory does not exist. Creating it first"
myFTP.mkd(destinationDirectory)
# Changing Working Directory
myFTP.cwd(destinationDirectory)
if os.path.isfile(sourceFilePath):
fh = open(sourceFilePath, 'rb')
myFTP.storbinary('STOR %s' % f, fh)
fh.close()
else:
print "Source File does not exist"

Does File Exist on Client Python TCP server

I am trying to make a TCP port server in python. Here is my code so far:
import socket
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.bind(('',4000))
sock.listen(1)
while 1:
client, address = sock.accept()
fileexists = client.RUNCOMMAND(does the file exist?)
if fileexists = 0:
client.close()
else if:
filedata = client.RUNCOMMAND(get the contents of the file)
if filedata = "abcdefgh":
client.send('Transfer file accepted.')
else:
client.send('Whoops, seems like you have a corrupted file!')
client.close()
I just have no idea how to run a command (RUNCOMMMAND) that would check if a file exists on the client.
Also, is there a way to check what operating system the client is on to run different commands (eg. linux will have a file finder different command than windows). And I totally understand if this isn't possible, but I am really hoping that there is a way to do this.
Thank you very much.
XMLRPC may help you.
XML-RPC is a Remote Procedure Call method that uses XML passed via HTTP as a transport.
http://docs.python.org/2/library/xmlrpclib.html
You might want to look at the very handy bottle.py micro server. its great for small server tasks like this and you get the Http protocol on top of this. You just include one file with your code. http://bottlepy.org
here is code that will work from http://blah:8090/get/file or http://blah:8090/exists/file so to see the contents of /etc/hosts would be http://blah:8090/get/etc/hosts
#!/usr/bin/python
import bottle
import os.path
#bottle.route("/get/<filepath:path>")
def index(filepath):
filepath = "/" + filepath
print "getting", filepath
if not os.path.exists(filepath):
return "file not found"
print open(filepath).read() # prints file
return '<br>'.join(open(filepath).read().split("\n")) # prints file with <br> for browser readability
#bottle.route("/exists/<filepath:path>")
def test(filepath):
filepath = "/" + filepath
return str(os.path.exists(filepath))
bottle.run(host='0.0.0.0', port=8090, reloader=True)
the reloader option on the run method allows you to edit the code without manually restarting the server. Its quite handy.

How can I get all .log and .txt files when I SSH into a server

I'm using the Paramiko module to log into a server (ssh on some and sftp on others). I can get text and log files from specific folders on the server no problem. But there are many sub-directories that have .txt and .log files. I read some where that the get method will not accept (*.txt). Does anyone know a way around this. Here is the code that I'm currently using to log into a server and get a specific log:
import paramiko
import sys
import os
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect('10.5.48.74', username='root', password='******')
ftp = ssh.open_sftp()
ftp.get('/var/opt/crindbios/log/crindbios.log', '.')
ftp.close()
Acquire a list of files with the following script. Then iterate over the list with ftp.get
import paramiko
import os
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect('localhost',username='****')
apath = '/var/log'
apattern = '"*.log"'
rawcommand = 'find {path} -name {pattern}'
command = rawcommand.format(path=apath, pattern=apattern)
stdin, stdout, stderr = ssh.exec_command(command)
filelist = stdout.read().splitlines()
ftp = ssh.open_sftp()
for afile in filelist:
(head, filename) = os.path.split(afile)
print(filename)
ftp.get(afile, './'+filename)
ftp.close()
ssh.close()
It is what dustyprogrammer proposed: On the remote server you apply shell commands to acquire the file list. Then you postprocess the list with python.
To download you have to create a new filepath for each file - download to directory as you proposed doesn't work (for me).
The filenames are easy accessible via sftp.listdir(). Therefore, I do it this way
import os
import paramiko
rserver = "raspberrypi"
ruser = "pi"
rpassword ="<your-password>"
rdirectory_charging_log = "/home/pi/logs/"
directory_charging_log = "/Users/<your-user>/logs/"
ssh = paramiko.SSHClient()
ssh.load_host_keys(os.path.expanduser(os.path.join("~", ".ssh", "known_hosts")))
ssh.connect(rserver, username=ruser, password=rpassword)
sftp = ssh.open_sftp()
rfiles = sftp.listdir(rdirectory_charging_log)
rfile = ""
for rfile in rfiles:
sftp.get(rdirectory_charging_log+rfile, directory_charging_log+rfile)
sftp.close()
ssh.close()

Categories