Opening Astropy FITS file from SFTP server - python

I have a Python script that ssh into a remote server using Paramiko module.
The below is my script
import paramiko
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect("host", username="McMissile")
A FITS file on a local machine is usually opened as follows:
from astropy.io import fits
hdu = fits.open('File.fits')
I was wondering how would I open a FITS file from the SFTP server machine and store it under the variable hdu in the local machine.
I cannot download the file from the server to the local machine due to the storage constraints.

Astropy.io fits.open method accepts a file-like object in place of a file name:
name : file path, file object, file-like object or pathlib.Path object
A file-like object representing a remote file is returned by Paramiko SFTPClient.open method:
A file-like object is returned, which closely mimics the behavior of a normal Python file object, including the ability to be used as a context manager.
So this should work:
sftp_client = ssh_client.open_sftp()
with sftp_client.open('remote_filename') as remote_file:
hdu = fits.open(remote_file)

Related

Download file from SFTP server in Python Paramiko

I am using Python Paramiko to retrieve/search file from an SFTP server. I get all file in the directory.
What I need is specific file from that directory. How do I get it?
Use Paramiko SFTPClient.get to download a single file:
with paramiko.SSHClient() as ssh:
ssh.connect(host, username=username, password=password)
with ssh.open_sftp() as sftp:\
sftp.get("/remote/path/file.txt", "/local/path/file.txt")
You will also have to deal with the server's host key verification.
What you need to do is create an ssh client, then execute an ls with piped grep to find your file. such as ls /srv/ftp | grep '^FTP_' to find files in the /srv/ftp directory and start with FTP. Then open an sftp connection and execute the get command to bring the files over.
EDIT: Martin below mentioned that there is a better way to get the directory contents using SFTPClient.listdir() - I have revised to that method. More info in the docs: https://docs.paramiko.org/en/stable/api/sftp.html
Putting all that together looks like
import paramiko
host = ''
port = 22
username = ''
password = ''
with paramiko.SSHClient() as client:
client.connect(host, port, username, password)
with client.open_sftp() as sftp:
files = sftp.listdir('/srv/ftp')
for i, file in enumerate(files):
if file and file.startswith('FTP'):
sftp.get(f'/srv/ftp/{file}', f'~/ftp/{file}')
print(f'Moved {file}')
This code is untested but should work. Hope that's clear.
Here is an answer in case you need a kind of find using a SFTP connection, not knowing the exact path and name of the file. If it is not what you were looking for, I am sorry.
I made a library named sftputil, based on paramiko, which implements advanced functionalities such as glob. To find a specific file and download it, you can do it this way :
from sftputil import SFTP
sftp = SFTP("hostname", "username", password="password")
# Here we look for a file with a name starting with `foo`
found_files = sftp.glob("directory/foo*")
# Here we look for the file `bar` in any subdirectory
found_files = sftp.glob("directory/*/bar")
# But you can use other patterns of course.
# And now the files can be downloaded
for f in found_files:
sftp.get(f, "my/local/path")
If you don’t know glob, you should read the python documentation, as this implementation works the same way.
I recently solved a similar issue of connecting to SFTP and downloading files via paramiko.
The code below assumes that you know the name of the file you are looking for (file_name). If there are multiple files you need, just add them to file_list.
import paramiko
host = 'host.name.com'
usr = 'username'
pwd = 'password'
remote_path = '/path/on/sftp/server/'
local_path = '/path/on/local/machine/'
file_name = 'specific_file_you_need.csv'
transport = paramiko.Transport((host,22))
transport.connect(None,usr,pwd)
sftp = paramiko.SFTPClient.from_transport(transport)
file_list = [file_name]
for file in file_list:
sftp.get(remote_path+'{0}'.format(file),
local_path+'{0}'.format(file))

Copy files from Linux to Windows using Samba client in Python

I am trying to copy a file from a Linux machine to a Windows shared drive. This answer shows how perform this action using the Samba client. In my case, the connection succeeds, listShares() returns the shares on the server but when I call connection.storeFile() I get Unable to connect to shared device.
What am I missing?
The python code running on the Linux machine:
connection = SMBConnection(username='user1', password='password1', my_name='host1',remote_name='fs01',domain='domain', use_ntlm_v2=True)
connection.connect(ip, 139) # Returns true
file_obj = open(filename,"r")
connection.storeFile('\\\\fs01\\data\\folder1', file1, file_obj)
smb.smb_structs.OperationFailure: Failed to store data\folder1\file1 on fs01: Unable to connect to shared device

implement CRCCheck during SFTP file transfer [duplicate]

I use Paramiko to put a file to an SFTP server:
import paramiko
transport = paramiko.Transport((host, port))
transport.connect(username=username, password=password)
sftp = paramiko.SFTPClient.from_transport(transport)
sftp.put(local_path, remote_path)
Now, I would like to check if it worked. The idea is that I compare the checksum of the local file and the remote one (that is located on the SFTP server).
Does Paramiko functionality allows to do that? If it is the case, how exactly it works?
With the SFTP, running over an encrypted SSH session, there's no chance the file contents could get corrupted while transferring. So unless it gets corrupted when reading the local file or writing the remote file, you can be pretty sure that the file was uploaded correctly, if the .put does not throw any error.
try:
sftp.put(local_path, remote_path)
except:
# Something went wrong
If you want to test explicitly anyway:
While there's the check-file extension to the SFTP protocol to calculate a remote file checksum, it's not widely supported. Particularly it's not supported by the most widespread SFTP server implementation, the OpenSSH. See What SFTP server implementations support check-file extension.
If you are lucky to connect to another SFTP server that supports the extension, you can use the Paramiko's SFTPFile.check method.
If not, your only option is to download the file back and compare locally.
If you have a shell access to the server, you can of course try to run some shell checksum command (sha256sum) over a separate shell/SSH connection (or channel) and parse the results. But that's not an SFTP solution anymore. See Comparing MD5 of downloaded files against files on an SFTP server in Python.
if the file doesn't upload then the method will throw an error, so u can check for error
import paramiko
transport = paramiko.Transport((host, port))
transport.connect(username=username, password=password)
sftp = paramiko.SFTPClient.from_transport(transport)
try:
sftp.put(local_path, remote_path)
except IOError:
#'Failure'
except OSError:
#'Failure'

How to transfer FileStorage object via FTP in python

I have a rester webservice (Flask Webservice) where clients sends me file object(Image or pdf); Now I want to store it into FTP location. How can I store file object(Image or pdf) to FTP location.
I am using Flask Webservice(Rester API), I able to receive file objct of type FileStorage. then I trying to store the file object(Image or pdf) to FTP using paramiko API paramiko.SFTPClient class to transfer/store the file to FTP. But it is not accepting the file object. It is expecting only local file path.
Below is the sample code which is used to transfer file to FTP
FTP Code:
import paramiko
transport = paramiko.Transport((host, port))
transport.connect(username = username, password = password)
sftp = paramiko.SFTPClient.from_transport(transport)
sftp.put(localpath, filepath)
Could you please suggest me how can I achieve file object store to FTP?
A simple solution would be to save your FileStorage object on disk, and then give the path to paramiko sftp client (your localpath variable).
Note that this is a non-optimised solution since it requires to write data on disk then re-read it.
For a more optimised solution, according to paramiko's SFTP doc, you could use the method
putfo(fl, remotepath, file_size=0, callback=None, confirm=True)
That would replace your line sftp.put(localpath, filepath) by sftp.putfo(filestorage_object, filepath)

Upload file via SFTP with Python

I wrote a simple code to upload a file to a SFTP server in Python. I am using Python 2.7.
import pysftp
srv = pysftp.Connection(host="www.destination.com", username="root",
password="password",log="./temp/pysftp.log")
srv.cd('public') #chdir to public
srv.put('C:\Users\XXX\Dropbox\test.txt') #upload file to nodejs/
# Closes the connection
srv.close()
The file did not appear on the server. However, no error message appeared. What is wrong with the code?
I have enabled logging. I discovered that the file is uploaded to the root folder and not under public folder. Seems like srv.cd('public') did not work.
I found the answer to my own question.
import pysftp
srv = pysftp.Connection(host="www.destination.com", username="root",
password="password",log="./temp/pysftp.log")
with srv.cd('public'): #chdir to public
srv.put('C:\Users\XXX\Dropbox\test.txt') #upload file to nodejs/
# Closes the connection
srv.close()
Put the srv.put inside with srv.cd
Do not use pysftp it's dead. Use Paramiko directly. See also pysftp vs. Paramiko.
The code with Paramiko will be pretty much the same, except for the initialization part.
import paramiko
with paramiko.SSHClient() as ssh:
ssh.load_system_host_keys()
ssh.connect(host, username=username, password=password)
sftp = ssh.open_sftp()
sftp.chdir('public')
sftp.put('C:\Users\XXX\Dropbox\test.txt', 'test.txt')
To answer the literal OP's question: the key point here is that pysftp Connection.cd works as a context manager (so its effect is discarded without with statement), while Paramiko SFTPClient.chdir does not.
import pysftp
with pysftp.Connection(host="www.destination.com", username="root",
password="password",log="./temp/pysftp.log") as sftp:
sftp.cwd('/root/public') # The full path
sftp.put('C:\Users\XXX\Dropbox\test.txt') # Upload the file
No sftp.close() is needed, because the connection is closed automatically at the end of the with-block
I did a minor change with cd to cwd
Syntax -
# sftp.put('/my/local/filename') # upload file to public/ on remote
# sftp.get('remote_file') # get a remote file

Categories