The file download works well with individual files but when I run it in a for loop to download multiple files at once, it keeps crashing after downloading the first file. I tried multiple things but the result is same, the program stops after the first download with no error.
def store_files_name(fname):
file_names.append(fname)
def store_dir_name(dirname):
dir_names.append(dirname)
def store_other_file_types(name):
un_name.append(name)
cnopts = pysftp.CnOpts()
cnopts.hostkeys = None
conn = pysftp.Connection('', username='', password='', private_key=".ppk", cnopts=cnopts)
sftp.walktree("",store_files_name,store_dir_name,store_other_file_types,recurse=True)
for f in file_names:
print(f)
conn.get("/"+f)
All pysftp recursive functions are known to be poorly implemented. They particularly do not work on Windows. They use os.sep and os.path functions for remote SFTP paths, what is wrong, as SFTP paths always use a forward slash.
And in general pysftp seems to be an abandoned project. Consider using Paramiko directly instead (pysftp is just a thin wrapper around Paramiko).
For a working Paramiko code that iterates a remote folder, see:
Python pysftp get_r from Linux works fine on Linux but not on Windows
Related
I have an SFTP server. I can take data by transferring/downloading files. Is there a way that I can do without downloading files?
My code is as below:
# Connection to the SFTP server
with pysftp.Connection(hostname, username, passowrd, port) as sftp:
with sftp.cd('directory'):
sftp.get('filename.txt')
This code downloads file to my local machine.
Yes and no. You can use the data from the remote (SFTP) server without storing the files to a local disk.
But you cannot use data locally without downloading them. That's impossible. You have to transfer the data to use them – at least to a memory of the local machine.
See A way to load big data on Python from SFTP server, not using my hard disk.
My answer there talks about Paramiko. But pysftp is a just a thin wrapper around Paramiko. Its Connection.open is directly mapped to underlying Paramiko's SFTPClient.open. So you can keep using pysftp:
with sftp.open('filename.txt', bufsize=32768) as f:
# use f as if you have opened a local file with open()
Though I'd recommend you not to: pysftp vs. Paramiko.
I can't find anything about this exception. I am trying to rename a remote file on a local (Windows) SFTP server with fsspec. Paramiko behind the scenes is doing a posix_rename(). What does the error mean?
fs.rename(old_file_path, new_file_path)
Paths look like /folder/file.ext.
I can rename files with other FTP clients on that same server.
Indeed, fsspec SFTPFileSystem.mv calls Paramiko SFTPClient.posix_rename. That's imo a bad choice. The SFTPClient.posix_rename internally uses a proprietary OpenSSH posix-rename#openssh.com extension, which is naturally not supported by most other SFTP servers (such as yours).
I do not know what is the best solution/workaround. You can probably add your own "file system" implementation based on SFTPFileSystem, reimplementing SFTPFileSystem.mv to call standard Paramiko SFTPClient.rename (which uses standard SFTP rename request).
Actually, I just found that the SFTPClient is exposed through the SFTPFileSystem and I can call rename() on it directly, which worked!
fs.ftp.rename("testfile.txt", "x")
I need to connect to a server with SSH to download files. I have Ubuntu and I've set up SSH in the standard way: I have a ssh_config file in .ssh which defines a host entry (say host_key) for the server address (Hostname.com) and username, and I've set up an RSA key. So when I try to log into SSH from the command line or bash, I just need to use ssh host_key
I would like to do this in Python. The standard solutions seems to be to use Paramiko to set up the connection. I tried this:
from paramiko import SSHClient
from scp import SCPClient
ssh = SSHClient()
ssh.load_system_host_keys()
ssh.connect('host_key')
scp = SCPClient(ssh.get_transport())
# etc...
However, it always seems to hang and time out on ssh.connect('host_key'). Even when I try to include the username and password: ssh.connect('host_key', username='usrnm', password='pswd').
Are my host keys not loading properly? And would this take care of the RSA keys as well?
It only works if I use the whole Hostname.com with username and typed-out password. Which is maybe a bit insecure.
Since paramiko has a SSHConfig class, you can use it for your ~/.ssh/config.
However, it is slightly messy, I recommend you to use fabric instead of that.
Here is the code example:
from fabric.api import put
put('local path', 'remote path')
I do not think that it is common to use ssh_config file with Paramiko (or any other code/language). ssh_config is a configuration file for OpenSSH tools, not for SSH in general.
Usually, you specify your private key directly in your code as an argument of SSHClient.connect method:
How to access to a remote server using Paramiko with a public key-file
If you want to keep using ssh_config, Paramiko can parse it. Check parse_ssh_config and lookup_ssh_host_config functions. But I believe, you still have to look up the key file from the config and pass it explicitly to SSHClient.connect method.
We have a framework used to validate few test cases and results will be stored in local machine containing multiple text and images.
Need to move these files from our local host to server.
I have the sever IP address, username and password.
So using Python I need to move these files or copy it to server
If you are going for ssh, you'll have to use scp and there is a dedicated Python package for that: Paramiko. See this post on stackoverflow.
import paramiko
def createSSHClient(server, port, user, password):
client = paramiko.SSHClient()
client.load_system_host_keys()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(server, port, user, password)
return client
ssh = createSSHClient(server, port, user, password)
scp = SCPClient(ssh.get_transport())
scp.put([file1,file2],remotePath)
Of course, you have to specify the various variables according to their name. The scp.put function takes a list of local files and a destination path on the remote system as arguments.
used SCP to do the transfer
ssh -i ~/.ssh/id_rsa intel#10.223.98.165 "mkdir < Folder created >"
scp -i ~/.ssh/id_rsa < source >*.txt < destination >
using os.system()
Thanks for helping
This is not really a Python problem: You say you have a username and a password to the server, but that doesn't tell us the least in which way you can access that server. Do you have SSH access? Then use scp as a command-line program or one of the numerous Python modules that make that possible.
The same goes for protocols like FTP, WebDAV, cifs/smb, NFS, ... It all depends on what ways you have to access/modify/create files on the server. Hence, this answer is all I can give you to your extremely inaccurate question.
I'm trying to send over multiple files from one server to another using Python. I've found a few ssh2 libraries, but either I can't find documentation on them (e.g. ssh), or they don't seem to support mput.
Anyone know of any sftp libraries which support mput?
Paramiko is a library that handles SSH and related things, such as SFTP, but it only supports a regular put, no mput that I can see.
What exactly does mput do? My sftp client doesn't have that command...
Guessing from the name, I'm thinking "multiple puts" or something like that, to send multiple files in one go? If that is the case, I suggest just looping over your list of files and using put.
I was faced with a similar problem a long time ago and could not get a satisfactory mput-method to run in Python. So the paramiko library seemed to make the most sense to me.
To enable progress output or other actions in your Python application, it is advantageous to send the files individually in a for-loop. However, the overhead increases minimally with this variant.
A small Paramiko-example code :
import pysftp
import os
list_files_to_transfer = []
# Check if List is empty
if list_files_to_transfer:
# Advanced connection options beyond authentication
cnopts = pysftp.CnOpts()
# Compression for lower Transfer-Load
cnopts.compression = True
cnopts.hostkeys = None
# Establish a connection with the SFTP server
with pysftp.Connection(host=host, username=username, port=port,
private_key=os.path.abspath(path_private_key),
cnopts=cnopts) as sftp:
# Change to the specified remote directory
with sftp.cd(self.path_remote_sink_folder):
# browse the list of files
for file in list_files_to_transfer:
# Upload the file to the server
sftp.put(file)