How to transfer a folder with Paramiko - python

I'm trying to download a 80 MB folder from a remote server onto my local machine. I know the file paths are right and I know that the folder exists. My current working code (works on a single file) is such:
import paramiko
def begin():
tran=paramiko.Transport(('dns.server.name', 22))
tran.connect(username='**',password='**')
sftp=paramiko.SFTPClient.from_transport(tran)
sftp.get('/remote/file/path', '/local/file/path')
sftp.close()
tran.close()
I've tried adding sftp.listdir, but I'm afraid I can't find enough documentation on the subject to make it understandable or usable for me. Is there something available that looks like os.walk?
My question is - How do I download small folders via the ssh2 protocol available in paramiko?

What I suggest instead of transferring whole folder, you should first make a temporary compressed tar file on server programmetically and transfer that tar file using sftp over network - may reduce bandwidh / will work faster and will be less error prone.

Related

Getting a list of files recursively using SFTP in shell or Python with no additional libraries

I am transferring files between two servers via SFTP using python's subprocess module. The only way I can connect to the remote server is via an SFTP connection.
I need to verify that the two directories on the local and remote server are identical after the transfer. This is pretty easy on the local server, a basic find command gives me what I need. However I have no clue how to get a similar result on the remote server.
Here's an example of the file structure, its identical on both machines.
JobDirectory
Job1
test.txt
tonks.txt
Job2
wildlife.txt
Job3
jackinthebox.txt
happygilmore.txt
sadprogrammer.txt
So I need a command that'll get the filenames from Job1, Job2, and Job3 and return them to me.
Something like
echo "ls *.txt" | sftp -q user#host.example.com:/path
doesn't track too well here, since it needs a specific path. I could get a list of folders within the directory and run the sftp command against each of them, but that's a lot of remote connections.
The only remote access tools I can use are subprocess and Python's OS module. Something like Paramiko SFTP is not available.
For an easy but inefficient solution, see the answer by #pasabaporaqui to List recursively all files on sftp.
With your obscure limitation, the only solution that uses one connection would be:
open an sftp subprocess in Python
feed sequence of ls commands to it, one for each directory
parsing the directory listings coming on standard output, producing more ls commands for each subdirectory found.

Copy folder from server(Linux) to local machine(windows) in python

How to copy a folder from Server (linux) to a local machine (windows) in python.
I tried with the given code but it did not work
from distutils.dir_util import copy_tree
copy_tree("source_path ","destination_path")
I used copy_tree command to copy a folder on my local machine but when I used the same command to copy a folder from server to local machine then it did not work.
Any other method is there? Or any changes needed?
You need to use SSH, SCP, or SFTP to transfer files from host to host.
I do this a lot and like to use SSH and SCP. You can run and SSH server on your windows machine using OpenSSH. Here is a good set of instructions from WinSCP: https://winscp.net/eng/docs/guide_windows_openssh_server.
I recommend using Paramiko for SSH with Python. Here is a good answer showing how this works with python: https://stackoverflow.com/a/38556344/634627.
If you set up OpenSSH, you could also do this with SFTP, sometimes I find this is more suitable that SCP. Here is a good answer showing how that works: https://stackoverflow.com/a/33752662/634627
The trick is getting OpenSSH running on your Windows host and setting up SSH keys so your server can authenticate to your localhost.
Using copytree should work if:
the folder on the server is made available to the windows machine as a client.
you have sufficient access permissions.
you use a raw string for the windows path to prevent string interpretation.
Ad 3: try print('c:\test\robot'):
In [1]: print('c:\test\robot')
obot est

How can I automate remote deployment in python?

I want to automate the remote deployment which currently I am doing manually.
The process includes
Make the tar ball from certain folders
SFTP to the remote server
Rename the old folders
Untar the new tar file
Restart apache
The remote system is on the intranet and has no access to the outside internet
I want to know how can I transfer the file from my python script and then when the transfer is complete then log into ssh and do stuff. I am confused about how can I achieve that. On localhost and I can do all that but how can I do that on a remote host?
For simple&dirty work you can use fabric (This by no means say that you cannot use fabric to build serious product)
For heavy configuration routines, you'd better pick a CMS (e.g., ansible)

update file on ssh and link to local computer

I may be being ignorant here, but I have been researching for the past 30 minutes and have not found how to do this. I was uploading a bunch of files to my server, and then, prior to them all finishing, I edited one of those files. How can I update the file on the server to the file on my local computer?
Bonus points if you tell me how I can link the file on my local computer to auto update on the server when I connect (if possible of course)
Just use scp and copy the file back.
scp [user]#[address]:/[remotepath]/[remotefile] [localfolder] if you want to copy the file on the server back to your local machine or
scp [localfolder]/[filename] [user]#[address]:/[remotepath] in case you want to copy the file again to the server. The elements in [] have to be exchanged with actual paths and or file names. On the remote it has to be the absolute path and on the local machine it can be absolute or relative.More information on scp
This of course means that the destination file will be overwritten.
Maybe rsync would be an option. It is capable to synchronize different folders.
rsync can be used to sync folders over the network and is capable to be combined with ssh.
Have you considered Dropbox or SVN ?
I don't know your local computer OS, but if it is Linux or OSX, you can consider LFTP. This is an FTP client which supports SFTP://. This client has the "mirror" functionality. With a single command you can mirror your local files against a server.
Note: what you need is a reverse mirror. in LFTP this is mirror -r

Python inotify (pyinotify) over FTP

I need to listen file events for my remote server over FTP. I've found pyinotify to detect file changes in Python. It detects well on local file system and I think it is very good.
But when I gave a FTP addres to make it watch, it does not see the path as appropriate and gives me ;
pyinotify ERROR] add_watch: cannot watch ftp://myuser#myserver/home/myUser WD=-1, Errno=No such file or directory (ENOENT)
I also tried with url ftp://myuser#myserver, but the result was same.
So, What am I missing with pyinotify, is it available to listen file changes over FTP with pyinotify?
If it is not available, could you suggest another library to do this?
Thank you
You won't be able to run pynotify over FTP, NFS, or anything other than local file systems.
You'll need to poll the remote FTP server using an FTP library to detect changes.
Not sure, but maybe it is because ftp://... isn't a folder. It is just a weblink. If you want to run pynotify over FTP, I think you have to upload file on the server, and run it there.
Or you can use ftplib for that. You just always list directory and get changes.
pyinotify is used only for directories and files.To monitor ftp, you can't make use of pyinotify.

Categories