I need to listen file events for my remote server over FTP. I've found pyinotify to detect file changes in Python. It detects well on local file system and I think it is very good.
But when I gave a FTP addres to make it watch, it does not see the path as appropriate and gives me ;
pyinotify ERROR] add_watch: cannot watch ftp://myuser#myserver/home/myUser WD=-1, Errno=No such file or directory (ENOENT)
I also tried with url ftp://myuser#myserver, but the result was same.
So, What am I missing with pyinotify, is it available to listen file changes over FTP with pyinotify?
If it is not available, could you suggest another library to do this?
Thank you
You won't be able to run pynotify over FTP, NFS, or anything other than local file systems.
You'll need to poll the remote FTP server using an FTP library to detect changes.
Not sure, but maybe it is because ftp://... isn't a folder. It is just a weblink. If you want to run pynotify over FTP, I think you have to upload file on the server, and run it there.
Or you can use ftplib for that. You just always list directory and get changes.
pyinotify is used only for directories and files.To monitor ftp, you can't make use of pyinotify.
Related
I am transferring files between two servers via SFTP using python's subprocess module. The only way I can connect to the remote server is via an SFTP connection.
I need to verify that the two directories on the local and remote server are identical after the transfer. This is pretty easy on the local server, a basic find command gives me what I need. However I have no clue how to get a similar result on the remote server.
Here's an example of the file structure, its identical on both machines.
JobDirectory
Job1
test.txt
tonks.txt
Job2
wildlife.txt
Job3
jackinthebox.txt
happygilmore.txt
sadprogrammer.txt
So I need a command that'll get the filenames from Job1, Job2, and Job3 and return them to me.
Something like
echo "ls *.txt" | sftp -q user#host.example.com:/path
doesn't track too well here, since it needs a specific path. I could get a list of folders within the directory and run the sftp command against each of them, but that's a lot of remote connections.
The only remote access tools I can use are subprocess and Python's OS module. Something like Paramiko SFTP is not available.
For an easy but inefficient solution, see the answer by #pasabaporaqui to List recursively all files on sftp.
With your obscure limitation, the only solution that uses one connection would be:
open an sftp subprocess in Python
feed sequence of ls commands to it, one for each directory
parsing the directory listings coming on standard output, producing more ls commands for each subdirectory found.
I am able to watch local directories using inotify kernel subsystem based solutions. There are some python projects too which are working on top of inotify like pyinotify, PyInotify, fsmonitor and watchdog.
I have mounted remote ftp server in local directory using curlftpfs so all syncing is easy now. But inotify is not able to watch network mounted points like local directories.
I want to track if there is new files added to ftp server. How can I achieve like I do for local directory using inotify based solution.
It can hardly work. The FTP protocol has no API to notify a client about the changes. The curlftpfs would have to continually poll the remote folder to provide the notification for inotify or other similar tool. It hardly does that.
You have to poll the FTP folder yourself.
See for example Monitor remote FTP directory.
I may be being ignorant here, but I have been researching for the past 30 minutes and have not found how to do this. I was uploading a bunch of files to my server, and then, prior to them all finishing, I edited one of those files. How can I update the file on the server to the file on my local computer?
Bonus points if you tell me how I can link the file on my local computer to auto update on the server when I connect (if possible of course)
Just use scp and copy the file back.
scp [user]#[address]:/[remotepath]/[remotefile] [localfolder] if you want to copy the file on the server back to your local machine or
scp [localfolder]/[filename] [user]#[address]:/[remotepath] in case you want to copy the file again to the server. The elements in [] have to be exchanged with actual paths and or file names. On the remote it has to be the absolute path and on the local machine it can be absolute or relative.More information on scp
This of course means that the destination file will be overwritten.
Maybe rsync would be an option. It is capable to synchronize different folders.
rsync can be used to sync folders over the network and is capable to be combined with ssh.
Have you considered Dropbox or SVN ?
I don't know your local computer OS, but if it is Linux or OSX, you can consider LFTP. This is an FTP client which supports SFTP://. This client has the "mirror" functionality. With a single command you can mirror your local files against a server.
Note: what you need is a reverse mirror. in LFTP this is mirror -r
I'm trying to download a 80 MB folder from a remote server onto my local machine. I know the file paths are right and I know that the folder exists. My current working code (works on a single file) is such:
import paramiko
def begin():
tran=paramiko.Transport(('dns.server.name', 22))
tran.connect(username='**',password='**')
sftp=paramiko.SFTPClient.from_transport(tran)
sftp.get('/remote/file/path', '/local/file/path')
sftp.close()
tran.close()
I've tried adding sftp.listdir, but I'm afraid I can't find enough documentation on the subject to make it understandable or usable for me. Is there something available that looks like os.walk?
My question is - How do I download small folders via the ssh2 protocol available in paramiko?
What I suggest instead of transferring whole folder, you should first make a temporary compressed tar file on server programmetically and transfer that tar file using sftp over network - may reduce bandwidh / will work faster and will be less error prone.
I would like to be able to use a Python script that I wrote to search files to login to an Ubuntu server that's password protected (which I have credentials ), and search files on that server.. Is there a straight forward way to accomplish this?
To login and run remote terminal commands through python, you should use either paramiko or pexpect. Pexpect is not touched very much by noah these days... I'm starting to wonder whether he is abandoning it.
The other way is to sftp the files from the remote server to your local machine... paramiko is useful for that as well.