I may be being ignorant here, but I have been researching for the past 30 minutes and have not found how to do this. I was uploading a bunch of files to my server, and then, prior to them all finishing, I edited one of those files. How can I update the file on the server to the file on my local computer?
Bonus points if you tell me how I can link the file on my local computer to auto update on the server when I connect (if possible of course)
Just use scp and copy the file back.
scp [user]#[address]:/[remotepath]/[remotefile] [localfolder] if you want to copy the file on the server back to your local machine or
scp [localfolder]/[filename] [user]#[address]:/[remotepath] in case you want to copy the file again to the server. The elements in [] have to be exchanged with actual paths and or file names. On the remote it has to be the absolute path and on the local machine it can be absolute or relative.More information on scp
This of course means that the destination file will be overwritten.
Maybe rsync would be an option. It is capable to synchronize different folders.
rsync can be used to sync folders over the network and is capable to be combined with ssh.
Have you considered Dropbox or SVN ?
I don't know your local computer OS, but if it is Linux or OSX, you can consider LFTP. This is an FTP client which supports SFTP://. This client has the "mirror" functionality. With a single command you can mirror your local files against a server.
Note: what you need is a reverse mirror. in LFTP this is mirror -r
Related
I am able to watch local directories using inotify kernel subsystem based solutions. There are some python projects too which are working on top of inotify like pyinotify, PyInotify, fsmonitor and watchdog.
I have mounted remote ftp server in local directory using curlftpfs so all syncing is easy now. But inotify is not able to watch network mounted points like local directories.
I want to track if there is new files added to ftp server. How can I achieve like I do for local directory using inotify based solution.
It can hardly work. The FTP protocol has no API to notify a client about the changes. The curlftpfs would have to continually poll the remote folder to provide the notification for inotify or other similar tool. It hardly does that.
You have to poll the FTP folder yourself.
See for example Monitor remote FTP directory.
I want to automate the remote deployment which currently I am doing manually.
The process includes
Make the tar ball from certain folders
SFTP to the remote server
Rename the old folders
Untar the new tar file
Restart apache
The remote system is on the intranet and has no access to the outside internet
I want to know how can I transfer the file from my python script and then when the transfer is complete then log into ssh and do stuff. I am confused about how can I achieve that. On localhost and I can do all that but how can I do that on a remote host?
For simple&dirty work you can use fabric (This by no means say that you cannot use fabric to build serious product)
For heavy configuration routines, you'd better pick a CMS (e.g., ansible)
I need to listen file events for my remote server over FTP. I've found pyinotify to detect file changes in Python. It detects well on local file system and I think it is very good.
But when I gave a FTP addres to make it watch, it does not see the path as appropriate and gives me ;
pyinotify ERROR] add_watch: cannot watch ftp://myuser#myserver/home/myUser WD=-1, Errno=No such file or directory (ENOENT)
I also tried with url ftp://myuser#myserver, but the result was same.
So, What am I missing with pyinotify, is it available to listen file changes over FTP with pyinotify?
If it is not available, could you suggest another library to do this?
Thank you
You won't be able to run pynotify over FTP, NFS, or anything other than local file systems.
You'll need to poll the remote FTP server using an FTP library to detect changes.
Not sure, but maybe it is because ftp://... isn't a folder. It is just a weblink. If you want to run pynotify over FTP, I think you have to upload file on the server, and run it there.
Or you can use ftplib for that. You just always list directory and get changes.
pyinotify is used only for directories and files.To monitor ftp, you can't make use of pyinotify.
I'm trying to download a 80 MB folder from a remote server onto my local machine. I know the file paths are right and I know that the folder exists. My current working code (works on a single file) is such:
import paramiko
def begin():
tran=paramiko.Transport(('dns.server.name', 22))
tran.connect(username='**',password='**')
sftp=paramiko.SFTPClient.from_transport(tran)
sftp.get('/remote/file/path', '/local/file/path')
sftp.close()
tran.close()
I've tried adding sftp.listdir, but I'm afraid I can't find enough documentation on the subject to make it understandable or usable for me. Is there something available that looks like os.walk?
My question is - How do I download small folders via the ssh2 protocol available in paramiko?
What I suggest instead of transferring whole folder, you should first make a temporary compressed tar file on server programmetically and transfer that tar file using sftp over network - may reduce bandwidh / will work faster and will be less error prone.
I'm building a system of apps that listen for files dumped into a folder shared on a smb shared drive. I've mounted the smb drives to the machine, shouldn't I be able to do a simple file.open()?
file = open("//drive/location/of/file/file.txt")
lines = file.readlines()
for line in lines:
print line
file.close()
I'm on a Mac with Snow Leopard, I must admit that I am more used to windows development so I just may be misunderstanding something. Any help would be very appreciated.
I've seen a couple similar stack overflow questions such as this one Using pysmbc to read files over samba
But these are about connecting programmatically, I already have the drive mapped to my machine.
If the remote SMB drive is mounted, then it'd essentially be "part" of your main file system, and it'd just be
/path/to/mount_point/path/on/remote/server/file.txt
^^^^^^^^^^^^^^^^^^^^- local on mac
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^- remote on other machine
and would all be considered "local", until file.txt is accesed, at which point the SMB infrastructure takes over and redirects the file operations to the remote machine.
Using the // notation indicates you're trying to reach directly over the network, and the format would be
//name_of_remote_machine/name_of_share/path/to/file.txt
with this, the localfile system is not involved, and all operations are immediately handled by the SMB system.