I'm trying to automate the process of transferring files from my remote server to my local directory using Python. I know of libraries like pysftp which are popular, but as far as I can tell these libraries require connections to be initiated from the local side (i.e. from a script that lives on the local machine). Is there a way to transfer files from a script that lives on the remote machine instead?
Thanks in advance.
Related
I want to do a python script that is able to copy log files from a remote windows 10 virtual machine to the script's machine (Windows) as well as deleting files. A developer in my work place uses WMI with C# to do these kind of stuff but I haven't been able to find anything for Python regarding this topic.
You can use SSH for that.
Paramiko is an awesome library that can run SSH in python: http://www.paramiko.org/
I am able to watch local directories using inotify kernel subsystem based solutions. There are some python projects too which are working on top of inotify like pyinotify, PyInotify, fsmonitor and watchdog.
I have mounted remote ftp server in local directory using curlftpfs so all syncing is easy now. But inotify is not able to watch network mounted points like local directories.
I want to track if there is new files added to ftp server. How can I achieve like I do for local directory using inotify based solution.
It can hardly work. The FTP protocol has no API to notify a client about the changes. The curlftpfs would have to continually poll the remote folder to provide the notification for inotify or other similar tool. It hardly does that.
You have to poll the FTP folder yourself.
See for example Monitor remote FTP directory.
What would be the best way of downloading / uploading files / directory to / from remote windows server and local windows machine using python scripting language?
Modules I heard of are paramiko and fabric...
Apart from these any other good option/choice?
It depends on the protocol you are using, if the file is big, use UDP, if the file is small use TCP, if the file is small use SSH. You don's necessarily need paramiko or fabric to communicate with another computer, since they are for ssh connections. If you know the protocol, then it is easier to communicate.
I'm trying to run a python script on a remote linux machine accessed by ssh(putty). I want to change/access directory to the windows directory and run a program which converts files on the server to csv and saves them to the server.
Is it possible to run the program without moving the files from remote to local, run conversion, move local to remote?
I am not the root user and can't install anything on the linux machine.
My Windows is 64bit and the linux machine is 64bit Ubuntu. Any suggestions?
I found a way to do what I wanted. Doing what I initially wanted requires me to transfer the files from the local machine to the remote machine then running the script and transferring it back. Ultimately it's a function of how fast my internet connection is. Since my local connection isn't that strong, I realized that my initial thoughts were flawed. Eventually, I just uploaded my data to the remote machine and run the script there. It was the fastest solution
I want to automate the remote deployment which currently I am doing manually.
The process includes
Make the tar ball from certain folders
SFTP to the remote server
Rename the old folders
Untar the new tar file
Restart apache
The remote system is on the intranet and has no access to the outside internet
I want to know how can I transfer the file from my python script and then when the transfer is complete then log into ssh and do stuff. I am confused about how can I achieve that. On localhost and I can do all that but how can I do that on a remote host?
For simple&dirty work you can use fabric (This by no means say that you cannot use fabric to build serious product)
For heavy configuration routines, you'd better pick a CMS (e.g., ansible)