How to copy a folder from Server (linux) to a local machine (windows) in python.
I tried with the given code but it did not work
from distutils.dir_util import copy_tree
copy_tree("source_path ","destination_path")
I used copy_tree command to copy a folder on my local machine but when I used the same command to copy a folder from server to local machine then it did not work.
Any other method is there? Or any changes needed?
You need to use SSH, SCP, or SFTP to transfer files from host to host.
I do this a lot and like to use SSH and SCP. You can run and SSH server on your windows machine using OpenSSH. Here is a good set of instructions from WinSCP: https://winscp.net/eng/docs/guide_windows_openssh_server.
I recommend using Paramiko for SSH with Python. Here is a good answer showing how this works with python: https://stackoverflow.com/a/38556344/634627.
If you set up OpenSSH, you could also do this with SFTP, sometimes I find this is more suitable that SCP. Here is a good answer showing how that works: https://stackoverflow.com/a/33752662/634627
The trick is getting OpenSSH running on your Windows host and setting up SSH keys so your server can authenticate to your localhost.
Using copytree should work if:
the folder on the server is made available to the windows machine as a client.
you have sufficient access permissions.
you use a raw string for the windows path to prevent string interpretation.
Ad 3: try print('c:\test\robot'):
In [1]: print('c:\test\robot')
obot est
Related
How to copy a file in a remote server /maindir/fil1.txt to a sub directory /maindir/subdir/file1.txt. I have the SFTP implemented using paramiko. But it always check for the local path to copy from .
filename_full_path='/maindir/fil1.txt'
destfilename_full_path='/maindir/subdir/file1.txt'
sftp.put(filename_full_path, destfilename_full_path)
How to tell SFTP that the local path is also in remote host?
A core SFTP protocol does not support copying remote files.
There's draft of copy-data/copy-file extension to the SFTP protocol.
But in the most widespread OpenSSH SFTP server the copy-data is supported by very recent version 9.0 only. Another servers that do support the extensions are ProFTPD mod_sftp and Bitvise SFTP server.
So even if Paramiko did support it (it does not), it would probably not be of any use to you.
Alternatives:
Download the folder and reupload it to a new location (a pure SFTP solution)
Use cp command in a "exec" channel (not SFTP anymore, requires a shell access) – use SSHClient.exec_command.
Many mistake copy and move. Moving a file to another folder is supported.
You can try in the below way
a=paramiko.SSHClient()
a.set_missing_host_key_policy(paramiko.AutoAddPolicy())
a.connect('ip',username='user',password='passw')
stdin, stdout, stderr = a.exec_command("cp /sourc/file /target/file")
You can't copy or move files as we normally do in an operating system.
You need to follow this to make a file transfer.
import paramiko
ssh_client =paramiko.SSHClient()
ssh_client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh_client.connect(hostname=’hostname’,username=’something’,password=’something_unique’)
ftp_client=ssh_client.open_sftp()
ftp_client.put(‘localfilepath’,remotefilepath’) #for Uploading file from local to remote machine
#ftp_client.get(‘remotefileth’,’localfilepath’) for Downloading a file from remote machine
ftp_client.close()
I want to do a python script that is able to copy log files from a remote windows 10 virtual machine to the script's machine (Windows) as well as deleting files. A developer in my work place uses WMI with C# to do these kind of stuff but I haven't been able to find anything for Python regarding this topic.
You can use SSH for that.
Paramiko is an awesome library that can run SSH in python: http://www.paramiko.org/
How to copy a file in a remote server /maindir/fil1.txt to a sub directory /maindir/subdir/file1.txt. I have the SFTP implemented using paramiko. But it always check for the local path to copy from .
filename_full_path='/maindir/fil1.txt'
destfilename_full_path='/maindir/subdir/file1.txt'
sftp.put(filename_full_path, destfilename_full_path)
How to tell SFTP that the local path is also in remote host?
A core SFTP protocol does not support copying remote files.
There's draft of copy-data/copy-file extension to the SFTP protocol.
But in the most widespread OpenSSH SFTP server the copy-data is supported by very recent version 9.0 only. Another servers that do support the extensions are ProFTPD mod_sftp and Bitvise SFTP server.
So even if Paramiko did support it (it does not), it would probably not be of any use to you.
Alternatives:
Download the folder and reupload it to a new location (a pure SFTP solution)
Use cp command in a "exec" channel (not SFTP anymore, requires a shell access) – use SSHClient.exec_command.
Many mistake copy and move. Moving a file to another folder is supported.
You can try in the below way
a=paramiko.SSHClient()
a.set_missing_host_key_policy(paramiko.AutoAddPolicy())
a.connect('ip',username='user',password='passw')
stdin, stdout, stderr = a.exec_command("cp /sourc/file /target/file")
You can't copy or move files as we normally do in an operating system.
You need to follow this to make a file transfer.
import paramiko
ssh_client =paramiko.SSHClient()
ssh_client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh_client.connect(hostname=’hostname’,username=’something’,password=’something_unique’)
ftp_client=ssh_client.open_sftp()
ftp_client.put(‘localfilepath’,remotefilepath’) #for Uploading file from local to remote machine
#ftp_client.get(‘remotefileth’,’localfilepath’) for Downloading a file from remote machine
ftp_client.close()
I want to download/upload file from remote windows 2008 R2 servers using my python script. The problem is I do not want to install anything extra on my windows server box. I want to achieve this just using my normal login credentials.
Below are the different methods I heard of:
Use paramiko SSH : But to use this we have to install an SSH service on the remote box, which i do not want to do.
Use python wmi module: But I guess it does not have the functionality to download files from the remote servers.
Mount drives on your local box: Also do not want to do this as there will be lot of machines i want to connect to.
Use winscp: I guess it will also require SSH ?
Fabric: Heard of this, not sure what are its prerequisites.
are there any other methods with which i can achieve this?
When in windows do as the windows-users do.
If you can't install additional software on the server, you need to mount the drive, and interact with the remote files like they are local files.
You mention that you have too many remote servers to connect to. Why not pick one drive letter, and reuse it for each server you need to connect to?
With net use you can mount from the command line.
Syntax for net use
net use p: /delete:yes
net use p: \\remote_host\share_name password /user:domain\user
Use Python's subprocess package to run the mount commands. Subprocess tutor.
import subprocess
# Make sure the drive isn't mounted.
try:
subprocess.call('net use p: /delete:yes', shell=True)
except:
# This might fail if the drive wasn't in use.
# As long as the next net use works, we're good.
pass
for host in ("host1", "host2"):
# mount(map) the remote share.
subprocess.call('net use p: \\%s\share_name password /user:domain\user' % host, shell=True)
with open("p:\path\remote-file.txt", "r") as remote_file:
# do stuff
# dismount(map) the drive
subprocess.call('net use p: /delete:yes', shell=True)
(Don't have a windows box and network to test this on.)
Use the win_unc library: http://covenanteyes.github.io/py_win_unc/
This will allow you to do normal modifications to file paths as well as log in as a different user.
I am using a subprocess to run a program on another machine through a mounted network file system.
For my first step in this project I have been using sshfs mount and wget:
sys_process = subprocess.Popen(wget_exec.split(), stdout=subprocess.PIPE, stderr=subprocess.PIPE)
using the command: wget works perfectly
using the command: /mnt/ssh_mount/wget does not execute
System libraries:
my remote system is: Arch Linux which is calling for libpcre.so.1
my local system is: Ubuntu which uses libpcre3 so libpcre.so.1 is missing
I know this because when I call the wget command through the ssh mount (/mnt/ssh_mount/bin/wget) it throws an error. I do not wish to install needed libraries to all systems using this as it defeats the purpose of trying to run something remotely.
For measure checks for permissions have been made
How do I get the command to use the local libraries?
I hope to use nfs as well which would exclude below as solutions:
Python subprocess - run multiple shell commands over SSH
Paramiko
I have tried (with no success)
os.chdir('/mnt/ssh_mount')
Error while loading shared libraries: 'libpcre.so.0: cannot open shared object file: No such file or directory' assumes a stable mount point which would cause changes in 2 places when the environment would change (this seems wrong from a database normalization background, I would assume code/sys admin as well)
You are not actually running the wget command on the remote machine - you are trying to run the remote machine's binary on your local system, and the command is failing due to incompatible library versions. sshfs, nfs, and other types of network mounting protocols simply mount the remote filesystem as an extension of your local one - they don't allow for remote execution. To do that, you'll need to open a remote shell using ssh tunneling and execute the Arch wget command through that.