Python - FileNotFoundError when dealing with DMZ - python

I created a python script to copy files from a source folder to a destination folder, the script runs fine in my local machine.
However, when I tried to change the source to a path located in a server installed in a DMZ and the destination to a folder in a local servers I got the following error:
FileNotFoundError: [WinError 3] The system cannot find the path specified: '\reports'
And Here is the script:
import sys, os, shutil
import glob
import os.path, time
fob= open(r"C:\Log.txt","a")
dir_src = r"\reports"
dir_dst = r"C:\Dest"
dir_bkp = r"C:\Bkp"
for w in list(set(os.listdir(dir_src)) - set(os.listdir(dir_bkp))):
if w.endswith('.nessus'):
pathname = os.path.join(dir_src, w)
Date_File="%s" %time.ctime(os.path.getmtime(pathname))
print (Date_File)
if os.path.isfile(pathname):
shutil.copy2(pathname, dir_dst)
shutil.copy2(pathname, dir_bkp)
fob.write("File Name: %s" % os.path.basename(pathname))
fob.write(" Last modified Date: %s" % time.ctime(os.path.getmtime(pathname)))
fob.write(" Copied On: %s" % time.strftime("%c"))
fob.write("\n")
fob.close()
os.system("PAUSE")

Okay, we first need to see what kind of remote folder you have.
If your remote folder is shared windows network folder, try mapping it as a network drive: http://windows.microsoft.com/en-us/windows/create-shortcut-map-network-drive#1TC=windows-7
Then you can just use something like Z:\reports to access your files.
If your remote folder is actually a unix server, you could use paramiko to access it and copy files from it:
import paramiko, sys, os, posixpath, re
def copyFilesFromServer(server, user, password, remotedir, localdir, filenameRegex = '*', autoTrust=True):
# Setup ssh connection for checking directory
sshClient = paramiko.SSHClient()
if autoTrust:
sshClient.set_missing_host_key_policy(paramiko.AutoAddPolicy()) #No trust issues! (yes this could potentially be abused by someone malicious with access to the internal network)
sshClient.connect(server,user,password)
# Setup sftp connection for copying files
t = paramiko.Transport((server, 22))
t.connect(user, password)
sftpClient = paramiko.SFTPClient.from_transport(t)
fileList = executeCommand(sshclient,'cd {0}; ls | grep {1}'.format(remotedir, filenameRegex)).split('\n')
#TODO: filter out empties!
for filename in fileList:
try:
sftpClient.get(posixpath.join(remotedir, filename), os.path.join(localdir, filename), callback=None) #callback for showing number of bytes transferred so far
except IOError as e:
print 'Failed to download file <{0}> from <{1}> to <{2}>'.format(filename, remotedir, localdir)
If your remote folder is something served with the webdav protocol, I'm just as interested in an answer as you are.
If your remote folder is something else still, please explain. I have not yet found a solution that treats all equally, but I'm very interested in one.

Related

Using Python, how to download multiple files from a subdirectory on FTP server into a desired directory on local machine?

Using python program, I was able to download multiple source files from a FTP server (using ftplib and os libraries) to my local machine.
These source file resides at a particular directory inside the FTP server.
I was able to download the source files only if I have provided the same directory path in my local machine, as of FTP directory path.
I am able to download the files into C:\data\abc\transfer which is same as remote directory /data/abc/transfer. Code is insisting me to provide the same directory.
But I want to download all files into my desired directory C:\data_download\
Below is the code :
import ftplib
import os
from ftplib import FTP
Ftp_Server_host = 'xcfgn#wer.com'
Ftp_username ='qsdfg12'
Ftp_password = 'xxxxx'
Ftp_source_files_path = '/data/abc/transfer/'
ftp = FTP(Ftp_Server_host)
ftp.login(user=Ftp_username, passwd=Ftp_password)
local_path = 'C:\\data_download\\'
print("connected to remote server :" + Ftp_Server_host)
print()
ftp_clnt = ftp_ssh.open_sftp()
ftp_clnt.chdir(Ftp_source_files_path)
print("current directory of source file in remote server :" +ftp_clnt.getcwd())
print()
files_list = ftp.nlst(Ftp_source_files_path)
for file in files_list:
print("local_path :" + local_path)
local_fn = os.path.join(local_path)
print(local_fn)
print('Downloading files from remote server :' + file)
local_file = open (local_fn, "wb")
ftp.retrbinary("RETR " + file, local_file.write)
local_file.close()
print()
print("respective files got downloaded")
print()
ftp_clnt.close()
You have to provide a full path to open function, not just a directory name.
To assemble a full local path, take a file name from the remote paths returned by ftp.nlst and combine them with the target local directory path.
Like this:
local_fn = os.path.join(local_path, os.path.basename(file))

Avoid Overiding Existing File [duplicate]

I am using pysftp library's get_r function (https://pysftp.readthedocs.io/en/release_0.2.9/pysftp.html#pysftp.Connection.get_r) to get a local copy of a directory structure from sftp server.
Is that the correct approach for a situation when the contents of the remote directory have changed and I would like to get only the files that changed since the last time the script was run?
The script should be able to sync the remote directory recursively and mirror the state of the remote directory - f.e. with a parameter controlling if the local outdated files (those that are no longer present on the remote server) should be removed, and any changes to the existing files and new files should be fetched.
My current approach is here.
Example usage:
from sftp_sync import sync_dir
sync_dir('/remote/path/', '/local/path/')
Use the pysftp.Connection.listdir_attr to get file listing with attributes (including the file timestamp).
Then, iterate the list and compare against local files.
import os
import pysftp
import stat
remote_path = "/remote/path"
local_path = "/local/path"
with pysftp.Connection('example.com', username='user', password='pass') as sftp:
sftp.cwd(remote_path)
for f in sftp.listdir_attr():
if not stat.S_ISDIR(f.st_mode):
print("Checking %s..." % f.filename)
local_file_path = os.path.join(local_path, f.filename)
if ((not os.path.isfile(local_file_path)) or
(f.st_mtime > os.path.getmtime(local_file_path))):
print("Downloading %s..." % f.filename)
sftp.get(f.filename, local_file_path)
Though these days, you should not use pysftp, as it is dead. Use Paramiko directly instead. See pysftp vs. Paramiko. The above code will work with Paramiko too with its SFTPClient.listdir_attr.

How to get a file from a network windows directory and move to my Python Django project directory?

I'm programming in a MacOS and using Python + Django. I must to get some files in our private network (Windows network) and move them to our server. There, Python/Django will read these files and save the data in a database. How can I do that?
What I have tried
source_path = "smb://server-name/GRUPOS/TECNOLOGIA_INFORMACAO/Dashboard Diretoria/"
dest_path = "./static/reports/". # This is my static folder where I want to move the file
file_name = "general_reports.csv"
shutil.copyfile(source_path + file_name, dest_path + file_name)
It gives the follow error:
[Errno 2] No such file or directory:
'smb://server-name/GRUPOS/TECNOLOGIA_INFORMACAO/Dashboard
Diretoria/general_reports.csv'
This path (source_path) I just copied and past from the Finder, so... I think that it's correct. I have already searched at StackOverflow and I have tried other methods like put "r" before the path... Nothing....
Technologies used
Python 3.6;
Django 3.0.5;
Mac OSX;
Windows Network.
Thank you for your help and patience.
You need to import an SMB client libary for python or you mount that drive before you work there
First of all, thank you #Van de Wack.
This is the complete solution:
Install the pysmb library (https://pypi.org/project/pysmb/):
pip install pysmb
Import the library to your code:
from smb.SMBConnection import SMBConnection
The follow code is a example to list all directories:
server_ip = "10.110.10.10" # Take your server IP - I have put a fake IP :)
server_name = 'myserver' # The servername for the IP above
share_name = "GRUPOS" # This is the principal folder of your network that you want's to connect
network_username = 'myuser' # This is your network username
network_password = '***' # This is your network password
machine_name = 'myuser#mac-mc70006405' # Your machine name
conn = SMBConnection(network_username, network_password, machine_name, server_name, use_ntlm_v2 = True)
assert conn.connect(server_ip, 139)
files = conn.listPath(share_name, "/TECNOLOGIA_INFORMACAO/Dashboard Diretoria/")
for item in files:
print(item.filename)

Python - download all folders, subfolders, and files with the python ftplib module

I have been working all day trying to figure out how to use the python ftplib module to download folders, subfolders, and files from an ftp server but I could only come up with this.
from ftplib import FTP
import sys, ftplib
sys.tracebacklimit = 0 # Does not display traceback errors
sys.stderr = "/dev/null" # Does not display Attribute errors
Host = "ftp.debian.org"
Port = 21
Username = ""
Password = ""
def MainClass():
global ftp
global con
Host
Port
ftp = FTP()
con = ftp.connect(Host, Port) # Connects to the host with the specified port
def grabfile():
source = "/debian/"
filename = "README.html"
ftp.cwd(source)
localfile = open(filename, 'wb')
ftp.retrbinary('RETR ' + filename, localfile.write)
ftp.quit()
localfile.close()
try:
MainClass()
except Exception:
print "Not Connected"
print "Check the address", Host + ":" + str(Port)
else:
print "Connected"
if ftplib.error_perm and not Username == "" and Password == "":
print "Please check your credentials\n", Username, "\n", Password
credentials = ftp.login(Username, Password)
grabfile()
This python script will download a README.html file from ftp.debian.org but, I would like to be able to download whole folders with files and subfolders in them and I cannot seem to figure that out. I have searched around for different python scripts using this module but I cannot seem to find any that do what I want.
Any suggestions or help would be greatly appreciated.
Note:
I would still like to use python for this job but it could be a different module such as ftputil or any other one out there.
Thanks in advance,
Alex
The short solution:
You could possibly just run: "wget -r ftp://username:password#ftp.debian.org/debian/*" to get all the files under the debian directory.
Then you can process the files in python.
The long solution:
You can go over every directory listing by using ftplib, getting a directory listing parsing it and then getting every file and recursing into directories.
If you search the web you'd find previous posts on stackoverlow which solve this issue

Python script to get files from one server into another and store them in separate directories?

I am working on server 1. I need to write a Python script where I need to connect to a server 2 and get certain files (files whose name begins with the letters 'HM') from a directory and put them into another directory, which needs to be created at the run time (because for each run of the program, a new directory has to be created and the files must be dumped in there), on server 1.
I need to do this in Python and I'm relatively new to this language. I have no idea where to start with the code. Is there a solution that doesn't involve 'tarring' the files? I have looked through Paramiko but that just transfers one file at a time to my knowledge. I have even looked at glob but I cannot figure out how to use it.
to transfer the files you might wanna check out paramiko
import os
import paramiko
localpath = '~/pathNameForToday/'
os.system('mkdir ' + localpath)
ssh = paramiko.SSHClient()
ssh.load_host_keys(os.path.expanduser(os.path.join("~", ".ssh", "known_hosts")))
ssh.connect(server, username=username, password=password)
sftp = ssh.open_sftp()
sftp.get(remotepath, localpath)
sftp.close()
ssh.close()
I you wanna use glob you can do this:
import os
import re
import glob
filesiwant = re.compile('^HM.+') #if your files follow a more specific pattern and you don't know regular expressions you can give me a sample name and i'll give you the regex4it
path = '/server2/filedir/'
for infile in glob.glob( os.path.join(path, '*') ):
if filesiwant.match(infile):
print "current file is: " + infile
otherwise an easier alternative is to use os.listdir()
import os
for infile in os.listdir('/server2/filedir/'):
...`
does that answer your question? if not leave comments
Python wouldn't be my first choice for this task, but you can use calls to the system and run mkdir and rsync. In particular you could do
import os
os.system("mkdir DIRECTORY")
os.system("rsync -cav user#server2:/path/to/files/HM* DIRECTORY/")
Just use ssh and tar. No need to get Python involved
$ ssh server2 tar cf - HM* | tar xf -
The remote tar can pipe straight into the local tar
You could use fabric. Create fabfile.py on server1:
import os
from fabric.api import get, hosts
#hosts('server2')
def download(localdir):
os.makedirs(localdir) # create dir or raise an error if it already exists
return get('/remote/dir/HM*', localdir) # download HM files to localdir
And run: fab download:/to/dir from the same directory in a shell (fabfile.py is to fab as Makefile is to make).

Categories