I am working on a code to copy images from a folder in a local directory to a remote directory. I am trying to use scp.
So in my directory, there is a folder that contains subfolders with images in it. There are also images that are in the main folder that are not in subfolders. I am trying to iterate through the subfolders and individual images and sort them by company, then make corresponding company folders for those images to be organized and copied onto the remote directory.
I am having problems creating the new company folder in the remote directory.
This is what I have:
def imageSync():
path = os.path.normpath("Z:\Complete")
folders = os.listdir(path)
subfolder = []
#separates subfolders from just images in complete folder
for folder in folders:
if folder[len(folder)-3:] == "jpg":
pass
else:
subfolder.append(folder)
p = dict()
for x in range(len(subfolder)):
p[x] = os.path.join(path, subfolder[x])
sub = []
for location in p.items():
sub.append(location[1])
noFold= []
for s in sub:
path1 = os.path.normpath(s)
images = os.listdir(path1)
for image in images:
name = image.split("-")
comp = name[0]
pathway = os.path.join(path1, image)
path2 = "scp " + pathway + " blah#192.168.1.10: /var/files/ImageSync/" + comp
pathhh = os.system(path2)
if not os.path.exists(pathhh):
noFold.append(image)
There's more to the code, but I figured the top part would help explain what I am trying to do.
I have created a ssh key in hopes of making os.system work, but Path2 is returning 1 when I would like it to be the path to the remote server. (I tried this: How to store the return value of os.system that it has printed to stdout in python?)
Also how do I properly check to see if the company folder in the remote directory already exists?
I have looked at Secure Copy File from remote server via scp and os module in Python and How to copy a file to a remote server in Python using SCP or SSH? but I guess I am doing something wrong.
I'm new to Python so thanks for any help!
try this to copy dirs and nested subdirs from local to remote:
cmd = "sshpass -p {} scp -r {}/* root#{}://{}".format(
remote_root_pass,
local_path,
remote_ip,
remote_path)
os.system(cmd)
don't forget to import os,
You may check the exitcode returned (0 for success)
Also you might need to "yum install sshpass"
And change /etc/ssh/ssh_config
StrictHostKeyChecking ask
to:
StrictHostKeyChecking no
Related
How do I get a list of every single directory under an azure container using python? I can’t seem to find documentation on how to do this and at same time I’m new to azure and its terminology so that probably doesn’t help either.
For example I may have a container named *sales_data” and underneath it I may have:
product1 / usa
product1/ international
— sales_reps/data
stores/ 50_different_subfolders_here
new_folders_created_all_the_time_here/new_sub_folders_here
I’d like a list that laid out the full path of each of directory.
Where do I go from here?
blobService = BlockBlobService(account_name = my_account_name, token_credential=my_azure_creds)
I've figured out how to get the listing of all files in a terminal directory such as what is below, but again I can't find the pathing instructions...
prefix_objects = blobService.list_blobs('sales_data', prefix='/product1/usa/')
for each_file in prefix_objects:
print(each_file.name)
Here's a full example (for version 12.0.0 of the SDK) which will find the full path of all files under a certain container.
To run the following code, you will need to retrieve the connection string for the storage account you are interested in.
import os
from azure.storage.blob import BlobServiceClient
def ls_files(client, path, recursive=False):
'''
List files under a path, optionally recursively
'''
if not path == '' and not path.endswith('/'):
path += '/'
blob_iter = client.list_blobs(name_starts_with=path)
files = []
for blob in blob_iter:
relative_path = os.path.relpath(blob.name, path)
if recursive or not '/' in relative_path:
files.append(relative_path)
return files
# Connection string for the storage account.
connection_string = ''
# Name of the container you are interested in.
container_name = 'sales_data'
blob_service_client = BlobServiceClient.from_connection_string(connection_string)
client = blob_service_client.get_container_client(container_name)
files = ls_files(client, '', recursive=True)
Note: The function ls_files comes from this repo.
All credits for that source code go to the author, rakshith91.
I am trying to create a directory in the home path and re-check if the directory exists in the home path before re-creating using os.path.exists(), but its not working as expected.
if os.access("./", os.W_OK) is not True:
print("Folder not writable")
dir_name_tmp = subprocess.Popen('pwd', stdout=subprocess.PIPE, shell=True)
dir_name_tmp = dir_name_tmp.stdout.read()
dir_name = dir_name_tmp.split('/')[-1]
dir_name = dir_name.rstrip()
os.system('ls ~/')
print "%s"%dir_name
if not os.path.exists("~/%s"%(dir_name)):
print "Going to create a new folder %s in home path\n"%(dir_name)
os.system('mkdir ~/%s'%(dir_name))
else:
print "Folder %s Already Exists\n"%(dir_name)
os.system('rm -rf ~/%s & mkdir ~/%s'%(dir_name, dir_name))
else :
print("Folder writable")
Output for the first time:
Folder not writable
Desktop Downloads Perforce bkp doc project
hello.list
Going to create a new folder hello.list in home path
Output for the 2nd time:
Folder not writable
Desktop Downloads Perforce bkp doc hello.list project
hello.list
Going to create a new folder hello.list in home path
mkdir: cannot create directory `/home/desperado/hello.list': File exists
Its not going into the else loop though the directory is existing. Am I missing something ? Share in you inputs !
Updated Working Code With Suggestions Provided: Using $HOME directory and os.path.expandusr
if os.access("./", os.W_OK) is not True:
log.debug("Folder Is Not writable")
dir_name_tmp = subprocess.Popen('pwd', stdout=subprocess.PIPE, shell=True)
dir_name_tmp = dir_name_tmp.stdout.read()
dir_name = dir_name_tmp.split('/')[-1]
dir_name = dir_name.rstrip()
log.debug("dir_name is %s"%dir_name)
dir_name_path = (os.path.expanduser('~/%s'%(dir_name))).rstrip()
log.debug("dir_name_path is %s"%(dir_name_path))
# if not os.path.exists('~/%s'%(dir_name)):
if not os.path.exists('%s'%(dir_name_path)):
log.debug("Going to create a new folder %s in home path\n"%(dir_name))
os.system('mkdir $HOME/%s'%(dir_name))
else:
log.debug("Folder %s Already Exists\n"%(dir_name))
os.system('rm -rf %s'%(dir_name_path))
os.system('mkdir $HOME/%s'%(dir_name))
else :
log.debug("Folder Is Writable")
The tilde symbol ~ representing the home directory is a shell convention. It is expanded by the shell in os.system, but it is understood literally in Python.
So you create <HOME>/<DIR>, but test for ~/<DIR>.
As mentioned by VPfB, the tilde symbol is understood literally by Python. To fix this, you need to get your actual home directory.
Now, on different platforms, there are different paths for the home directory.
To get the home directory, os.path.expanduser will be useful.
>>> import os
>>> os.path.expanduser("~")
'/Users/ashish'
I need to upload an image file to a certain folder. It works fine on localhost but if I push the app to heroku it tells me:
IOError: [Errno 2] No such file or directory: 'static/userimg/1-bild-1.jpg'
Which means it cant find the folder?
I need to store the image files there for a few seconds to perform some actions on theme. After that they will be sent to AWS and will be deleted from the folder.
Thats the code where I save the images into the folder:
i = 1
for key, file in request.files.iteritems():
if file:
filename = secure_filename(str(current_user.id) + "-bild-"+ str(i) +".jpg")
file.save(os.path.join(app.config['UPLOAD_FOLDER'], filename))
i = i + 1
Later I get the files from the folder like this:
for item in os.listdir(os.path.join(app.config['UPLOAD_FOLDER'])):
if item.startswith(str(current_user.id)) and item.split(".")[0].endswith("1"):
with open(os.path.join(app.config['UPLOAD_FOLDER'], item), "rb") as thefile:
data = base64.b64encode(thefile.read())
upload_image_to_aws_from_image_v3('MYBUCKET', "userimg/", data, new_zimmer, "hauptbild", new_zimmer.stadt, new_zimmer.id)
os.remove(str(thefile.name))
That is the upload folder in app config:
UPLOAD_FOLDER = "static/userimg/"
Everything works fine on localhost.
I had to add the absolute path to the directory, because the path on the server is different.
You can run heroku run bash for your app through the CLI and check the directories with dir. Use then cd .. to go back or cd *directory name* to go into the directory.
I had to add
MYDIR = os.path.dirname(__file__)
and replace all
os.path.join(app.config['UPLOAD_FOLDER']
with
os.path.join(MYDIR + "/" + app.config['UPLOAD_FOLDER']
Some informations on the topic also here:
Similar question
GITHUB
Hi,
This is a pretty basic script i have started in python..
I intend to eventually, in nyears to come lol, to use it to backup from my server(FTP) to my local backup drive (E), then backup again to my other Local backup drive (D:).
So far im buzzing..
I managed to copy using copytree from a directory i specify in the script to another directory that i specify in the script..
Local E: to local D:..
my next task, well next mission is to be able to Input the directory i wish to copy from and copy to..
i assumed this would be as simplae as chanhging the
src =
to
src = input()
alas this causes an error..
anyways here is the script and i would appreciate any input as Python, so fun is still an alien langauge to me lol..
import shutil, os, datetime, time, sys
newFolderName = datetime.datetime.now()
newFolder = newFolderName.strftime('%m%d%Y')
src = input('type the src dir: ')
dst= input('type the dest dir: ' +
newFolderName.strftime('%d-%B-%y\\').upper())
shutil.copytree(src, dst)
I have to download files from the web over several requests. The downloaded files for each request have to be put in a folder with the same name as the request number.
For example:
My script is now running to download files for request number 87665. So all the downloaded files are to be put in the destination folder Current Download\Attachment87665. So how do I do that?
What I have tried so far:
my_dir = "D:\Current Download"
my_dir = os.path.expanduser(my_dir)
if not os.path.exists(my_dir):
os.makedirs(my_dir)
But it doesn't meet my original requirement. Any idea how to achieve this?
Just create a path beforehand, via os.path.join:
request_number = 82673
# base dir
_dir = "D:\Current Download"
# create dynamic name, like "D:\Current Download\Attachment82673"
_dir = os.path.join(_dir, 'Attachment%s' % request_number)
# create 'dynamic' dir, if it does not exist
if not os.path.exists(_dir):
os.makedirs(_dir)