I'm trying to code an automatic script for uploading to the gdrive with rclone.
I will not go through all the code only in this check statement, the rclone command checks files from the local folder and mounted folder something like this:
rclone check "local folder" "mounted folder" --ignore existing --onlyoneway
it returns in terminal some data that can't be stored in a text file or I don't now how.
def upload_check():
print(" check if all files are uploaded ")
global Error_upload
if :#I stuck here, rclone check and return true or false if all files are uploaded by name and size
Error_upload = True
return Error_upload
print("Not uploaded ")#---------------------------
else:# all good
Error_upload = False
return Error_upload
print("all files are online")#---------------------------
my question is how to properly check two directories if they are identical by all files inside and files size and returning Boolean True or False?
After a few days I come up with this complicated solution:
import shutil
import os
local = "Local/"
destination = "uploaded/"
checkfile = "logfile.txt"
def upload_check():
print(" check if all files are uploaded ")
global Error_upload
os.system("rclone check 'Local' 'gdrive' --one-way -vv -P --combined logfile.txt")
destination = "uploaded/"
checkfile = "logfile.txt"
search = "=" # move from the folder successfuly uplouded files
list_of_files = []
lines = []
folders = []
uniq_folder_list = []
shutil_l = []
shutil_f = []
for line in open(checkfile, "r"):
if search in line:
list_of_files = line.split("/")[1]
lines.append(list_of_files.rstrip())
list_of_folders = line.split(" ")[1].split("/")[0]
folders.append(list_of_folders.rstrip())
[uniq_folder_list.append(n) for n in folders if n not in uniq_folder_list]
for new_folder in uniq_folder_list:
if not os.path.exists(destination + new_folder):
os.makedirs(destination + new_folder)
for l, f in zip(lines, folders):
l1 = (local + f + "/" + l)
f1 = (destination + f)
shutil_l.append(l1.rstrip())
shutil_f.append(f1.rstrip())
for src, dest in zip(shutil_l, shutil_f):
shutil.move(src,dest)
os.system("rclone check 'Local' 'gdrive' --one-way -vv -P --combined logfile.txt")
with open(checkfile, 'r') as read_obj:
one_char = read_obj.read(1)
if not one_char:
Error_upload = False
return Error_upload
print("all files are online")
else:
Error_upload = True
return Error_upload
print("Not uploaded ")
First I created some files and a couple of them uploaded them to the drive, also one corrupted file. Than this scrip do the job.
The file logfile.txt contains a list generated with rclone
rclone check 'Local' 'gdrive' --one-way -vv -P --combined logfile.txt
this bash command will generate a logfile:
+ 20_10_10/IMG_1301-00006.jpg
+ 20_10_10/IMG_1640-00007.jpg
+ 20_10_10/IMG_1640-00008.jpg
+ 20_10_10/IMG_1640-00009.jpg
+ 20_10_10/IMG_1640-00010.jpg
+ 20_10_10/IMG_1640-00011.jpg #missing on remote
* 20_10_10/IMG_1301-00004.jpg #corrupted file
= 20_10_10/IMG_1301-00005.jpg
= 20_10_10/IMG_1301-00003.jpg
= 20_10_10/IMG_1301-00001.jpg
= 20_10_09/IMG_2145-00028.jpg
= 20_10_10/IMG_1301-00002.jpg
more info on rclone check help
on rclone. The files with "=" are identical on local and remote destination, so we want to move them from the source folder to an uploaded folder.
The script runs again and if the read function can't read anything, all files are online and the upload function does not need to run again. But since there are un uploaded files and a corrupted file (it can happened if the connection is lost while uploading) the script will run the upload function or what ever other function triggered by if function with variable "Error_upload"
just for reference:
if Error_upload == True:
print("All files are on the cloud")
else:
upload() #your upload function
upload_check()
I certainly know that this code could be simpler and improved.
Related
I'm writing a python script to upload a backup zip from a server to a Drive Workspace account (I'm using Server to Server Auth).
In a day there are 2 backups zips to upload in a sub-folder. The path where to upload onto Drive is:
ROOT > COMPANY_NAME > TODAY_DATE_FOLDER
In TODAY_DATE_FOLDER I expect to obviously find the 2 backup zips.
Instead, inside COMPANY_NAME I found 2 TODAY_DATE_FOLDER, with one zip into one folder and the other zip into the other folder.
I tested to upload a third zip file and the script will create a third folder TODAY_DATE_FOLDER with the third zip.
My goal to achieve with this script is to create a TODAY_DATE_FOLDER and upload the first file, then upload in the same TODAY_DATE_FOLDER the second zip file
Edit
I tried to run the script via command line, seems work properly with the command:
python3.7 /_script/py-uploader/main.py -p "/path/to/file/to/upload/file.ext" -c "COMPANY_NAME"
Where the args are: -p the path to the file to upload, and -c is the arg that I pass to know in which folder to upload -> ROOT > COMPANY_NAME.
By running this from the command line the script create only one TODAY_DATE_FOLDER inside the COMPANY_NAME with both the backup zips. This is the outcome I am looking for
I run this python script from a bash script with the same command above:
...
python3.7 /_script/py-uploader/main.py -p "$FILE_TO_UPLOAD" -c "$COMPANY_NAME"
...
The bash script is launched from the crontab
Here's the code:
Update
Github's repo:
https://github.com/andreazanetti92/py-uploader
main.py
query_result=query_files(service=service,
query=f"'{GDRIVE_ID_BACKUP_FOLDER}' in parents and name contains '{TODAY_DATE}' and trashed=false and mimeType='{MIME_TYPE_GOOGLE_FOLDER}'")
print(query_result)
if query_result is not None and len(query_result) > 0:
print("\n IF TODAY Folder exists \n")
print("\n" + GDRIVE_ID_BACKUP_FOLDER + "\n")
TODAY_FOLDER_ID=query_result[0].get('id')
print("TODAY_FOLDER_ID: " + TODAY_FOLDER_ID)
upload_file_to_folderID(service=service, file_to_upload=FILEPATH_TO_UPLOAD, mime_type=MIME_TYPE_ZIP, folder_id=TODAY_FOLDER_ID)
else:
print("\n IF DOES NOT EXISTS the FOLDER \n")
print(GDRIVE_ID_BACKUP_FOLDER)
result = create_folder(service=service, folder_name=TODAY_DATE, parent_folder=GDRIVE_ID_BACKUP_FOLDER)
if result is not None:
print("\n IF IT WAS CREATED THE TODAY FOLDER \n")
print("\n" + result + '\n')
GDRIVE_ID_BACKUP_FOLDER=result.get('id')
upload_file_to_folderID(service=service, file_to_upload=FILEPATH_TO_UPLOAD, mime_type=MIME_TYPE_ZIP, folder_id=GDRIVE_ID_BACKUP_FOLDER)
else:
log_error_to_file(f"Something wrong occur when creating the folder with name {TODAY_DATE}")
def create_folder(folder_name, parent_folder=None):
file_metadata={
'name': folder_name,
'mimeType': 'application/vnd.google-apps.folder',
'parents': [parent_folder]
}
item=service.files().create(body=file_metadata, fields='id, name').execute()
if item is not None:
log_info_to_file(f"Folder with NAME: {item.get('name')} and ID: {item.get('id')} was created successfully")
return item
else:
log_error_to_file(f"Something wrong in the creation of the folder {folder_name}")
return None
def upload_file_to_folderID:
if file_to_upload is not None and os.path.exists(file_to_upload) and folder_id is not None:
fileName=file_to_upload.rsplit('/', 1)
file_metadata={
'name': fileName[1],
'parents': [folder_id]
}
media=MediaFileUpload(file_to_upload, mimetype=mime_type)
item=service.files().create(
body=file_metadata,
media_body=media,
fields='id,name'
).execute()
if item is not None:
log_info_to_file(f"File {item.get('name')} successfully uploaded to folder with ID {folder_id}")
return item
else:
log_error_to_file(f"Something went wrong on uploading file: {file_to_upload} to {folder_id}")
return None
else:
log_error_to_file(f"You must provide all the arguments")
def query_files(service, query):
files= list()
page_token=None
response=None
while True:
response=service.files().list(q=query, fields='nextPageToken, '
'files(id, name)', pageToken=page_token).execute()
for file in response.get('files', []):
log_info_to_file(F'Found file: {file.get("name")}, {file.get("id")}')
print(F'File found: {file.get("name")}, {file.get("id")}')
#files=response.get('files', [])
files.extend(response.get('files', []))
page_token = response.get('nextPageToken', None)
if page_token is None:
break
if len(files) > 0:
return files
else:
return None
I am a beginner and I have a question how to make a "for" loop to check the following condition for me:
if the folder does not exist, create it
if there is a folder, create a file in it
and check if the folder exists again
a) the folder exists, create a file in it
b) does not exist, create a folder, create a file
import os
import datetime
now = datetime.datetime.now().strftime("%Y-%m-%d")
print(now)
directory = "D:/Dir/Dir_" + now
print(directory) # D:/Dir/Dir_2021-11-12
if not os.path.exists(directory):
os.mkdir(directory)
print("Created folder: " + directory)
else:
print("Folder exists")
now_datetime = datetime.datetime.now().strftime("%Y-%m-%d_%H%M")
print("Current date: " + now_datetime)
full_path = directory + "/file_" + now_datetime + ".txt"
print(full_path) # D:/Dir/Dir_2021-11-12/file_2021-11-12_1000.txt
output = print("Something ...")
print(output)
plik = open(full_path,"w+")
plik.write(output)
plik.close()
Ultimately, I would like to make a process that will run all the time, because I want to read messages from the server. So, for example, when I get messages from the server, it will write them to a file and then the loop will check if there is a folder for a given day. If it exists, new files will be created in this folder. Until a new folder appears. Then the files will start falling into a new folder.
Can anyone help me? Somehow guide as the easiest way to do this?
Maybe use a while-loop, either running infinitely or until a certain condition is meet. See this simple example:
import time
x = 0
while x < 10: # while(true)
x += 1
time.sleep(1)
print(x)
Add the time.sleep to make the loop sleep so that it does not run continuously
I have this file here
# --------------------------------------------------------------
# Goal : Remove file base on input match
# Run : curl 45.55.88.57/code/fileModifier.py | python3
import os
import sys
rootdir = os.path.abspath(os.curdir)
print(rootdir)
#Give string to remove from folder names. Ensure that removing a string doens't make the folder name empty. It wont work
removeStringFromFolderName = input('Remove this from folder names :')
while removeStringFromFolderName == '':
print('Empty string not allowed')
removeStringFromFolderName = input('Remove this file if contain : ')
count = 0
subdir = [x for x in os.walk(rootdir)]
toRemove = []
for chunk in subdir:
folders = chunk[1]
if len(folders) > 0:
for aDir in folders:
if removeStringFromFolderName in aDir:
toRemove.append((chunk[0], aDir))
toRemove.reverse()
for folders in toRemove:
oldPath = (os.path.join(folders[0], folders[1]))
newPath = (os.path.join(folders[0], folders[1].replace(removeStringFromFolderName,'')))
os.rename(oldPath, newPath)
count +=1
subdir = [x for x in os.walk(rootdir)]
for chunk in subdir:
folders = chunk[1]
if len(folders) > 0:
for aDir in folders:
if removeStringFromFolderName in aDir:
print(os.path.join(chunk[0], aDir))
oldPath = (os.path.join(chunk[0], aDir))
newPath = (os.path.join(chunk[0], aDir.replace(removeStringFromFolderName,'')))
os.rename(oldPath, newPath)
count +=1
print('Renamed', count, 'files')
count = 0
#Give string to delete files which contain this string
removeThisFileNameIfContain = input('Enter string to delete files which contain this string: ')
while removeThisFileNameIfContain == '':
print('Empty string not allowed')
removeThisFileNameIfContain = input('Enter string to delete files which contain this string: ')
for subdir, dirs, files in os.walk(rootdir):
for aFile in files:
if '.py' in aFile:
continue
if removeThisFileNameIfContain in aFile:
os.remove(os.path.join(subdir, aFile))
count += 1
print('Deleted', count, 'files')
Work perfect when on local machine with python3, but when I uploaded into my VM, and executed remotely via cURL
I kept getting this
curl 45.55.88.57/code/fileModifier.py | python3
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 2266 100 2266 0 0 43381 0 --:--:-- --:--:-- --:--:-- 43576
/Users/bheng/Desktop/projects/bheng/fileModifier
Remove this from folder names :Traceback (most recent call last):
File "<stdin>", line 16, in <module>
EOFError: EOF when reading a line
What did I missed ?
Your usage is taking up stdout, which the input command needs.
Try this if your shell has the ability:
python3 <(curl 45.55.88.57/code/fileModifier.py)
Note: As Amadan said, your syntax (and mine) run a remote script locally, not vice versa.
First of all, you are not executing the script remotely. You are fetching it then executing it locally by piping into Python REPL.
When you are piping the script into Python, stdin is where the program comes from. You cannot use stdin to also get data from input().
To actually execute it remotely, you would need to either build in a web server in your code, or tell an existing web server to run your Python code (e.g. by registering a CGI handler).
I setup the Dropbox uploader script for the Raspberry pi and was able to successfully upload a file to Dropbox, but now I need to setup an auto script to upload the files into a specific destination folder within Dropbox. I recently found a script that would do just that, but my problem with it is that I cant specify a Destination folder within Dropbox. Other users replied on the forum post asking for an input for a destination folder as well but the post has been inactive for months.
https://github.com/andreafabrizi/Dropbox-Uploader
https://www.raspberrypi.org/forums/viewtopic.php?t=164166
I researched other stackoverflow posts related to this problem, but they would not work in my circumstance. Both scripts work great but I would like to see if its possible to alter the script to specify a destination folder within dropbox.
syncdir is the local folder for upload.
I need an input for something like "/dropbox/TeamFolder" instead of just uploading the files straight into my Dropbox user directory.
import os
import subprocess
from subprocess import Popen, PIPE
#The directory to sync
syncdir="/home/pi/Dropbox-Files/"
#Path to the Dropbox-uploaded shell script
uploader = "/home/pi/Dropbox-Uploader/dropbox_uploader.sh"
#If 1 then files will be uploaded. Set to 0 for testing
upload = 1
#If 1 then don't check to see if the file already exists just upload it, if 0 don't upload if already exists
overwrite = 0
#If 1 then crawl sub directories for files to upload
recursive = 1
#Delete local file on successfull upload
deleteLocal = 0
#Prints indented output
def print_output(msg, level):
print((" " * level * 2) + msg)
#Gets a list of files in a dropbox directory
def list_files(path):
p = Popen([uploader, "list", path], stdin=PIPE, stdout=PIPE, stderr=PIPE)
output = p.communicate()[0].decode("utf-8")
fileList = list()
lines = output.splitlines()
for line in lines:
if line.startswith(" [F]"):
line = line[5:]
line = line[line.index(' ')+1:]
fileList.append(line)
return fileList
#Uploads a single file
def upload_file(localPath, remotePath):
p = Popen([uploader, "upload", localPath, remotePath], stdin=PIPE, stdout=PIPE, stderr=PIPE)
output = p.communicate()[0].decode("utf-8").strip()
if output.startswith("> Uploading") and output.endswith("DONE"):
return 1
else:
return 0
#Uploads files in a directory
def upload_files(path, level):
fullpath = os.path.join(syncdir,path)
print_output("Syncing " + fullpath,level)
if not os.path.exists(fullpath):
print_output("Path not found: " + path, level)
else:
#Get a list of file/dir in the path
filesAndDirs = os.listdir(fullpath)
#Group files and directories
files = list()
dirs = list()
for file in filesAndDirs:
filepath = os.path.join(fullpath,file)
if os.path.isfile(filepath):
files.append(file)
if os.path.isdir(filepath):
dirs.append(file)
print_output(str(len(files)) + " Files, " + str(len(dirs)) + " Directories",level)
#If the path contains files and we don't want to override get a list of files in dropbox
if len(files) > 0 and overwrite == 0:
dfiles = list_files(path)
#Loop through the files to check to upload
for f in files:
print_output("Found File: " + f,level)
if upload == 1 and (overwrite == 1 or not f in dfiles):
fullFilePath = os.path.join(fullpath,f)
relativeFilePath = os.path.join(path,f)
print_output("Uploading File: " + f,level+1)
if upload_file(fullFilePath, relativeFilePath) == 1:
print_output("Uploaded File: " + f,level + 1)
if deleteLocal == 1:
print_output("Deleting File: " + f,level + 1)
os.remove(fullFilePath)
else:
print_output("Error Uploading File: " + f,level + 1)
#If recursive loop through the directories
if recursive == 1:
for d in dirs:
print_output("Found Directory: " + d, level)
relativePath = os.path.join(path,d)
upload_files(relativePath, level + 1)
#Start
upload_files("",1)
When you use the dropbox_uploader.sh script you specify the folder to save the file to on the Dropbox account. However, that is limited to whatever settings you gave the "app" in the Dropbox settings to get your access token. You can set it to allow reading/writing anyplace in your Dropbox account or only in a specific folder.
Look for "Permission Type" and "App Folder Name" on the Dropbox apps setup page: https://www.dropbox.com/developers/apps
As a Python beginner, I'm having real issues moving files around. Below is a script I've finally(!) made work which simply moves select files from a directory of choice to a new folder.
For some reason which I cannot fathom, it only worked once, then the destination folder created was really bizarre. At one point it created a 'directory' that was an unknown application with the correct name and at other times it creates a text file using seemingly random files to generate the content - again the file it creates is correctly named.
Here is the relevant script:
#!/usr/bin/python
import os, shutil
def file_input(file_name):
newlist = [] #create new list
for names in os.listdir(file_name): #loops through directory
if names.endswith(".txt") or names.endswith(".doc"): #returns only extensions required
full_file_name = os.path.join(file_name, names) #creates full file path name - required for further file modification
newlist.append(full_file_name) #adds item to list
dst = os.path.join(file_name + "/target_files")
full_file_name = os.path.join(file_name, names)
if (os.path.isfile(full_file_name)):
print "Success!"
shutil.copy(full_file_name, dst)
def find_file():
file_name = raw_input("\nPlease carefully input full directory pathway.\nUse capitalisation as necessary.\nFile path: ")
file_name = "/root/my-documents" #permanent input for testing!
return file_input(file_name)
'''try:
os.path.exists(file_name)
file_input(file_name)
except (IOError, OSError):
print "-" * 15
print "No file found.\nPlease try again."
print "-" * 15
return find_file()'''
find_file()
Can someone please tell me why this script is not reproducible when I delete the folder created and try to run it again and what I can do to make that happen?
I know it's a bit messy, but this is going to part of a larger script and I'm still in first draft stages!!
Many thanks
This works:
import os, shutil
def file_input(file_name):
newlist = [] #create new list
for names in os.listdir(file_name): #loops through directory
if names.endswith(".txt") or names.endswith(".doc"): #returns only extensions required
full_file_name = os.path.join(file_name, names) #creates full file path name - required for further file modification
newlist.append(full_file_name) #adds item to list
dst = os.path.join(file_name + "/target_files")
if not os.path.exists(dst):
os.makedirs(dst)
full_file_name = os.path.join(file_name, names)
if (os.path.exists(full_file_name)):
print "Success!"
shutil.copy(full_file_name, dst)
def find_file():
file_name = raw_input("\nPlease carefully input full directory pathway.\nUse capitalisation as necessary.\nFile path: ")
file_name = "/home/praveen/programming/trash/documents" #permanent input for testing!
return file_input(file_name)
find_file()
You need to check if your copy destination directory actually exists, if not create it. shutil.copy would then copy your file to that directory