I want to upload a file to multi folder google driver via API but only one file will be save, not each file for each folder. (1 file in serveral folder)
Example using by hand: Add the Same File to Multiple Folders in Google Drive without Copying
Could you please help me! Thank you!
In inserting a file in a folder, you need to specify the correct folder ID in the parents property of the file. Using Python:
folder_id = '0BwwA4oUTeiV1TGRPeTVjaWRDY1E'
file_metadata = {
'name' : 'photo.jpg',
'parents': [ folder_id ]
}
media = MediaFileUpload('files/photo.jpg',
mimetype='image/jpeg',
resumable=True)
file = drive_service.files().create(body=file_metadata,
media_body=media,
fields='id').execute()
print 'File ID: %s' % file.get('id')
As further mentioned in Files: insert, setting the parents[] property in the request body will put the file in all of the provided folders. If no folders are provided in parents[] field, the file will be placed in the default root folder.
Hope that helps!
Related
I've wrote the following code to upload a file into blob storage using Python:
blob_service_client = ContainerClient(account_url="https://{}.blob.core.windows.net".format(ACCOUNT_NAME),
credential=ACCOUNT_KEY,
container_name=CONTAINER_NAME)
blob_service_client.upload_blob("my_file.txt", open("my_file.txt", "rb"))
this works fine. I wonder how can I upload the entire folder with all files and sub folders in it while keeping the structure of my local folder intact?
After reproducing from my end I could able to achieve your requirement using os module. Below is the complete code that worked for me.
dir_path = r'<YOUR_LOCAL_FOLDER>'
for path, subdirs, files in os.walk(dir_path):
for name in files:
fullPath=os.path.join(path, name)
print("FullPath : "+fullPath)
file=fullPath.replace(dir_path,'')
fileName=file[1:len(file)];
print("File Name :"+fileName)
# Create a blob client using the local file name as the name for the blob
blob_service_client = ContainerClient(account_url=ACCOUNT_URL,
credential=ACCOUNT_KEY,
container_name=CONTAINER_NAME)
print("\nUploading to Azure Storage as blob:\n\t" + fileName)
blob_service_client.upload_blob(fileName, open(fullPath, "rb"))
Below is the folder structure in my local.
├───g.txt
├───h.txt
├───Folder1
├───z.txt
├───y.txt
├───Folder2
├───a.txt
├───b.txt
├───SubFolder1
├───c.txt
├───d.txt
RESULTS:
I have a problem witch was hard to write in the title. I have this script with a lot of help from #Tanaike . This script is doing basically two things:
Deletes files from Google Drive folder by filenames which are in local folder CSVtoGD (using spreadsheet ID's)
then:
Upload list of CSV from local folder "CSVtoGD" to Google Drive folder
I have a big problem now and can not work it out. The script is deleting old files in google drive when there are the same filenames in CSVtoGD. When I add new file to local folder CSVtoGD, there is a error "list index out of range" and I got printed "No files found" like in the script. I was trying to make some modification but it was blind shoots. What I want this script to do is to delete from Google Drive folder ONLY files which are in local CSVtoGD folder and work on with rest of the files in CSVtoGD (just upload them). Anyone have some answer to that? Thank you :)
import gspread
import os
from googleapiclient.discovery import build
gc = gspread.oauth(credentials_filename='/users/user/credentials.json')
service = build("drive", "v3", credentials=gc.auth)
def getSpreadsheetId(filename):
q = "name='" + filename + "' and mimeType='application/vnd.google-apps.spreadsheet' and trashed=false"
res = service.files().list(q=q, fields="files(id)", corpora="allDrives", includeItemsFromAllDrives=True, supportsAllDrives=True).execute()
items = res.get("files", [])
if not items:
print("No files found.")
exit()
return items[0]["id"]
os.chdir('/users/user/CSVtoGD2')
files = os.listdir()
for filename in files:
fname = filename.split(".")
if fname[1] == "csv":
folder_id = '1z_pUvZyt5AoTNy-aKCKLmlNjdR2OPo'
oldSpreadsheetId = getSpreadsheetId(fname[0])
#print(oldSpreadsheetId)
sh = gc.del_spreadsheet(oldSpreadsheetId)
**# IF there are the same filenames in CSVtoGD folder on my Mac
#and the same filenames on Google Drive folder,
#those lines works well.
#Problem is when there are new files in CSVtoGD local folder on Mac.**
sh = gc.create(fname[0], folder_id)
content = open(filename, "r").read().encode("utf-8")
gc.import_csv(sh.id, content)
I believe your goal is as follows.
For example, when sample.csv is existing on your local PC and a Spreadsheet of sample is existing in your Google Drive, you want to delete the Spreadsheet of sample from your Google Drive.
When sample1.csv is existing on your local PC and the Spreadsheet of sample1 is NOT existing in your Google Drive, you want to upload sample1.csv to Google Drive.
In this case, how about the following modification?
Modified script:
import gspread
import os
from googleapiclient.discovery import build
from googleapiclient.http import MediaFileUpload
gc = gspread.oauth(credentials_filename='/users/user/credentials.json')
service = build("drive", "v3", credentials=gc.auth)
folder_id = '1z_pUvZyt5AoTNy-aKCKLmlNjdR2OPo' # Please set the folder ID you want to upload the file.
def getSpreadsheetId(filename, filePath):
q = "name='" + filename + "' and mimeType='application/vnd.google-apps.spreadsheet' and trashed=false"
res = service.files().list(q=q, fields="files(id)", corpora="allDrives", includeItemsFromAllDrives=True, supportsAllDrives=True).execute()
items = res.get("files", [])
if not items:
print("No files found.")
file_metadata = {
"name": filename,
"parents": [folder_id],
"mimeType": "application/vnd.google-apps.spreadsheet",
}
media = MediaFileUpload(filePath + "/" + filename + ".csv")
file = service.files().create(body=file_metadata, media_body=media, fields="id").execute()
id = file.get("id")
print("File was uploaded. The file ID is " + id)
exit()
return items[0]["id"]
filePath = '/users/user/CSVtoGD2'
os.chdir(filePath)
files = os.listdir()
for filename in files:
fname = filename.split(".")
if fname[1] == "csv":
oldSpreadsheetId = getSpreadsheetId(fname[0], filePath)
print(oldSpreadsheetId)
sh = gc.del_spreadsheet(oldSpreadsheetId)
sh = gc.create(fname[0], folder_id)
content = open(filename, "r").read().encode("utf-8")
gc.import_csv(sh.id, content)
When this script is run, the above flow is run.
Note:
In this modification, the CSV file is uploaded as a Google Spreadsheet. From your question, I thought that this might be your expected result. But, if you want to upload the CSV file as the CSV file, please remove "mimeType": "application/vnd.google-apps.spreadsheet", from file_metadata.
If an error related to the scope, please add the scope of https://www.googleapis.com/auth/drive and authorize the scopes again and test it again.
Reference:
Upload file data
I have created a Service Account and made this account (XXXX#XXXX.iam.gserviceaccount.com) be a Manager in a Shared Drive.
I am able to retrieve the shared folder ID by running service.drives().list(pageSize=10).execute()
But if I run:
folder_id = '0ACNaJE1nx6YwXXXXXXX' # Same folder ID as above
query = "'%s' in parents" % folder_id
response = service.files().list(q=query, spaces='drive', fields='files(id, name, parents)').execute()
It returns {'files': []} even though the shared folder contains some files. What am I doing wrong?
I believe your goal and your current situation as follows.
You want to retrieve the file list from the specific folder in the shared Drive using googleapis for python.
Your service in your script has the permission for retrieving the file list from the shared drive.
In this case, how about including includeItemsFromAllDrives=True, supportsAllDrives=True and corpora="allDrives" in the query parameter of the method of files.list in Drive API? When your script is modified, it becomes as follows.
Modified script:
folder_id = '0ACNaJE1nx6YwXXXXXXX' # Same folder ID as above
query = "'%s' in parents" % folder_id
response = service.files().list(q=query, pageSize=1000, includeItemsFromAllDrives=True, supportsAllDrives=True, corpora="allDrives", fields='files(id, name, parents)').execute()
Note:
In this modified script, it supposes that you have the permission for retrieving the file metadata from the shared drive. Please be careful this.
Reference:
Files: list
I'm currently trying to download a file from Google Drive using PyDrive, but am only able to download the file to the same location as my Python program. Is there a way to specify the file's download location? This is how I am downloading files currently.
if file1['title'] == file_name:
file2 = drive.CreateFile({'id': file1['id']})
print('Downloading file %s from Google Drive' % file2['title'])
file2.GetContentFile(file_name) # Save Drive file as a local file
Try following:
if file1['title'] == file_name:
location = "Path/where/you/want/to/save/"
"""
if you are in linux and want to save it to documents try following
location = "~/Documents/"
the last forward slash i.e. '/' is important
"""
full_path = location + file_name
file2 = drive.CreateFile({'id': file1['id']})
print('Downloading file %s from Google Drive' % file2['title'])
file2.GetContentFile(full_path)
Also I want to move this file in another folder and give output file on another folder. All folders are on Egnyte and using Python
client = egnyte.EgnyteClient({"domain": "apidemo.egnyte.com",
"User_Name": "Password"})
folder = client.folder("/Shared/Data/Individuals/Input")
Client use for login on Egnyte.
Folder contains the address of file and now how I read file in this folder and file name is "abc.txt".
How can I read this file and move on location "/Shared/Data/Individuals/Checked".
And after data processed output file saved on other location "/Shared/Data/Individuals/output".
It's an old post but here is how you would download/read files from a folder.
client = egnyte.EgnyteClient({"domain": "apidemo.egnyte.com",
"access_token": "OAuth token"})
file = client.file("/Shared/Data/Individuals/Input/abc.txt")
file_resource = file.download()
file_resource.read()
Egnyte have desktopconnect(2.0) version which I download on my system and after installation it will make egnyte drive on my pc and now I can easily access the egnyte files from my code just like other local files can read from code.