I want to copy an existing template ppt present in my google drive. Then I want to change the placeholder text to some other text.
here is what I am trying.
from google.oauth2 import service_account
import googleapiclient.discovery
SCOPES = (
'https://www.googleapis.com/auth/drive',
'https://www.googleapis.com/auth/presentations',
)
SERVICE_ACCOUNT_FILE = 'cred.json'
credentials = service_account.Credentials.from_service_account_file(
SERVICE_ACCOUNT_FILE, scopes=SCOPES)
SLIDES = discovery.build('slides', 'v1', credentials=credentials)
DRIVE = discovery.build('drive', 'v3', credentials=credentials)
TMPLFILE = 'title slide template'
rsp = DRIVE.files().list(q="name='%s'" % TMPLFILE).execute().get('files')[0]
print(rsp)
DATA = {'name': 'Google Slides API template DEMO'}
print('** Copying template %r as %r' % (rsp['name'], DATA['name']))
DECK_ID = DRIVE.files().copy(body=DATA, fileId=rsp['id']).execute().get('id')
print(DECK_ID)
print('** Replacing placeholder text')
reqs = [
{'replaceAllText': {
'containsText': {'text': '{{text}}'},
'replaceText': final_til[0]
}},
]
SLIDES.presentations().batchUpdate(body={'requests': reqs},
presentationId=DECK_ID).execute()
print('DONE')
But it is not working. I don't get any error. everything works fine but I don't see the new ppt.
Output:
{'kind': 'drive#file', 'id': '15mVjkrT7PkckKetK_q9aYRVxaDcwDdHpAh7xjrAWB6Q', 'name': 'title slide template', 'mimeType': 'application/vnd.google-apps.presentation'} <--- rsp
** Copying template 'title slide template' as 'Google Slides API template DEMO'
11O97tySSNaboW6YRVD62Q7HLs8aVuS2pWyLYXImdSec <-- DECK_ID
** Replacing placeholder text
DONE
If I change
SLIDES.presentations().batchUpdate(body={'requests': reqs},
presentationId=DECK_ID).execute()
to
SLIDES.presentations().batchUpdate(body={'requests': reqs},
presentationId=rsp.get('id')).execute()
then it does replace the text but in my template file which I don't want.
Why is this happening?
I believe your current situation and goal as follows.
From your script,
You are using googleapis for python.
You have already been able to use Drive API and Slides API using the service account.
an existing template ppt present is a Google Slides which is not the PowerPoint file.
Modification points:
From your script and But it is not working. I don't get any error. everything works fine but I don't see the new ppt., I understood that you might want to see the Google Slides copied by the service account at your Google Drive.
When the Google Slides is copied by the service account, the copied Google Slides is put to the Drive of the service account. The Drive of service account is different from your Google Drive. By this, you cannot see the copied Google Slides on your Drive. I thought that this might be the reason of your issue.
In order to see the Google Slides copied by the service account at your Google Drive, for example, the following workarounds can be used.
Share the copied Google Slides with your email of Google account.
In this case, you can see the shared file at the shared folder.
At first, it creates new folder in your Google Drive and share the folder with the email of the service account. And when the Google Slides is copied, it sets the shared folder as the destination folder.
Workaround 1:
In this workaround, it shares the copied Google Slides with your email of Google account. When this is reflected to your script, it becomes as follows.
Modified script:
In this case, new permission is created to the copied file using "Permissions: create".
From:
print(DECK_ID)
print('** Replacing placeholder text')
To:
print(DECK_ID)
permission = {
'type': 'user',
'role': 'writer',
'emailAddress': '####gmail.com' # <--- Please set the email of your Google account.
}
DRIVE.permissions().create(fileId=DECK_ID, body=permission).execute()
print('** Replacing placeholder text')
Workaround 2:
In this workaround, the Google Slides is copied to the shared folder in your Google Drive. Before you use this script, please create new folder and share the folder with the email of service account. When this is reflected to your script, it becomes as follows.
Modified script:
In this case, the metadata is added to the request body of DRIVE.files().copy().
From:
DATA = {'name': 'Google Slides API template DEMO'}
To:
DATA = {'name': 'Google Slides API template DEMO', 'parents': ['###']}
Please set the folder ID of shared folder to ###.
References:
Permissions: create
Files: copy
#Tanaike's answer is great, but there is one other option too:
Account Impersonation
credentials = service_account.Credentials.from_service_account_file(
SERVICE_ACCOUNT_FILE, scopes=SCOPES)
delegated_credentials = credentials.with_subject(<email>)
DRIVE = build('drive','v3', credentials = delegated_credentials)
Here is a good overview: Using OAuth 2.0 for Server to Server Applications, specifically this section goes through the code.
Remember to set Domain Wide Delegation in both the GCP console and the Admin console.
The project initialized in the GCP Cloud console has also been granted scopes from within the Admin console > Security > API Controls > Domain wide delegation > Add new
The first thing the script does is build the credentials using from_service_account_file:
credentials = service_account.Credentials.from_service_account_file(
SERVICE_ACCOUNT_FILE, scopes=SCOPES)
Then it builds the delegated credentials, that is, the user to be impersonated:
delegated_credentials = credentials.with_subject('<EMAIL>')
From there it can build the service as normal. You can save to the user's drive as if it were the user doing it themselves.
References
Service Accounts
Using OAuth 2.0 for Server to Server Applications
Related
I have looked all over the forums and documentation to try to understand what I am missing here, but I'm also very new to Python so could be making a simple mistake. I am trying to create a spreadsheet in a shared folder, then share that sheet with my main account (using a service account to create the sheet because not sure if there is another way). I can create the sheet in the shared drive but I can't see to see the newly created sheets even while applying shared permissions without receiving any errors. Here is what I have so far
`
from __future__ import print_function
from oauth2client.service_account import ServiceAccountCredentials
from googleapiclient.discovery import build
SCOPES = [
'https://www.googleapis.com/auth/drive.metadata',
'https://www.googleapis.com/auth/spreadsheets',
'https://www.googleapis.com/auth/drive',
'https://www.googleapis.com/auth/drive.file',
]
# Get credentials
creds = ServiceAccountCredentials.from_json_keyfile_name('creds.json', SCOPES)
service = build('drive', 'v3', credentials=creds)
# Name sheet and provide parent ID of shared folder within shared drive
sheet_metadata = {
'name': 'This will be cools',
'mimeType': 'application/vnd.google-apps.spreadsheet',
'parents': 'SHARED FOLDER ID INSIDE SHARED DRIVE',
}
# Sharing permissions
shared_permissions = {
'role': 'writer',
'type': 'user',
'emailAddress': 'MY PERSONAL EMAIL'
}
results = service.files().create(body=sheet_metadata,fields='id').execute()
permission = service.permissions().create(
fileId=results.get('id'),
body=shared_permissions
)
`
I was having trouble figuring out how to pass the ID of the newly created sheet given that I don't have it because it is being created, but it seems this is working but still cannot see the new sheet when I access the shared folder in the shared drive. Any insight is greatly appreciated.
I was missing .execute() after adjusting permissions. Here is the correct permissions portion with slight adjustment to grab the sheet id right before that works perfect to create a sheet and make it visible.
sheet_id = results.get('id')
permissions = service.permissions().create(
fileId=sheet_id, body=shared_permissions).execute()
I write a python script to upload file to google drive, but the script is redirecting to chrome for email user authentication.
is there any way to avoid redirecting to chrome for authentication.
I'm running on python 3.9.
here is my sample code:
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
gauth = GoogleAuth()
drive = GoogleDrive(gauth)
upload_file_list = ['myfile.pdf']
for upload_file in upload_file_list:
gfile = drive.CreateFile({'parents': [{'id': '1B8ttlQMRUkjbrscevfa1DablIayzObh2'}]})
# Read file and set it as the content of this instance.
gfile.SetContentFile(upload_file)
gfile.Upload() # Upload the file.
The behaviour you are reporting is totally normal with OAuth 2.0 and the official Google APIs library.
What #Tanaike said is a good solution. You could use a service account to access Google Drive files without granting consent every time the token expires. With service accounts there are 2 options to achieve that:
Share the file/folder with the email address of the service account.
Use domain-wide delegation of authority to allow the service account to impersonate any user in your domain. Requires a domain using Google Workspace or Cloud Identity and Super Admin access to configure domain-wide delegation.
General information on how to make API calls with domain-wide delegation is available on this page https://developers.google.com/identity/protocols/oauth2/service-account#authorizingrequests.
Here is a working code sample:
from google.oauth2 import service_account
from googleapiclient.discovery import build
from googleapiclient.errors import HttpError
# Scopes required by this endpoint
# https://developers.google.com/drive/api/v3/reference/permissions/list
SCOPES = ["https://www.googleapis.com/auth/drive.readonly"]
# Variable that holds the file ID
DOCUMENT_ID = "i0321LSy8mmkx_Bw-XlDyzQ_b3Ny9m74u"
# Service account Credential file downloaded with domain-wide delegation of authority
# or with shared access to the file.
SERVICE_ACCOUNT_FILE = "serviceaccount.json";
# Creation of the credentials
credentials = service_account.Credentials.from_service_account_file(
SERVICE_ACCOUNT_FILE,
scopes=SCOPES)
# [Impersonation] the service account will take action on behalf of the user,
# requires domain-wide delegation of authority.
delegated_credentials = credentials.with_subject('user#domain.com')
# The API call is attempted
try:
service = build('drive', 'v3', credentials=delegated_credentials)
# Retrieve the documents contents from the Docs service.
document = service.files().get(fileId=DOCUMENT_ID).execute()
print('The title of the document is: {}'.format(document.get('name')))
except HttpError as err:
print(err)
Keep in mind that to use user impersonation you will need to configure domain-wide delegation in the Admin console of the domain that has the files (this will also work for external files shared with users in the domain).
If you want to use this with regular consumer accounts you can't use user impersonation, instead you will share the file with the service account (read or write access) to later make API calls. Line 20 creates delegated credentials, this line needs to be removed if you will use this other approach.
This code is showing the id of the folder_name but it is not showing up the folder_name in google drive. My purpose is to create and show the folder_name in google drive. How to do it?
import httplib2
from googleapiclient.discovery import build
from oauth2client.service_account import ServiceAccountCredentials
scope = 'https://www.googleapis.com/auth/drive'
credentials = ServiceAccountCredentials.from_json_keyfile_name('credentials.json', scope)
http = httplib2.Http()
drive_service = build('drive', 'v3', http=credentials.authorize(http))
def createFolder(name):
file_metadata = {
'name': name,
'mimeType': 'application/vnd.google-apps.folder'
}
file = drive_service.files().create(body=file_metadata,
fields='id').execute()
print('Folder ID: %s' % file.get('id'))
createFolder('folder_name')
In your script, the folder is created by the service account. I think that this is the reason of your issue. The Google Drive of service account is different from your Google Drive. By this, the folder created by the service account cannot be seen in your Google Drive using the browser. When you want to see the folder created by the service account at the browser, how about the following flow?
Create new folder in your Google Drive of your account.
Please create new folder in your Google Drive. In this case, you can create it using the browser.
Share the created new folder with the email of the service account.
By this, your created folder in your Google Drive can be accessed by the service account.
The email address can be confirmed in the file of credentials.json.
Create a folder to the shared folder using the service account with the script.
By this flow, I think that your goal can be achieved. For this, please modify your script as follows.
From:
file_metadata = {
'name': name,
'mimeType': 'application/vnd.google-apps.folder'
}
To:
file_metadata = {
'name': name,
'mimeType': 'application/vnd.google-apps.folder',
'parents': ['### folder ID ###']
}
Please replace ### folder ID ### with your created folder ID in your Google Drive.
By above modification, the folder created by the script can be seen at your Google Drive using your browser.
Reference:
Files: create of Drive API v3
I have been trying to access some simple information on Google Shared Drive files from a Python 3.7 script:
The last time a Google Sheets file on a shared drive was modified.
I have created a service account in the GCP Drive API menu and it can access/edit/etc Google Sheets without any problem the via the Sheets API.
However, when I use the same service account for the Drive API, it does not return any info on files outside its own folder (which contains only one file: "Getting Started"). The account has access to all Cloud APIs, has Domain-wide Delegation with all scopes related to Drive API included in the API control menu in GSuite.
The email address of the service account has been properly added to all folders in the shared drive.
Any idea? Basically all I need is to know when is the last time a sheet was modified by any given user.
secret_cred_file = ...
SCOPES = ['https://www.googleapis.com/auth/drive']
credentials = service_account.Credentials.from_service_account_file(secret_cred_file, scopes=SCOPES)
service = discovery.build('drive', 'v3', credentials=credentials)
results = service.files().list(pageSize=10, fields="nextPageToken, files(id, name,modifiedTime)").execute()
items = results.get('files', [])
PS: I have seen this: Getting files from shared folder but it does not help
I was able to list shared drive files without impersonating a user by adding some parameters to the list method as stated on google documentation:
Implement shared drive support
Shared drives follow different organization, sharing, and ownership models from My Drive. If your app is going to create and manage files on shared drives, you must implement shared drive support in your app. To begin, you need to include the supportsAllDrives=true query parameter in your requests when your app performs these operations:
files.get, files.list, files.create, files.update, files.copy, files.delete, changes.list, changes.getStartPageToken, permissions.list, permissions.get, permissions.create, permissions.update, permissions.delete
Search for content on a shared drive
Use the files.list method to search for shared drives. This section covers shared drive-specific fields in the files.list method. To search for shared drive, refer to Search for files and folders.
The files.list method contains the following shared drive-specific fields and query modes:
driveId — ID of shared drive to search.
includeItemsFromAllDrives — Whether shared drive items should be included in results. If not present or set to false, then shared drive items are not returned.
corpora — Bodies of items (files/documents) to which the query applies. Supported bodies are user, domain, drive, and allDrives. Prefer user or drive to allDrives for efficiency.
supportsAllDrives — Whether the requesting application supports both My Drives and shared drives. If false, then shared drive items are not included in the response.
Example
service.files().list(includeItemsFromAllDrives=True, supportsAllDrives=True, pageSize=10, fields="nextPageToken, files(id, name,modifiedTime)").execute()
It is nice to remember that the folder or files needs to be shared with the service account.
You need to impersonate your users.
It is not possible to make an API call to get all the files in your domain in one go.
In the Service Accounts article it says:
Service accounts are not members of your Google Workspace domain, unlike user accounts. For example, if you share assets with all members in your Google Workspace domain, they will not be shared with service accounts...This doesn't apply when using domain-wide delegation, because API calls are authorized as the impersonated user, not the service account itself.
So unfortunately you can't just share a file with a service account. To get all the files in your domain you would need to:
Impersonate an admin account and get a list of all the users.
Impersonate each user and make Drive API request for each.
Here is a good quick start for the Python Library, specifically this section
Remember to set permissions in both the GCP console and the Admin console though it seems like you have done this correctly.
Example script
from google.oauth2 import service_account
from googleapiclient.discovery import build
def main():
SCOPES = ['https://www.googleapis.com/auth/drive.metadata.readonly',
'https://www.googleapis.com/auth/admin.directory.user.readonly']
SERVICE_ACCOUNT_FILE = 'credentials.json'
credentials = service_account.Credentials.from_service_account_file(
SERVICE_ACCOUNT_FILE, scopes=SCOPES)
# Admin SDK to get users
admin_delegated_credentials = credentials.with_subject('[ADMIN_EMAIL]')
admin_service = build(
'admin',
'directory_v1',
credentials=admin_delegated_credentials
)
admin_results = admin_service.users().list(customer='my_customer', maxResults=10,
orderBy='email').execute()
users = admin_results.get('users', [])
if not users:
print('No users in the domain.')
else:
for user in users:
print(u'{0} ({1})'.format(user['primaryEmail'],
user['name']['fullName']))
# Drive to get files for each user
delegated_credentials = credentials.with_subject(user['primaryEmail'])
drive_service = build(
'drive',
'v3',
credentials=delegated_credentials
)
drive_results = drive_service.files().list(
pageSize=10,
fields="nextPageToken, files(id, name,modifiedTime)"
).execute()
items = drive_results.get('files', [])
if not items:
print('No files found.')
else:
print('Files:')
for item in items:
print(u'{0} ({1})'.format(item['name'],
item['id']))
if __name__ == '__main__':
main()
Explanation
This script has two scopes:
'https://www.googleapis.com/auth/drive.metadata.readonly'
'https://www.googleapis.com/auth/admin.directory.user.readonly'
The project initialized in the GCP Cloud console has also been granted these scopes from within the Admin console > Security > API Controls > Domain wide delegation > Add new
The first thing the script does is build the credentials using from_service_account_file:
credentials = service_account.Credentials.from_service_account_file(
SERVICE_ACCOUNT_FILE, scopes=SCOPES)
Then it builds the delegated credentials, that is, the user to be impersonated:
admin_delegated_credentials = credentials.with_subject('[ADMIN_EMAIL]')
From there it can build the service as normal. It gets a list of the users, loops through the users and lists their files. You could adapt this to your needs.
References
Service Accounts
Using OAuth 2.0 for Server to Server Applications
I'm trying use google drive api. I created a service account credentials and downloaded from console cloud. The problem is that I'm part of an organization in gsuit and when I try list my files, it's empty, but I have files in my drive.
from apiclient.discovery import build
from oauth2client.service_account import ServiceAccountCredentials
credentials = ServiceAccountCredentials.from_json_keyfile_name(
"credentials.json", scopes=['https://www.googleapis.com/auth/drive'])
service = build('drive', 'v3', credentials=credentials)
print(service.files().list().execute())
What could be?
Actually you are not providing a lot of information but make sure on the api credentials you issued you selected the 'Other UI' option on the field 'Where will you be calling the API from' and you chose 'User data' instead of 'Application data', also the scope should be 'https://www.googleapis.com/auth/drive.readonly.metadata' for listing data.
'https://www.googleapis.com/auth/drive' is correct too but given that it is a gsuite account there can be limitations on generic scopes even for your own data.
Also you should do service = DRIVE.files().list().execute().get('files', [])
for f in files:
print(f['name'])
and enumerate that files array to get the files.
if that doesn't work have a look at the api docs and if you can't figure it out please post more details and try to do some debugging and post the results here.
Edit: Try using the restapi too with the appropriate credentials and see if the files are fetched successfully there. https://developers.google.com/drive/api/v2/reference/files/list