I'm hoping to use the Google Sheets API in a cloud function, which will run from my account's default service account, and I'm working in Python. However, I've only ever authenticated the Sheets library locally, using this bit of code:
import os.path
from googleapiclient.discovery import build
from google_auth_oauthlib.flow import InstalledAppFlow
from google.auth.transport.requests import Request
from google.oauth2.credentials import Credentials
def gen_creds(path_to_secret: str, rw_vs_ro: str):
"""
Generate the needed credentials to work with the Sheets v4 API based on your secret
json credentials file.
:param path_to_secret: The file path to your credentials json file
:param rw_vs_ro: A string, 'r_o' or 'r_w', representing whether creds should be readonly or readwrite
:return: The built service variable
"""
if rw_vs_ro == 'r_o':
scopes = ['https://www.googleapis.com/auth/spreadsheets.readonly']
creds_nm = 'readonly_token.json'
else:
scopes = ['https://www.googleapis.com/auth/spreadsheets']
creds_nm = 'readwrite_token.json'
creds = None
# The file token.json stores the user's access and refresh tokens, and is
# created automatically when the authorization flow completes for the first
# time.
if os.path.exists(creds_nm):
creds = Credentials.from_authorized_user_file(creds_nm, scopes)
# If there are no (valid) credentials available, let the user log in.
if not creds or not creds.valid:
if creds and creds.expired and creds.refresh_token:
creds.refresh(Request())
else:
flow = InstalledAppFlow.from_client_secrets_file(
path_to_secret, scopes)
creds = flow.run_local_server(port=0)
# Save the credentials for the next run
with open(creds_nm, 'w') as token:
token.write(creds.to_json())
return build('sheets', 'v4', credentials=creds)
And I'm not entirely sure how to translate this to something that a cloud function would understand, as the cloud function won't be running as me, and lacks the same type of os path that I have access to locally. Would appreciate any insight into what the translation process looks like here--I was only able to find examples in JS, which wasn't perfect for what I was going for. Then, I would love to understand how to actually implement this code in a cloud function in GCP. Thanks!
When you deploy a cloud function, your main code will have access to all the files deployed within that function. This means all you need to do is include your readwrite_token.json/readonly_token.json files when deploying the package.
Once that's done, instead of simply passing the token files as strings, since the function’s directory can be different from the current working directory, you have to properly include the files as specified in this GCP Function Filesystem documentation
Also, you can't use InstalledAppFlow in the Cloud Function environment since this flow is meant for desktop os environments so better pray for the block to never be executed or replace with a different flow.
Actually, I found a simple answer to this question in the end--it's very easy to generate these credentials in GCP for Python! The exact replacement method for gen_creds is:
import google.auth
from googleapiclient.discovery import build
def gen_creds(rw_vs_ro: str):
"""
Generate the service credentials to be used to query a google sheet
:param rw_vs_ro: A string, 'r_o' or 'r_w', representing whether creds should be readonly or readwrite
:return: The built service variable
"""
if rw_vs_ro == 'r_o':
scopes = ['https://www.googleapis.com/auth/spreadsheets.readonly']
if rw_vs_ro == 'r_w':
scopes = ['https://www.googleapis.com/auth/spreadsheets']
creds, project = google.auth.default(scopes=scopes)
service = build('sheets', 'v4', credentials=creds)
return service
Hope this is as helpful to others as it is to me!
Related
Brand new to anything regarding programming! Please treat me like I know absolutely nothing. I followed the Google Docs API quickstart for python. I am getting the error of
FileNotFoundError: [Errno 2] No such file or directory: 'credentials.json'
From my understanding, this is because the file path is incorrect, but I have no clue how to fix it. Any help is much appreciated. Here is the source code.
from __future__ import print_function
import os.path
from google.auth.transport.requests import Request
from google.oauth2.credentials import Credentials
from google_auth_oauthlib.flow import InstalledAppFlow
from googleapiclient.discovery import build
from googleapiclient.errors import HttpError
# If modifying these scopes, delete the file token.json.
SCOPES = ['https://www.googleapis.com/auth/documents.readonly']
# The ID of a sample document.
DOCUMENT_ID = '195j9eDD3ccgjQRttHhJPymLJUCOUjs-jmwTrekvdjFE'
def main():
"""Shows basic usage of the Docs API.
Prints the title of a sample document.
"""
creds = None
# The file token.json stores the user's access and refresh
tokens, and is
# created automatically when the authorization flow completes
for the first
# time.
if os.path.exists('token.json'):
creds = Credentials.from_authorized_user_file('token.json',
SCOPES)
# If there are no (valid) credentials available, let the user
log in.
if not creds or not creds.valid:
if creds and creds.expired and creds.refresh_token:
creds.refresh(Request())
else:
flow = InstalledAppFlow.from_client_secrets_file(
'credentials.json', SCOPES)
creds = flow.run_local_server(port=0)
# Save the credentials for the next run
with open('token.json', 'w') as token:
token.write(creds.to_json())
try:
service = build('docs', 'v1', credentials=creds)
# Retrieve the documents contents from the Docs service.
document =
service.documents().get(documentId=DOCUMENT_ID).execute()
print('The title of the document is:
{}'.format(document.get('title')))
except HttpError as err:
print(err)
if __name__ == '__main__':
main()}
I had the same issue but one step back, I didn't know how to create the credentials.json file.
Create an API Key using steps here (note that the OAuth is different than the other API keys). These same steps are outlined below with screenshots:
Open the Google Cloud console.
At the top-left, click Menu menu
APIs & Services > Credentials. Step 2 screenshot
Click +Create Credentials > OAuth client ID. step 3 screenshot
Click Application type > Desktop app. step 4 screenshot In the "Name"
field, type a name for the credential. This name is only shown in
the Google Cloud console. NOTE: Users WILL see this name the 1st
time they give permission to the app in Google's warning dialog.
step 5 screenshot
Click Create. The OAuth client created screen
appears, showing your new Client ID and Client secret. --- Before
you Click OK --- You can download the json file here. screenshot
to download the json
Click OK. The newly created credential appears under "OAuth 2.0 Client IDs."
Finally, rename the json to credentials.json and move it to the folder where your app is running from (or give your app the right path the file).
I think you need to add the absolute path to the file. Try with a variable like this.
CLIENT_SECRET_FILE = r'C:\Users\ME\client_secret.json'
Then add that to your code.
flow = InstalledAppFlow.from_client_secrets_file(
CLIENT_SECRET_FILE , SCOPES)
If you copy files/ project folders, hard coded links will be lost.
Try getting the current directory and supplying that info to the variable looking for credentials.json. (that file is in the working directory right?)
if not creds or not creds.valid:
if creds and creds.expired and creds.refresh_token:
creds.refresh(Request())
else:
CURR_DIR = os.path.dirname(os.path.realpath(__file__))
credential_file=str(CURR_DIR)+'/credentials.json' #may need backslash in windows, IDK
flow = InstalledAppFlow.from_client_secrets_file(
credential_file, SCOPES)
creds = flow.run_local_server(port=0)
I'm hoping to use the Google Sheets API in a cloud function, which will run from my account's default service account, and I'm working in Python. However, I've only ever authenticated the Sheets library locally, using this bit of code:
import os.path
from googleapiclient.discovery import build
from google_auth_oauthlib.flow import InstalledAppFlow
from google.auth.transport.requests import Request
from google.oauth2.credentials import Credentials
def gen_creds(path_to_secret: str, rw_vs_ro: str):
"""
Generate the needed credentials to work with the Sheets v4 API based on your secret
json credentials file.
:param path_to_secret: The file path to your credentials json file
:param rw_vs_ro: A string, 'r_o' or 'r_w', representing whether creds should be readonly or readwrite
:return: The built service variable
"""
if rw_vs_ro == 'r_o':
scopes = ['https://www.googleapis.com/auth/spreadsheets.readonly']
creds_nm = 'readonly_token.json'
else:
scopes = ['https://www.googleapis.com/auth/spreadsheets']
creds_nm = 'readwrite_token.json'
creds = None
# The file token.json stores the user's access and refresh tokens, and is
# created automatically when the authorization flow completes for the first
# time.
if os.path.exists(creds_nm):
creds = Credentials.from_authorized_user_file(creds_nm, scopes)
# If there are no (valid) credentials available, let the user log in.
if not creds or not creds.valid:
if creds and creds.expired and creds.refresh_token:
creds.refresh(Request())
else:
flow = InstalledAppFlow.from_client_secrets_file(
path_to_secret, scopes)
creds = flow.run_local_server(port=0)
# Save the credentials for the next run
with open(creds_nm, 'w') as token:
token.write(creds.to_json())
return build('sheets', 'v4', credentials=creds)
And I'm not entirely sure how to translate this to something that a cloud function would understand, as the cloud function won't be running as me, and lacks the same type of os path that I have access to locally. Would appreciate any insight into what the translation process looks like here--I was only able to find examples in JS, which wasn't perfect for what I was going for. Then, I would love to understand how to actually implement this code in a cloud function in GCP. Thanks!
When you deploy a cloud function, your main code will have access to all the files deployed within that function. This means all you need to do is include your readwrite_token.json/readonly_token.json files when deploying the package.
Once that's done, instead of simply passing the token files as strings, since the function’s directory can be different from the current working directory, you have to properly include the files as specified in this GCP Function Filesystem documentation
Also, you can't use InstalledAppFlow in the Cloud Function environment since this flow is meant for desktop os environments so better pray for the block to never be executed or replace with a different flow.
Actually, I found a simple answer to this question in the end--it's very easy to generate these credentials in GCP for Python! The exact replacement method for gen_creds is:
import google.auth
from googleapiclient.discovery import build
def gen_creds(rw_vs_ro: str):
"""
Generate the service credentials to be used to query a google sheet
:param rw_vs_ro: A string, 'r_o' or 'r_w', representing whether creds should be readonly or readwrite
:return: The built service variable
"""
if rw_vs_ro == 'r_o':
scopes = ['https://www.googleapis.com/auth/spreadsheets.readonly']
if rw_vs_ro == 'r_w':
scopes = ['https://www.googleapis.com/auth/spreadsheets']
creds, project = google.auth.default(scopes=scopes)
service = build('sheets', 'v4', credentials=creds)
return service
Hope this is as helpful to others as it is to me!
I was writing a Python script to automate uploading some files to Google Drive. Since I'm still a newbie Python programmer and this is an exercise as much as anything else, I started following the Google Quickstart and decided to use their quickstart.py as a basis on which to base my own script. In the part where it talks about how to create credentials for your Python script, it refers to the "Create credentials" link, at https://developers.google.com/workspace/guides/create-credentials
I follow the link, get into one of my Google Cloud projects, and try to set up the OAuth consent screen, using an "Internal" project, as they tell you... but I can't. Google says:
“Because you’re not a Google Workspace user, you can only make your
app available to external (general audience) users. ”
So I try to create an "External" project, and then proceed to create a new client ID, using a Desktop application. Then I download the JSON credentials and put them in the same folder as my Python script, as "credentials.json". I then execute the Python script in order to authenticate it: the browser opens, I log into my Google account, give it my permissions... and then the browser hangs, because it's redirecting to a localhost URL and obviously my little Python script isn't listening in my computer at all.
I believe they must have changed this recently, because a year ago I started following the same Python tutorial and could create credentials without problems, but the Google Drive API docs haven't been updated yet. So... how do I create credentials for a Python script now?
EDIT: adding here the source code for my script. As I said, it's very similar to Google's "quickstart.py":
from __future__ import print_function
import pickle
import os.path
from googleapiclient.discovery import build
from google_auth_oauthlib.flow import InstalledAppFlow
from google.auth.transport.requests import Request
from googleapiclient.errors import HttpError
# If modifying these scopes, delete the file token.pickle.
SCOPES = ['https://www.googleapis.com/auth/drive.metadata', 'https://www.googleapis.com/auth/drive']
def main():
"""Shows basic usage of the Drive v3 API.
Prints the names and ids of the first 10 files the user has access to.
"""
creds = None
# The file token.pickle stores the user's access and refresh tokens, and is
# created automatically when the authorization flow completes for the first
# time.
if os.path.exists('token_myappname.pickle'):
with open('token_myappname.pickle', 'rb') as token:
creds = pickle.load(token)
# If there are no (valid) credentials available, let the user log in.
if not creds or not creds.valid:
if creds and creds.expired and creds.refresh_token:
creds.refresh(Request())
else:
flow = InstalledAppFlow.from_client_secrets_file(
'credentials.json', SCOPES)
creds = flow.run_local_server(port=0)
# Save the credentials for the next run
with open('token_myappname.pickle', 'wb') as token:
pickle.dump(creds, token)
service = build('drive', 'v3', credentials=creds)
# Call the Drive v3 API
results = service.files().list(
pageSize=10, fields="nextPageToken, files(id, name)").execute()
items = results.get('files', [])
if not items:
print('No files found.')
else:
#print(items[0])
print('Files:')
for item in items:
#print (item)
print(u'{0} {1} {2}'.format(item['name'], item['owners'], item['parents']))
I propose you to use a service account to access to the Drive.
For that, you need to share the drive (or the folder) with the service account email. And then use this code
from googleapiclient.discovery import build
import google.auth
SCOPES = ['https://www.googleapis.com/auth/drive.metadata', 'https://www.googleapis.com/auth/drive']
def main():
credentials, project_id = google.auth.default(scopes=SCOPES)
service = build('drive', 'v3', credentials=credentials)
# Call the Drive v3 API
results = service.files().list(
q=f"'1YJ6gMgACOqVVbcgKviJKtVa5ITgsI1yP' in parents",
pageSize=10, fields="nextPageToken, files(id, name, owners, parents)").execute()
items = results.get('files', [])
if not items:
print('No files found.')
else:
#print(items[0])
print('Files:')
for item in items:
#print (item)
print(u'{0} {1} {2}'.format(item['name'], item['owners'], item['parents']))
If you run your code on Google Cloud, in a compute engine instance for example, you need to customize the VM with the service account that you authorized in your drive. (Don't use the compute engine default service account, else you will need extra configuration on your VM)
If you run your script outside GCP, you need to generate a service account key file and to store it on your local server. Then, create an environment variable GOOGLE_APPLICATION_CREDENTIALS that reference the full path of the stored key file.
Aside from the other solution posted by Guillaume Blaquiere, I also found another one on my own, which I wanted to post here in case it's helpful. All I had to do is to... erm, actually read the code I was copying and pasting, in particular this line:
creds = flow.run_local_server(port=0)
I checked Google's documentation outside of the Quickstart and found in the following: https://google-auth-oauthlib.readthedocs.io/en/latest/reference/google_auth_oauthlib.flow.html
It turns out, the example code was opening a local port in my computer to listen to the request, and it wasn't working probably due to the "port 0" part, or some other network problem.
So the workaround I found was to use a different auth method found in the docs:
creds = flow.run_console()
In this case, you paste manually in the command line the auth code given to you by Google. I just tried it, and have my credentials happily stored in my local pickle file.
I read the Google API documentation pages (Drive API, pyDrive) and created a databricks notebook to connect to the Google drive. I used the sample code in the documentation page as follow:
from __future__ import print_function
import pickle
import os.path
from googleapiclient.discovery import build
from google_auth_oauthlib.flow import InstalledAppFlow
from google.auth.transport.requests import Request
# If modifying these scopes, delete the file token.pickle.
SCOPES = ['https://www.googleapis.com/auth/drive.metadata.readonly']
def main():
"""Shows basic usage of the Drive v3 API.
Prints the names and ids of the first 10 files the user has access to.
"""
creds = None
# The file token.pickle stores the user's access and refresh tokens, and is
# created automatically when the authorization flow completes for the first
# time.
if os.path.exists('token.pickle'):
with open('token.pickle', 'rb') as token:
creds = pickle.load(token)
# If there are no (valid) credentials available, let the user log in.
if not creds or not creds.valid:
if creds and creds.expired and creds.refresh_token:
creds.refresh(Request())
else:
flow = InstalledAppFlow.from_client_secrets_file(
CRED_PATH, SCOPES)
creds = flow.run_local_server()
# Save the credentials for the next run
with open('token.pickle', 'wb') as token:
pickle.dump(creds, token)
service = build('drive', 'v3', credentials=creds)
# Call the Drive v3 API
results = service.files().list(
pageSize=10, fields="nextPageToken, files(id, name)").execute()
items = results.get('files', [])
if not items:
print('No files found.')
else:
print('Files:')
for item in items:
print(u'{0} ({1})'.format(item['name'], item['id']))
if __name__ == '__main__':
main()
The CRED_PATH includes the credential file path in /dbfs/FileStore/shared_uploads. The script prompts me the URL to authorize the application but immediately after allowing access it redirects to the page that says "This site can’t be reached: localhost refused to connect."
The localhost is listening on the default port (8080):
I checked the redirect URI of the registered app in Google API Services and it includes the localhost.
I'm not sure what should I check/set to have access the Google API in databricks. Any thought is appreciated
Although I'm not sure whether this is better workaround for your situation, in your situation, how about using the service account instead of OAuth2 you are using? By this, the access token can be retrieved without opening the URL for retrieving the authorization code, and Drive API can be used with googleapis for python you are using. From this, I thought that your issue might be able to be removed.
The method for using the service account with your script is as follows.
Usage:
1. Create service account.
About this, you can see the following official document.
Creating and managing service accounts
and/or
Create a service account
When the service account is created, the credential file of JSON data is downloaded. This file is used for the script.
2. Sample script:
The sample script for using the service account with googleapis for python is as follows.
from oauth2client.service_account import ServiceAccountCredentials
from googleapiclient.discovery import build
credentialFileOfServiceAccount = '###.json' # Please set the file path of the creadential file of service account.
creds = ServiceAccountCredentials.from_json_keyfile_name(credentialFileOfServiceAccount, ['https://www.googleapis.com/auth/drive.metadata.readonly'])
service = build('drive', 'v3', credentials=creds)
results = service.files().list(pageSize=10, fields="nextPageToken, files(id, name)").execute()
items = results.get('files', [])
if not items:
print('No files found.')
else:
print('Files:')
for item in items:
print(u'{0} ({1})'.format(item['name'], item['id']))
Note:
The Google Drive of the service account is different from your Google Drive. So in this case, when you share a folder on your Google Drive with the mail address of the service account (This email address can be seen in the credential file.). By this, you can get and put the file to the folder using the service account and you can see and edit the file in the folder on your Google Drive using the browser.
References:
Creating and managing service accounts
Create a service account
I am trying to fetch gsuite alerts via API. I have created a service account as per their docs and I have assigned that service account to my google cloud function.
I do not want to use environment variables or upload credentials along with source code but I want leverage default service account used by function.
from googleapiclient.discovery import build
def get_credentials():
# if one knows credentials file location(when one uploads the json credentials file or specify them in environment variable) one can easily get the credentials by specify the path.
# In case of google cloud functions atleast I couldn't find it the path as the GOOGLE_APPLICATION_CREDENTIALS is empty in python runtime
# the below code work find if one uncomments the below line
#credentials = ServiceAccountCredentials.from_json_keyfile_name(key_file_location)
credentials = < how to get default credentials object for default service account?>
delegated_credentials = credentials.create_delegated('admin#alertcenter1.bigr.name').create_scoped(SCOPES)
return delegated_credentials
def get_alerts(api_name, api_version, key_file_location=None):
delegated_credentials = get_credentials()
alertcli = build(api_name, api_version, credentials=delegated_credentials)
resp = alertcli.alerts().list(pageToken=None).execute()
print(resp)
Is there any way I can create a default credentials object. I have tried using
from google.auth import credentials but this does not contain create_delegated function and
I have also tried ServiceAccountCredentials() but this requires signer.
Here is an example to use the Gmail API with delegated credentials. The service account credentials will need "Enable G Suite Domain-wide Delegation" enabled.
from google.oauth2 import service_account
from googleapiclient.discovery import build
credentials = service_account.Credentials.from_service_account_file(
credentials_file,
scopes=['https://www.googleapis.com/auth/gmail.send'])
impersonate = 'username#example.com'
credentials = credentials.with_subject(impersonate)
service = build('gmail', 'v1', credentials=credentials)
You can use the google.auth.default function to get the default credentials and use them to make an IAM signer which can be used to create new service account credentials which has the delegated email adress as subject. I have a more detailed answer for a similar question.
There is also Google Cloud Platform Github repository with some documentation about this method.