I want to retrieve realtime user data from Google Analytics and this article by Google outlines how to this (and works as expected).
gapi.client.analytics.data.realtime.get({
"ids": "ga:21660971",
"metrics": "rt:activeUsers"
})
However I am following the server-side authentication because I want this to be always visible to users who do not have access. But overnight this access token has become invalid. What can I do or rework to make this work with the need to constantly have manual intervention to refresh the token all of the time!
Python script (as is in the Google article).
# service-account.py
import time
from oauth2client.service_account import ServiceAccountCredentials
# The scope for the OAuth2 request.
SCOPE = 'https://www.googleapis.com/auth/analytics.readonly'
# The location of the key file with the key data.
KEY_FILEPATH = 'MY-JSON-FILE.json'
# Defines a method to get an access token from the ServiceAccount object.
def get_access_token():
return ServiceAccountCredentials.from_json_keyfile_name(
KEY_FILEPATH, SCOPE).get_access_token().access_token
print(get_access_token())
JS Authentification
/**
* Authorize the user with an access token obtained server side.
*/
gapi.analytics.auth.authorize({
'serverAuth': {
'access_token': 'TOKEN-FROM-PY-SCRIPT-ABOVE'
}
});
This is the error given in the console, and it only does this after 60 minutes or so.
{domain: "global", reason: "authError", message: "Invalid
Credentials", locationType: "header", location: "Authorization"}
Array(0) message
"Invalid Credentials"
Related
I have a public Cloud Run, authenticated by JWT Token. Working 100%.
The logic inside the Cloud Run to decode the token is in python:
def decode_jwt(token: str) -> dict:
try:
decoded_token = jwt.decode(
token, JWT_SECRET, algorithms=[JWT_ALGORITHM])
return decoded_token if decoded_token["expires"] >= time.time() else None
except Exception as e:
raise InvalidTokenError
The Cloud Run is publicly available using a custom domain.
Now, I want to do some requests to the Cloud Run, using Cloud Tasks (each request have different parameters, created previously by a Cloud Functions).
In the Cloud Tasks, I create each task with a "Bearer {token}" parameter
Cloud Task Headers Code:
task["http_request"]["headers"] = \
{"Authorization": f"Bearer {token}",
"Accept": "application/json"}
First situation:
When I create the task without the "oidc_token" parameter in the http_request creation.
Cloud Run returns "403 Forbidden", and never reach the decode_jwt function inside cloud run.
Cloud Task http_request Code:
task = {
"http_request": {
"http_method": tasks_v2.HttpMethod.POST,
"url": url,
}
}
Second situation:
I add an "oidc_token".
task = {
"http_request": {
"http_method": tasks_v2.HttpMethod.POST,
"url": url,
"oidc_token": {
"service_account_email": "service-task#xxxxx.iam.gserviceaccount.com",
}
}
Now, the request reach the Cloud Run decode_jwt function, and the log in Cloud Run returns "InvalidTokenError".
Extra: I added a logging.info to expose the token received in Cloud Run, and is not the token I passed in the Cloud Task Creation.
Problem Summary:
you have a public (allUsers) Cloud Run service.
you have created your own authorization mechanism (HS256 - HMAC with SHA-256).
you want to assign a custom token for the HTTP Authorization Bearer value.
Cloud Run authorization is managed by IAP.
Authorization for the Cloud Run service is managed by the Identity Aware Proxy (IAP). If you add an HTTP Authorization Bearer token, IAP will verify that token. That step fails for your custom token which results in an HTTP 403 Forbidden error.
Cloud Tasks supports two types of HTTP Authorization Bearer tokens. OAuth Access tokens and OIDC Identity tokens. You cannot use your own token value to replace the supported types.
That leaves you with two options:
Enhance your code to support Google signed OIDC Identity Tokens.
Use a custom HTTP header that supports your custom token format.
Note: I do not recommend using HS256. HS256 is a symmetric algorithm which means the secret must be known to both sides in order to validate the payload. RS256 is an asymmetric algorithm which uses private/public key pairs. To verify only requires the public key. This is one of the strong design features of Google's use of private keys for service accounts and identities. If you switch to Google's method, all of the hard work is done for you.
You have to specificy the audience of your Cloud Run service, like that
task = {
"http_request": { # Specify the type of request.
"http_method": tasks_v2.HttpMethod.POST,
"url": url, # The full url path that the task will be sent to.
"oidc_token": {
"service_account_email": "service-task#xxxxx.iam.gserviceaccount.com",
"audience": base url of Cloud Run, no /sub/path
}
}
I am trying to use google cloud to get the stats of my youtube channel, but I don't want to have to complete an OAuth every time and enter key into console.. I have an API key and am wondering if its possible to just get the Stats using the API key instead of having to go through OAuth process, as I plan to hook this up to a display and don't want to constantly have to do the OAuth.
Here is the code I copied from the Github, but once again, this does the OAuth and its kinda a pain and I wish to just use API key so I don't have to interact with it to get my stats.
import google.oauth2.credentials
import google_auth_oauthlib.flow
from googleapiclient.discovery import build
from googleapiclient.errors import HttpError
from google_auth_oauthlib.flow import InstalledAppFlow
SCOPES = ['https://www.googleapis.com/auth/yt-analytics.readonly']
API_SERVICE_NAME = 'youtubeAnalytics'
API_VERSION = 'v2'
CLIENT_SECRETS_FILE = 'YTClientSecrets.json'
def get_service():
flow = InstalledAppFlow.from_client_secrets_file(CLIENT_SECRETS_FILE, SCOPES)
credentials = flow.run_console()
return build(API_SERVICE_NAME, API_VERSION, credentials = credentials)
def execute_api_request(client_library_function, **kwargs):
response = client_library_function(
**kwargs
).execute()
print(response)
os.environ['OAUTHLIB_INSECURE_TRANSPORT'] = '1'
youtubeAnalytics = get_service()
execute_api_request(
youtubeAnalytics.reports().query,
ids='channel==MINE',
startDate='2017-01-01',
endDate='2021-12-31',
metrics='estimatedMinutesWatched,views,likes,subscribersGained',
dimensions='day',
sort='day'
)
print(response.text)```
Answer: no you can not use an API key to access private user data.
How to see if you need authorization for a method.
If you check the documentation for jobs.reports.get you will notice the following
All methods which request private user data will have an authorization section. This tells you which authorization scope is required in order to access the private data.
API keys only allow you to access public data, for exampmle public videos uploaded to youtube are public data so you do not need to be authorized to access them you can use an api key.
Service accounts
There is another type of authorization called service account authorization which allows you to pre authorize a service account to access private user data.
However there are limitations to which APIs support service account authentication.
The YouTube Analytics API does not support the service account flow.
The YouTube Reporting API only supports the service account flow for YouTube content owners that own and manage multiple YouTube channels. Specifically, content owners can use service accounts in API requests that set a value for the onBehalfOfContentOwner request parameter.
If you can not use Service account Authentication your only option is to make a single user type system where you authorize your script store your refresh token so that your script can use the refresh token in the future to request a new access token.
I have been trying (with little success) to have a google cloud function be triggered via an http request from a google sheet (google apps script) and it seemingly won't work. A few important things:
The function should only run if the user comes from my organization
The user should not have to be invited to the GCP project
I know this can be done very easily in google colabs and Python. The following script will let a user in my organization who is not in the GCP project trigger the cloud function:
import requests
import google.auth
from google.auth.transport.requests import Request
from google.colab import auth
credentials, project_id = google.auth.default()
request = Request()
credentials.refresh(request=request)
GCF_URL = ''https://project_location-project_id.cloudfunctions.net/name-of-your-funciton''
resp = requests.get(GCF_URL, headers={'Authorization': f'Bearer {credentials.id_token}'})
This will work and trigger the cloud function for any users inside my organization but does not work for my personal email for example.
Now, I would like to replicate this behvaiour inside a google apps script such that an end user with access to that sheet can trigger the cloud function as long as they are a member of my organization.
I have tried some things I have seen online such as this example:
function callExternalUrl() {
var url = 'https://project_location-project_id.cloudfunctions.net/name-of-your-funciton';
var oauthToken = ScriptApp.getOAuthToken(); // get the user who's logged into Google Sheet's OAuth token
const data = {
oauthToken, // 1. add the oauth token to the payload
activeUser: param.user // 2. this here is important as it adds the userinfo.email scope to the token
// any other data you need to send to the Cloud Function can be added here
};
var options = {
'method' : 'get', // or post, depending on how you set up your Cloud Function
'contentType': 'application/json',
// Convert the JavaScript object to a JSON string.
'payload' : JSON.stringify(data)
};
const response = UrlFetchApp.fetch(url, options);
Logger.log('Response Code: ' + response.getResponseCode());
}
This gives a 403 error but if I change it up so that it gives the OAuth on the correct format like this:
function callExternalUrl() {
var url = 'https://project_location-project_id.cloudfunctions.net/name-of-your-funciton';
var oauthToken = ScriptApp.getOAuthToken(); // get the user who's logged into Google Sheet's OAuth token
var response = UrlFetchApp.fetch(url, {
headers: {
Authorization: 'Bearer ' + oauthToken
}
});
// const response = UrlFetchApp.fetch(url, options);
Logger.log('Response Code: ' + response.getResponseCode());
}
I get a 401 (i.e. the authorization failed). Now, it seems that I simply have to get the correct authentication from the users to send in this request for it to work. I have seen this github repo that focuses on getting OAuth2 from google apps scripts (https://github.com/gsuitedevs/apps-script-oauth2), but I can't seem to get that to work either, it would have to be adapted to cloud in some way I am unaware of.
I have read
Securely calling a Google Cloud Function via a Google Apps Script
which is very similar but it did not seem to get to the root of the problem, any input on how to make this process possible?
I created an python application that is using the Youtube api (so examples are in python, but doesn't really matter, the concepts should be the same). I managed to get it working where I can connect and make api calls. However, when I connect to the api, I have to define a flow that checks if a the credentials storage file exists. If it doesn't, then I have to manually sign in using the flow. After sign in the file (main.py-oauth2.json), is created with the token. I would like to be able to download the credentials without having to sign manually sign in. I was hoping there was a way to make a POST request for that token, like I have seen here, but I have been able to do this with Youtube api. Does anyone know how to implement the desired feature ?
main.py
flow = flow_from_clientsecrets(CLIENT_SECRETS_FILE,
scope=YOUTUBE_UPLOAD_SCOPE,
message=MISSING_CLIENT_SECRETS_MESSAGE)
storage = Storage(OAUTH_CREDENTIALS)
credentials = storage.get()
if credentials is None or credentials.invalid:
# manual / UI login
credentials = run_flow(flow, storage, args)
Trying to use a google service account throws 401 errors on upload.
credentials = Credentials.from_service_account_file(
SERVICE_ACCOUNT_FILE, scopes=YOUTUBE_UPLOAD_SCOPES)
if credentials is None or credentials.expired:
raise ValueError('Invalid credentials')
return build(YOUTUBE_API_SERVICE_NAME, YOUTUBE_API_VERSION,
credentials=credentials)
...
status, response = insert_request.next_chunk()
# <HttpError 401 "Unauthorized">
Evidence this can be done
The oauth2client.service_account.ServiceAccountCredentials class is
only used with OAuth 2.0 Service Accounts. No end-user is involved
for these server-to-server API calls, so you can create this object
directly without using a Flow object.
youtube api
Oauth flow docs
https://developers.google.com/identity/protocols/OAuth2#serviceaccount
The problem is that most YouTube data is private user data. Being that it is private user data you must be authenticated as a user who has access to the data in question in order to access it. To do that we use Oauth2 and login to our account and get an access token and a refresh token returned.
The access token can be used to request data from the Youtube Api, the refresh token can be used to request a new access token when ever the access token expires (After an hour)
Normally i would say that you should consider using a service account. Services accounts are dummy users who can be preconfigured with access to user data. Unfortunately the Youtube api does not support service accounts.
What you should be doing and what i have done a number of times in the past is to authenticate your code once. Get the refresh token and save it. In the future whenever you wish to run your application you simply use the refresh token to request a new access token and you will be able to access the api. You wont have to manually type your login and password and consent to the access anymore everything can be done in the background using the refesh token.
Note: You will need to watch it there are some cases that can cause a refresh token to expire but you shouldn't worry for the most part they are good for as long as you continue to use them regularly.
I am not a python dev but found this
from oauth2client import client, GOOGLE_TOKEN_URI
CLIENT_ID = "client_id"
CLIENT_SECRET = "client_secret"
REFRESH_TOKEN = "refresh_token"
credentials = client.OAuth2Credentials(
access_token = None,
client_id = CLIENT_ID,
client_secret = CLIENT_SECRET,
refresh_token = REFRESH_TOKEN,
token_expiry = None,
token_uri = GOOGLE_TOKEN_URI,
token_ id = None,
revoke_uri= None)
http = credentials.authorize(httplib2.Http())
I'm trying to create Circles with the Google+ API, but I'm kinda stuck, this is my code, it was more or less copied from the official API documentation (yes I know it doesn't create Circle, but the issue is the same)
import httplib2
from apiclient.discovery import build
from oauth2client.client import OAuth2WebServerFlow
import json
with open('client_secrets.json', 'r') as f:
json_data = json.load(f)
data = json_data['web']
CLIENT_ID = data['client_id']
CLIENT_SECRET = data['client_secret']
# List the scopes your app requires:
SCOPES = ['https://www.googleapis.com/auth/plus.me',
'https://www.googleapis.com/auth/plus.circles.write']
# The following redirect URI causes Google to return a code to the user's
# browser that they then manually provide to your app to complete the
# OAuth flow.
REDIRECT_URI = 'http://localhost/oauth2callback'
# For a breakdown of OAuth for Python, see
# https://developers.google.com/api-client-library/python/guide/aaa_oauth
# CLIENT_ID and CLIENT_SECRET come from your APIs Console project
flow = OAuth2WebServerFlow(client_id=CLIENT_ID,
client_secret=CLIENT_SECRET,
scope=SCOPES,
redirect_uri=REDIRECT_URI)
auth_uri = flow.step1_get_authorize_url()
# This command-line server-side flow example requires the user to open the
# authentication URL in their browser to complete the process. In most
# cases, your app will use a browser-based server-side flow and your
# user will not need to copy and paste the authorization code. In this
# type of app, you would be able to skip the next 3 lines.
# You can also look at the client-side and one-time-code flows for other
# options at https://developers.google.com/+/web/signin/
print 'Please paste this URL in your browser to authenticate this program.'
print auth_uri
code = raw_input('Enter the code it gives you here: ')
# Set authorized credentials
credentials = flow.step2_exchange(code)
# Create a new authorized API client.
http = httplib2.Http()
http = credentials.authorize(http)
service = build('plusDomains', 'v1', http=http)
from apiclient import errors
try:
people_service = service.people()
people_document = people_service.get(userId='me').execute()
except errors.HttpError, e:
print e.content
My output:
{
"error": {
"errors": [
{
"domain": "global",
"reason": "forbidden",
"message": "Forbidden"
}
],
"code": 403,
"message": "Forbidden"
}
}
I searched for answer, but didn't really find any. On the API console I have Google+ API and
Google+ Domains API services added also my secret and client id are okay (otherwise the whole script would fail sooner). Also the auth is successful, my app's name is shown under https://accounts.google.com/IssuedAuthSubTokens. What did I miss?
The problem lies with your REDIRECT_URI variable. When you are using OAuth 2.0 in a purely server-side flow, the redirect URI MUST be 'urn:ietf:wg:oauth:2.0:oob'.
Try changing the variable like so (and be sure to update your client ID in the API Console):
REDIRECT_URI = 'urn:ietf:wg:oauth:2.0:oob'
Edit: Also, make sure that you are making your API call for a user within a domain. The Google+ Domains API only permits API calls that are restricted to users and content within that domain.