I'm trying to set up a deployment street in Google Cloud Build. To do this, I want to:
Run unit test
Deploy to Cloud Run without traffic
Run integration tests
Migrate traffic in Cloud Run
I've got this mostly set up, but my integration tests include a couple of calls to Cloud Run to validate that authenticated calls return 200 and unauthenticated return 401. The thing I'm having difficulties with is to make signed requests from Cloud Build. When deploying by hand and running integration tests, they work, but not from Cloud Build.
Ideally, I would like to use the Cloud Build Service Account for invoking Cloud Run like I usually do in AWS, but I can't find how to get access to that from the Cloud Runner. So instead, I retrieve a credentials file from Secret Manager. This credentials file is from a newly created Service Account with Cloud Run Invoker role:
steps:
- name: gcr.io/cloud-builders/gcloud
id: get-github-ssh-secret
entrypoint: 'bash'
args: [ '-c', 'gcloud secrets version access latest --secret=name-of-secret > /root/service-account/credentials.json' ]
volumes:
- name: 'service-account'
path: /root/service-account
...
- name: python:3.8.7
id: integration-tests
entrypoint: /bin/sh
args:
- '-c'
- |-
if [ $_STAGE != "prod" ]; then
python -m pip install -r requirements.txt
python -m pytest test/integration --disable-warnings ;
fi
volumes:
- name: 'service-account'
path: /root/service-account
For the integration tests, I've created a class called Authorizer and I have __get_authorized_header_for_cloud_build and __get_authorized_header_for_cloud_build2 as attempts:
import json
import time
import urllib
from typing import Optional
import google.auth
import requests
from google import auth
from google.auth.transport.requests import AuthorizedSession
from google.oauth2 import service_account
import jwt
class Authorizer(object):
cloudbuild_credential_path = "/root/service-account/credentials.json"
# Permissions to request for Access Token
scopes = ["https://www.googleapis.com/auth/cloud-platform"]
def get_authorized_header(self, receiving_service_url) -> dict:
auth_header = self.__get_authorized_header_for_current_user() \
or self.__get_authorized_header_for_cloud_build(receiving_service_url)
return auth_header
def __get_authorized_header_for_current_user(self) -> Optional[dict]:
credentials, _ = auth.default()
auth_req = google.auth.transport.requests.Request()
credentials.refresh(auth_req)
if hasattr(credentials, "id_token"):
authorized_header = {"Authorization": f'Bearer {credentials.id_token}'}
auth_req.session.close()
print("Got auth header for current user with auth.default()")
return authorized_header
def __get_authorized_header_for_cloud_build2(self, receiving_service_url) -> dict:
credentials = service_account.Credentials.from_service_account_file(
self.cloudbuild_credential_path, scopes=self.scopes)
auth_req = google.auth.transport.requests.Request()
credentials.refresh(auth_req)
return {"Authorization": f'Bearer {credentials.token}'}
def __get_authorized_header_for_cloud_build(self, receiving_service_url) -> dict:
with open(self.cloudbuild_credential_path, 'r') as f:
data = f.read()
credentials_json = json.loads(data)
signed_jwt = self.__create_signed_jwt(credentials_json, receiving_service_url)
token = self.__exchange_jwt_for_token(signed_jwt)
return {"Authorization": f'Bearer {token}'}
def __create_signed_jwt(self, credentials_json, run_service_url):
iat = time.time()
exp = iat + 3600
payload = {
'iss': credentials_json['client_email'],
'sub': credentials_json['client_email'],
'target_audience': run_service_url,
'aud': 'https://www.googleapis.com/oauth2/v4/token',
'iat': iat,
'exp': exp
}
additional_headers = {
'kid': credentials_json['private_key_id']
}
signed_jwt = jwt.encode(
payload,
credentials_json['private_key'],
headers=additional_headers,
algorithm='RS256'
)
return signed_jwt
def __exchange_jwt_for_token(self, signed_jwt):
body = {
'grant_type': 'urn:ietf:params:oauth:grant-type:jwt-bearer',
'assertion': signed_jwt
}
token_request = requests.post(
url='https://www.googleapis.com/oauth2/v4/token',
headers={
'Content-Type': 'application/x-www-form-urlencoded'
},
data=urllib.parse.urlencode(body)
)
return token_request.json()['id_token']
So when running locally, the __get_authorized_header_for_current_user is being used and works. When running in Cloud Build, __get_authorized_header_for_cloud_build is used. But even when temporarily disabling __get_authorized_header_for_current_user and let cloudbuild_credential_path reference to a json-file on my local pc, it keep getting 401s. Even when I give the service account from the credentials-file Owner rights. Another attempt is __get_authorized_header_for_cloud_build where I try to get the token more by myself instead of a package, but still 401.
For completeness, the integration test look somewhat like this:
class NameOfViewIntegrationTestCase(unittest.TestCase):
base_url = "https://**.a.run.app"
name_of_call_url = base_url + "/name-of-call"
def setUp(self) -> None:
self._authorizer = Authorizer()
def test_name_of_call__authorized__ok_result(self) -> None:
# Arrange
url = self.name_of_call_url
# Act
response = requests.post(url, headers=self._authorizer.get_authorized_header(url))
# Arrange
self.assertTrue(response.ok, msg=f'{response.status_code}: {response.text}')
Any idea what I'm doing wrong here? Let me know if you need any clarification on something. Thanks in advance!
Firstly, your code is too complex. If you want to leverage the Application Default Credential (ADC) according with the runtime environment, only these lines are enough
from google.oauth2.id_token import fetch_id_token
from google.auth.transport import requests
r = requests.Request()
print(fetch_id_token(r,"<AUDIENCE>"))
On Google Cloud Platform, the environment service account will be used thanks to the metadata server. On your local environment, you need to set the environment variable GOOGLE_APPLICATION_CREDENTIALS with as value the path of the service account key file
Note: you can generate id_token only with service account credential (on GCP or on your environment), it's not possible with your user account
The problem here, it's that doesn't work on Cloud Build. I don't know why, but it's not possible to generate an id_token with the Cloud Build metadata server. So, I wrote an article on this with a possible workaround
Related
My requirement is that I have written a lambda function in AWS for automatically creating a repository in GitHub using the GitHub API and PAT Token authentication.
def create_automatic_repo(repo_name):
query_url = f"https://api.github.com/api/v3/orgs/{org_name}/repos"
params = {
"name": repo_name
}
headers = {
'Authorization': f'token {secret[secretKey]}',
}
response = requests.post(query_url, headers=headers, data=json.dumps(params))
print("creating new repository response ", response)
print("creating new repository response content ", response.content)
We successfully created a repo using the Github API with PAT Token. Now we need to change authentication from PAT Token to the Github Apps.
I am trying to authenticate Github Apps using AppId and PrivateKey. I have generated the jwt token with the jwt token. I am trying to hit "https://api.github.com/app/installations/installation_id/access_tokens" this GitHub api for getting access_token. I am getting a 200 response but it is redirecting to the SAML authentication page.
$ curl -i \
-H "Authorization: token YOUR_INSTALLATION_ACCESS_TOKEN" \
-H "Accept: application/vnd.github+json" \
https://api.github.com/api/v3/orgs/{org_name}/repos
This is the curl command I have found in the official document. If I have access_token, I can use the GitHub API for creating a repo through a lambda function in AWS.
I am attaching the flow which I have followed for Authentication for Github Apps. Here I am attaching the official document which I have followed : https://docs.github.com/en/developers/apps/building-github-apps/authenticating-with-github-apps
Created Github Apps by giving homepage url as GitHub Organization url
Installed Github App under the organization level
Wrote python code for generating jwt token
Here I am attaching the Python code for generating the JWT token and triggering the GitHub API for installation_Id. I am getting 200 responses but it is redirecting to the SAML authentication page.
import json
import os
import time
import jwt
import requests
from cryptography.hazmat.backends import default_backend
cert_bytes = open(r'first.txt', "r").read().encode()
print("prtinging cert_bytes ", cert_bytes)
private_key = default_backend().load_pem_private_key(cert_bytes, None)
time_since_epoch_in_seconds = int(time.time())
payload = {
# issued at time, 60 seconds in the past to allow for clock drift
"iat": time_since_epoch_in_seconds - 60,
# JWT expiration time (10 minute maximum)
"exp": time_since_epoch_in_seconds + (10 * 60),
# GitHub App's identifier
"iss": 231726,
}
encoded_payload = jwt.encode(payload, private_key, algorithm="RS256")
print("printing encoded_payload ", encoded_payload)
headers = {
'Authorization': f'Bearer {encoded_payload}'
}
resp = requests.get("https://api.github.com/app/installations/installation_id/access_tokens", headers=headers)
print('Code: ', resp.status_code)
print('Content: ', resp.content)
This is the Image which I am redirecting to the SAML Authentication Page:
I read the GitHub official documentation, and they mentioned that we needed to activate a SAML session to authenticate Github Apps:
https://docs.github.com/en/enterprise-cloud#latest/authentication/authenticating-with-saml-single-sign-on/about-authentication-with-saml-single-sign-on#about-oauth-apps-github-apps-and-saml-sso
But I didn't see the option to enable to SSO SAML authentication as mentioned in the document. : https://docs.github.com/en/enterprise-cloud#latest/organizations/managing-saml-single-sign-on-for-your-organization/enabling-and-testing-saml-single-sign-on-for-your-organization#enabling-and-testing-saml-single-sign-on-for-your-organization
This is the Image where I did not find option for enabling the SAML Authentication:
[]
Can you please help us on enabling SAML authentication for accessing Github Apps Authentication Process without PAT Token or is there any other way for GitHub authentication from lambda function in aws using GitHub api's apart from PAT Token.
I have prepared automation in GCP cloud, automation is prepared in Python SDK. Script is deploying VPC firewall rules (I used documentation to prepare it - GCP Python SDK firewall deployment). Automation works as expected, firewall rules are being created in the Google environment, but how can I check if the deployment completed successfully? I know that I can use list method to create list of existing firewall rules, then compere it with rules I wanted do deploy, but if there any native method to verify the deployment status?
OK, I prepared that code, and it works in my environment.
import json
import time
import googleapiclient
from googleapiclient import discovery
from oauth2client.client import GoogleCredentials
from colormodule import color
credentials = GoogleCredentials.get_application_default()
def create_vpc_firewall_rule(credentials, discovery, project, firewall_body):
service = discovery.build('compute', 'v1', credentials=credentials)
request = service.firewalls().insert(project=project, body=firewall_body)
response = request.execute()
return response
def firewall_body(traffic_direction_name, network_name, ports, json, project_name):
rule_name_input = f'firewall-rule-test-{traffic_direction_name}'
network_input = f'projects/{project_name}/global/networks/{network_name}'
json_string = {
"name": rule_name_input,
"allowed": [
{
"IPProtocol": "tcp",
"ports": ports
}
],
"network": network_input,
"direction": "EGRESS",
"destinationRanges": "192.168.0.23/32",
"priority": 1000,
"targetTags": [
"testwindows"
]
}
data = json.dumps(json_string)
return data
def wait_for_operation_fw_deployment(compute, project_id, operation, fw_rule_name, colors):
print('')
print('Waiting for operation to finish...')
print('')
while True:
result = compute.globalOperations().get(
project=project_id,
operation=operation).execute()
if result['status'] == 'DONE':
print(f'Deployment of {colors.OKCYAN}%s{colors.ENDC} Firewall rule has been completed.'
% fw_rule_name)
print('')
time.sleep(2)
if 'error' in result:
raise Exception(result['Deployment has returned an error.'])
return result
time.sleep(2)
firewall_body = eval(firewall_body(traffic_direction_name,network_name, ports, json, project_name))
operation = create_vpc_firewall_rule(credentials, discovery, project_name, firewall_body)
compute = googleapiclient.discovery.build('compute', 'v1')
wait_for_operation_fw_deployment(compute, project_id, operation['name'], fw_rule_name, colors)
I have tried downloading file from Google Drive to my local system using python script but facing a "forbidden" issue while running a Python script. The script is as follows:
import requests
url = "https://www.googleapis.com/drive/v3/files/1wPxpQwvEEOu9whmVVJA9PzGPM2XvZvhj?alt=media&export=download"
querystring = {"alt":"media","export":"download"}
headers = {
'Authorization': "Bearer TOKEN",
'Host': "www.googleapis.com",
'Accept-Encoding': "gzip, deflate",
'Connection': "keep-alive",
}
response = requests.request("GET", url, headers=headers, params=querystring)
print(response.url)
#
import wget
import os
from os.path import expanduser
myhome = expanduser("/home/sunarcgautam/Music")
### set working dir
os.chdir(myhome)
url = "https://www.googleapis.com/drive/v3/files/1wPxpQwvEEOu9whmVVJA9PzGPM2XvZvhj?alt=media&export=download"
print('downloading ...')
wget.download(response.url)
In this script, I have got forbidden issue. Am I doing anything wrong in the script?
I have also tried another script that I found on a Google Developer page, which is as follows:
import auth
import httplib2
SCOPES = "https://www.googleapis.com/auth/drive.scripts"
CLIENT_SECRET_FILE = "client_secret.json"
APPLICATION_NAME = "test_Download"
authInst = auth.auth(SCOPES, CLIENT_SECRET_FILE, APPLICATION_NAME)
credentials = authInst.getCredentials()
http = credentials.authorize(httplib2.Http())
drive_serivce = discovery.build('drive', 'v3', http=http)
file_id = '1Af6vN0uXj8_qgqac6f23QSAiKYCTu9cA'
request = drive_serivce.files().export_media(fileId=file_id,
mimeType='application/pdf')
fh = io.BytesIO()
downloader = MediaIoBaseDownload(fh, request)
done = False
while done is False:
status, done = downloader.next_chunk()
print ("Download %d%%." % int(status.progress() * 100))
This script gives me a URL mismatch error.
So what should be given for redirect URL in Google console credentials? or any other solution for the issue? Do I have to authorise my Google console app from Google in both the script? If so, what will the process of authorising the app because I haven't found any document regarding that.
To make requests to Google APIs the work flow is in essence the following:
Go to developer console, log in if you haven't.
Create a Cloud Platform project.
Enable for your project, the APIs you are interested in using with you projects' apps (for example: Google Drive API).
Create and download OAuth 2.0 Client IDs credentials that will allow your app to gain authorization for using your enabled APIs.
Head over to OAuth consent screen, click on and add your scope using the button. (scope: https://www.googleapis.com/auth/drive.readonly for you). Choose Internal/External according to your needs, and for now ignore the warnings if any.
To get the valid token for making API request the app will go through the OAuth flow to receive the authorization token. (Since it needs consent)
During the OAuth flow the user will be redirected to your the OAuth consent screen, where it will be asked to approve or deny access to your app's requested scopes.
If consent is given, your app will receive an authorization token.
Pass the token in your request to your authorized API endpoints.[2]
Build a Drive Service to make API requests (You will need the valid token)[1]
NOTE:
The available methods for the Files resource for Drive API v3 are here.
When using the Python Google APIs Client, then you can use export_media() or get_media() as per Google APIs Client for Python documentation
IMPORTANT:
Also, check that the scope you are using, actually allows you to do what you want (Downloading Files from user's Drive) and set it accordingly. ATM you have an incorrect scope for your goal. See OAuth 2.0 API Scopes
Sample Code References:
Building a Drive Service:
import google_auth_oauthlib.flow
from google.auth.transport.requests import Request
from google_auth_oauthlib.flow import InstalledAppFlow
from googleapiclient.discovery import build
class Auth:
def __init__(self, client_secret_filename, scopes):
self.client_secret = client_secret_filename
self.scopes = scopes
self.flow = google_auth_oauthlib.flow.Flow.from_client_secrets_file(self.client_secret, self.scopes)
self.flow.redirect_uri = 'http://localhost:8080/'
self.creds = None
def get_credentials(self):
flow = InstalledAppFlow.from_client_secrets_file(self.client_secret, self.scopes)
self.creds = flow.run_local_server(port=8080)
return self.creds
# The scope you app will use.
# (NEEDS to be among the enabled in your OAuth consent screen)
SCOPES = "https://www.googleapis.com/auth/drive.readonly"
CLIENT_SECRET_FILE = "credentials.json"
credentials = Auth(client_secret_filename=CLIENT_SECRET_FILE, scopes=SCOPES).get_credentials()
drive_service = build('drive', 'v3', credentials=credentials)
Making the request to export or get a file
request = drive_service.files().export(fileId=file_id, mimeType='application/pdf')
fh = io.BytesIO()
downloader = MediaIoBaseDownload(fh, request)
done = False
while done is False:
status, done = downloader.next_chunk()
print("Download %d%%" % int(status.progress() * 100))
# The file has been downloaded into RAM, now save it in a file
fh.seek(0)
with open('your_filename.pdf', 'wb') as f:
shutil.copyfileobj(fh, f, length=131072)
I have tried uploading file to Google Drive from my local system using a Python script but I keep getting HttpError 403. The script is as follows:
from googleapiclient.http import MediaFileUpload
from googleapiclient import discovery
import httplib2
import auth
SCOPES = "https://www.googleapis.com/auth/drive"
CLIENT_SECRET_FILE = "client_secret.json"
APPLICATION_NAME = "test"
authInst = auth.auth(SCOPES, CLIENT_SECRET_FILE, APPLICATION_NAME)
credentials = authInst.getCredentials()
http = credentials.authorize(httplib2.Http())
drive_serivce = discovery.build('drive', 'v3', credentials=credentials)
file_metadata = {'name': 'gb1.png'}
media = MediaFileUpload('./gb.png',
mimetype='image/png')
file = drive_serivce.files().create(body=file_metadata,
media_body=media,
fields='id').execute()
print('File ID: %s' % file.get('id'))
The error is :
googleapiclient.errors.HttpError: <HttpError 403 when requesting
https://www.googleapis.com/upload/drive/v3/files?uploadType=multipart&alt=json&fields=id
returned "Insufficient Permission: Request had insufficient authentication scopes.">
Am I using the right scope in the code or missing anything ?
I also tried a script I found online and it is working fine but the issue is that it takes a static token, which expires after some time. So how can I refresh the token dynamically?
Here is my code:
import json
import requests
headers = {
"Authorization": "Bearer TOKEN"}
para = {
"name": "account.csv",
"parents": ["FOLDER_ID"]
}
files = {
'data': ('metadata', json.dumps(para), 'application/json; charset=UTF-8'),
'file': ('mimeType', open("./test.csv", "rb"))
}
r = requests.post(
"https://www.googleapis.com/upload/drive/v3/files?uploadType=multipart",
headers=headers,
files=files
)
print(r.text)
Answer:
Delete your token.pickle file and re-run your application.
More Information:
As long as you have the correct set of credentials then all that is required when you update the scopes of your application is to re-obtain a token. Delete the token file located in the application's root folder, then run the application again. If you have the https://www.googleapis.com/auth/drive scope, and the Gmail API enabled in the developer console, you should be good.
References:
Google Drive API - Files: create method
"Insufficient Permission: Request had insufficient authentication scopes."
Means that the user you have authenticated with has not granted your application permission to do what you are trying to do.
The files.create method requires that you have authenticated the user with one of the following scopes.
while your code does appear to be using the full on drive scope. What i suspect has happens is that you have authenticated your user then changed the scope in your code and not promoted the user to login again and grant consent. You need to remove the users consent from your app either by having them remove it directly in their google account or just deleteing the credeitnals you have stored in your app. This will force the user to login again.
There is also an approval prompt force option to the google login but am not a python dev so im not exactly sure how to force that. it should be something like the prompt='consent' line below.
flow = OAuth2WebServerFlow(client_id=CLIENT_ID,
client_secret=CLIENT_SECRET,
scope='https://spreadsheets.google.com/feeds '+
'https://docs.google.com/feeds',
redirect_uri='http://example.com/auth_return',
prompt='consent')
consent screen
If done correctly the user should see a screen like this
Prompting them to grant you full access to their drive account
Token pickle
If you are following googles tutorial here https://developers.google.com/drive/api/v3/quickstart/python you need to delete the token.pickle that contains the users stored consent.
if os.path.exists('token.pickle'):
with open('token.pickle', 'rb') as token:
creds = pickle.load(token)
You can use the google-api-python-client to build a Drive service for using Drive API.
Get your Authorization as by following the first 10 steps of this answer.
If you want the user to go through consent screen only once, then store the credentials in a file. They include a refresh token that app can use to request authorization after expired.Example
With a valid Drive Service you can upload a file by calling a function like the following upload_file:
def upload_file(drive_service, filename, mimetype, upload_filename, resumable=True, chunksize=262144):
media = MediaFileUpload(filename, mimetype=mimetype, resumable=resumable, chunksize=chunksize)
# Add all the writable properties you want the file to have in the body!
body = {"name": upload_filename}
request = drive_service.files().create(body=body, media_body=media).execute()
if getFileByteSize(filename) > chunksize:
response = None
while response is None:
chunk = request.next_chunk()
if chunk:
status, response = chunk
if status:
print("Uploaded %d%%." % int(status.progress() * 100))
print("Upload Complete!")
Now pass in the parameters and call the function...
# Upload file
upload_file(drive_service, 'my_local_image.png', 'image/png', 'my_imageination.png' )
You will see the file with the name: my_imageination.png in your Google Drive root folder.
More about the Drive API v3 service and available methods here.
getFileSize() function:
def getFileByteSize(filename):
# Get file size in python
from os import stat
file_stats = stat(filename)
print('File Size in Bytes is {}'.format(file_stats.st_size))
return file_stats.st_size
Uploading to certain folder(s) in your drive is easy...
Just add the parent folder Id(s) in the body of the request.
Here are the properties of a File.
Example:
request_body = {
"name": "getting_creative_now.png",
"parents": ['myFiRsTPaRentFolderId',
'MyOtherParentId',
'IcanTgetEnoughParentsId'],
}
To use the scope 'https://www.googleapis.com/auth/drive' you need to submit the google app for verification.
Find the image for scope
So use the scope 'https://www.googleapis.com/auth/drive.file' instead of 'https://www.googleapis.com/auth/drive' to upload files without verification.
Also use SCOPES as list.
ex: SCOPES = ['https://www.googleapis.com/auth/drive.file']
I can successfully upload and download the files to google drive by using the above SCOPE.
I found the solution for uploading a file to google drive. Here it is:
import requests
import json
url = "https://www.googleapis.com/oauth2/v4/token"
payload = "{\n\"" \
"client_id\": \"CLIENT_ID" \
"\",\n\"" \
"client_secret\": \"CLIENT SECRET" \
"\",\n\"" \
"refresh_token\": \"REFRESH TOKEN" \
"\",\n\"" \
"grant_type\": \"refresh_token\"\n" \
"}"
headers = {
'grant_type': 'authorization_code',
'Content-Type': 'application/json'
}
response = requests.request("POST", url, headers=headers, data=payload)
res = json.loads(response.text.encode('utf8'))
headers = {
"Authorization": "Bearer %s" % res['access_token']
}
para = {
"name": "file_path",
"parents": "google_drive_folder_id"
}
files = {
'data': ('metadata', json.dumps(para), 'application/json; charset=UTF-8'),
# 'file': open("./gb.png", "rb")
'file': ('mimeType', open("file_path", "rb"))
}
r = requests.post(
"https://www.googleapis.com/upload/drive/v3/files?uploadType=multipart",
headers=headers,
files=files
)
print(r.text)
For generating client id, client secret and refresh token, you can follow the link :- click here
Maybe the question is a little bit outdated, but I found an easy way to upload files on google drive from python
pip install gdrive-python
Then you have to allow the script to upload files on your Google account with this command and follow the instructions:
python -m drive about
Finally, upload the file:
form gdrive import GDrive
drive = GDrive()
drive.upload('path/to/file')
More info on the GitHub repo: https://github.com/vittoriopippi/gdrive-python
Hi I want to use the google api service to create service accounts.
Here is my current code:
base_url = f"https://iam.googleapis.com/v1/projects/{project}/serviceAccounts"
auth = f"?access_token={access_token}"
data = {"accountId": name,
"serviceAccount": {
"displayName": name
}}
Create a service Account
r = requests.post(base_url + auth, json=data)
try:
r.raise_for_status()
except requests.HTTPError:
if r.status_code != 409:
raise
This works, but it uses the requests package.
I want to use googleapiclient
from googleapiclient.discovery import build
credentials = GoogleCredentials.get_application_default()
api = build(service, version, credentials=credentials)
Then, where do I find information on how to use this api object?
I've tried:
api.projects().serviceAccounts.create(name=name).execute()
But this does not work, and I don't know how to find what arguments are expected or required.
You can find the GCP IAM API documentation here.
The arguments required and values are documented there.
For anyone else who is struggling.
Check out api explorer to get the format of the request.
For example, If the endpoint is iam.projects.serviceAccounts.get
and you need to provide name = "projects/project/serviceAccounts/sa#gsc.googleserviceaccounts.com"
Then your call will look like:
from googleapiclient.discovery import build
credentials = GoogleCredentials.get_application_default()
api = build(service, version, credentials=credentials)
sa = api.projects().serviceAccounts().get(name="projects/project/serviceAccounts/sa#gsc.googleserviceaccounts.com")
Hope this helps someone.