Is there a way to authenticate gspread with the default service account? - python

If I want to create/read/update spreadsheets using gspread, I know first have to authenticate like so:
import gspread
gc = gspread.service_account()
Where I can also specify the filename to point to a service account json key, but is there a way I can tell gspread to use the default service account credentials without pointing to a json?
My use-case is that I want to run gspread in a vm (or cloud function) that already comes with an IAM role and I can't seem to figure out where to get the json file from. I also don't want to copy the json to the vm unnecessarily.

You can use google.auth to get the credentials of the default service account.
Then you can use gspread.authorize() with these credentials:
import google.auth
import gspread
credentials, project_id = google.auth.default(
scopes=[
'https://spreadsheets.google.com/feeds',
'https://www.googleapis.com/auth/drive'
]
)
gc = gspread.authorize(credentials)

As per the use case, Cloud function has Runtime service account feature which allows you to use default service account or you can attach any service account to the cloud function.
Using this library you can do the thing without json file.
import google.auth
credentials, project_id = google.auth.default()

Related

GCP Python Compute Engine - list VM's

I have the following Python3 script:
import os, json
import googleapiclient.discovery
from google.oauth2 import service_account
from google.cloud import storage
storage_client = storage.Client.from_service_account_json('gcp-sa.json')
buckets = list(storage_client.list_buckets())
print(buckets)
compute = googleapiclient.discovery.build('compute', 'v1')
def list_instances(compute, project, zone):
result = compute.instances().list(project=project, zone=zone).execute()
return result['items'] if 'items' in result else None
list_instances(compute, "my-project", "my-zone")
Only listing buckets without the rest works fine, that tells me that my service account (with has read access to the whole project) should work. How can I now list VM's? Using the code above, I get
raise exceptions.DefaultCredentialsError(_HELP_MESSAGE)
google.auth.exceptions.DefaultCredentialsError: Could not automatically determine credentials. Please set GOOGLE_APPLICATION_CREDENTIALS or explicitly create credentials and re-run the application. For more information, please see https://cloud.google.com/docs/authentication/getting-started
So that tells me that I somehow have to pass the service account json. How is that possible?
Thanks!!

query google groups api from gcp instance/function python

I'm trying to create a script to query google groups API from GCP instance. The instance has SA attached to it, this SA has SCOPE - 'https://www.googleapis.com/auth/admin.directory.group.readonly' allowed in GSuite, and the user is also setup in GSuite with a custom role attached to it(list groups).
For the SA I created a key file in GCP console. Then I get the credentials as documentation says:
from googleapiclient.discovery import build
from google.oauth2 import service_account
creds = service_account.Credentials.from_service_account_file(
SERVICE_ACCOUNT_FILE, scopes=SCOPES)
then add the user - to act as.
creds = creds.with_subject('user#domain.com')
service = build('admin', 'directory_v1', credentials=creds)
results = service.groups().list(domain=tenant, maxResults=10,
orderBy='email',
query='email:{}*'.format(group_name)).execute()
Then I query API, everything works perfect and I get the groups.
So my question is:
Is there a way to use the SA attached to the instance without generating the json key file. Like get compute_instance / default credentials / from instance metadata and then somehow authenticate them to the GSuite API?
Or is there a way to query groups without hitting Gsuite API, just call some from within GCP?
You should read the next article on the official GCP Documentation page
Here is an example of how to bind service account to VM
gcloud compute instances create example-vm \
--service-account my-sa#my-project.iam.gserviceaccount.com \
--scopes https://www.googleapis.com/auth/admin.directory.group.readonly
The answers I got from google:
No, it's not possible without generating a private key for SA for impersonation.
No, the right way of getting the groups is to query Gsuite's APIs.

How to get delegated credentials objects for invoking google apis?

I am trying to fetch gsuite alerts via API. I have created a service account as per their docs and I have assigned that service account to my google cloud function.
I do not want to use environment variables or upload credentials along with source code but I want leverage default service account used by function.
from googleapiclient.discovery import build
def get_credentials():
# if one knows credentials file location(when one uploads the json credentials file or specify them in environment variable) one can easily get the credentials by specify the path.
# In case of google cloud functions atleast I couldn't find it the path as the GOOGLE_APPLICATION_CREDENTIALS is empty in python runtime
# the below code work find if one uncomments the below line
#credentials = ServiceAccountCredentials.from_json_keyfile_name(key_file_location)
credentials = < how to get default credentials object for default service account?>
delegated_credentials = credentials.create_delegated('admin#alertcenter1.bigr.name').create_scoped(SCOPES)
return delegated_credentials
def get_alerts(api_name, api_version, key_file_location=None):
delegated_credentials = get_credentials()
alertcli = build(api_name, api_version, credentials=delegated_credentials)
resp = alertcli.alerts().list(pageToken=None).execute()
print(resp)
Is there any way I can create a default credentials object. I have tried using
from google.auth import credentials but this does not contain create_delegated function and
I have also tried ServiceAccountCredentials() but this requires signer.
Here is an example to use the Gmail API with delegated credentials. The service account credentials will need "Enable G Suite Domain-wide Delegation" enabled.
from google.oauth2 import service_account
from googleapiclient.discovery import build
credentials = service_account.Credentials.from_service_account_file(
credentials_file,
scopes=['https://www.googleapis.com/auth/gmail.send'])
impersonate = 'username#example.com'
credentials = credentials.with_subject(impersonate)
service = build('gmail', 'v1', credentials=credentials)
You can use the google.auth.default function to get the default credentials and use them to make an IAM signer which can be used to create new service account credentials which has the delegated email adress as subject. I have a more detailed answer for a similar question.
There is also Google Cloud Platform Github repository with some documentation about this method.

How do I authenticate the with the Google Storage SDK using a JSON service account key?

I'm having a look at the API reference located at:
https://googlecloudplatform.github.io/google-cloud-python/latest/storage/client.html
How am I meant to authenticate when using a service account? I don't see how to create the necessary Credentials object.
https://google-auth.readthedocs.io/en/stable/user-guide.html#service-account-private-key-files
from google.oauth2 import service_account
credentials = service_account.Credentials.from_service_account_file(
'/path/to/key.json')
scoped_credentials = credentials.with_scopes(
['https://www.googleapis.com/auth/cloud-platform'])

how to obtain GCR access token with python / listing docker images

The access token im getting with gcloud auth print-access-token is obviously a different access token than the one i can get with some basic python code:
export GOOGLE_APPLICATION_CREDENTIALS=/the-credentials.json
from oauth2client.client import GoogleCredentials
credentials = GoogleCredentials.get_application_default()
credentials.get_access_token()
What i am trying to do is get a token that would work with:
curl -u _token:<mytoken> https://eu.gcr.io/v2/my-project/my-docker-image/tags/list
I'd prefer not to install gcloud utility as a dependency for my app, hence my tries to obtain the access token progrmatically via oath google credentials
I know this is a very old question, but I just got faced with the exact same problem of requiring an ACCESS_TOKEN in Python and not being able to generate it, and managed to make it work.
What you need to do is to use the variable credentials.token, except it won't exist once you first create the credentials object, returning None. In order to generate a token, the credentials must be used by a google.cloud library, which in my case was done by using the googleapiclient.discovery.build method:
sqladmin = googleapiclient.discovery.build('sqladmin', 'v1beta4', credentials=credentials)
response = sqladmin.instances().get(project=PROJECT_ID, instance=INSTANCE_ID).execute()
print(json.dumps(response))
After which the ACCESS_TOKEN could be properly generated using
access_token = credentials.token
I've also tested it using google.cloud storage as a way to test credentials, and it also worked, by just trying to access a bucket in GCS through the appropriate Python library:
from google.oauth2 import service_account
from google.cloud import storage
PROJECT_ID = your_project_id_here
SCOPES = ['https://www.googleapis.com/auth/sqlservice.admin']
SERVICE_ACCOUNT_FILE = '/path/to/service.json'
credentials = service_account.Credentials.from_service_account_file(
SERVICE_ACCOUNT_FILE, scopes=SCOPES)
try:
list(storage.Client(project=PROJECT_ID, credentials=credentials).bucket('random_bucket').list_blobs())
except:
print("Failed because no bucket exists named 'random_bucket' in your project... but that doesn't matter, what matters is that the library tried to use the credentials and in doing so generated an access_token, which is what we're interested in right now")
access_token = credentials.token
print(access_token)
So I think there are a few questions:
gcloud auth print-access-token vs GoogleCredentials.get_application_default()
gcloud doesn't set application default credentials by default anymore when performing a gcloud auth login, so the access_token you're getting from gcloud auth print-access-token is going to be the one corresponding to the used you used to login.
As long as you follow the instructions to create ADC's for a service account, that account has the necessary permissions, and the environment from which you are executing the script has access to the ENV var and the adc.json file, you should be fine.
How to make curl work
The Docker Registry API specifies that a token exchange should happen, swapping your Basic auth (i.e. Authorization: Basic base64(_token:<gcloud_access_token>)) for a short-lived Bearer token. This process can be a bit involved, but is documented here under "How to authenticate" and "Requesting a Token". Replace auth.docker.io/token with eu.gcr.io/v2/token and service=registry.docker.io with service=eu.gcr.io, etc. Use curl -u oauth2accesstoken:<mytoken> here.
See also: How to list images and tags from the gcr.io Docker Registry using the HTTP API?
Avoid the question entirely
We have a python lib that might be relevant to your needs:
https://github.com/google/containerregistry

Categories