query google groups api from gcp instance/function python - python

I'm trying to create a script to query google groups API from GCP instance. The instance has SA attached to it, this SA has SCOPE - 'https://www.googleapis.com/auth/admin.directory.group.readonly' allowed in GSuite, and the user is also setup in GSuite with a custom role attached to it(list groups).
For the SA I created a key file in GCP console. Then I get the credentials as documentation says:
from googleapiclient.discovery import build
from google.oauth2 import service_account
creds = service_account.Credentials.from_service_account_file(
SERVICE_ACCOUNT_FILE, scopes=SCOPES)
then add the user - to act as.
creds = creds.with_subject('user#domain.com')
service = build('admin', 'directory_v1', credentials=creds)
results = service.groups().list(domain=tenant, maxResults=10,
orderBy='email',
query='email:{}*'.format(group_name)).execute()
Then I query API, everything works perfect and I get the groups.
So my question is:
Is there a way to use the SA attached to the instance without generating the json key file. Like get compute_instance / default credentials / from instance metadata and then somehow authenticate them to the GSuite API?
Or is there a way to query groups without hitting Gsuite API, just call some from within GCP?

You should read the next article on the official GCP Documentation page
Here is an example of how to bind service account to VM
gcloud compute instances create example-vm \
--service-account my-sa#my-project.iam.gserviceaccount.com \
--scopes https://www.googleapis.com/auth/admin.directory.group.readonly

The answers I got from google:
No, it's not possible without generating a private key for SA for impersonation.
No, the right way of getting the groups is to query Gsuite's APIs.

Related

GCP user list using python

How I can get a list of users in account GCP using python. I can't find how I can authorize using python in account and get a list. Can anybody help me?
I am assuming that you are just getting started with Google Cloud and the Python SDKs. If you are already experienced, skip to the bottom of my answer for the actual example code.
The documentation for the Google Cloud Python SDKs can be hard to figure out. The key detail is that Google documents the APIs using automated tools. Google publishes a document that SDKs can read to automatically build APIs. This might appear strange at first, but very clever when you think about it. SDKs that automatically update themselves to support the latest API implementation.
Start with the root document: Google API Client Library for Python Docs
Near the bottom is the link for documentation:
Library reference documentation by API
For your case, listing users with IAM bindings in a project, scroll down to cloudresourcemanager. Sometimes there are multiple API versions. Usually, pick the latest version. In this case, v3.
Knowing which API to use is built from experience. As you develop more and more software in Google Cloud, the logic to the architecture becomes automatic.
Cloud Resource Manager API
The API provides multiple Instance Methods. In your case, the instance method is projects.
Cloud Resource Manager API - projects
Within projects are Instance Methods. In your case, getIamPolicy().
getIamPolicy(resource, body=None, x__xgafv=None)
Sometimes you need to review the REST API to understand parameters and returned values.
Resource Manager REST API: Method: projects.getIamPolicy
For example, to understand the response from the Python SDK API, review the response documented by the REST API which includes several examples:
Resource Manager REST API: Policy
Now that I have covered the basics of discovering how to use the documentation, let's create an example that will list the roles and IAM members.
Import the required Python libraries:
from google.oauth2 import service_account
import googleapiclient.discovery
Create a variable with your Project ID. Note: do not use Project Name.
PROJECT_ID='development-123456'
Note: In the following explanation, I use a service account. Later in this answer, I show an example using ADC (Application Default Credentials) set up by the Google Cloud CLI (gcloud).
Create a variable with the full pathname to your Google Cloud Service Account JSON Key file:
SA_FILE='/config/service-account.json'
Create a variable for the required Google Cloud IAM Scopes. Typically I use the following scope as I prefer to control permissions via IAM Roles assigned to the service account:
SCOPES=['https://www.googleapis.com/auth/cloud-platform']
Create OAuth credentials from the service account:
credentials = service_account.Credentials.from_service_account_file(
filename=SA_FILE,
scopes=SCOPES)
Now we are at the point to start using the API documentation. The following code builds the API discovery document and loads the APIs for cloudresourcemanager:
service = googleapiclient.discovery.build(
'cloudresourcemanager',
'v3',
credentials=credentials)
Now call the API which will return a JSON response details the roles and members with bindings to the project:
resource = 'projects/' + PROJECT_ID
response = service.projects().getIamPolicy(resource=resource, body={}).execute()
The following is simple code to print part of the returned JSON:
for binding in response['bindings']:
print('Role:', binding['role'])
for member in binding['members']:
print(member)
Complete example that uses ADC (Application Default Credentials):
import googleapiclient.discovery
PROJECT_ID='development-123456'
service = googleapiclient.discovery.build('cloudresourcemanager', 'v3')
resource = 'projects/' + PROJECT_ID
response = service.projects().getIamPolicy(resource=resource, body={}).execute()
for binding in response['bindings']:
print('Role:', binding['role'])
for member in binding['members']:
print(member)
Complete example using a service account:
from google.oauth2 import service_account
import googleapiclient.discovery
PROJECT_ID='development-123456'
SA_FILE='/config/service-account.json'
SCOPES=['https://www.googleapis.com/auth/cloud-platform']
credentials = service_account.Credentials.from_service_account_file(
filename=SA_FILE,
scopes=SCOPES)
service = googleapiclient.discovery.build(
'cloudresourcemanager', 'v3', credentials=credentials)
resource = 'projects/' + PROJECT_ID
response = service.projects().getIamPolicy(resource=resource, body={}).execute()
for binding in response['bindings']:
print('Role:', binding['role'])
for member in binding['members']:
print(member)

How to get Google OAUTH Token without using Gcloud Command Line

I am currently using the following code to get the OAUTH Token
command = 'gcloud auth print-access-token'
result = str(subprocess.Popen(command, universal_newlines=True, shell=True, stdout=subprocess.PIPE,
stderr=subprocess.PIPE).communicate())
The result variable has the OAUTH Token. This technique uses my current logged in gcloud config.
However, I am looking out for a way to get the OAUTH Token without using command line.
I am using this OAUTH Token to make CDAP calls to get the Google Dataflow Pipeline Execution Details.
I checked some google blogs. This is the one I think should try but it asks to create consent screen and it will require one time activity to provide consent to the scopes defined and then it should work.
Google Document
Shall I follow steps in above document and check OR is there any other way we can get the OAUTH Token?
Is there a way to get authentication done by service account instead of google user account and get the OAUTH Token?
For automated process, service account is the recommended way. You can use the google-oauth library for this. You can generate an access token like this
# With default credential (your user account or the Google Cloud Component service account.
# Or with the service account key file defined in the GOOGLE_APPLICATION_CREDENTIALS env var -> for platform outside GCP)
credentials, project_id = google.auth.default(scopes=["https://www.googleapis.com/auth/cloud-platform"])
# With service account key file (not recommended)
# credentials = service_account.Credentials.from_service_account_file('service-account.json',
# scopes=["https://www.googleapis.com/auth/cloud-platform"])
from google.auth.transport import requests
credentials.refresh(requests.Request())
print(credentials.token)
However, if you want to call Google cloud APIs, I recommend you to use authorized request object
Here an example of BigQuery call. You can use service account key file to generate your credential as in my previous example.
base_url = 'https://bigquery.googleapis.com'
credentials, project_id = google.auth.default(scopes=['https://www.googleapis.com/auth/cloud-platform'])
project_id = 'MyProjectId'
authed_session = AuthorizedSession(credentials)
response = authed_session.request('GET', f'{base_url}/bigquery/v2/projects/{project_id}/jobs')
print(response.json())
EDIT
When you want to use Google APIs, a service account key file is not needed (and I recommend you to not use it) on your computer and on GCP component. The Application Default Credential is always sufficient.
When you are in your local environment, you must run the command gcloud auth application-default login. With this command, you will register your personal account as default credential when you run locally your app. (of course, you need to have your user account email authorized on the component that you call)
When you are on GCP environment, each component have a default service account (or you can specify one with you configure your component). Thanks to the component "identity", you can use the default credential. (of course, you need to have the service account email authorized on the component that you call)
ONLY when you run an app automatically and outside GCP, you need a service account key file (for example, in your CI/CD other that Cloud Build, or in an app deployed on other Cloud Provider or on premise)
Why service account key file is not recommended? It's at least my recommendation because this file is ..... a file!! That's the problem. You have a way to authenticate a service account in a simple file: you have to store it securely (it's a secret and an authentication method!!), you can copy it, you can send it by email, you can even commit it in a public GIT repository... In addition, Google recommend to rotate them every 90 days, so it's a nightmare to rotate, to trace and to manage

Using Google People API with Service Account

I'm using the Google People API to access my contacts.
I activated it in the Google Developers Console and created a project, a service account (ending with ....iam.gserviceaccount.com) and a key for authentication which is stored in JSON format.
When I access the contacts, it seems to take the contacts of my service account address rather than my Google account which results in an empty list.
How can I tell the API to use my account rather than the service account?
This is the code I have so far:
from google.oauth2 import service_account
from googleapiclient.discovery import build
# pip install google-auth google-auth-httplib2 google-api-python-client
SCOPES = ['https://www.googleapis.com/auth/contacts.readonly']
KEY = '~/private.json'
credentials = service_account.Credentials.from_service_account_file(
KEY, scopes=SCOPES)
service = build(
serviceName='people', version='v1', credentials=credentials)
connections = service.people().connections().list(
resourceName='people/me', personFields='names').execute()
print(connections)
# result: {}
A service account is NOT you a service account is a dummy user it has its own google drive account, google calendar and apparently google contacts. The reason that you are seeing an empty result set is that you have not added any contacts to the service accounts account.
Service accounts are most often used to grant access to data that the developer owns. For example you can take the service account email address and share one of your folders on google drive it will then have acccess to that folder on your google drive account. You can do the same with google calendar.
There are some apis that do not give you the ablity to share your data with other users. Youtube, adwords, blogger and google contacts to name a few.
You cant use a service account to access your personal google contacts. Your best bet would be to authenticate your application with oauth2 and access them that way.
Note about Google Workspace
If you have a google workspace account, a serivce account can be configured to act on behalf of a user on the domain, but only a user on the domain. Perform Google Workspace domain-wide delegation of authority
Not a python expert but I've just performed the task the OP is talking about in .NET and I am pretty sure it's feasable with Python too.
So it looks like all needs to be done is delegating domain-wide authority to the SA. I.e. assign required scopes for your SA, in my case it was https://www.googleapis.com/auth/contacts.readonly.
Then you should do your call and specify an account you're trying to impersonate (took the python example from here)
from google.oauth2 import service_account
SCOPES = ['https://www.googleapis.com/auth/sqlservice.admin']
SERVICE_ACCOUNT_FILE = '/path/to/service.json'
credentials = service_account.Credentials.from_service_account_file(
SERVICE_ACCOUNT_FILE, scopes=SCOPES)
# this is the line you apparently were missing
delegated_credentials = credentials.with_subject('user#example.org')
Then you'll be able to do the people/me calls. Worked for me in .NET as I said.

Authorizing a python script to access the GData API without the OAuth2 user flow

I'm writing a small python script that will retrieve a list of my Google Contacts (using the Google Contacts API) and will randomly suggest one person for me to contact (good way to automate keeping in touch with friends!)
This is just a standalone script that I plan to schedule on a cron job. The problem is that Google seems to require OAuth2 style authentication, where the user (me) has to approve the access and then the app receives an authorization token I can then use to query the user's (my) contacts.
Since I'm only accessing my own data, is there a way to "pre-authorize" myself? Ideally I'd love to be able to retrieve some authorization token and then I'd run the script and pass that token as an environment variable
AUTH_TOKEN=12345 python my_script.py
That way it doesn't require user input/interaction to authorize it one time.
The implementation you're describing invokes the full "three-legged" OAuth handshake, which requires explicit user consent. If you don't need user consent, you can instead utilize "two-legged" OAuth via a Google service account, which is tied to an application, rather than a user. Once you've granted permission to your service account to access your contacts, you can use the oauth2client ServiceAccountCredentials class to directly access GData without requiring user consent.
Here's the two-legged authentication example from the Google service account documentation:
import json
from httplib2 import Http
from oauth2client.service_account import ServiceAccountCredentials
from apiclient.discovery import build
scopes = ['https://www.googleapis.com/auth/sqlservice.admin']
credentials = ServiceAccountCredentials.from_json_keyfile_name(
'service-account.json', scopes)
sqladmin = build('sqladmin', 'v1beta3', credentials=credentials)
response = sqladmin.instances().list(project='examinable-example-123').execute()
print response

Google AppEngine to Fusion Tables with Service Accounts

Late to the game on migrating to the /v1 Fusion Table API but no holding off any longer.
I'm using Python on AppEngine and trying to connect to Google Fusion Tables with Google Service Accounts (the more complicated cousin of OAuth2 for server side apps that uses JSON Web Tokens)
I found another question that pointed me to some documentation for using Service Accounts with Google Prediction API.
Fusion Table and Google Service Accounts
So far I've got
import httplib2
from oauth2client.appengine import AppAssertionCredentials
from apiclient.discovery import build
credentials = AppAssertionCredentials(scope='https://www.googleapis.com/auth/fusiontables')
http = credentials.authorize(httplib2.Http(memcache)) #Http(memcache)
service = build("fusiontables", "v1", http=http)
# list the tables
tables = service.table().list().execute() # <-- ERROR 401 invalid credentials here
Does anyone have an example of connecting to Fusion Tables on AppEngine using Service Accounts they might be able to share? Or something nice online?
Thanks
This actually does work. The important parts are you have to give the app engine service account access to your fusion table. If you are writing then the account needs write access. For help see: https://developers.google.com/api-client-library/python/start/installation (look for Getting started: Quickstart)
Your app engine service account will be something like your-app-id#appspot.gserviceaccount.com
You must also make the app engine service account a team member in the api console and give it "can edit" privilege.
SCOPE='https://www.googleapis.com/auth/fusiontables'
PROJECT_NUMBER = 'XXXXXXXX' # REPLACE WITH YOUR Project ID
# Create a new API service for interacting with Fusion Tables
credentials = AppAssertionCredentials(scope=SCOPE)
http = credentials.authorize(httplib2.Http())
logging.info('QQQ: accountname: %s' % app_identity.get_service_account_name())
service = build('fusiontables', 'v1', http=http, developerKey='YOUR KEY HERE FROM API CONSOLE')
def log(value1,value2=None):
tableid='YOUR TABLE ID FROM FUSION TABLES'
now = strftime("%Y-%m-%d %H:%M:%S", gmtime())
service.query().sql(sql="INSERT INTO %s (Temperature,Date) values(%s,'%s')" % (tableid,value1,now)).execute()
to clarify Ralph Yozzo's answer: you need to add the value of 'client_email' from the json file you downloaded when you created your service_account credentials (the same file you load when using ServiceAccountCredentials.from_json_keyfile_name('service_acct.json') with the new oauth2client library), to your table's sharing dialog screen (click 1 then enter the email address in 2)
Since Fusion Tables' tables are owned by individual Gmail accounts rather than the service account associated with an API console project, the AppAssertionCredentials probably won't work. It would make for an interesting feature request, though:
http://code.google.com/p/fusion-tables/issues/list
The best online resource I have found for help connecting Python AppEngine to Fusion Tables API with Oauth2 is
Google APIs Client Library for Python
The slide presentation is helpful to understanding the online samples, why decorators are used.
Also useful for understanding whether to use the app's Service Acount or User Accounts to authenticate is:
Using OAuth 2.0 to Access Google APIs
Consider installing the Google APIs Client Library for Python
Apart from the scope, the Oauth2 is more or less common to all Google APIs not just fusion tables.
Once oauth2 is working, see the Google Fusion Tables API
In case you want it to work from another host than Google App Engine or Google Compute Engine (e.g. from localhost for testing) then you should use ServiceAccountCredentials created from a json key file that you can generate and download from your service account page.
scopes = ['https://www.googleapis.com/auth/fusiontables']
keyfile = 'PATH TO YOUR SERVICE ACCOUNT KEY FILE'
FTID = 'FUSION TABLE ID'
credentials = ServiceAccountCredentials.from_json_keyfile_name(keyfile, scopes)
http_auth = credentials.authorize(Http(memcache))
service = build('fusiontables', 'v2', http=http_auth)
def insert(title, description):
sqlInsert = "INSERT INTO {0} (Title,Description) values('{1}','{2}')".format(FTID, title, description)
service.query().sql(sql=sqlInsert).execute()
Refer to Google's page on service accounts for explanations.

Categories