Let it be stated that this is the first time I'm working with Google Cloud, so this is probably a noob question.
I want to use the Google Cloud Speech library to do some speech to text on an audio file. I also want to explicitly state within the code which Google Cloud service account credentials to use by giving the private key file. How do I do so?
It seems that what I want is a mix of this quickstart for speech recognition and this example of how to set credentials within the code (the explicit() part).
I tried doing so, but explicit() uses google.cloud.storage to set the client,
from google.cloud import storage
storage_client = storage.Client.from_service_account_json('service_account.json')
in order to make the API request.
Setting
client = storage.Client.from_service_account_json('service_account.json')
and then running
client.recognize(config, audio)
obviously throws an error saying that client doesn't have that attribute. My guess is what I need is something similar, but for google.cloud.speech? I have tried looking through the documentation - am I missing something?
Almost all of the Google Client libraries follow the same design pattern for credentials. In the example below, the code loads the credentials and then creates the client using those credentials.
Link to the Speech Client
from google.oauth2 import service_account
from google.cloud import speech_v1
from google.cloud.speech_v1 import enums
SCOPES = ["https://www.googleapis.com/auth/cloud-platform"]
SERVICE_ACCOUNT_FILE = 'service-account.json'
cred = service_account.Credentials.from_service_account_file(
SERVICE_ACCOUNT_FILE, scopes=SCOPES)
client = speech_v1.SpeechClient(credentials=cred)
OR (Notice not specifying scopes which is usually OK)
client = speech_v1.SpeechClient.from_service_account_file(SERVICE_ACCOUNT_FILE)
Related
How I can get a list of users in account GCP using python. I can't find how I can authorize using python in account and get a list. Can anybody help me?
I am assuming that you are just getting started with Google Cloud and the Python SDKs. If you are already experienced, skip to the bottom of my answer for the actual example code.
The documentation for the Google Cloud Python SDKs can be hard to figure out. The key detail is that Google documents the APIs using automated tools. Google publishes a document that SDKs can read to automatically build APIs. This might appear strange at first, but very clever when you think about it. SDKs that automatically update themselves to support the latest API implementation.
Start with the root document: Google API Client Library for Python Docs
Near the bottom is the link for documentation:
Library reference documentation by API
For your case, listing users with IAM bindings in a project, scroll down to cloudresourcemanager. Sometimes there are multiple API versions. Usually, pick the latest version. In this case, v3.
Knowing which API to use is built from experience. As you develop more and more software in Google Cloud, the logic to the architecture becomes automatic.
Cloud Resource Manager API
The API provides multiple Instance Methods. In your case, the instance method is projects.
Cloud Resource Manager API - projects
Within projects are Instance Methods. In your case, getIamPolicy().
getIamPolicy(resource, body=None, x__xgafv=None)
Sometimes you need to review the REST API to understand parameters and returned values.
Resource Manager REST API: Method: projects.getIamPolicy
For example, to understand the response from the Python SDK API, review the response documented by the REST API which includes several examples:
Resource Manager REST API: Policy
Now that I have covered the basics of discovering how to use the documentation, let's create an example that will list the roles and IAM members.
Import the required Python libraries:
from google.oauth2 import service_account
import googleapiclient.discovery
Create a variable with your Project ID. Note: do not use Project Name.
PROJECT_ID='development-123456'
Note: In the following explanation, I use a service account. Later in this answer, I show an example using ADC (Application Default Credentials) set up by the Google Cloud CLI (gcloud).
Create a variable with the full pathname to your Google Cloud Service Account JSON Key file:
SA_FILE='/config/service-account.json'
Create a variable for the required Google Cloud IAM Scopes. Typically I use the following scope as I prefer to control permissions via IAM Roles assigned to the service account:
SCOPES=['https://www.googleapis.com/auth/cloud-platform']
Create OAuth credentials from the service account:
credentials = service_account.Credentials.from_service_account_file(
filename=SA_FILE,
scopes=SCOPES)
Now we are at the point to start using the API documentation. The following code builds the API discovery document and loads the APIs for cloudresourcemanager:
service = googleapiclient.discovery.build(
'cloudresourcemanager',
'v3',
credentials=credentials)
Now call the API which will return a JSON response details the roles and members with bindings to the project:
resource = 'projects/' + PROJECT_ID
response = service.projects().getIamPolicy(resource=resource, body={}).execute()
The following is simple code to print part of the returned JSON:
for binding in response['bindings']:
print('Role:', binding['role'])
for member in binding['members']:
print(member)
Complete example that uses ADC (Application Default Credentials):
import googleapiclient.discovery
PROJECT_ID='development-123456'
service = googleapiclient.discovery.build('cloudresourcemanager', 'v3')
resource = 'projects/' + PROJECT_ID
response = service.projects().getIamPolicy(resource=resource, body={}).execute()
for binding in response['bindings']:
print('Role:', binding['role'])
for member in binding['members']:
print(member)
Complete example using a service account:
from google.oauth2 import service_account
import googleapiclient.discovery
PROJECT_ID='development-123456'
SA_FILE='/config/service-account.json'
SCOPES=['https://www.googleapis.com/auth/cloud-platform']
credentials = service_account.Credentials.from_service_account_file(
filename=SA_FILE,
scopes=SCOPES)
service = googleapiclient.discovery.build(
'cloudresourcemanager', 'v3', credentials=credentials)
resource = 'projects/' + PROJECT_ID
response = service.projects().getIamPolicy(resource=resource, body={}).execute()
for binding in response['bindings']:
print('Role:', binding['role'])
for member in binding['members']:
print(member)
I am trying to use the google Double click bid manager (DBM) API, to download reports, I am trying to make this automatic without manual authentication, but all I can find is the GitHub repo for DBM samples https://github.com/googleads/googleads-bidmanager-examples
This sample opens up a browser for manual authentication.
Is there any way to do it automatically using python?
You can use a Google Cloud Platform service account for authentication as well.
Create service account and create/download the JSON key
Add the service account to the DBM (now Display & Video 360) account you want to access
Use the Python Google API client library (also see this Google DV360 tutorial, the authentication part is the same):
from googleapiclient import discovery
from oauth2client.service_account import ServiceAccountCredentials
# SETTINGS - GOOGLE GENERAL
GOOGLE_JSON_KEYFILE = "<your-keyfile>.json" # Google Cloud Platform Service Account JSON keyfile
# SETTINGS - GOOGLE DV360 API
GOOGLE_DV360_API_VERSION = 'v1'
GOOGLE_DV360_API_SCOPES = ['https://www.googleapis.com/auth/display-video']
# Google D&V360 API service
def get_dv360_service():
credentials = ServiceAccountCredentials.from_json_keyfile_name(
GOOGLE_JSON_KEYFILE,
scopes=GOOGLE_DV360_API_SCOPES)
return discovery.build('displayvideo', GOOGLE_DV360_API_VERSION, credentials=credentials, cache_discovery=False)
dv360_service = get_dv360_service()
#dv360_service.-> get your reports
If I want to create/read/update spreadsheets using gspread, I know first have to authenticate like so:
import gspread
gc = gspread.service_account()
Where I can also specify the filename to point to a service account json key, but is there a way I can tell gspread to use the default service account credentials without pointing to a json?
My use-case is that I want to run gspread in a vm (or cloud function) that already comes with an IAM role and I can't seem to figure out where to get the json file from. I also don't want to copy the json to the vm unnecessarily.
You can use google.auth to get the credentials of the default service account.
Then you can use gspread.authorize() with these credentials:
import google.auth
import gspread
credentials, project_id = google.auth.default(
scopes=[
'https://spreadsheets.google.com/feeds',
'https://www.googleapis.com/auth/drive'
]
)
gc = gspread.authorize(credentials)
As per the use case, Cloud function has Runtime service account feature which allows you to use default service account or you can attach any service account to the cloud function.
Using this library you can do the thing without json file.
import google.auth
credentials, project_id = google.auth.default()
The access token im getting with gcloud auth print-access-token is obviously a different access token than the one i can get with some basic python code:
export GOOGLE_APPLICATION_CREDENTIALS=/the-credentials.json
from oauth2client.client import GoogleCredentials
credentials = GoogleCredentials.get_application_default()
credentials.get_access_token()
What i am trying to do is get a token that would work with:
curl -u _token:<mytoken> https://eu.gcr.io/v2/my-project/my-docker-image/tags/list
I'd prefer not to install gcloud utility as a dependency for my app, hence my tries to obtain the access token progrmatically via oath google credentials
I know this is a very old question, but I just got faced with the exact same problem of requiring an ACCESS_TOKEN in Python and not being able to generate it, and managed to make it work.
What you need to do is to use the variable credentials.token, except it won't exist once you first create the credentials object, returning None. In order to generate a token, the credentials must be used by a google.cloud library, which in my case was done by using the googleapiclient.discovery.build method:
sqladmin = googleapiclient.discovery.build('sqladmin', 'v1beta4', credentials=credentials)
response = sqladmin.instances().get(project=PROJECT_ID, instance=INSTANCE_ID).execute()
print(json.dumps(response))
After which the ACCESS_TOKEN could be properly generated using
access_token = credentials.token
I've also tested it using google.cloud storage as a way to test credentials, and it also worked, by just trying to access a bucket in GCS through the appropriate Python library:
from google.oauth2 import service_account
from google.cloud import storage
PROJECT_ID = your_project_id_here
SCOPES = ['https://www.googleapis.com/auth/sqlservice.admin']
SERVICE_ACCOUNT_FILE = '/path/to/service.json'
credentials = service_account.Credentials.from_service_account_file(
SERVICE_ACCOUNT_FILE, scopes=SCOPES)
try:
list(storage.Client(project=PROJECT_ID, credentials=credentials).bucket('random_bucket').list_blobs())
except:
print("Failed because no bucket exists named 'random_bucket' in your project... but that doesn't matter, what matters is that the library tried to use the credentials and in doing so generated an access_token, which is what we're interested in right now")
access_token = credentials.token
print(access_token)
So I think there are a few questions:
gcloud auth print-access-token vs GoogleCredentials.get_application_default()
gcloud doesn't set application default credentials by default anymore when performing a gcloud auth login, so the access_token you're getting from gcloud auth print-access-token is going to be the one corresponding to the used you used to login.
As long as you follow the instructions to create ADC's for a service account, that account has the necessary permissions, and the environment from which you are executing the script has access to the ENV var and the adc.json file, you should be fine.
How to make curl work
The Docker Registry API specifies that a token exchange should happen, swapping your Basic auth (i.e. Authorization: Basic base64(_token:<gcloud_access_token>)) for a short-lived Bearer token. This process can be a bit involved, but is documented here under "How to authenticate" and "Requesting a Token". Replace auth.docker.io/token with eu.gcr.io/v2/token and service=registry.docker.io with service=eu.gcr.io, etc. Use curl -u oauth2accesstoken:<mytoken> here.
See also: How to list images and tags from the gcr.io Docker Registry using the HTTP API?
Avoid the question entirely
We have a python lib that might be relevant to your needs:
https://github.com/google/containerregistry
I'm starting looking at Google Analytics core reporting API, which is now in version 3.
According to the documentation, I could use one of the client libraries listed in the link http://code.google.com/apis/analytics/docs/gdata/v3/gdataLibraries.html.
I'm using python, so I was looking for an example of using the core reporting API in python, but I could not find one using this library. None of the examples at http://code.google.com/p/google-api-python-client/wiki/SampleApps include an example of the Core Reporting API.
One other option seems to be using the library at http://code.google.com/p/gdata-python-client/ but I'm not sure this library is using the lastest version of the core reporting API (v3.0).
I'm looking for a python library (with documentation / examples) that is compliant to http://code.google.com/apis/analytics/docs/gdata/v3/reference.html
Thanks
I did not find any example or good documentation, but I was able to mix general oauth2 authentication with the JAVA example and the python library source code to find an answer. So, here it goes:
Authentication:
from oauth2client.file import Storage
from oauth2client.client import AccessTokenRefreshError
from oauth2client.client import OAuth2WebServerFlow
from oauth2client.tools import run
import httplib2
FLOW = OAuth2WebServerFlow(
client_id=CLIENT_ID,
client_secret=CLIENT_SECRET,
scope='https://www.googleapis.com/auth/analytics.readonly')
storage = Storage('file_name.dat')
credentials = storage.get()
if credentials is None or credentials.invalid:
credentials = run(FLOW, storage)
http = credentials.authorize(httplib2.Http())
Connecting to the Core Reporting API (I'm not sure the verb "connect" is adequate)
from apiclient.discovery import build
service = build('analytics', 'v3', http=http)
Making a query:
query = service.data().ga().get(ids='ga:%d' % PROFILE_ID, start_date=START_DATE, end_date=END_DATE,metrics='ga:pageviews')
results = query.execute()
The full list of parameters to pass to the get method when creating the query can be found at http://api-python-client-doc.appspot.com/analytics/v3/data/ga.
The results come in a python dict exactly as described in http://code.google.com/apis/analytics/docs/gdata/v3/reference.html#data_response