I'm using python with google cloud speech api I did all the steps in "How to use google speech recognition api in python?" on ubuntu and on windows as well and when I trying to run the simple script from here - "https://github.com/GoogleCloudPlatform/python-docs-samples/blob/master/speech/api/speech_rest.py"
I get the next error:
<HttpError 403 when requesting https://speech.googleapis.com/$discovery/rest?version=v1beta1 returned "Google Cloud Speech API has not been used in project google.com:cloudsdktool before or it is disabled. Enable it by visiting https://console.developers.google.com/apis/api/speech.googleapis.com/overview?project=google.com:cloudsdktool then retry. If you enabled this API recently, wait a few minutes for the action to propagate to our systems and retry.">
what is weird is that I don't have project by the name "cloudsdktool"
I run "gcloud init", and linked the json file that I got when I created service account key with "gcloud auth activate-service-account --key-file=jsonfile" command,
I tried in linux to create google credentials environment variable and still I get the same massage
So I found two ways to fix that problem:
1 - if using google cloud sdk and the cloud speech is in beta version you need to run 'gcloud beta init' instead of 'gcloud init' and then provide the json file
2 - if you don't want to use the cloud sdk from google you can pass the json file straight in python app
here are the methods for this:
from oauth2client.client import GoogleCredentials
GoogleCredentials.from_stream('path/to/your/json')
then you just create scope on the creds and authorizing or if using grpc(streaming) you pass it to the header just like in the example.
here are the changed script for the grpc:
def make_channel(host, port):
"""Creates an SSL channel with auth credentials from the environment."""
# In order to make an https call, use an ssl channel with defaults
ssl_channel = implementations.ssl_channel_credentials(None, None, None)
# Grab application default credentials from the environment
creds = GoogleCredentials.from_stream('path/to/your/json').create_scoped([SPEECH_SCOPE])
# Add a plugin to inject the creds into the header
auth_header = (
'Authorization',
'Bearer ' + creds.get_access_token().access_token)
auth_plugin = implementations.metadata_call_credentials(
lambda _, cb: cb([auth_header], None),
name='google_creds')
# compose the two together for both ssl and google auth
composite_channel = implementations.composite_channel_credentials(
ssl_channel, auth_plugin)
return implementations.secure_channel(host, port, composite_channel)
Related
I'm hosting a Flask web app on Cloud Run. I'm also using Secret Manager to store Service Account keys. (I previously downloaded a JSON file with the keys)
In my code, I'm accessing the payload then using os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = payload to authenticate. When I deploy the app and try to visit the page, I get an Internal Service Error. Reviewing the logs, I see:
File "/usr/local/lib/python3.10/site-packages/google/auth/_default.py", line 121, in load_credentials_from_file
raise exceptions.DefaultCredentialsError(
google.auth.exceptions.DefaultCredentialsError: File {"
I can access the secret through gcloud just fine with: gcloud secrets versions access 1 --secret="<secret_id>" while acting as the Service Account.
Here is my Python code:
# Grabbing keys from Secret Manager
def access_secret_version():
# Create the Secret Manager client.
client = secretmanager.SecretManagerServiceClient()
# Build the resource name of the secret version.
name = "projects/{project_id}/secrets/{secret_id}/versions/1"
# Access the secret version.
response = client.access_secret_version(request={"name": name})
payload = response.payload.data.decode("UTF-8")
return payload
#app.route('/page/page_two')
def some_random_func():
# New way
payload = access_secret_version() # <---- calling the payload
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = payload
# Old way
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = "service-account-keys.json"
I'm not technically accessing a JSON file like I was before. The payload variable is storing entire key. Is this why it's not working?
Your approach is incorrect.
When you run on a Google compute service like Cloud Run, the code runs under the identity of the compute service.
In this case, by default, Cloud Run uses the Compute Engine default service account but, it's good practice to create a Service Account for your service and specify it when you deploy it to Cloud Run (see Service accounts).
This mechanism is one of the "legs" of Application Default Credentials when your code is running on Google Cloud, you don't specify the environment variable (you also don't need to create a key) and Cloud Run service acquires the credentials from the Metadata service:
import google.auth
credentials, project_id = google.auth.default()
See google.auth package
It is bad practice to define|set an environment variable within code. By their nature, environment variables should be provided by the environment. Doing this with APPLICATION_DEFAULT_CREDENTIALS means that your code always sets this value when it should only do this when the code is running off Google Cloud.
For completeness, if you need to create Credentials from a JSON string rather than from a file contain a JSON string, you can use from_service_account_info (see google.oauth2.service_account)
I am currently using the following code to get the OAUTH Token
command = 'gcloud auth print-access-token'
result = str(subprocess.Popen(command, universal_newlines=True, shell=True, stdout=subprocess.PIPE,
stderr=subprocess.PIPE).communicate())
The result variable has the OAUTH Token. This technique uses my current logged in gcloud config.
However, I am looking out for a way to get the OAUTH Token without using command line.
I am using this OAUTH Token to make CDAP calls to get the Google Dataflow Pipeline Execution Details.
I checked some google blogs. This is the one I think should try but it asks to create consent screen and it will require one time activity to provide consent to the scopes defined and then it should work.
Google Document
Shall I follow steps in above document and check OR is there any other way we can get the OAUTH Token?
Is there a way to get authentication done by service account instead of google user account and get the OAUTH Token?
For automated process, service account is the recommended way. You can use the google-oauth library for this. You can generate an access token like this
# With default credential (your user account or the Google Cloud Component service account.
# Or with the service account key file defined in the GOOGLE_APPLICATION_CREDENTIALS env var -> for platform outside GCP)
credentials, project_id = google.auth.default(scopes=["https://www.googleapis.com/auth/cloud-platform"])
# With service account key file (not recommended)
# credentials = service_account.Credentials.from_service_account_file('service-account.json',
# scopes=["https://www.googleapis.com/auth/cloud-platform"])
from google.auth.transport import requests
credentials.refresh(requests.Request())
print(credentials.token)
However, if you want to call Google cloud APIs, I recommend you to use authorized request object
Here an example of BigQuery call. You can use service account key file to generate your credential as in my previous example.
base_url = 'https://bigquery.googleapis.com'
credentials, project_id = google.auth.default(scopes=['https://www.googleapis.com/auth/cloud-platform'])
project_id = 'MyProjectId'
authed_session = AuthorizedSession(credentials)
response = authed_session.request('GET', f'{base_url}/bigquery/v2/projects/{project_id}/jobs')
print(response.json())
EDIT
When you want to use Google APIs, a service account key file is not needed (and I recommend you to not use it) on your computer and on GCP component. The Application Default Credential is always sufficient.
When you are in your local environment, you must run the command gcloud auth application-default login. With this command, you will register your personal account as default credential when you run locally your app. (of course, you need to have your user account email authorized on the component that you call)
When you are on GCP environment, each component have a default service account (or you can specify one with you configure your component). Thanks to the component "identity", you can use the default credential. (of course, you need to have the service account email authorized on the component that you call)
ONLY when you run an app automatically and outside GCP, you need a service account key file (for example, in your CI/CD other that Cloud Build, or in an app deployed on other Cloud Provider or on premise)
Why service account key file is not recommended? It's at least my recommendation because this file is ..... a file!! That's the problem. You have a way to authenticate a service account in a simple file: you have to store it securely (it's a secret and an authentication method!!), you can copy it, you can send it by email, you can even commit it in a public GIT repository... In addition, Google recommend to rotate them every 90 days, so it's a nightmare to rotate, to trace and to manage
I have a Flask app in Google App Engine Standard Environment Python, and I also have a Cloud Function with an HTTP trigger which accepts a JSON body including the URL of a file. The CF downloads the file at that URL then saves it to a GCS bucket. The GAE service account has Cloud Function Invoker permissions, yet when using urlfetch.fetch() in my GAE code to trigger CF, the App Engine code gets a 403 Forbidden error unless I make the CF trigger callable by anyone.
How do I successfully call/trigger CF's from GAE in Python? I assume the answer is one of these:
Set IAM permissions on GAE service account to {enlighten me here}
Add authentication headers in urlfetch.fetch() like so {different enlightenment}
Make CF triggerable from anywhere, but hard code some secret key so the CF code itself handles authentication.
It's well documented here: Cloud Functions Authentication
In short you have to provide your service account credentials in the authentication header.
To get your credentials use the Google Auth Client library. If you are testing from local you should create a service account JSON and load it to the environment variable GOOGLE_APPLICATION_CREDENTIALS but on App Engine it will work from scratch.
After you have gotten your token, pass it as an auth header like so:
auth_req = google.auth.transport.requests.Request()
auth_token = google.oauth2.id_token.fetch_id_token(auth_req, cloud_function_url)
response = requests.post(cloud_function_url, json=payload, headers={"Authorization" : f"Bearer {auth_token}"})
I'm trying to connect to my Google Cloud Endpoints API that is running as an Appengine app:
#endpoints.api(name='helloworldendpoints', allowed_client_ids=["1234", "12345"], version='v1', auth_level=endpoints.AUTH_LEVEL.REQUIRED)
class HelloWorldApi(remote.Service):
...
The API request is as follows:
scopes = ["https://www.googleapis.com/auth/userinfo.email"]
credentials = ServiceAccountCredentials.from_json_keyfile_name("CloudEndpointsClient.json", scopes)
from httplib2 import Http
http_auth = credentials.authorize(Http())
from apiclient.discovery import build
api_root = 'https://myapp.appspot.com/_ah/api'
api = 'helloworldendpoints'
version = 'v1'
discovery_url = '%s/discovery/v1/apis/%s/%s/rest' % (api_root, api, version)
service = build(api, version, discoveryServiceUrl=discovery_url)
response = service.myFunction(myparameter = "123456").execute(http=http_auth)#
print response
The requests work well if I remove authentication requirements.
I know that authentication works since the error changes if after authenticating.
The error message I'm getting is:
googleapiclient.errors.HttpError: https://my-app.appspot.com/_ah/api/helloworldendpoints/v1/obtainScoreFromEmail?myparameter=1234&alt=json returned "Access Not Configured. has not been used in project 123456789 before or it is disabled. Enable it by visiting https://console.developers.google.com/apis/api/helloworldendpoints/overview?project=123456789 then retry. If you enabled this API recently, wait a few minutes for the action to propagate to our systems and retry.">
I cannot enable the API in my Google Cloud Project, since the API does not exist.
What I found to work is a hack around. I used a user authentication (instead of server) on the same project for the same API which worked (https://cloud.google.com/endpoints/docs/frameworks/python/access_from_python).
After I switched back to my initial server auth. method it started working.
To use the new Google Directory API we created an OAuth2 "service account" (see Using OAuth 2.0 for Server to Server Applications). This is basically a PKCS #12 file. All of our Directory API scripts work fine with this service account.
We also use the EmailSettings API (Developer's Guide Email Settings API) to manage some of our Google account settings. These scripts did not move to the new API format, so we continue to use the old OAuth1 authentication method. This has, until recently, worked fine. However, it appears that Google is no longer supporting OAuth1 authentication.
So, we need to move the EmailSettings scripts from OAuth1 to our OAuth2 service account. We are using the gdata Python libraries (GitHub google/gdata-python-client) to make calls to the EmailSettings API. This is how we currently authenticate to make EmailSettings API calls:
import gdata.apps.emailsettings.service
# self is an EmailSettingsService object (gdata.apps.emailsettings.service)
self.domain = "mydomain.com"
self.source = "my application name"
token = get OAuth1 token string from a file
self.SetOAuthInputParameters(
gdata.auth.OAuthSignatureMethod.HMAC_SHA1,
consumer_key = token.oauth_input_params._consumer.key,
consumer_secret = token.oauth_input_params._consumer.secret
)
token.oauth_input_params = self._oauth_input_params
self.SetOAuthToken(token)
Using these Python gdata libraries, how do I authenticate using our OAuth2 service account, i.e., the PKCS #12 file, to use the EmailSettings API?
There's another SO question which shows you how to do this but uses the slightly newer gdata.apps.emailsettings.client methods.
If you must stay with gdata.apps.emailsettings.service then you can "hack" OAuth 2.0 onto the object with something like:
Build your OAuth 2.0 credentials object as you are already doing for your Admin SDK Directory API calls.
Build your GData client object as you are already doing in lines 1-3 of your code.
Grab the access token from your credentials object and apply it as a header to your client object:
client.additional_headers = {
'Authorization': u'Bearer %s' % credentials.access_token}
If you get a 401 response (access token expired), repeat 1 and 3 to get and apply a new access token.