I am attempting to import google-cloud and big-query libraries and running into default credentials error. I have attempted to set the credentials by downloading the json file from cloud portal and specifying the path to the file.
## Google Big Query
%reload_ext google.cloud.bigquery
from google.cloud import bigquery
bqclient = bigquery.Client(project = "dat-exp")
os.environ.setdefault("GCLOUD_PROJECT", "dat-exp")
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = "path/xxxxx.json"
---------------------------------------------------------------------------
DefaultCredentialsError Traceback (most recent call last)
/tmp/ipykernel_2944/2163850103.py in <cell line: 81>()
79 get_ipython().run_line_magic('reload_ext', 'google.cloud.bigquery')
80 from google.cloud import bigquery
---> 81 bqclient = bigquery.Client(project = "dat-exp")
82 os.environ.setdefault("GCLOUD_PROJECT", "dat-exp")
DefaultCredentialsError: Could not automatically determine credentials. Please set GOOGLE_APPLICATION_CREDENTIALS or explicitly create credentials and re-run the application. For more information, please see https://cloud.google.com/docs/authentication/getting-started
Default Credentials (ADC) is a method of searching for credentials.
Your code is setting the environment after the client has attempted to locate credentials. That means the client failed to locate credentials before you set up credentials. A quick solution is to move the line with bigquery.Client(...) to be after the os.environ(...) lines.
os.environ.setdefault("GCLOUD_PROJECT", "dat-exp")
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = "path/xxxxx.json"
bqclient = bigquery.Client(project = "dat-exp")
I do not recommend the method that you are using (modify the environment inside the program). Either modify the environment before the program starts or specify the credentials to use when creating the client bigquery.Client().
from google.cloud import bigquery
from google.oauth2 import service_account
key_path = "path/to/service_account.json"
credentials = service_account.Credentials.from_service_account_file(
key_path, scopes=["https://www.googleapis.com/auth/cloud-platform"])
client = bigquery.Client(credentials=credentials, project='dat-exp')
Provide credentials for Application Default Credentials
However, the correct method of specifying credentials depends on where you are deploying your code. For example, applications can fetch credentials from the compute metadata service when deployed in Google Cloud.
Related
I try to run SQL queries from Google BigQuery in the Jupyter notebook.
I do everything as written here https://cloud.google.com/bigquery/docs/bigquery-storage-python-pandas#download_query_results_using_the_client_library.
I opened a Client Account and download the JSON file.
Now I try to run the script :
from google.cloud import bigquery
bqclient = bigquery.Client('c://folder/client_account.json')
# Download query results.
query_string = """
SELECT * from `project.dataset.table`
"""
dataframe = (
bqclient.query(query_string)
.result()
.to_dataframe(
# Optionally, explicitly request to use the BigQuery Storage API. As of
# google-cloud-bigquery version 1.26.0 and above, the BigQuery Storage
# API is used by default.
create_bqstorage_client=True,
)
)
print(dataframe.head())
But I keep getting an error:
DefaultCredentialsError: Could not automatically determine credentials. Please set GOOGLE_APPLICATION_CREDENTIALS or explicitly create credentials and re-run the application. For more information, please see https://cloud.google.com/docs/authentication/getting-started
I do not understand what I am doing wrong, because the JSON file looks fine and the path to the file is correct.
The error suggests that your GCP environment is not able to identify and configure the required application credentials.
To authenticate using service account follow the below approach :
from google.cloud import bigquery
from google.oauth2 import service_account
# TODO(developer): Set key_path to the path to the service account key
# file.
key_path = "path/to/service_account.json"
credentials = service_account.Credentials.from_service_account_file(
key_path, scopes=["https://www.googleapis.com/auth/cloud-platform"],
)
bqclient = bigquery.Client(credentials=credentials, project=credentials.project_id,)
query_string = """
SELECT * from `project.dataset.table`
"""
dataframe = (
bqclient.query(query_string)
.result()
.to_dataframe(
# Optionally, explicitly request to use the BigQuery Storage API. As of
# google-cloud-bigquery version 1.26.0 and above, the BigQuery Storage
# API is used by default.
create_bqstorage_client=True,
)
)
print(dataframe.head())
I have the following Python3 script:
import os, json
import googleapiclient.discovery
from google.oauth2 import service_account
from google.cloud import storage
storage_client = storage.Client.from_service_account_json('gcp-sa.json')
buckets = list(storage_client.list_buckets())
print(buckets)
compute = googleapiclient.discovery.build('compute', 'v1')
def list_instances(compute, project, zone):
result = compute.instances().list(project=project, zone=zone).execute()
return result['items'] if 'items' in result else None
list_instances(compute, "my-project", "my-zone")
Only listing buckets without the rest works fine, that tells me that my service account (with has read access to the whole project) should work. How can I now list VM's? Using the code above, I get
raise exceptions.DefaultCredentialsError(_HELP_MESSAGE)
google.auth.exceptions.DefaultCredentialsError: Could not automatically determine credentials. Please set GOOGLE_APPLICATION_CREDENTIALS or explicitly create credentials and re-run the application. For more information, please see https://cloud.google.com/docs/authentication/getting-started
So that tells me that I somehow have to pass the service account json. How is that possible?
Thanks!!
My scenario: I need to call the Google DBM API using google-api-python-client from a Cloud Function using a service account.
Update
Here the sample code I trying to run:
import google.auth
from googleapiclient.discovery import build
from google.auth.transport import requests
API_NAME = 'doubleclickbidmanager'
API_VERSION = 'v1.1'
SCOPES = ['https://www.googleapis.com/auth/doubleclickbidmanager']
credentials, project_id = google.auth.default(scopes=SCOPES)
print(credentials.service_account_email)
# prints "default"
credentials.refresh(requests.Request())
print(credentials.service_account_email)
# prints "[service-account]#[project].iam.gserviceaccount.com"
service = build(API_NAME, API_VERSION, credentials=credentials)
service.queries().listqueries().execute()
# local: return the queries
# in CF: Encountered 403 Forbidden with reason "insufficientPermissions"
When I run locally setting the environment variable GOOGLE_APPLICATION_CREDENTIALS=keyfile.json works fine.
But when running in the Cloud Function I got the 403 error.
The keyfile.json use the same service account [service-account]#[project].iam.gserviceaccount.com setted in the Cloud Function.
You have to force the library to generate a token to force it to fulfill all its fields. For this, simply call a retry, like this
import google.auth
credentials, project_id = google.auth.default(scopes=SCOPES)
from google.auth.transport import requests
credentials.refresh(requests.Request())
print(credentials._service_account_email)
build(API_NAME, API_VERSION, credentials=credentials)
Ok, this is my first question in the site so I going to try to be clear.
I am trying to build a speech recognition application in the raspberry pi with python and the Cloud Speech-to-Text API. While trying to set the credentials for the application defining a variable in the Terminal (following the steps shown here: https://cloud.google.com/speech-to-text/docs/reference/libraries#client-libraries-usage-python) I get the following error:
Traceback (most recent call last):
File "/home/pi/Documents/pythonPrograms/GoogleSpeech.py", line 15, in <module>
client = speech.SpeechClient()
File "/usr/local/lib/python3.5/dist-packages/google/cloud/speech_v1/gapic/speech_client.py", line 137, in __init__
credentials=credentials,
File "/usr/local/lib/python3.5/dist-packages/google/cloud/speech_v1/gapic/transports/speech_grpc_transport.py", line 63, in __init__
credentials=credentials,
File "/usr/local/lib/python3.5/dist-packages/google/cloud/speech_v1/gapic/transports/speech_grpc_transport.py", line 98, in create_channel
scopes=cls._OAUTH_SCOPES,
File "/usr/local/lib/python3.5/dist-packages/google/api_core/grpc_helpers.py", line 177, in create_channel
credentials, _ = google.auth.default(scopes=scopes)
File "/usr/local/lib/python3.5/dist-packages/google/auth/_default.py", line 306, in default
raise exceptions.DefaultCredentialsError(_HELP_MESSAGE)
google.auth.exceptions.DefaultCredentialsError: Could not automatically determine credentials. Please set GOOGLE_APPLICATION_CREDENTIALS or explicitly create credentials and re-run the application. For more information, please see https://cloud.google.com/docs/authentication/getting-started
As that didn't work, I tried to set the credentials manually inside the code. The problem is, I keep getting the same error (probably because I'm not doing it right). Here is my code right now:
import io
import os
# Imports the Google Cloud client library
from google.oauth2 import service_account
credentials = service_account.Credentials.from_service_account_file("/root/Downloads/key.json")
scoped_credentials = credentials.with_scopes(["https://www.googleapis.com/auth/cloud-platform"])
from google.cloud import speech
from google.cloud.speech import enums
from google.cloud.speech import types
# Instantiates a client
client = speech.SpeechClient()
# The name of the audio file to transcribe
file_name = os.path.join(
os.path.dirname(__file__),
'resources',
'audio.raw')
# Loads the audio into memory
with io.open(file_name, 'rb') as audio_file:
content = audio_file.read()
audio = types.RecognitionAudio(content=content)
config = types.RecognitionConfig(
encoding=enums.RecognitionConfig.AudioEncoding.LINEAR16,
sample_rate_hertz=16000,
language_code='en-US')
# Detects speech in the audio file
response = client.recognize(config, audio)
for result in response.results:
print('Transcript: {}'.format(result.alternatives[0].transcript))
While searching for a solution I tried eliminating the part of the code that says:
from google.cloud import speech
from google.cloud.speech import enums
from google.cloud.speech import types
To what I get:
Traceback (most recent call last):
File "/home/pi/Documents/pythonPrograms/GoogleSpeech.py", line 12, in <module>
client = speech.SpeechClient()
NameError: name 'speech' is not defined
Thus, I suppose the problem is within the way that I imported that, and not in the credential itself. It is important to add that I activated the API in my Google Cloud account project.
Any help would be really appreciated.
This post is over 2 years old but I bumped into the same problem (I wanted to explicitly set the credentials in the code for testing)
The only thing you need to do is to pass the credential as credential argument to SpeechClient as follows:
import io
import os
# Imports the Google Cloud client library
from google.oauth2 import service_account
credentials = service_account.Credentials.from_service_account_file("/path/to/key.json")
scoped_credentials = credentials.with_scopes(["https://www.googleapis.com/auth/cloud-platform"])
from google.cloud import speech
# Instantiates a client
client = speech.SpeechClient(credentials=credentials)
Alternatively, you could also use from_service_account_file method directly in SpeechClient:
from google.cloud import speech
# Instantiates a client
client = speech.SpeechClient.from_service_account_file("/path/to/key.json")
If your credentials set-up is OK any of the options should work without a problem.
The error message clearly indicates that environment variable GOOGLE_APPLICATION_CREDENTIALS is not set.
The ReadMe file for the reference code mentions that you have to setup the authentication. Basically you need to create a service account, give the service account necessary permissions(= set a role, this depends on what do you want to do with the account), download the credentials json file and set GOOGLE_APPLICATION_CREDENTIALS environment variable to point to your json file. Actual steps vary depending on your OS, whole process is documented here.
I use os command to solve Google-Credential issue. set GOOGLE_APPLICATION_CREDENTIALS environment variable in Python code following.
import os
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = "path of your JSON credential file"
This will be worth for the issue.
I am find this docs https://firebase.google.com/docs/storage/gcp-integration#apis
Here is code from this docs
# Import gcloud
from google.cloud import storage
# Enable Storage
client = storage.Client()
# Reference an existing bucket.
bucket = client.get_bucket('my-existing-bucket')
# Upload a local file to a new file to be created in your bucket.
zebraBlob = bucket.get_blob('zebra.jpg')
zebraBlob.upload_from_filename(filename='/photos/zoo/zebra.jpg')
# Download a file from your bucket.
giraffeBlob = bucket.get_blob('giraffe.jpg')
giraffeBlob.download_as_string()
In line client = storage.Client()
Said:
Could not automatically determine credentials. Please set GOOGLE_APPLICATION_CREDENTIALS or explicitly create credential and re-run the application
In the next step I am put
from oauth2client.client import GoogleCredentials
GOOGLE_APPLICATION_CREDENTIALS = 'credentials.json'
credentials = GoogleCredentials.get_application_default()
Said:
The Application Default Credentials are not available. They are available if running in Google Compute Engine
And my final question is how to authenticate in Google Compute Engine.
Problem is solved.
1.) You need to install https://cloud.google.com/sdk/
2.) Login inside cloud sdk with your gmail
3.) choose your firebase project
4.) put gcloud auth application-default login in console
5.) You can see credentials here
C:\Users\storks\AppData\Roaming\gcloud\application_default_credentials.json
6.) For more info see How the Application Default Credentials work
https://developers.google.com/identity/protocols/application-default-credentials