I'm trying to execute an SQL query on some bigquery table. I keep getting a DefaultCredentialsError when trying to instantiate a bigquery client object. For example by doing this:
from google.cloud import bigquery
client = bigquery.Client.from_service_account_json('service_account_key.json')
Or by doing this:
from oauth2client.service_account import ServiceAccountCredentials
key = open('service_account_key.json', 'rb').read()
credentials = ServiceAccountCredentials(
'my_email',
key,
scope='https://www.googleapis.com/auth/bigquery')
client = bigquery.Client(credentials=credentials)
Could there be a problem with my .json credentals file? I created a service account key:
Any other suggestions?
You are most likely hitting a bug with using the from_service_account_json method.
Instead, try using the recommended way of authenticating by exporting the GOOGLE_APPLICATION_CREDENTIALS environment variable as decribed here.
Related
I am trying to get data from Google big query table using python. I dont have a service account access,but i have individual access to bigquery using gcloud. i have application default credentials Json file. I need to how to make a connection to bigquery usinG ADC.
code snippet:
from google.cloud import bigquery
conn=bigquery.Client()
query="select * from my_data.test1"
conn.query(query)
When i run above code snippet i am getting error saying:
NewConnectionError: <urllib3.connection.HttpsConnection object at 0x83dh46bdu640>: Failed to establish a new connection:[Error -2] Name or Service not known
Note: ENVIRONMENT Variable GOOGLE APPLICATION CREDENTIALS is not set and empty
Your script works for me because I authenticated using end user credentials from Google Cloud SDK, once you have the SDK installed you can simply run:
gcloud auth application-default login
The credentials from your json file are not being passed to the bigquery client, e.g.:
client = bigquery.Client(project=project, credentials=credentials)
to set that up you can follow these steps: https://cloud.google.com/bigquery/docs/authentication/end-user-installed
or this thread has some good details on setting the credentials environment variable: Setting GOOGLE_APPLICATION_CREDENTIALS for BigQuery Python CLI
import pandas as pd
import firebase_admin
from firebase_admin import credentials, firestore
cred = credentials.Certificate("crt")
firebase_admin.initialize_app(cred,{
'databaseURL': 'url'
})
db = firestore.client()
actor_ref = db.collection('actors')
# Import data
df = pd.read_csv("./hw1_datasets/actor.csv")
tmp = df.to_dict(orient='records')
list(map(lambda x: actor_ref.add(x), tmp))
I'm running this script to import csv to my Firebase Real-time Database and it keeps saying the project does not exist or it does not contain an active Cloud Datastore or Cloud Firestore database. I created a firebase real-time database by using the same google account and I'm not sure why is it saying no database. Does Google Cloud not support Firebase's real-time database? Any help would be strongly appreicated
Your question seems to be mixing up both Firebase Realtime Database and Firestore. While you have mentioned that you are using Firebase Realtime Database your python script is for importing data into Firestore. Please note that Firestore and Firebase Realtime Database are two different Databases.
The error message you are getting suggests that the project doesn’t have a Firestore Database. So to resolve the error please go to Firebase Console and create a Database in Firestore. After creating the Firestore database add a collection named ‘actors’. You can follow the steps mentioned here to create a Firestore database in Firebase Console.
If you want to use Firebase Realtime database you have to initialize it in the python script differently. You may look at the following as a reference to know how to initialize Firebase Realtime Database.
First you have to import ‘db’ from firebase_admin as follows
from firebase_admin import db
Then you have to create a credentials object by taking the serviceAccountKey.json file which you can generate in the Project Overview > Project Settings page in Firebase console
cred = credentials.Certificate('path/to/serviceAccountKey.json')
Next you have to initialize the database as follows -
firebase_admin.initialize_app(cred, {
'databaseURL': 'https://databaseName.firebaseio.com'
})
Now to access the Firebase Realtime Database you have to create a reference as follows
ref = db.reference('databaseName')
More details on how to initialize Firebase Realtime Database is here.
To read and save data to the Firebase Realtime Database you may refer to this document and this document respectively.
The problem is in:
firebase_admin.initialize_app(cred,{
'databaseURL': 'url'
})
The databaseURL is a reference to the Realtime Database, and it seems that is not enough for your code to find the Firestore database of the project. As shown in the documentation on setting up Python access to Firestore, you will (also) need to pass the project ID before you can access Firestore:
firebase_admin.initialize_app(cred, {
'projectId': project_id,
})
db = firestore.client()
If I want to create/read/update spreadsheets using gspread, I know first have to authenticate like so:
import gspread
gc = gspread.service_account()
Where I can also specify the filename to point to a service account json key, but is there a way I can tell gspread to use the default service account credentials without pointing to a json?
My use-case is that I want to run gspread in a vm (or cloud function) that already comes with an IAM role and I can't seem to figure out where to get the json file from. I also don't want to copy the json to the vm unnecessarily.
You can use google.auth to get the credentials of the default service account.
Then you can use gspread.authorize() with these credentials:
import google.auth
import gspread
credentials, project_id = google.auth.default(
scopes=[
'https://spreadsheets.google.com/feeds',
'https://www.googleapis.com/auth/drive'
]
)
gc = gspread.authorize(credentials)
As per the use case, Cloud function has Runtime service account feature which allows you to use default service account or you can attach any service account to the cloud function.
Using this library you can do the thing without json file.
import google.auth
credentials, project_id = google.auth.default()
I have the following Python3 script:
import os, json
import googleapiclient.discovery
from google.oauth2 import service_account
from google.cloud import storage
storage_client = storage.Client.from_service_account_json('gcp-sa.json')
buckets = list(storage_client.list_buckets())
print(buckets)
compute = googleapiclient.discovery.build('compute', 'v1')
def list_instances(compute, project, zone):
result = compute.instances().list(project=project, zone=zone).execute()
return result['items'] if 'items' in result else None
list_instances(compute, "my-project", "my-zone")
Only listing buckets without the rest works fine, that tells me that my service account (with has read access to the whole project) should work. How can I now list VM's? Using the code above, I get
raise exceptions.DefaultCredentialsError(_HELP_MESSAGE)
google.auth.exceptions.DefaultCredentialsError: Could not automatically determine credentials. Please set GOOGLE_APPLICATION_CREDENTIALS or explicitly create credentials and re-run the application. For more information, please see https://cloud.google.com/docs/authentication/getting-started
So that tells me that I somehow have to pass the service account json. How is that possible?
Thanks!!
I am trying to get authentication to use the google translation API. Currently on my local machine I simply do this:
from google.cloud import translate_v2 as translate
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = path_to_cred_json_file
translate_client = translate.Client()
which works fine. However, I wish to do this on AWS where I have I stored the credential json file in AWS secrets. In the documentation for translate.Client I see this:
Init signature:
translate.Client(
target_language='en',
credentials=None,
...
)
...
:type credentials: :class:`~google.auth.credentials.Credentials`
However, if I read in the json file and try to pass it in as the credentials argument it chucks an error.
The only answer I have for now in AWS is to read the secret, write it out as a json file, and then set os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = path_to_cred_json_file, which will work, but was told by a data engineer is a security risk.
So the question is how do I get this google.auth.credentials.Credentials object without reading a physical file. I have access to the plain text version of the json file in memory (via AWS secrets). I'm really new to AWS in general so go easy on me.
Thanks to #miles-budnek and this github comment, found the answer.
Supposing I have the json string as a dictionary called secret:
from google.cloud import translate_v2 as translate
from google.oauth2 import service_account
credentials = service_account.Credentials.from_service_account_info(secret)
t_client = translate.Client(credentials=credentials)