I am find this docs https://firebase.google.com/docs/storage/gcp-integration#apis
Here is code from this docs
# Import gcloud
from google.cloud import storage
# Enable Storage
client = storage.Client()
# Reference an existing bucket.
bucket = client.get_bucket('my-existing-bucket')
# Upload a local file to a new file to be created in your bucket.
zebraBlob = bucket.get_blob('zebra.jpg')
zebraBlob.upload_from_filename(filename='/photos/zoo/zebra.jpg')
# Download a file from your bucket.
giraffeBlob = bucket.get_blob('giraffe.jpg')
giraffeBlob.download_as_string()
In line client = storage.Client()
Said:
Could not automatically determine credentials. Please set GOOGLE_APPLICATION_CREDENTIALS or explicitly create credential and re-run the application
In the next step I am put
from oauth2client.client import GoogleCredentials
GOOGLE_APPLICATION_CREDENTIALS = 'credentials.json'
credentials = GoogleCredentials.get_application_default()
Said:
The Application Default Credentials are not available. They are available if running in Google Compute Engine
And my final question is how to authenticate in Google Compute Engine.
Problem is solved.
1.) You need to install https://cloud.google.com/sdk/
2.) Login inside cloud sdk with your gmail
3.) choose your firebase project
4.) put gcloud auth application-default login in console
5.) You can see credentials here
C:\Users\storks\AppData\Roaming\gcloud\application_default_credentials.json
6.) For more info see How the Application Default Credentials work
https://developers.google.com/identity/protocols/application-default-credentials
Related
I'm hosting a Flask web app on Cloud Run. I'm also using Secret Manager to store Service Account keys. (I previously downloaded a JSON file with the keys)
In my code, I'm accessing the payload then using os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = payload to authenticate. When I deploy the app and try to visit the page, I get an Internal Service Error. Reviewing the logs, I see:
File "/usr/local/lib/python3.10/site-packages/google/auth/_default.py", line 121, in load_credentials_from_file
raise exceptions.DefaultCredentialsError(
google.auth.exceptions.DefaultCredentialsError: File {"
I can access the secret through gcloud just fine with: gcloud secrets versions access 1 --secret="<secret_id>" while acting as the Service Account.
Here is my Python code:
# Grabbing keys from Secret Manager
def access_secret_version():
# Create the Secret Manager client.
client = secretmanager.SecretManagerServiceClient()
# Build the resource name of the secret version.
name = "projects/{project_id}/secrets/{secret_id}/versions/1"
# Access the secret version.
response = client.access_secret_version(request={"name": name})
payload = response.payload.data.decode("UTF-8")
return payload
#app.route('/page/page_two')
def some_random_func():
# New way
payload = access_secret_version() # <---- calling the payload
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = payload
# Old way
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = "service-account-keys.json"
I'm not technically accessing a JSON file like I was before. The payload variable is storing entire key. Is this why it's not working?
Your approach is incorrect.
When you run on a Google compute service like Cloud Run, the code runs under the identity of the compute service.
In this case, by default, Cloud Run uses the Compute Engine default service account but, it's good practice to create a Service Account for your service and specify it when you deploy it to Cloud Run (see Service accounts).
This mechanism is one of the "legs" of Application Default Credentials when your code is running on Google Cloud, you don't specify the environment variable (you also don't need to create a key) and Cloud Run service acquires the credentials from the Metadata service:
import google.auth
credentials, project_id = google.auth.default()
See google.auth package
It is bad practice to define|set an environment variable within code. By their nature, environment variables should be provided by the environment. Doing this with APPLICATION_DEFAULT_CREDENTIALS means that your code always sets this value when it should only do this when the code is running off Google Cloud.
For completeness, if you need to create Credentials from a JSON string rather than from a file contain a JSON string, you can use from_service_account_info (see google.oauth2.service_account)
I am writing a script that will authenticate to the Gmail API, pull some emails and transform some email data. I can get this working locally since I have the service account file which I am creating a credentials object from and then referencing in the Gmail API, however since this will be running in Google Cloud Product (GCP) the credentials are stored in the environment.
I need to somehow either modify the code to not reference credentials or retrieve a credentials object in the environment.
The code below works locally, however where I am creating the credentials object, I either need to retrieve that from the environment or authenticate in a different way when creating the delegated_credentials and service objects.
from googleapiclient.discovery import build
from google.oauth2 import service_account
SCOPES = ['https://www.googleapis.com/auth/gmail.readonly']
SERVICE_ACCOUNT_FILE = 'secret.json'
credentials = service_account.Credentials.from_service_account_file(
SERVICE_ACCOUNT_FILE, scopes=SCOPES)
delegated_credentials = credentials.with_subject('someone#gmail.com')
service = build('gmail', 'v1', credentials=delegated_credentials)
As an example, when using the google cloud storage API, I create the client in GCP as follow:
storage_client = storage.Client()
I write a python script to upload file to google drive, but the script is redirecting to chrome for email user authentication.
is there any way to avoid redirecting to chrome for authentication.
I'm running on python 3.9.
here is my sample code:
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
gauth = GoogleAuth()
drive = GoogleDrive(gauth)
upload_file_list = ['myfile.pdf']
for upload_file in upload_file_list:
gfile = drive.CreateFile({'parents': [{'id': '1B8ttlQMRUkjbrscevfa1DablIayzObh2'}]})
# Read file and set it as the content of this instance.
gfile.SetContentFile(upload_file)
gfile.Upload() # Upload the file.
The behaviour you are reporting is totally normal with OAuth 2.0 and the official Google APIs library.
What #Tanaike said is a good solution. You could use a service account to access Google Drive files without granting consent every time the token expires. With service accounts there are 2 options to achieve that:
Share the file/folder with the email address of the service account.
Use domain-wide delegation of authority to allow the service account to impersonate any user in your domain. Requires a domain using Google Workspace or Cloud Identity and Super Admin access to configure domain-wide delegation.
General information on how to make API calls with domain-wide delegation is available on this page https://developers.google.com/identity/protocols/oauth2/service-account#authorizingrequests.
Here is a working code sample:
from google.oauth2 import service_account
from googleapiclient.discovery import build
from googleapiclient.errors import HttpError
# Scopes required by this endpoint
# https://developers.google.com/drive/api/v3/reference/permissions/list
SCOPES = ["https://www.googleapis.com/auth/drive.readonly"]
# Variable that holds the file ID
DOCUMENT_ID = "i0321LSy8mmkx_Bw-XlDyzQ_b3Ny9m74u"
# Service account Credential file downloaded with domain-wide delegation of authority
# or with shared access to the file.
SERVICE_ACCOUNT_FILE = "serviceaccount.json";
# Creation of the credentials
credentials = service_account.Credentials.from_service_account_file(
SERVICE_ACCOUNT_FILE,
scopes=SCOPES)
# [Impersonation] the service account will take action on behalf of the user,
# requires domain-wide delegation of authority.
delegated_credentials = credentials.with_subject('user#domain.com')
# The API call is attempted
try:
service = build('drive', 'v3', credentials=delegated_credentials)
# Retrieve the documents contents from the Docs service.
document = service.files().get(fileId=DOCUMENT_ID).execute()
print('The title of the document is: {}'.format(document.get('name')))
except HttpError as err:
print(err)
Keep in mind that to use user impersonation you will need to configure domain-wide delegation in the Admin console of the domain that has the files (this will also work for external files shared with users in the domain).
If you want to use this with regular consumer accounts you can't use user impersonation, instead you will share the file with the service account (read or write access) to later make API calls. Line 20 creates delegated credentials, this line needs to be removed if you will use this other approach.
My login to AWS console is MFA & for that I am using Google Authenticator.
I have S3 DEV bucket and to access that DEV bucket, I have to switch role and after switching i can access DEV bucket.
I need help how to achieve same in python with boto3.
There are many csv file that I need to open in dataframe and without that resolving access, I cannot proceed.
I tried configuring AWS credentials & config and using that in my python code but didn't helped.
AWS document is not clear about how to do switching role while using & doing in python.
import boto3
import s3fs
import pandas as pd
import boto.s3.connection
access_key = 'XXXXXXXXXXX'
secret_key = 'XXXXXXXXXXXXXXXXX'
# bucketName = 'XXXXXXXXXXXXXXXXX'
s3 = boto3.resource('s3')
for bucket in s3.buckets.all():
print(bucket.name)
Expected result should be to access that bucket after switching role in python code along with MFA.
In general, it is a bad for security to put credentials in your program code. It is better to store them in a configuration file. You can do this by using the AWS Command-Line Interface (CLI) aws configure command.
Once the credentials are stored this way, any AWS SDK (eg boto3) will automatically retrieve the credentials without having to reference them in code.
See: Configuring the AWS CLI - AWS Command Line Interface
There is an additional capability with the configuration file, that allows you to store a role that you wish to assume. This can be done by specifying a profile with the Role ARN:
# In ~/.aws/credentials:
[development]
aws_access_key_id=foo
aws_access_key_id=bar
# In ~/.aws/config
[profile crossaccount]
role_arn=arn:aws:iam:...
source_profile=development
The source_profile points to the profile that contains credentials that will be used to make the AssumeRole() call, and role_arn specifies the Role to assume.
See: Assume Role Provider
Finally, you can tell boto3 to use that particular profile for credentials:
session = boto3.Session(profile_name='crossaccount')
# Any clients created from this session will use credentials
# from the [crossaccount] section of ~/.aws/credentials.
dev_s3_client = session.client('s3')
An alternative to all the above (which boto3 does for you) is to call assume_role() in your code, then use the temporary credentials that are returned to define a new session that you can use to connect to a service. However, the above method using profiles is a lot easier.
I am trying to fetch gsuite alerts via API. I have created a service account as per their docs and I have assigned that service account to my google cloud function.
I do not want to use environment variables or upload credentials along with source code but I want leverage default service account used by function.
from googleapiclient.discovery import build
def get_credentials():
# if one knows credentials file location(when one uploads the json credentials file or specify them in environment variable) one can easily get the credentials by specify the path.
# In case of google cloud functions atleast I couldn't find it the path as the GOOGLE_APPLICATION_CREDENTIALS is empty in python runtime
# the below code work find if one uncomments the below line
#credentials = ServiceAccountCredentials.from_json_keyfile_name(key_file_location)
credentials = < how to get default credentials object for default service account?>
delegated_credentials = credentials.create_delegated('admin#alertcenter1.bigr.name').create_scoped(SCOPES)
return delegated_credentials
def get_alerts(api_name, api_version, key_file_location=None):
delegated_credentials = get_credentials()
alertcli = build(api_name, api_version, credentials=delegated_credentials)
resp = alertcli.alerts().list(pageToken=None).execute()
print(resp)
Is there any way I can create a default credentials object. I have tried using
from google.auth import credentials but this does not contain create_delegated function and
I have also tried ServiceAccountCredentials() but this requires signer.
Here is an example to use the Gmail API with delegated credentials. The service account credentials will need "Enable G Suite Domain-wide Delegation" enabled.
from google.oauth2 import service_account
from googleapiclient.discovery import build
credentials = service_account.Credentials.from_service_account_file(
credentials_file,
scopes=['https://www.googleapis.com/auth/gmail.send'])
impersonate = 'username#example.com'
credentials = credentials.with_subject(impersonate)
service = build('gmail', 'v1', credentials=credentials)
You can use the google.auth.default function to get the default credentials and use them to make an IAM signer which can be used to create new service account credentials which has the delegated email adress as subject. I have a more detailed answer for a similar question.
There is also Google Cloud Platform Github repository with some documentation about this method.