check if a Firebase App is already initialized in python - python

I get the following error:
ValueError: The default Firebase app already exists. This means you called initialize_app() more than once without providing an app name as the second argument. In most cases you only need to call initialize_app() once. But if you do want to initialize multiple apps, pass a second argument to initialize_app() to give each app a unique name.
How Can I check if the default firebase app is already initialized or not in python?

The best way is to control your app workflow so the initialization is called only once. But of course, idempotent code is also a good thing, so here is what you can do to avoid that error:
import firebase_admin
from firebase_admin import credentials
if not firebase_admin._apps:
cred = credentials.Certificate('path/to/serviceAccountKey.json')
default_app = firebase_admin.initialize_app(cred)

Initialize the app in the constructor
cred = credentials.Certificate('/path/to/serviceAccountKey.json')
firebase_admin.initialize_app(cred)
then in your method you call
firebase_admin.get_app()
https://firebase.google.com/docs/reference/admin/python/firebase_admin

I've found the following to work for me.
For the default app:
import firebase_admin
from firebase_admin import credentials
if firebase_admin._DEFAULT_APP_NAME in firebase_admin._apps:
# do something.
I have been using it in this way with a named app:
import firebase_admin
from firebase_admin import credentials
if 'my_app_name' not in firebase_admin._apps:
cred = credentials.Certificate('path/to/serviceAccountKey.json')
firebase_admin.initialize_app(cred, {
'databaseURL': 'https://{}.firebaseio.com'.format(project_id),
'storageBucket': '{}.appspot.com'.format(project_id)}, name='my_app_name')

I use this try / except block to handle initialisation of the app
try:
app = firebase_admin.get_app()
except ValueError as e:
cred = credentials.Certificate(CREDENTIALS_FIREBASE_PATH)
firebase_admin.initialize_app(cred)

If you ended up here due to building Cloud Functions in GCP with Python that interacts with Firestore, then this is what worked for me:
The reason for using an exception for control flow here is that the firebase_admin._apps is a protected member of the module, so accessing it directly is not best practice either.
import firebase_admin
from firebase_admin import credentials, firestore
def init_with_service_account(file_path):
"""
Initialize the Firestore DB client using a service account
:param file_path: path to service account
:return: firestore
"""
cred = credentials.Certificate(file_path)
try:
firebase_admin.get_app()
except ValueError:
firebase_admin.initialize_app(cred)
return firestore.client()
def init_with_project_id(project_id):
"""
Initialize the Firestore DB client using a GCP project ID
:param project_id: The GCP project ID
:return: firestore
"""
cred = credentials.ApplicationDefault()
try:
firebase_admin.get_app()
except ValueError:
firebase_admin.initialize_app(cred)
return firestore.client()

You can use
firebase_admin.delete_app(firebase_admin.get_app())
And execute the code again

You can also use default credentials
if (not len(firebase_admin._apps)):
cred = credentials.ApplicationDefault()
firebase_admin.initialize_app(cred, {
'projectId': "yourprojetid"})

In my case, I faced a similar error. But my issue was, I have initialized the app twice in my python file. So it crashed my whole python file and returns a
ValueError: The default Firebase app already exists. This means you called initialize_app() more than once without providing an app name as the second argument. In most cases you only need to call initialize_app() once. But if you do want to initialize multiple apps, pass a second argument to initialize_app() to give each app a unique name.
I solved this by removing one of my firebase app initialization. I hope this will help someone!!

Reinitializing more than one app in firebase using python:
You need to give different name for different app
For every app we will be generating one object store those objects in list and access those object one by one later
def getprojectid(proj_url):
p = r'//(.*)\.firebaseio'
x = re.findall(p, url)
return x[0]
objects = []
count = 0
details = dict()
def addtofirebase(json_path, url):
global objects, count, details
my_app_name = getprojectid(url) # Function which returns project ID
if my_app_name not in firebase_admin._apps:
cred = credentials.Certificate(json_path)
obj = firebase_admin.initialize_app(cred,xyz , name=my_app_name) # create the object
objects.append(obj) # Store Initialized Objects in one list
details[my_app_name] = count # Storing index of object in dictionary to access it later using project id
count += 1
ref = db.reference('/',app= objects[details[my_app_name]) # using this reference, change database
else:
ref = db.reference('/',app= objects[details[my_app_name]) # from next time it will get update here. it will not get initialise again and again

Make the the app global, don't put the initialize_app() inside the function because whenever the function called it also calls the initialize_app() again.
CRED = credentials.Certificate('path/to/serviceAccountKey.json')
DEFAULT_APP = firebase_admin.initialize_app(cred)
def function():
"""call default_app and process data here"""

Use don't need key.json file. You can default gcloud credentials to authenticate.
gcloud auth application-default login --project="yourproject"
python code:
import firebase_admin
app_options = {'projectId': 'yourproject'}
default_app = firebase_admin.initialize_app(options=app_options)

Related

List google cloud compute engine active instance

I'm looking to find out all the active resources( like compute engine, gke etc) and the respective zones .
I tried below python code to print that but its printing all zone information wherever compute engine is available , can please someone guide me what functions are available to do so .
compute = googleapiclient.discovery.build('compute', 'v1')
request = compute.instances().aggregatedList(project=project)
while request is not None:
response = request.execute()
for name, instances_scoped_list in response['items'].items():
pprint((name, instances_scoped_list))
request = compute.instances().aggregatedList_next(previous_request=request, previous_response=response)
You can list all instances you have in your project, using the Cloud Console gcloud compute instances list command or the instances.list() method.
To list all instances in a project in table form, run:
gcloud compute instances list
You will get something like :
NAME ZONE MACHINE_TYPE PREEMPTIBLE INTERNAL_IP EXTERNAL_IP STATUS
instance-1 us-central1-a n1-standard-1 10.128.0.44 xx.xx.xxx.xx RUNNING
instance-2 us-central1-b n1-standard-1 10.128.0.49 xx.xx.xxx.xx RUNNING
Edit1
As you mentioned aggregatedList() is the correct one, and to get the information required is necessary to go over the JSON Response Body.
If you need some specific fields you can check the Response body information.
Also, you can use this code as a guide, I’m getting all the information from the instances.
from pprint import pprint
from googleapiclient import discovery
from oauth2client.client import GoogleCredentials
credentials = GoogleCredentials.get_application_default()
service = discovery.build('compute', 'v1', credentials=credentials)
# Project ID for this request.
project = "{Project-ID}" # TODO: Update placeholder value.
request = service.instances().aggregatedList(project=project)
while request is not None:
response = request.execute()
instance = response.get('items', {})
for instance in instance.values():
for a in instance.get('instances', []):
print(str(instance))
request = service.instances().aggregatedList_next(previous_request=request, previous_response=response)

GCP Python Compute Engine - list VM's

I have the following Python3 script:
import os, json
import googleapiclient.discovery
from google.oauth2 import service_account
from google.cloud import storage
storage_client = storage.Client.from_service_account_json('gcp-sa.json')
buckets = list(storage_client.list_buckets())
print(buckets)
compute = googleapiclient.discovery.build('compute', 'v1')
def list_instances(compute, project, zone):
result = compute.instances().list(project=project, zone=zone).execute()
return result['items'] if 'items' in result else None
list_instances(compute, "my-project", "my-zone")
Only listing buckets without the rest works fine, that tells me that my service account (with has read access to the whole project) should work. How can I now list VM's? Using the code above, I get
raise exceptions.DefaultCredentialsError(_HELP_MESSAGE)
google.auth.exceptions.DefaultCredentialsError: Could not automatically determine credentials. Please set GOOGLE_APPLICATION_CREDENTIALS or explicitly create credentials and re-run the application. For more information, please see https://cloud.google.com/docs/authentication/getting-started
So that tells me that I somehow have to pass the service account json. How is that possible?
Thanks!!

Python REST API gives Internal Server Error

I'm trying to create a REST API in python .. I have decent experience in python but relatively new to REST APIs ... When I run my python script I get a "Internal Server Error" on my browser ..
Some thing like this:
Error on the browser
and on my console I see this:
Error as displayed on my console
Here's my code:
from flask import Flask, request
from flask_restful import Resource, Api
import firebase_admin
# For connecting to firestore database and authentication
from firebase_admin import credentials, firestore
# For Data base Connectivity
from firebase_admin import db
from flask import jsonify
app = Flask(__name__)
api = Api(app)
class Firebase_Data(Resource):
def getData(self):
# Setting up credentials to connect
cred = credentials.Certificate(../Path)
# Setting up secure connection to firestore Real time database
app = firebase_admin.initialize_app(cred, {
'projectId' : 'project_ID'
})
# Connecting to the firestore client
db_ref = firestore.client()
# Referring to the '** section of the data
ref_inc = db_ref.collection(u'name of column')
# Fetching all the records under that particular section and
converting them to list of dictionaries
docs = list( ref_inc.get() )
lat_long = []
for doc in docs:
data = doc.to_dict()
lat_long.append(
{ 'Latitude:' : data['latitude'], 'Longitude' :
data['longitude'] } )
return jsonify(lat_long)
api.add_resource(Firebase_Data, '/Firebase_Data') # Route_1
if __name__ == '__main__':
app.run(port=5002)
I'm basically trying to fetch some data from a Fire store database and display it on the browser. I don't think the fire store part has anything to do with my error, I think I'm missing something on executing the "get" function of my python class which I'm not able to figure out .. Any help is highly appreciated.. Thanks in advance

Connecting aws backend to firebase database

I'm currently running python code in my aws server and trying to connect to my friend's firebase database. I read the documentation provided by firebase to connect to aws server.
https://firebase.google.com/docs/admin/setup
I have followed every step but I'm getting an error when I try to connect to my server. I have added google-service.json for credential.
Error that I get :
ValueError: Invalid service account certificate. Certificate must
contain a "type" field set to "service_account".
Do I need to modify the google-services.json ?
My code:
import firebase_admin
from firebase_admin import credentials
cred = credentials.Certificate('/home/ec2-user/google-services.json')
#default_app = firebase_admin.initialize_app(cred)
other_app = firebase_admin.initialize_app(cred, name='other')
ault_app = firebase_admin.initialize_app()
google-services.json is typically the name of an Android app configuration file. That's not the same as a service account. To get a hold of the credentials for a service account for your project, you'll need to generate one from the Firebase console from Project Settings -> Service Accounts. The documentation is here. Once you have this file, you can initialize the Admin SDK with it to begin accessing the data in your project.
Better way would be to store credentials on s3 (encrypted) with a IAM role attached to lambda function.
import os
import firebase_admin
from firebase_admin import credentials
import boto3
from settings.local_settings import AWS_REGION, ENVIRONMENT
import json
firebase_config_file = 'app-admin-config-{}.json'.format(ENVIRONMENT)
firebase_admin_creds_file = 'app-admin-sdk-{}.json'.format(ENVIRONMENT)
current_dir = os.path.abspath(os.path.dirname(__file__))
files = [f for f in os.listdir(current_dir) if os.path.isfile(f)]
if firebase_config_file not in files and firebase_admin_creds_file not in files:
s3 = boto3.resource('s3', region_name=AWS_REGION)
bucket = s3.Bucket('app-s3-secrets')
firebase_config = json.loads(
bucket.Object('app-admin-config-{}.json'.format(ENVIRONMENT)).get()['Body'].read())
firebase_admin_creds = json.loads(
bucket.Object('app-admin-sdk-{}.json'.format(ENVIRONMENT)).get()['Body'].read().decode())
class Firebase:
#staticmethod
def get_connection():
cred = credentials.Certificate(firebase_admin_creds)
return firebase_admin.initialize_app(cred, firebase_config)
app = Firebase.get_connection()

Boto3 uses old credentials

I am using tkinter to create gui application that returns the security groups. Currently if you want to change your credentials (e.g. if you accidentally entered the wrong ones) you would have to restart the application otherwise boto3 would carry on using the old credentials.
I'm not sure why it keeps using the old credentials because I am running everything again using the currently entered credentials.
This is a snippet of the code that sets the environment variables and launches boto3. It works perfectly fine if you enter the right credentials the first time.
os.environ['AWS_ACCESS_KEY_ID'] = self.accessKey
os.environ['AWS_SECRET_ACCESS_KEY'] = self.secretKey
self.sts_client = boto3.client('sts')
self.assumedRoleObject = self.sts_client.assume_role(
RoleArn=self.role,
RoleSessionName="AssumeRoleSession1"
)
self.credentials = self.assumedRoleObject['Credentials']
self.ec2 = boto3.resource(
'ec2',
region_name=self.region,
aws_access_key_id=credentials['AccessKeyId'],
aws_secret_access_key=credentials['SecretAccessKey'],
aws_session_token=credentials['SessionToken'],
)
The credentials variables are set using:
self.accessKey = str(self.AWS_ACCESS_KEY_ID_Form.get())
self.secretKey = str(self.AWS_SECRET_ACCESS_KEY_Form.get())
self.role = str(self.AWS_ROLE_ARN_Form.get())
self.region = str(self.AWS_REGION_Form.get())
self.instanceID = str(self.AWS_INSTANCE_ID_Form.get())
Is there a way to use different credentials in boto3 without restarting the program?
You need boto3.session.Session to overwrite the access credentials.
Just do this
reference http://boto3.readthedocs.io/en/latest/reference/core/session.html
import boto3
# Assign you own access
mysession = boto3.session.Session(aws_access_key_id='foo1', aws_secret_access_key='bar1')
# If you want to use different profile call foobar inside .aws/credentials
mysession = boto3.session.Session(profile_name="fooboar")
# Afterwards, just declare your AWS client/resource services
sqs_resource=mysession.resource("sqs")
# or client
s3_client=mysession.client("s3")
Basically, little change to your code. you just pass in the session instead of direct boto3.client/boto3.resource
self.sts_client = mysession.client('sts')
Sure, just create different sessions from botocore.session.Session object for each set of credentials:
import boto3
s1 = boto3.session.Session(aws_access_key_id='foo1', aws_secret_access_key='bar1')
s2 = boto3.session.Session(aws_access_key_id='foo2', aws_secret_access_key='bar2')
Also you can leverage set_credentials method to keep 1 session an change creds on the fly:
import botocore
session - botocore.session.Session()
session.set_credentials('foo', 'bar')
client = session.create_client('s3')
client._request_signer._credentials.access_key
u'foo'
session.set_credentials('foo1', 'bar')
client = session.create_client('s3')
client._request_signer._credentials.access_key
u'foo1'
The answers given by #mootmoot and #Vor clearly state the way of dealing with multiple credentials using a session.
#Vor's answer
import boto3
s1 = boto3.session.Session(aws_access_key_id='foo1', aws_secret_access_key='bar1')
s2 = boto3.session.Session(aws_access_key_id='foo2', aws_secret_access_key='bar2')
But some of you would be curious about
why does the boto3 client or resource behave in that manner in the first place?
Let's clear out a few points about Session and Client as they'll actually lead us to the answer to the aforementioned question.
Session
A 'Session' stores configuration state and allows you to create service clients and resources
Client
if the credentials are not passed explicitly as arguments to the boto3.client method, then the credentials configured for the session will automatically be used. You only need to provide credentials as arguments if you want to override the credentials used for this specific client
Now let's get to the code and see what actually happens when you call boto3.client()
def client(*args, **kwargs):
return _get_default_session().client(*args, **kwargs)
def _get_default_session():
if DEFAULT_SESSION is None:
setup_default_session()
return DEFAULT_SESSION
def setup_default_session(**kwargs):
DEFAULT_SESSION = Session(**kwargs)
Learnings from the above
The function boto3.client() is really just a proxy for the boto3.Session.client() method
If you once use the client, the DEFAULT_SESSION is set up and for the next consecutive creation of clients it'll keep using the DEFAULT_SESSION
The credentials configured for the DEFAULT_SESSION are used if the credentials are not explicitly passed as arguments while creating the boto3 client.
Answer
The first call to boto3.client() sets up the DEFAULT_SESSION and configures the session with the oldCredsAccessKey, oldCredsSecretKey, the already set values for env variables AWS_ACCESS_KEY_ID and AWS_SECRET_ACESS_KEY respectively.
So even if you set new values of credentials in the environment, i.e do this
os.environ['AWS_ACCESS_KEY_ID'] = newCredsAccessKey
os.environ['AWS_SECRET_ACCESS_KEY'] = newCredsSecretKey
The upcoming boto3.client() calls still pick up the old credentials configured for the DEFAULT_SESSION
NOTE
boto3.client() call in this whole answer means that no arguments passed to the client method.
References
https://boto3.amazonaws.com/v1/documentation/api/latest/_modules/boto3.html#client
https://boto3.amazonaws.com/v1/documentation/api/latest/_modules/boto3/session.html#Session
https://ben11kehoe.medium.com/boto3-sessions-and-why-you-should-use-them-9b094eb5ca8e

Categories