I want to download image from firebase storage which the resouce folder
can anyone teach me how to do?
my english is not good forgive me
This code will help you download any image in Firebase, you will find in your local folder as "img.png"
The credentials.json should be in the same folder you run this code in and you can retrieve it from Firebase, its particular to your account
In my case my storage bucket is called 'aisparkdev-tinder.appspot.com'
When generating the blob variable, the string is just the path to your image.
import urllib
import firebase_admin
from firebase_admin import credentials
from firebase_admin import storage
cred = credentials.Certificate("credentials.json")
# Initialize the app with a service account, granting admin privileges
appF = firebase_admin.initialize_app(cred,
{'storageBucket': 'aisparkdev-tinder.appspot.com',},
name='storage')
bucket = storage.bucket(app=appF)
blob = bucket.blob("profileImages/"+profilePic+".jpg")
urllib.request.urlretrieve(blob.generate_signed_url(datetime.timedelta(seconds=300), method='GET'), ".\\img.png")
Related
I want to create CSV file from pandas data frame in a google storage bucket using colab tool.
Right now we use our gmail authentication to load csv file in storage bucket using below command
df.to_csv("gs://Jobs/data.csv")
I have checked below google links
https://cloud.google.com/docs/authentication/production#linux-or-macos
Currently, we used the below code to get credentials from service account
def getCredentialsFromServiceAccount(path: str) -> service_account.Credentials:
return service_account.Credentials.from_service_account_info(
path
)
Kindly suggest
There are two main approaches you can use to upload a csv to Cloud Storage within Colab. Both store the csv locally first and then upload it.
Use gsutil from within Colab.
Use Cloud Storage Python Client Library
The first approach is easiest but authenticates with the user signed into Colab and not a service account.
from google.colab import auth
auth.authenticate_user()
bucket_name = 'my-bucket'
df.to_csv('data.csv', index = False)
!gsutil cp 'data.csv' 'gs://{bucket_name}/'
The next approach uses the client library and authenticates with a service account.
from google.cloud import storage
bucket_name = 'my-bucket'
# store csv locally
df.to_csv('data.csv', index = False)
# start storage client
client = storage.Client.from_service_account_json("path/to/key.json")
# get bucket
bucket = client.bucket(bucket_name)
# create blob (where you want to store csv within bucket)
blob = bucket.blob("jobs/data.csv")
# upload blob to bucket
blob.upload_from_filename("data.csv")
So I am trying to upload my data to firebase using python. Although my data is being pushed to the cloud and it's an image file of .png format. When I view my data in the cloud, it keeps on loading but never actually shows up. Although the size is identical to the size of the local image file. But I don't understand why It never displays the image. This wait bar keeps rotating.
Here's my code from the server-side (Python Code).
import firebase
from google.cloud import storage
from google.cloud.storage import client
import cv2
import firebase_admin
from firebase_admin import credentials
from firebase_admin import storage
cred = credentials.Certificate('certificate.json')
firebase_admin.initialize_app(cred, {
'storageBucket': 'test-cd462.appspot.com'
})
bucket = storage.bucket()
image_data=cv2.imread('1.png')
imageBlob = bucket.blob('Images.png')
blob.upload_from_string(
image_data,
content_type='image/png'
)
print(imageBlob)
Here's the screenshot of what I am getting in my firebase window:
Please help me and let me know where have I gone wrong. Thanks in advance!
I'm building an API that uploads images to Firebase storage, everything works as expected in that regard, the problem is that the syntax makes me specify the file name in each upload, and in production mode the API will receive upload requests from multiple devices, so I need to make to code so it checks for an available id, set it for the "blob()" object, and then do a normal upload, but I have no idea how to do that. or a random name I don't care as long as it doesn't overwrite another picture
Here is my current code:
from flask_pymongo import PyMongo
import firebase_admin
from firebase_admin import credentials, auth, storage, firestore
import os
import io
cred = credentials.Certificate('service_account_key.json')
firebase_admin.initialize_app(cred, {'storageBucket': 'MY-DATABASE-NAME.appspot.com'})
bucket = storage.bucket()
blob = bucket.blob("images/newimage.png") #here is where im guessing i #should put the next available name
# "apple.png" is a sample image #for testing in my directory
with open("apple.png", "rb") as f:
blob.upload_from_file(f)
As "Klaus D."'s comment said the solution was to implement the "uuid" module
import uuid
.....
.....
blob = bucket.blob("images/" + str(uuid.uuid4()))
I'm currently running python code in my aws server and trying to connect to my friend's firebase database. I read the documentation provided by firebase to connect to aws server.
https://firebase.google.com/docs/admin/setup
I have followed every step but I'm getting an error when I try to connect to my server. I have added google-service.json for credential.
Error that I get :
ValueError: Invalid service account certificate. Certificate must
contain a "type" field set to "service_account".
Do I need to modify the google-services.json ?
My code:
import firebase_admin
from firebase_admin import credentials
cred = credentials.Certificate('/home/ec2-user/google-services.json')
#default_app = firebase_admin.initialize_app(cred)
other_app = firebase_admin.initialize_app(cred, name='other')
ault_app = firebase_admin.initialize_app()
google-services.json is typically the name of an Android app configuration file. That's not the same as a service account. To get a hold of the credentials for a service account for your project, you'll need to generate one from the Firebase console from Project Settings -> Service Accounts. The documentation is here. Once you have this file, you can initialize the Admin SDK with it to begin accessing the data in your project.
Better way would be to store credentials on s3 (encrypted) with a IAM role attached to lambda function.
import os
import firebase_admin
from firebase_admin import credentials
import boto3
from settings.local_settings import AWS_REGION, ENVIRONMENT
import json
firebase_config_file = 'app-admin-config-{}.json'.format(ENVIRONMENT)
firebase_admin_creds_file = 'app-admin-sdk-{}.json'.format(ENVIRONMENT)
current_dir = os.path.abspath(os.path.dirname(__file__))
files = [f for f in os.listdir(current_dir) if os.path.isfile(f)]
if firebase_config_file not in files and firebase_admin_creds_file not in files:
s3 = boto3.resource('s3', region_name=AWS_REGION)
bucket = s3.Bucket('app-s3-secrets')
firebase_config = json.loads(
bucket.Object('app-admin-config-{}.json'.format(ENVIRONMENT)).get()['Body'].read())
firebase_admin_creds = json.loads(
bucket.Object('app-admin-sdk-{}.json'.format(ENVIRONMENT)).get()['Body'].read().decode())
class Firebase:
#staticmethod
def get_connection():
cred = credentials.Certificate(firebase_admin_creds)
return firebase_admin.initialize_app(cred, firebase_config)
app = Firebase.get_connection()
I am find this docs https://firebase.google.com/docs/storage/gcp-integration#apis
Here is code from this docs
# Import gcloud
from google.cloud import storage
# Enable Storage
client = storage.Client()
# Reference an existing bucket.
bucket = client.get_bucket('my-existing-bucket')
# Upload a local file to a new file to be created in your bucket.
zebraBlob = bucket.get_blob('zebra.jpg')
zebraBlob.upload_from_filename(filename='/photos/zoo/zebra.jpg')
# Download a file from your bucket.
giraffeBlob = bucket.get_blob('giraffe.jpg')
giraffeBlob.download_as_string()
In line client = storage.Client()
Said:
Could not automatically determine credentials. Please set GOOGLE_APPLICATION_CREDENTIALS or explicitly create credential and re-run the application
In the next step I am put
from oauth2client.client import GoogleCredentials
GOOGLE_APPLICATION_CREDENTIALS = 'credentials.json'
credentials = GoogleCredentials.get_application_default()
Said:
The Application Default Credentials are not available. They are available if running in Google Compute Engine
And my final question is how to authenticate in Google Compute Engine.
Problem is solved.
1.) You need to install https://cloud.google.com/sdk/
2.) Login inside cloud sdk with your gmail
3.) choose your firebase project
4.) put gcloud auth application-default login in console
5.) You can see credentials here
C:\Users\storks\AppData\Roaming\gcloud\application_default_credentials.json
6.) For more info see How the Application Default Credentials work
https://developers.google.com/identity/protocols/application-default-credentials