Pyrebase .download() not getting files from Firebase Storage - python

I am attempting to download some image files from Firebase Storage via the Pyrebase .download() function, but am having trouble actually getting the files. It supposedly finds the file on the database fine, as indicated with print(file.name), but when I try and print the actual line to download the file, it returns None and no files are downloaded in the specified folder. There are no errors as far as I can spot as the code exits with exit code 0.
As suggested by other solutions found, I have already tried adding 'serviceAccount' into the config when using pyrebase.initialize_app(config), and adding 'filename' and 'path' to the .download() function.
import pyrebase
config = {
'apiKey': "...",
'authDomain': "...",
'projectId': "...",
'storageBucket': "...",
'messagingSenderId': "...",
'appId': "...",
'measurementId': "...",
'databaseURL': "",
'serviceAccount': "C:/Users/Dell/PycharmProjects/FYP/creds.json"
}
firebase = pyrebase.initialize_app(config)
storage = firebase.storage()
path_local = "C:/Users/Dell/PycharmProjects/FYP/identDatabase/"
all_files = storage.child("C:/Users/Dell/PycharmProjects/FYP/identDatabase").list_files()
for file in all_files:
print(file.name)
storage.child(file.name).download(filename=file.name.split('/')[6], path=path_local)
# example of full path inside Firebase Storage
# C:/Users/Dell/PycharmProjects/FYP/identDatabase/William Engel.jpg

Related

How to transfer a csv file from notebook folder to a datastore

I want to transfer a generated csv file test_df.csv from my Azure ML notebook folder which has a path /Users/Ankit19.Gupta/test_df.csv to a datastore which has a web path https://abc.blob.core.windows.net/azureml/LocalUpload/f3db18b6. I have written the python code as
from azureml.core import Workspace
ws = Workspace.from_config()
datastore = ws.get_default_datastore()
datastore.upload_files('/Users/Ankit19.Gupta/test_df.csv',
target_path='https://abc.blob.core.windows.net/azureml/LocalUpload/f3db18b6',
overwrite=True)
But it is showing the following error message:
UserErrorException: UserErrorException:
Message: '/' does not point to a file. Please upload the file to cloud first if running in a cloud notebook.
InnerException None
ErrorResponse
{
"error": {
"code": "UserError",
"message": "'/' does not point to a file. Please upload the file to cloud first if running in a cloud notebook."
}
}
I have tried this but it is not working for me. Can anyone please help me to resolve this issue. Any help would be appreciated.
The way the path was mentioned is not accurate. The datastore path will be different manner.
Replace the below code for the small change in the calling path.
from azureml.core import Workspace
ws = Workspace.from_config()
datastore = ws.get_default_datastore()
datastore.upload_files('./Users/foldername/filename.csv',
target_path=’your targetfolder',
overwrite=True)
We need to call all the parent folders before the folder. “./” is the way we can call the dataset from datastore.

How to dynamically set the output name for a blob in azure functions for python

I created an azure function in Python that gets triggered after a blob upload happens.
I would like to copy and rename the blob to another storage. This is what I have right now:
// function.json
{
"scriptFile": "main.py",
"bindings": [
{
"name": "myblob",
"type": "blobTrigger",
"direction": "in",
"path": "ingress/upload/{name}",
"connection": "conn_STORAGE"
},
{
"name": "myblobout",
"type": "blob",
"direction": "out",
"path": "ingress/test/{name}",
"connection": "conn_STORAGE"
}]
}
# main.py
import logging
import azure.functions as func
def main(myblob: func.InputStream, myblobout: func.Out[bytes]):
myblobout.set(myblob.read())
This works fine but it only copies the file. How can I rename the file dynamically during runtime?
Thx!
As the file gets copied, Now you can change the copied file using one of the workarounds mentioned below.
from azure.storage.blob import ContainerClient
from azure.core.exceptions import ResourceExistsError
blob_name = "abcd.zip"
container_client = ContainerClient.from_connection_string(conn_str, "container_name")
try:
blob_client = container_client .get_blob_client(blob_name)
# upload the blob if it doesn't exist
blob_client.upload_blob(data)
except ResourceExistsError:
# check the number of blobs with the same prefix.
# For example, This will return a generator of [abcd, abcd(1), abcd(2)]
blobs = list(container_client.list_blobs(name_starts_with=blob_name))
length = len(blobs)
if length == 1:
# it means there is only one blob - which is from the previous version
blob_client.upload_blob(data, overwrite=True)
else:
# if there are 10 files with the name starting with abcd, it means your name for the 11th file will be abcd(10).
name = blob_name.split('.')[0] + '(' + str(length) + ').' + a.split('.')[1]
blob_client = container_client .get_blob_client(blob_name)
blob_client.upload_blob(data)
Try replacing the blob_name using Input stream name parameter of the blob.
REFERENCES:
Dynamic rename Azure Blob if already uploaded

Read text file from Firebase Storage in python

I am trying to read a file from Firebase storage under the sub-folder called Transcripts. When I try to read a text file which is in the root folder it works perfectly. However, it fails to read any text file under the sub-folder called "Transcripts".
Here, is the structure of my Firebase Storage bucket:
Transcripts/
Audio 1.txt
Audio 2.txt
Audio 1.amr
Audio 2.amr
Audio Name.txt
Here is the python code where I try to read the file in the root folder:
import pyrebase
config = {
"apiKey": "XXXXXXXX",
"authDomain": "XXXXXXXX.firebaseapp.com",
"databaseURL": "https://XXXXXXXX.firebaseio.com",
"projectId": "XXXXXXXX",
"storageBucket": "XXXXXXXX.appspot.com",
"messagingSenderId": "XXXXXXXX",
"appId": "XXXXXXXX",
"measurementId": "XXXXXXXX",
"serviceAccount":"/Users/faizarahman/Desktop/MY-PROJECT.json"
}
firebase = pyrebase.initialize_app(config) # initializing firebase
storage = firebase.storage() # getting storage reference 1
storage2 = firebase.storage() # getting storage reference 2 (to avoid overwriting to storage reference 1)
url = storage.child("Audio Name").get_url(None) # getting the url from storage
print(url) # printing the url
text_file = urllib.request.urlopen(url).read() # reading the text file
name_list = storage.child("Transcripts/").list_files() # getting all the list of files inside the Transcripts folder.
folder_name = "Transcripts/ "
for file in name_list: # iterating through all the files in the list.
try:
if folder_name in file.name: # Check if the path has "Transcripts"
transcript_name = file.name.replace("Transcripts/ ", "") # Extract the name from the file from "Transcripts/ Audio Number"
unicode_text = text_file.decode("utf-8") # converting the content inside the Audio Name file to a string value.
if transcript_name == unicode_text: # If the content inside the Audio Name file (the content will be a file name) matches with the file name then read that file
text_file1 = storage2.child("Audio Name").get_url(None) # for testing purposes the "Audio Name" works here...
print(text_file1)
except:
print('Download Failed')
The link that it gives me looks like this:
https://firebasestorage.googleapis.com/v0/b/MY-PROJECT-ID.appspot.com/o/Audio%20Name?alt=media
Here, is what I get when I click the link:
Reading Audio Name text file successful.
Here is the python code where I try to read the file in the "Transcripts" folder:
import pyrebase
config = {
"apiKey": "XXXXXXXX",
"authDomain": "XXXXXXXX.firebaseapp.com",
"databaseURL": "https://XXXXXXXX.firebaseio.com",
"projectId": "XXXXXXXX",
"storageBucket": "XXXXXXXX.appspot.com",
"messagingSenderId": "XXXXXXXX",
"appId": "XXXXXXXX",
"measurementId": "XXXXXXXX",
"serviceAccount":"/Users/faizarahman/Desktop/MY-PROJECT.json"
}
firebase = pyrebase.initialize_app(config) # initializing firebase
storage = firebase.storage() # getting storage reference 1
storage2 = firebase.storage() # getting storage reference 2 (to avoid overwriting to storage reference 1)
url = storage.child("Audio Name").get_url(None) # getting the url from storage
print(url) # printing the url
text_file = urllib.request.urlopen(url).read() # reading the text file
name_list = storage.child("Transcripts/").list_files() # getting all the list of files inside the Transcripts folder.
folder_name = "Transcripts/ "
for file in name_list: # iterating through all the files in the list.
try:
if folder_name in file.name: # Check if the path has "Transcripts"
transcript_name = file.name.replace("Transcripts/ ", "") # Extract the name from the file from "Transcripts/ Audio Number"
unicode_text = text_file.decode("utf-8") # converting the content inside the Audio Name file to a string value.
if transcript_name == unicode_text: # If the content inside the Audio Name file (the content will be a file name) matches with the file name then read that file
text_file1 = storage2.child("Transcripts/" + unicode_text).get_url(None) # for testing purposes the "Audio Name" works here but reading the file under transcripts does not work here...
print(text_file1)
except:
print('Download Failed')
The link that it gives me looks like this:
https://firebasestorage.googleapis.com/v0/b/MY-PROJECT-ID.appspot.com/o/Transcripts%2FAudio%202?alt=media
Here is what I get when I try to read the file which is inside the "Transcripts" sub folder:
Reading Audio 2 under transcript sub folder failed.
I believe the error is in this line:
text_file1 = storage2.child("Transcripts/" + unicode_text).get_url(None)

Upload File to Google-drive Teamdrive folder with PyDrive

I have been successfully uploading files to a google-drive-folder with PyDrive. But, when it comes to uploading files to a folder in a google-drive-teamdrive-folder which is shared with me, the following code is not working.
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
gauth = GoogleAuth()
gauth.LocalWebserverAuth()
drive = GoogleDrive(gauth)
location_to_save = "D:\images"
mImageLoc = location_to_save + "\\abcd.jpg"
#[...Code to fetch and save the file as abcd.jpg ...]
gfolder_id = "1H1gjBKcpiHJtnXKVxWQEC1CS8t4Gswjj" #This is a google drive folder id. I am replacing this with a teamdrive folder id, but that does not work
gfile_title = mImageLoc.split("\\")[-1] # returns abcd.jpg
http = gdrive.auth.Get_Http_Object()
f = gdrive.CreateFile({"parents": [{"kind": "drive#fileLink", "id": gfolder_id}],
'title': gfile_title})
f.SetContentFile(mImageLoc)
f.Upload(param={"http": http})
The error message I am recieving is: pydrive.files.ApiRequestError: <HttpError 404 when requesting https://www.googleapis.com/upload/drive/v2/files?alt=json&uploadType=resumable returned "File not found: 0AG-N4DqGC1nbUk9PVA">
'0AG-N4DqGC1nbUk9PVA' is the teamdrive's folder id here.
I have been searching for means to upload files to Teamdrives with PyDrive but in vain. I see in the pydrive's github pages that they added the teamdrives support approx 8 month ago. But I cannot find any documentation on how to use that. Can anyone suggest where I am being wrong please?
For uploading, try making a file called "settings.yaml" and saving it in your working directory, as per the instructions here:
https://pythonhosted.org/PyDrive/oauth.html
You will need the client id and client secret found in the client_secrets.json file which should also be in your directory after you authorised access to the Google API.
Test it out with the following code to make a text file in a folder in the team drive:
parent_folder_id = 'YYYY'
f = drive.CreateFile({
'title': 'test.txt',
'parents': [{
'kind': 'drive#fileLink',
'teamDriveId': team_drive_id,
'id': parent_folder_id
}]
})
f.SetContentString('Hello World')
f.Upload(param={'supportsTeamDrives': True})
# where XXXX and YYYY are the team drive and target folder ids found from the end of the URLS when you open them in your browser.

Empty file stored on Firebase with Python

My goal is to generate certain files (txt/pdf/excel) on my Python server and subsequently push it to the Firebase Storage.
For the Firebase Storage integration I use the pyrebase package.
So far I have managed to generate the file locally and subsequently store it on the right path on the Firebase Storage database.
However, the files I store are always empty. What is the reason for this?
1. Generating the localFile
import os
def save_templocalfile(specs):
# Random something
localFileName = "test.txt"
localFile = open(localFileName,"w+")
for i in range(1000):
localFile.write("This is line %d\r\n" % (i+1))
return {
'localFileName': localFileName,
'localFile': localFile
}
2. Storing the localFile
# Required Libraries
import pyrebase
import time
# Firebase Setup & Admin Auth
config = {
"apiKey": "<PARAMETER>",
"authDomain": "<PARAMETER>",
"databaseURL": "<PARAMETER>",
"projectId": "<PARAMETER>",
"storageBucket": "<PARAMETER>",
"messagingSenderId": "<PARAMETER>"
}
firebase = pyrebase.initialize_app(config)
storage = firebase.storage()
def fb_upload(localFile):
# Define childref
childRef = "/test/test.txt"
storage.child(childRef).put(localFile)
# Get the file url
fbResponse = storage.child(childRef).get_url(None)
return fbResponse
The problem was that I opened my file with Write permissions only:
localFile = open(localFileName,"w+")
The solution was to close the write operation and opening it with Read permissions:
# close (Write)
localFile.close()
# Open (Read)
my_file = open(localFileName, "rb")
my_bytes = my_file.read()
# Store on FB
fbUploadObj = storage.child(storageRef).put(my_bytes)

Categories