Can anyone give me the documentation of "Get started: write and deploy your first functions" with python? - python

There is this document from the Firebase about how to write and deploy the cloud function in Nodejs but can anyone help me out getting that very document in python. I am getting confused as I am a newbie in this field?
However, I tried to write my cloud function which looks like the below but constantly getting some errors that I am going to mention below:
import json
import firebase_admin
from firebase_admin import credentials
from firebase_admin import db
def go_firebase(request):
cred = credentials.Certificate('firebasesdk.json')
firebase_admin.initialize_app(cred, {
'databaseURL' : 'https://firebaseio.com/'
})
ref=db.reference('agents')
snapshot = ref.order_by_key().get()
for key, val in snapshot.items():
kw=val
dictfilt = lambda x, y: dict([ (i,x[i]) for i in x if i in set(y) ])
wanted_keys = ("address","name","phone","uid")
result = dictfilt(kw, wanted_keys)
data= json.dumps(result, sort_keys=True)
return data
And after deploying the function with http trigger, in the log, it is saying:
severity: "ERROR"
textPayload: "Traceback (most recent call last):
File "/env/lib/python3.7/site-packages/google/cloud/functions/worker.py", line 313, in run_http_function
result = _function_handler.invoke_user_function(flask.request)
File "/env/lib/python3.7/site-packages/google/cloud/functions/worker.py", line 215, in invoke_user_function
return call_user_function(request_or_event)
File "/env/lib/python3.7/site-packages/google/cloud/functions/worker.py", line 208, in call_user_function
return self._user_function(request_or_event)
File "/user_code/main.py", line 6, in go_firebase
cred = credentials.Certificate('firebasesdk.json')
File "/env/lib/python3.7/site-packages/firebase_admin/credentials.py", line 83, in __init__
with open(cert) as json_file:
FileNotFoundError: [Errno 2] No such file or directory: 'firebasesdk.json'
I have no idea why it is saying file not found because I have that json file in the same path where I am executing the function! I am using google cloud shell!
Can anyone be kind enough to tell me where I am going wrong?

The .json file that you are referring to is probably the dependencies file for Node.js Cloud Function.
How it works
Every Google Cloud Function has an extra file (additionally to the main code) that has all the libraries to be installed. For example if you are using requests library in your code then there is no way of running pip install requests before executing the main code. Therefore you add this library in the additional file and the Cloud Function will first read that file during deployment and will try to install all the libraries mentioned there.
For Node.js code the file with the libraries is a .json file. For the Python it is a requirements.txt file. For more information, you can refer to The Python Runtime documentation.

Related

Can't deploy python script to google cloud functions due to issue with flairnlp import

I am trying to deploy a Google Cloud Function that performs sentiment analysis on tweets using a flair nlp model. The code deploys perfectly fine without the line 'import flair' or alternatives like 'from flair import x,y,z'. As soon as I include the import statement for flair the function fails to deploy. Below is the error I get when deploying with the import statement (error is copied from Firebase logs). This is my first time posting on StackOverflow so please pardon me if the post looks ugly.
{"#type":"type.googleapis.com/google.cloud.audit.AuditLog","status":{"code":3,"message":"Function failed on loading user code. This is likely due to a bug in the user code. Error message: Code in file main.py can't be loaded.\nDetailed stack trace:\nTraceback (most recent call last):\n File \"/env/local/lib/python3.7/site-packages/google/cloud/functions/worker_v2.py\", line 359, in check_or_load_user_function\n _function_handler.load_user_function()\n File \"/env/local/lib/python3.7/site-packages/google/cloud/functions/worker_v2.py\", line 236, in load_user_function\n spec.loader.exec_module(main_module)\n File \"<frozen importlib._bootstrap_external>\", line 728, in exec_module\n File \"<frozen importlib._bootstrap>\", line 219, in _call_with_frames_removed\n File \"/user_code/main.py\", line 5, in <module>\n from flair import models, data\n File \"/env/local/lib/python3.7/site-packages/flair/__init__.py\", line 20, in <module>\n from . import models\n File \"/env/local/lib/python3.7/site-packages/flair/models/__init__.py\", line 1, in <module>\n from .sequence_tagger_model import SequenceTagger, MultiTagger\n File \"/env/local/lib/python3.7/site-packages/flair/models/sequence_tagger_model.py\", line 21, in <module>\n from flair.embeddings import TokenEmbeddings, StackedEmbeddings, Embeddings\n File \"/env/local/lib/python3.7/site-packages/flair/embeddings/__init__.py\", line 6, in <module>\n from .token import TokenEmbeddings\n File \"/env/local/lib/python3.7/site-packages/flair/embeddings/token.py\", line 10, in <module>\n from transformers import AutoTokenizer, AutoConfig, AutoModel, CONFIG_MAPPING, PreTrainedTokenizer\nImportError: cannot import name 'AutoModel' from 'transformers' (unknown location)\n. Please visit https://cloud.google.com/functions/docs/troubleshooting for in-depth troubleshooting documentation."},"authenticationInfo":
And here is the script I am trying to deploy, as well as the requirements.txt file
main.py
import firebase_admin
from firebase_admin import credentials
from firebase_admin import firestore
from datetime import datetime, timedelta
import flair
# or from flair import models, data
class FirestoreHandler():
cred = credentials.Certificate("serviceAccountKey.json")
firebase_admin.initialize_app(cred)
db = firestore.client()
def analysis_on_create(self):
docs = self.db.collection('tweets').order_by(u'time', direction=firestore.Query.DESCENDING).limit(1).get()
data = docs[0].to_dict()
most_recent_tweet = data['full-text']
sentiment_model = flair.models.TextClassifier.load('en-sentiment')
sentence = flair.data.Sentence(str(most_recent_tweet))
sentiment_model.predict(sentence)
result = sentence.labels[0]
if result.value == "POSITIVE":
val= 1 * result.score
else:
val= -1 * result.score
self.db.collection('sentiment').add({'sentiment':val,'timestamp':datetime.now()+timedelta(hours=3)})
def add_test(self):
self.db.collection('test3').add({"status":"success", 'timestamp':datetime.now()+timedelta(hours=3)})
def hello_firestore(event, context):
"""Triggered by a change to a Firestore document.
Args:
event (dict): Event payload.
context (google.cloud.functions.Context): Metadata for the event.
"""
resource_string = context.resource
# print out the resource string that triggered the function
print(f"Function triggered by change to: {resource_string}.")
# now print out the entire event object
print(str(event))
fire = FirestoreHandler()
fire.add_test()
fire.analysis_on_create()
requirements.txt
# Function dependencies, for example:
# package>=version
firebase-admin==5.0.1
https://download.pytorch.org/whl/cpu/torch-1.0.1.post2-cp37-cp37m-linux_x86_64.whl
flair
I included the url to pytorch download because flair is built on pytorch, and the function would not deploy without the url (even when I didn't import flair in main.py). I have also tried specifying different versions for flair to no avail.
Any intuition as to what may be causing this issue would be greatly appreciated! I am new to the Google Cloud ecosystem, this being my first project. If there is any additional information I can provide please let me know.
Edit: I am deploying from the website (not using CLI)
I am not sure that the provided requirements.txt is OK for GCP cloud functions deployment. Not sure that an explicit https URL is going to be handled correctly...
The Specifying dependencies in Python documentation page describes how the dependencies are to be stated - using the pip package manager's requirements.txt file or packaging local dependencies alongside your function.
Can you simply mention flair with necessary version in the requirements.txt file? Will it work?
In addition, the error you provided highlights that the transformers package is required. Can it be that some specific version is required?
====
As a side comment - I don't know your context and requirements, but I am not sure that in order to work with the Firestore from inside a cloud function all of that
import firebase_admin
from firebase_admin import credentials
from firebase_admin import firestore
is required, as it might be better to avoid using serviceAccountKey.json at all, and simply assign relevant IAM roles to the service account which is used for the given cloud function execution.

Getting KeyError: 'Endpoint' error in Python when calling Custom Vision API

from azure.cognitiveservices.vision.customvision.prediction import CustomVisionPredictionClient
from msrest.authentication import CognitiveServicesCredentials
from azure.cognitiveservices.vision.customvision import prediction
from PIL import Image
endpoint = "https://southcentralus.api.cognitive.microsoft.com/"
project_id = "projectidhere"
prediction_key = "predictionkeyhere"
predict = CustomVisionPredictionClient(prediction_key, endpoint)
with open("c:/users/paul.barbin/pycharmprojects/hw3/TallowTest1.jpg", mode="rb") as image_data:
tallowresult = predict.detect_image(project_id, "test1", image_data)
Python 3.7, and I'm using Azure Custom Vision 3.1? (>azure.cognitiveservices.vision.customvision) (3.1.0)
Note that I've seen the same question on SO but no real solution. The posted answer on the other question says to use the REST API instead.
I believe the error is in the endpoint (as stated in the error), and I've tried a few variants - with the slash, without, using an environment variable, without, I've tried appending various strings to my endpoint but I keep getting the same message. Any help is appreciated.
Full error here:
Traceback (most recent call last):
File "GetError.py", line 15, in <module>
tallowresult = predict.detect_image(project_id, "test1", image_data)
File "C:\Users\paul.barbin\PycharmProjects\hw3\.venv\lib\site-packages\azure\cognitiveservices\vision\customvision\prediction\operations\_custom_vision_
prediction_client_operations.py", line 354, in detect_image
request = self._client.post(url, query_parameters, header_parameters, form_content=form_data_content)
File "C:\Users\paul.barbin\PycharmProjects\hw3\.venv\lib\site-packages\msrest\service_client.py", line 193, in post
request = self._request('POST', url, params, headers, content, form_content)
File "C:\Users\paul.barbin\PycharmProjects\hw3\.venv\lib\site-packages\msrest\service_client.py", line 108, in _request
request = ClientRequest(method, self.format_url(url))
File "C:\Users\paul.barbin\PycharmProjects\hw3\.venv\lib\site-packages\msrest\service_client.py", line 155, in format_url
base = self.config.base_url.format(**kwargs).rstrip('/')
KeyError: 'Endpoint'
CustomVisionPredictionClient takes two required, positional parameters: endpoint and credentials. Endpoint needs to be passed in before credentials, try swapping the order:
predict = CustomVisionPredictionClient(endpoint, prediction_key)

Python: Uploading video to youtube using google provided code snippet does not work

I'm trying to upload a video to Youtube using a python script.
So the code given here (upload_video.py) is supposed to work and I've followed the set up which includes enabling the Youtube API and getting OAuth secret keys and what not. You may notice that the code is in Python 2 so I used 2to3 to make it run with python3.7. The issue is that for some reason, I'm asked to login when I execute upload_video.py:
Now this should not be occuring as that's the whole point of having a client_secrets.json file, that you don't need to explicitly login. So once I exit this in-shell browser, Here's what I see:
Here's the first line:
/usr/lib/python3.7/site-packages/oauth2client/_helpers.py:255: UserWarning: Cannot access upload_video.py-oauth2.json: No such file or directory
warnings.warn(_MISSING_FILE_MESSAGE.format(filename))
Now I don't understand why upload_video.py-oauth2.json is needed as in the upload_video.py file, the oauth2 secret file is set as "client_secrets.json".
Anyways, I created the file upload_video.py-oauth2.json and copied the contents of client_secrets.json to it. I didn't get the weird login then but I got another error:
Traceback (most recent call last):
File "upload_video.py", line 177, in <module>
youtube = get_authenticated_service(args)
File "upload_video.py", line 80, in get_authenticated_service
credentials = storage.get()
File "/usr/lib/python3.7/site-packages/oauth2client/client.py", line 407, in get
return self.locked_get()
File "/usr/lib/python3.7/site-packages/oauth2client/file.py", line 54, in locked_get
credentials = client.Credentials.new_from_json(content)
File "/usr/lib/python3.7/site-packages/oauth2client/client.py", line 302, in new_from_json
module_name = data['_module']
KeyError: '_module'
So basically now I've hit a dead end. Any ideas about what to do now?
See the code of function get_authenticated_service in upload_video.py: you should not create the file upload_video.py-oauth2.json by yourself! This file is created upon the completion of the OAuth2 flow via the call to run_flow within get_authenticated_service.
Also you may read the doc OAuth 2.0 for Mobile & Desktop Apps for thorough info about the authorization flow on standalone computers.

Using google_rest API

I am trying to use the Google Drive API to download publicly available files however whenever I try to proceed I get an import error.
For reference, I have successfully set up the OAuth2 such that I have a client id as well as a client secret , and a redirect url however when I try setting it up I get an error saying the object has no attribute urllen
>>> from apiclient.discovery import build
>>> from oauth2client.client import OAuth2WebServerFlow
>>> flow = OAuth2WebServerFlow(client_id='not_showing_client_id', client_secret='not_showing_secret_id', scope='https://www.googleapis.com/auth/drive', redirect_uri='https://www.example.com/oauth2callback')
>>> auth_uri = flow.step1_get_authorize_url()
>>> code = '4/E4h7XYQXXbVNMfOqA5QzF-7gGMagHSWm__KIH6GSSU4#'
>>> credentials = flow.step2_exchange(code)
And then I get the error:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Library/Python/2.7/site-packages/oauth2client/util.py", line
137, in positional_wrapper
return wrapped(*args, **kwargs)
File "/Library/Python/2.7/site-packages/oauth2client/client.py", line
1980, in step2_exchange
body = urllib.parse.urlencode(post_data)
AttributeError: 'Module_six_moves_urllib_parse' object has no attribute
'urlencode'
Any help would be appreciated, also would someone mind enlightening me as to how I instantiate a drive_file because according to https://developers.google.com/drive/web/manage-downloads, I need to instantiate one and I am unsure of how to do so.
Edit: So I figured out why I was getting the error I got before. If anyone else is having the same problem then try running.
sudo pip install -I google-api-python-client==1.3.2
However I am still unclear about the drive instance so any help with that would be appreciated.
Edit 2: Okay so I figured out the answer to my whole question. The drive instance is just the metadata which results when we use the API to search for a file based on its id
So as I said in my edits try the sudo pip install and a file instance is just a dictionary of meta data.

Google Cloud Storage using Python

I set up a required environment for Google Cloud Storage according to the manual.
I have installed "gsutil" and set up all paths.
My gsutil works perfectly, however, when I try to run the code below,
#!/usr/bin/python
import StringIO
import os
import shutil
import tempfile
import time
from oauth2_plugin import oauth2_plugin
import boto
# URI scheme for Google Cloud Storage.
GOOGLE_STORAGE = 'gs'
# URI scheme for accessing local files.
LOCAL_FILE = 'file'
uri=boto.storage_uri('sangin2', GOOGLE_STORAGE)
try:
uri.create_bucket()
print "done!"
except boto.exception.StorageCreateError, e:
print "failed"
It gives "403 Access denied" error.
Traceback (most recent call last):
File "/Volumes/WingIDE-101-4.0.0/WingIDE.app/Contents/MacOS/src/debug/tserver/_sandbox.py", line 23, in <module>
File "/Users/lsangin/gsutil/boto/boto/storage_uri.py", line 349, in create_bucket
return conn.create_bucket(self.bucket_name, headers, location, policy)
File "/Users/lsangin/gsutil/boto/boto/gs/connection.py", line 91, in create_bucket
response.status, response.reason, body)
boto.exception.GSResponseError: GSResponseError: 403 Forbidden
<?xml version='1.0' encoding='UTF-8'?><Error><Code>AccessDenied</Code><Message>Access denied.</Message></Error>
Since I am new to this, it is kinda hard for me to figure out why.
Can someone help me?
Thank you.
The boto library should automatically find and use your $HOME/.boto file. One thing to check: make sure the project you're using is set as your default project for legacy access (at the API console, click on "Storage Access" and verify that it says "This is your default project for legacy access"). When I have that set incorrectly and I follow the create bucket example you referenced, I also get a 403 error, however, it doesn't make sense that this would work for you in gsutil but not with direct use of boto.
Try adding "debug=2" when you instantiate the storage_uri object, like this:
uri = boto.storage_uri(name, GOOGLE_STORAGE, debug=2)
That will generate some additional debugging information on stdout, which you can then compare with the debug output from an analogous, working gsutil example (via gsutil -D mb ).

Categories