How to read HTTP response from Azure Python SDK - python

I'm trying to push a file (put blob request) to the Azure CDN Blob storage using the python SDK. It works no problem, I just can't figure out how to read the header information in the response. According to the docs, its supposed to send back a 201 status if it is successful.
http://msdn.microsoft.com/en-us/library/azure/dd179451.aspx
http://azure.microsoft.com/en-us/documentation/articles/storage-python-how-to-use-blob-storage/
from azure.storage import BlobService
blob_service = BlobService(account_name='accountnamehere', account_key='apikeyhere')
file_contents = open('path/to/image.jpg').read()
blob_service.put_blob(CONTAINER, 'filename.jpg', file_contents, x_ms_blob_type='BlockBlob', x_ms_blob_content_type='image/jpeg')
Any help is greatly appreciated.
Thanks

You can't read the response code.
Source code for SDK is available on GitHub, and in the current version put_blob() function does not return anything.
Do you need to read it though? If put_blob completes succesfully, then your code continues from the next statement. If it were to fail, then the SDK will raise an exception which you can then catch.
You could verify your exception/error handling by using a wrong access key for example.

Related

Google Cloud Function succeeds, but not showing expected output

I am testing out cloud function and I have things setup, but output is not populating correctly (the output is not being saved into Cloud Storage and my print statements are not populating). Here is my code and my requirements below. I have setup the Cloud Function to just run as a HTTP request trigger type with unauthenticated invocations and having a Runtime service account as a specified account that has write access to Cloud Storage. I have verified that I am calling the correct Entry point.
logs
2022-03-22T18:52:02.749482564Z test-example vczj9p85h5m2 Function execution started
2022-03-22T18:52:04.148507183Z test-example vczj9p85h5m2 Function execution took 1399 ms.
Finished with status code: 200
main.py
import requests
from google.cloud import storage
import json
def upload_to_gsc(data):
print("saving to cloud storage")
client = storage.Client(project="my-project-id")
bucket = client.bucket("my-bucket-name")
blob = bucket.blob("subfolder/name_of_file")
blob.upload_from_string(data)
print("data uploaded to cloud storage")
def get_pokemon(request):
url = "https://pokeapi.co/api/v2/pokemon?limit=100&offset=200"
data = requests.get(url).json()
output = [i.get("name") for i in data["results"]]
data = json.dumps(output)
upload_to_gsc(data=data)
print("saved data!")
requirements.txt
google-cloud-storage
requests==2.26.0
As #JackWotherspoon mentioned, be sure to make sure you double check your project-id,bucket-name and entry point if you have a case like I did. For myself, I recreated the Cloud Function and tested it and it worked again.
As #dko512 mentioned in comments, issue was resolved by recreating and redeploying the Cloud Function.
Posting the answer as community wiki for the benefit of the community that might encounter this use case in the future.
Feel free to edit this answer for additional information.

Calling External API with Requests keeps returning Error: could not handle the request

I am trying to write a google cloud function in python which will just get some data from an API and display it:
from flask import escape
import requests
import json
def randomFact(request):
URL = "https://uselessfacts.jsph.pl/random.json?language=en"
r = requests.get(URL).json()
return r
This is the requirement.txt:
# Function dependencies, for example:
# package>=version
functions-framework==3.0.0
flask==2.0.2
flask-restful==0.3.9
requests==2.11.0
However every time I run it I just get a response saying: Error: could not handle the request.
I have enabled billing for the project so that is not the issue, and I have tested it locally and it works perfectly but it just doesn't work when I put it in a google cloud function.
Please help if you know any way to solve this issue or just how to call an external API from a google cloud function, I have been trying for 3 days.
I was able to reproduce your issue. If you look at the logs inside cloud functions, it says there's an ssl certificate error.
I tried changing your requirements.txt to not include a version number on requests and it worked:
functions-framework==3.0.0
requests
flask==2.0.2
flask-restful==0.3.9

Connect Cloud Storage to Cloud Function

I'm having issues executing a Cloud Function on GCP which tries to update some google sheets of mine. I got this script working in Jupyter but have struggled to deploy it virtually as a Cloud Function. My issue seems to be authorizing the CF to post to google sheets.
I've tried many things over 6+ hours, most questions on stackoverflow, medium articles github but haven't been able to find a working solution for me. I don't think it's a roles/permissions issues. I understand how some of these may work when you are outside cloud functions but not inside of it.
Ultimately, I think from what I've seen the best way is to host my JSON secret key inside of a storage bucket and call that, I've tried this to no success and this does seem somewhat convoluted as everything is from a google service.
I've honestly gone back to my orignal code so am back to the first error which is simply that my JSON key cannot be found as when I was running it in Jupyter it was in the same directory...hence why I created a google storage bucket to try to link to.
import pandas as pd
import gspread
from oauth2client.service_account import ServiceAccountCredentials
import google.cloud
from df2gspread import df2gspread as d2g
from df2gspread import gspread2df as g2d
import datetime
import time
import numpy as np
def myGet(event, context):
scope = ['https://spreadsheets.google.com/feeds', 'https://www.googleapis.com/auth/drive']
credentials = ServiceAccountCredentials.from_json_keyfile_name('my-key-name.json', scope)
gc = gspread.authorize(credentials)
spreadsheet_key = '--removed actual key/id--'
ERROR: File "/env/local/lib/python3.7/site-packages/oauth2client/service_account.py", line 219, in from_json_keyfile_name with open(filename, 'r') as file_obj: FileNotFoundError: [Errno 2] No such file or directory: 'my-key-name.json'
Thanks very much for any guidance and support on this. I have thouroughly looked and tried to solve this on my own. EDIT: Please keep in mind, this is not a .py file living in a directory, that's part of my issue, I don't know where to link to as its an isolated "Cloud Function" as far as I can tell.
Some links I've looked at in my 20+ attempts to fix this issue just to name a few:
How can I get access to Google Cloud Storage using an access and a secret key
Accessing google cloud storage bucket from cloud functions throws 500 error
https://cloud.google.com/docs/authentication/getting-started#auth-cloud-implicit-python
https://cloud.google.com/docs/authentication/getting-started#setting_the_environment_variable
UPDATE:
I realized you could upload a zip of your files to show three files in the inline editor. At the beginning I was not doing this so could not figure out where to put the JSON key. Now I have it viewable and need to figure out how to call it in the method.
When I do a test run of the CF, I get a non-descript error which doesn't show up in the logs and can't test it from the Cloud Schedular like I could previously. I found this on stack overflow and feel like I now need the same version but for python and figure out what calls to make from the google docs.
Cloud Functions: how to upload additional file for use in code?
My advice is to not use JSON key file in your Cloud Functions (and on all GCP product). With Cloud Function, like with other GCP product, you have the capability to load automatically the service account during your deployment.
The advantage of Cloud Function Identity is that you haven't a key file to store secretly, you don't have to rotate your key file for increasing the security, you don't have risk of leak of key file,...
By the way, use the default service account in your code.
If you need to get the credential object, you can use the oauth2 python library for this.
import google.auth
credentials, project_id = google.auth.default()
You'll need to specify a relative filename instead, e.g. ./my-key-name.json, assuming the file is in the same directory as your main.py file.
I had the same problem and solved it like this:
import google.auth
credentials, _ = google.auth.default()
gc = gspread.authorize(credentials)
That should work for you.

How to debug persistent 'SpreadsheetNotFound' errors using python gspread

After 2 days of reading gspread docs, gspread blog posts, and following the most recent docs for using gspread I'm still not able to open even one Google Spreadsheet. I set up a GDrive API service account. It appears that my OAuth2 credentials are working, but I'm still getting constant 'SpreadsheetNotFound' errors.
My code looks something like this, and runs with no error message until it tries to open the Google Spreadsheet:
import gspread
from oauth2client.service_account import ServiceAccountCredentials
scope = ['https://spreadsheets.google.com/feeds']
jsonfile = '/path/Python-data-GSpreadsheets-dbbc-99.json'
credentials = ServiceAccountCredentials.from_json_keyfile_name(jsonfile,scope)
# get authorized to access data
gc = gspread.authorize(credentials)
# gc returns with no error messages, so far OK
wks = gc.open('SSTEst123').sheet1 #fails here
#throws SpreadsheetNotFound error
Yes, I've added my gmail email address as authorized to edit (tried it with and without) since the service account should be able to access all my data anyway, and I'm resharing my spreadsheet with myself.
What else can I try to discover why it is failing?
Any ideas on how to debug why it can't open any spreadsheet?
OK, here's how I solved it.
Had to share my spreadsheets with the Google Drive API service account 'email address' (e.g. my.test.data#python-xyz-gspreadsheets.iam.gserviceaccount.com ) -- (Thought it was my 'real' gmail ID/email, but it wasn't)
For debugging the feed ...
Added a few print() lines in the gspread client.py source to see if I was getting anything in the feed, and to check the values from the feed. Noticed that I was getting a value that was old value (from previous script?)
Shut down and restarted my iPython notebook, and it's working now.
OK, problem solved. Here's what I did to figure it out:
Added a few print() lines in the client.py source to see if I was getting anything in the feed, and to check the values from the feed. Noticed that I was getting a value that was old value (from previous script?)
Shut down and restarted my iPython notebook, and it's working now.

Can I read from the AppEngine BlobStore using the remote api

I am trying to read (and subsequently save) blobs from the blobstore using the remote api. I get the error: "No api proxy found for service "blobstore"" when I execute the read.
Here is the stub code:
for b in bs.BlobInfo.all().fetch(100):
blob_reader = bs.BlobReader(str(b.key))
file = blob_reader.read()
the error occurs on the line: file = blob_reader.read()
I am reading the file from my personal appspot via terminal with:
python tools/remote_api/blobstore_download.py --servername=myinstance.appspot.com --appid=myinstance
So, reading from the blobstore possible via the remote api? or is my code bad? Any suggestions?
We recently added blobstore support to remote_api. Make sure you're using the latest version of the SDK, and your error should go away.

Categories