AuthorizationPermissionMismatch when Python Function App attempts to read file - python

When I try to read a txt-file from blob storage using a Function App, it returns this error in the log:
Result: Failure Exception: HttpResponseError: This request is not authorized to perform this operation using this permission. RequestId:00000-0000-00000-00000-000000000000 Time:2021-07-28T13:14:46.7803762Z ErrorCode:AuthorizationPermissionMismatch Error:None
In the Access Control menu of the storage account, the role 'Storage Blob Data Contributor' has been given to the system-assigned-identity of the Function App.
This is my code:
import logging
import azure.functions as func
from azure.storage.blob import BlobServiceClient, BlobClient
from azure.identity import DefaultAzureCredential
def main(req: func.HttpRequest) -> func.HttpResponse:
blob_url = "https://my-storage-account.blob.core.windows.net"
blob_credential = DefaultAzureCredential()
blob_client = BlobClient(account_url=blob_url, container_name='tests', blob_name='file.txt', credential=blob_credential)
download_stream = blob_client.download_blob()
logging.info('Contents of the download_stream: %s', download_stream)
return func.HttpResponse("OK", status_code=200)
Why do I get the error instead of the contents of the 'file.txt'?

The system-assigned-identity also needs the role 'Storage Queue Data Contributor
'. And to show the contents of the file in the logging 'download_stream' should be replaced by download_stream.readall().

Related

Webhook verification fails with Shopify triggering an Azure function

I'm trying to create a webhook consumer:
Shopify is posting the data
an Azure function written in Python is triggered by HTTP
the sample code below just logged the result of the webhook verificationthe result of the verification is being logged
There's an example by Shopify, but as I'm not using Flask (like in this SO question), I modified the code a bit:
import azure.functions as func
import logging
import hmac
import hashlib
import base64
import json
app = func.FunctionApp()
#app.function_name(name="webhook-trigger")
#app.route(route="webhook", auth_level=func.AuthLevel.ANONYMOUS)
def webhook_process(req: func.HttpRequest) -> func.HttpResponse:
logging.info('Python HTTP trigger function is processing a request.')
verified = verify(req)
logging.info("Webhook verified: {}".format(verified))
return func.HttpResponse(
"This HTTP triggered function executed successfully.",
status_code=200
)
def verify(req: func.HttpRequest) -> bool:
CLIENT_SECRET = 'xxxxx'
header = req.headers
data = req.get_json()
encoded_data = json.dumps(data).encode('utf-8')
digest = hmac.new(CLIENT_SECRET.encode('utf-8'), encoded_data, digestmod=hashlib.sha256).digest()
computed_hmac = base64.b64encode(digest)
logging.info("computed_hmac: {}".format(computed_hmac))
logging.info("header['X-Shopify-Hmac-Sha256'] encoded: {}".format(header['X-Shopify-Hmac-Sha256'].encode('utf-8')))
verified = hmac.compare_digest(computed_hmac, header['X-Shopify-Hmac-Sha256'].encode('utf-8'))
return verified
The logging output:
I don't understand, why the verification fails (possibly due to some encoding stuff which I did wrong. Any ideas?

Can Azure Function Bindings take a folder instead of a file as path

Currently I need to read data from different files in a folder and return the result. There are quite a large number of files in the folder. So is it possible to use the folder as path in Azure function bindings??[enter image description here][1]
Below is one of the workaround that is possible using HTTP Trigger.
import logging
import azure.functions as func
from azure.storage.blob import BlockBlobService
ACCOUNT_NAME = "<YOUR ACCOUNT NAME>"
SAS_TOKEN='<YOUR SAS TOKEN>'
blob_service = BlockBlobService(account_name=ACCOUNT_NAME,account_key=None,sas_token=SAS_TOKEN)
def main(req: func.HttpRequest) -> func.HttpResponse:
logging.info('Python HTTP trigger function processed a request.')
blobnames=""
name = req.params.get('name')
if not name:
try:
req_body = req.get_json()
except ValueError:
return func.HttpResponse(f"Enter the container you are searching for")
else:
name = req_body.get('name')
print("\nList blobs in the container")
generator = blob_service.list_blobs(container_name=name) #lists the blobs inside containers
for blob in generator:
blobnames+=blob.name+"\n"
if name:
return func.HttpResponse(f"{blobnames}"
)
else:
return func.HttpResponse(
"There is no container present with this name",
status_code=200
)
RESULT:

How to create a blob in an Azure Storage Container using Python & Azure Functions

I am having a lot of difficulty writing an API response as json to a blob within an Azure Storage Container. I have tried multiple solutions online but have not managed to see any through to success. I would like to share 2 attempts I have made and hopefully there is someone out there that can assist me in getting at least one methodology correct
Attempt/Method 1
I have tried to use a Service Principle to authenticate my BlobServiceClient from Azure-Storage-Blob. My service principal has been assigned the role of Storage Blob Data Contributor for the Container within which I am trying to create the blob. However on execution of the script I receive an error along the lines of "Unsupported Credential". Below is my script and the error:
My script and resulting error are:
import azure.functions as func
import requests
import json
import uuid
from azure.storage.blob import BlobServiceClient, BlobClient, ContainerClient
from msrestazure.azure_active_directory import ServicePrincipalCredentials
from azure.storage.common import TokenCredential
# Initialise parameters to obtain data from Rest API
url = "https://api.powerbi.com/v1.0/myorg/admin/groups?$top=1000&$expand=datasets,dataflows,reports,users,dashboards"
headers = {'Authorization': get_access_token()}
# Get response. I want to save the response output to a blob.
response = requests.get(url, headers=headers)
response = response.json()
# Initialise parameters for credentials
CLIENT = "bxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxx7" # Azure App/Service Principal ID
KEY = "Gxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx1" # Azure App/Service Principal Key
TENANT_ID = "cxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxx7" # Tenant where Storage Account is which is different to the Tenant the App resides
RESOURCE = f"https://storageaccountxxxxxxxxx.blob.core.windows.net"
# Create credentials & token
credentials = ServicePrincipalCredentials(
client_id = CLIENT,
secret = KEY,
#tenant = TENANT_ID,
resource = RESOURCE
)
tokenCre = TokenCredential(credentials.token["access_token"])
# Initialise parameters for BlobServiceClient
ACCOUNT_URL = "https://storageaccountxxxxxxxxx.blob.core.windows.net/pbiactivity" # includes container name at end of url
#Create BlobServiceClient
blobService = BlobServiceClient(account_url = ACCOUNT_URL, token_credential=tokenCre)
#Create blobClient
blobClient = BlobClient(account_url = RESOURCE,container_name=CONTAINER_NAME, blob_name="response.json", credential = tokenCre )
#Upload response json as blob
blobClient.upload_blob(response, blob_type = "BlockBlob")
Click here for the error that comes after the upload_blob method call]1
Attempt/Method 2
In my second attempt I tried to create ,my BlobServiceClient using Azure-Storage-Blob using my storage account connection string. This method actually allows me to create containers, however when I try to upload a blob as in the script below, However I am unable to create blobs within a container as I get a 403 Forbidden response.
My script and resulting error are:
import requests
import json
import uuid
from azure.storage.blob import BlobServiceClient, BlobClient, ContainerClient
# Initialise parameters to obtain data from Rest API
url = "https://api.powerbi.com/v1.0/myorg/admin/groups?$top=1000&$expand=datasets,dataflows,reports,users,dashboards"
headers = {'Authorization': get_access_token()}
# Get response. I want to save the response output to a blob.
response = requests.get(url, headers=headers)
response = response.json()
# Initialise parameters
CONNECTION_STRING = "DefaultEndpointsProtocol=https;AccountName=storageaccountxxxxxxxxx;AccountKey=rxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxQ==;EndpointSuffix=core.windows.net"
# Create blobServiceClient from connection string
blobServiceClient = BlobServiceClient.from_connection_string(conn_str=CONNECTION_STRING)
#Create blobClient
blobClient = blobServiceClient.get_blob_client(container = "pbiactivity", blob = "response.json")
#Upload response json to blob
blobClient.upload_blob(response, blob_type = "BlockBlob")
Click Here for the errors that come after the upload_blob method call]2
Here is one of the workaround that worked for me:-
import os
import logging
from azure.storage.blob import BlobServiceClient, BlobClient
#Initialise parameters
url = "<YourURL>"
headers = {'Authorization': get_access_token()}
#Get response
response = requests.get(url, headers=headers)
response = response.json()
connectionString= "<Your_Connection_String>"
containerName = "<Name_of_your_container>"
blobServiceClient = BlobServiceClient.from_connection_string(connectionString)
blobContainerClient = blobServiceClient.get_container_client(containerName)
#To create Container (If the container has already been created you can ignore this)
#blobContainerClient.create_container()
#Create blobClient
blobClient = blobServiceClient.get_blob_client(container = "<Name_of_your_container>", blob = "response.json")
with open("response", "rb") as blob_file:
blobClient.upload_blob(data=blob_file)
In my Storage Account:-

generate shared access signature through python

I'm tryng to generate a shared access signature link through python of my files which are already at blob storage, but something goes wrong , I received this message when I put the generate link on web browser:
"Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature."
I'm generating the key from my container name on right button get shared access signature, but I can't go further.
from datetime import datetime
from datetime import timedelta
#from azure.storage.blob import BlobService
datetime.utcnow()
from azure.storage.blob import generate_blob_sas, AccountSasPermissions,AccessPolicy
def generate_link():
account_name='my_account_name_storage'
container_name='container_name'
blob_name='file_name.xsl'
account_key='?sv=2019-12-12&ss=bfqt&srt=sco&sp=rwdlacupx&se=2020-09-17T05:49:57Z&st=2020-09-16T21:49:57Z&spr=https&sig=sdfsdhgbjgnbdkfnglfkdnhklfgnhklgf%30'
url = f"https://{account_name}.blob.core.windows.net/{container_name}/{blob_name}"
sas_token = generate_blob_sas(
account_name=account_name,
account_key=account_key,
container_name=container_name,
blob_name=blob_name,
permission=AccountSasPermissions(read=True),
expiry=datetime.utcnow() + timedelta(hours=1)
)
print(sas_token)
url_with_sas = f"{url}?{sas_token}"
print(url_with_sas)
generate_link()```
It's wrong about account_key in your code.
To find account_key of your storage account, please nav to azure portal -> your storage account -> Settings -> Access keys, then you can see the account_key. The screenshot as below:

redirect uri error while uploading file to google drive

I am trying to upload file to my google drive via my web application.
I am creating client id for my web application as follows:
Client ID: 916885716524-1qvrrridktedn50pasooe1ndepe1oefp.apps.googleusercontent.com
Email address: 916885716524-1qvrrridktedn50pasooe1ndepe1oefp#developer.gserviceaccount.com
Client secret: 6an3xatjgt7sU4Y5v61er7hd
Redirect URIs: http://localhost:9000/
JavaScript origins: http://localhost:9000/
I am downloading the json file and saving it.
Now, whenever user is trying to upload from web app.
It is going to the authentication window. Now when I am selecting the account, It is saying that :
Error: redirect_uri_mismatch
The redirect URI in the request: http:// localhost:8080/ did not match a registered redirect URI
Request Details
from_login=1
scope=https://www.googleapis.com/auth/drive.readonly https://www.googleapis.com/auth/drive.apps.readonly https://www.googleapis.com/auth/drive https://www.googleapis.com/auth/drive.metadata.readonly https://www.googleapis.com/auth/drive.file
response_type=code
access_type=offline
redirect_uri=http://localhost:8080/
as=36ff9556bb7c2164
display=page
pli=1
client_id=916885716524-1qvrrridktedn50pasooe1ndepe1oefp.apps.googleusercontent.com
authuser=0
hl=en
As You can see I have not mentioned 8080 in my redirect uri but then also it is trying to redirect to that uri.
My code is as follows:
In my handler:
Class Upload(tornado.web.RequestHandler):
def post(self, *args, **kwargs):
# some logic here by which I am getting the file path
# then calling following function from another file
file_path = "/home/user/filename.txt"
upload_to_drive(file_path)
self.finish(json.dumps({"status": "success"}))
The other file where I am writing logic for upload to google drive is:
# a help full link is https://developers.google.com/drive/quickstart-
python#step_1_enable_the_drive_api
import os
import sys
import socket
import logging
import httplib2
from mimetypes import guess_type
from apiclient.discovery import build
from apiclient.http import MediaFileUpload
from oauth2client.client import OAuth2WebServerFlow
from oauth2client.file import Storage
import apiclient
from oauth2client.client import flow_from_clientsecrets
from oauth2client.tools import run
# Log only oauth2client errors
logging.basicConfig(level="ERROR")
token_file = os.path.join(os.path.dirname(__file__), 'sample.dat')
CLIENT_SECRETS = os.path.join(os.path.dirname(__file__), 'client_secrets.json')
# Helpful message to display if the CLIENT_SECRETS file is missing.
MISSING_CLIENT_SECRETS_MESSAGE = """
WARNING: Please configure OAuth 2.0
To make this sample run you will need to download the client_secrets.json file
and save it at:
%s
""" % os.path.join(os.path.dirname(__file__), CLIENT_SECRETS)
FLOW = flow_from_clientsecrets(CLIENT_SECRETS,
scope=[
'https://www.googleapis.com/auth/drive',
'https://www.googleapis.com/auth/drive.apps.readonly',
'https://www.googleapis.com/auth/drive.file',
'https://www.googleapis.com/auth/drive.readonly',
'https://www.googleapis.com/auth/drive.metadata.readonly',
],
message=MISSING_CLIENT_SECRETS_MESSAGE)
def authorize(token_file, storage):
if storage is None:
storage = Storage(token_file)
credentials = storage.get()
if credentials is None or credentials.invalid:
credentials = run(FLOW, storage)
# Create an httplib2.Http object and authorize it with credentials
http = httplib2.Http()
credentials.refresh(http)
http = credentials.authorize(http)
return http
def upload_file(file_path, file_name, mime_type):
# Create Google Drive service instance
http = httplib2.Http()
drive_service = build('drive', 'v2', http=http)
media_body = MediaFileUpload(file_path,
mimetype=mime_type,
resumable=False)
body = {
'title': file_name,
'description': 'backup',
'mimeType': mime_type,
}
permissions = {
'role': 'reader',
'type': 'anyone',
'value': None,
'withLink': True
}
# Insert a file
# drive_services.files() is at first an empty list.
file = drive_service.files().insert(body=body, media_body=media_body).execute()
# Insert new permissions and create file instance
drive_service.permissions().insert(fileId=file['id'], body=permissions).execute()
print 'file uploaded !!'
def file_properties(file_path):
mime_type = guess_type(file_path)[0]
file_name = file_path.split('/')[-1]
return file_name, mime_type
def upload_to_drive(file_path):
try:
with open(file_path) as f: pass
except IOError as e:
print(e)
sys.exit(1)
http = authorize(token_file, None)
file_name, mime_type = file_properties(file_path)
upload_file(file_path, file_name, mime_type)
I am not able to understand where I am doing wrong. Please somebody explain a way out of this.
Thanks
In the last of the the Upload class you have
self.redirect("/")
If you're running this on a local development server, it expects there to be something at http:// localhost:8080/, the default host/address for your development server.
I’m not intimate with the python library, but whichever call is constructing the Authentication URI, it's apparently putting a http://localhost:8080 in the parameters, as you can see in your post. So either you need to figure out how to change the behavior of the python library to put in localhost:9000, or you need to change the registration in the developer console to allow localhost:8080.
I find that as I work my way through developing, staging, and product-izing an app, I end up with a half-dozen different redirects building up in the dev console. No apparent harm to it that I can see.
We need to modify oauth2client.tool a bit
You can specify your ports like following and then every thing will work fine.
gflags.DEFINE_multi_int('auth_host_port', [8080, 8090, 9000],.....
)

Categories