I am trying to create an Azure Function (implemented in Python) to delete an item in a CosmosDB container. Using Azure Cosmos DB Input & Output bindings, I was able to add, query and update items but I was not able to find a method that could delete one. Is it possible to delete an item using the binding methods?
The following code is what I am currently using to do a simple update.
_init_.py file
import logging
import azure.functions as func
def main(req: func.HttpRequest, doc: func.Out[func.Document]) -> func.HttpResponse:
logging.info('Python HTTP trigger function processed a request.')
departure_time = ""
arrival_time = ""
try:
req_body = req.get_json()
except ValueError:
pass
else:
bc_id_no = req_body.get('bc_id_no')
trip_id = req_body.get('trip_id')
departure_time = req_body.get('departure_time')
arrival_time = req_body.get('arrival_time')
if bc_id_no and trip_id:
newdocs = func.DocumentList()
input_dict = {
"bc_id_no": bc_id_no,
"id": trip_id,
"departure_time": departure_time,
"arrival_time": arrival_time
}
newdocs.append(func.Document.from_dict(input_dict))
doc.set(newdocs)
return func.HttpResponse(f"This HTTP triggered function executed successfully.")
else:
return func.HttpResponse(
"bc_id_no or trip_id not available",
status_code=200
)
function.json
{
"scriptFile": "__init__.py",
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get",
"post"
],
"route": "update_rec"
},
{
"type": "cosmosDB",
"direction": "out",
"name": "doc",
"databaseName": "mockDB",
"collectionName": "mockCollection",
"connectionStringSetting": "AzureCosmosDBConnectionString"
},
{
"type": "http",
"direction": "out",
"name": "$return"
}
]
}
Understand that it may be possible to use the sqlQuery configuration properties for the input binding to specify a delete statement (not too sure if this is a good practice even..) but just wondering if another method for deletion is available.
The bindings only support querying, reading (Input binding), or adding (Output binding), there is no Delete support.
There is no configuration you can pass that would make the binding execute a Delete, it's just not there in the code: https://github.com/Azure/azure-webjobs-sdk-extensions/tree/cosmos/v3.x/src/WebJobs.Extensions.CosmosDB
The only alternative I can think of is if you used the Python SDK directly inside the Function to perform the delete: https://learn.microsoft.com/azure/cosmos-db/sql-api-sdk-python
Just make sure that the instance is created and maintained outside of the execution scope: https://learn.microsoft.com/azure/azure-functions/manage-connections#static-clients
Related
I am able to write a file to cosmos DB with the help of output binding, but what I need is to know how to overwrite the existing file that is already in cosmos DB
My code looks like this
import azure.functions as func
def main(req: func.HttpRequest, doc: func.Out[func.Document]) -> func.HttpResponse:
request_body = req.get_body()
doc.set(func.Document.from_json(request_body))
return 'OK'
and my output binding looks like this
{
"scriptFile": "__init__.py",
"bindings": [
{
"authLevel": "function",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get",
"post"
]
},
{
"type": "cosmosDB",
"direction": "out",
"name": "doc",
"databaseName": "demodb",
"collectionName": "data",
"createIfNotExists": "true",
"connectionStringSetting": "AzureCosmosDBConnectionString"
},
{
"type": "http",
"direction": "out",
"name": "$return"
}
]
}
All I want is to know how could I overwrite the existing file that is cosmos DB
Please help me with some sample code....
thanks.
I am able to write a file to cosmos DB with the help of output binding, but what I need is to know how to overwrite the existing file that is already in cosmos DB
As far as I know, currently there is no possible way to overwrite the existing file in a cosmos DB.
If you want to update existing document in cosmos DB, you need to query (or read), modify, and then replace the document.
you can refer the SDK for azure cosmos db.
I have followed this blog created by Evan Wong and I am able to append the file details.
def main(req: func.HttpRequest, cosmos:func.DocumentList) -> func.HttpResponse:
logging.info('Python HTTP trigger function processed a request.')
User_details = []
for user in cosmos:
User_details = {
"id": user['id'],
"name": user['name']
}
users_json.append(User_details)
return func.HttpResponse(
json.dumps(User_details),
status_code=200,
mimetype="application/json"
)
I have a Python Azure Function that is one file and one main function:
def main(req: func.HttpRequest) -> func.HttpResponse:
[bunch of code]
return func.HttpResponse("the file will be deleted in 10 minutes", status_code=200)
It creates a file inside Azure Blob storage for the user and deletes it in 10 minutes. I use time.sleep(600) to do this. However, the message only arrives at the end of this timer, after the file has already been deleted.
How can I make the HttpResponse show the message before the script ends, then wait 10 minutes before deleting the message?
I've tried adding func.HttpResponse('the file will be deleted in 10 minutes') before the time.sleep(600) but it doesn't return anything.
For a Function with http output binding like this you have to return the http response at the end for the response to work. So with a single Function, you cannot achieve this. Continue reading for the alternate solution.
This problem is typically an 'asynchronous' processing example where you want to respond immediately like "ok, I am going to do this" while it "queues further processing" to be continued in the backend. To achieve this in Azure Function you will need 2 functions as below:
Function 1 : Http trigger, http output and Queue output binding (for simplicity I will use storage queue).
Function 2 : Queue trigger (will get triggered by the message queued by function 1).
Function 1 (update according to your need):
JSON:
{
"scriptFile": "__init__.py",
"bindings": [
{
"authLevel": "function",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get",
"post"
]
},
{
"type": "http",
"direction": "out",
"name": "$return"
},
{
"type": "queue",
"direction": "out",
"name": "msg",
"queueName": "outqueue",
"connection": "AzureStorageQueuesConnectionString"
}
]
}
Code:
import azure.functions as func
def main(req: func.HttpRequest, msg: func.Out[str]) -> func.HttpResponse:
[bunch of code]
input_msg = "<create the message body required by function 2>"
msg.set(input_msg)
return func.HttpResponse("the file will be deleted in 10 minutes", status_code=201)
Function 2 (update according to your need):
JSON:
{
"scriptFile": "__init__.py",
"bindings": [
{
"name": "msg",
"type": "queueTrigger",
"direction": "in",
"queueName": "messages",
"connection": "AzureStorageQueuesConnectionString"
}
]
}
Code:
import json
import azure.functions as func
def main(msg: func.QueueMessage):
# below is just an example of parsing the message, in your case it might be taking the blob info required for deleting
message = json.dumps({
'id': msg.id,
'body': msg.get_body().decode('utf-8'),
'expiration_time': (msg.expiration_time.isoformat()
if msg.expiration_time else None),
'insertion_time': (msg.insertion_time.isoformat()
if msg.insertion_time else None),
'time_next_visible': (msg.time_next_visible.isoformat()
if msg.time_next_visible else None),
'pop_receipt': msg.pop_receipt,
'dequeue_count': msg.dequeue_count
})
[bunch of code]
NOTE: You can also look at Durable Functions where you can handle complex workflow and would not need to manage the queueing yourself. But since your scenario in this case is quite simple, I did not cover it.
This is because the actual response isn't sent before the function itself returns something to the pipeline - the pipeline will then return the result to the caller.
And instead of doing this wonky 10-minute waiting inside a function app (which is something you really never should do), I'd create a queue message, set the initial invisibility to 10 minutes, add to e.g. delete-file-queue. Have a QueueTrigger somewhere, listening to delete-file-queue, and do the deletion of the file.
So instead, do something like this (I'm not super familiar with Functions in python, so treat this as pseudo code):
def main(req: func.HttpRequest) -> func.HttpResponse:
# handle whatever you have to, but do NOT include time.sleep
queue_client.send_message("path/to/blob", visibility_timeout=600)
# the message will end up in the back of the queue, and
# it'll stay invisible for 600 seconds
# this is something we don't have to wait for, and thus, the following
# will return immediately
return func.HttpResponse("file will be deleted in 10 minutes")
Your QueueTrigger would then be something like this:
def main(filename: func.QueueMessage, inputblob: func.InputStream) -> None:
# check if inputblob is none, if not, delete it
In your functions.json, you should include bindings for the filename and inputblob:
{
"name": "filename",
"type": "queueTrigger",
"direction": "in",
"queueName": "delete-file-queue",
"connection": "MyStorageConnectionString"
},
{
"name": "inputblob",
"type": "blob",
"path": "{queueTrigger}",
"connection": "MyStorageConnectionString",
"direction": "in"
}
Guide to initializing a queue_client.
And more info here.
Right now I'm trying to figure out how to work with Azure, and now I'm stuck in a problem while storing my data in the storage account.
I have three strings and want to store each of them in a separate blob. With the first two, my code works fine, but the third one causes some retries and ends with a timeout.
My code is running within an Azure function.
Here is a minimal example:
from azure.storage.blob import BlobClient
blob_client = BlobClient.from_connection_string(
conn_str. = '<STORAGE_ACCOUNT_CONNECTION_STRING>',
container_name = '<CONTAINER_NAME>',
blob_name. = '<NAME_OF_BLOB>',
)
dic_blob_props = blob_client.upload_blob(
data = '<INFORMATION_THAT_SHOULD_GO_TO_THE_BLOB>',
blob_type = "BlockBlob",
overwrite = True,
)
The for the first two strings everything works fine but the third fails. The strings have the following length:
len(s_1) = 1246209
len(s_2) = 8794086
len(s_3) = 24518001
Most likely it is because the third string is too long, but there must be a way to save it, right?
I have already tried to set the timeout time within the .upload_blob method by timeout=600, but this has not changed the result at all, nor the time until a new attempt to write is made.
The error is:
Exception: ServiceResponseError: ('Connection aborted.', timeout('The write operation timed out'))
If you have any ideas on the problem pleast let me know :-)
In my case, the problem disappeared after I deployed the function in the cloud. It seems that there was a problem debugging with Visual Studio code.
On my side, I don't have the problem. You can have a look of my code:
__init__.py
import logging
import azure.functions as func
def main(req: func.HttpRequest,outputblob: func.Out[func.InputStream],) -> func.HttpResponse:
logging.info('This code is to upload a string to a blob.')
s_3 = "x"*24518001
outputblob.set(s_3)
return func.HttpResponse(
"The string already been uploaded to a blob.",
status_code=200
)
function.json
{
"scriptFile": "__init__.py",
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"route": "{test}",
"methods": [
"get",
"post"
]
},
{
"type": "http",
"direction": "out",
"name": "$return"
},
{
"name": "outputblob",
"type": "blob",
"path": "test1/{test}.txt",
"connection": "str",
"direction": "out"
}
]
}
local.settings.json
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "",
"FUNCTIONS_WORKER_RUNTIME": "python",
"str":"DefaultEndpointsProtocol=https;AccountName=0730bowmanwindow;AccountKey=xxxxxx==;EndpointSuffix=core.windows.net"
}
}
Then I hit the endpoint http://localhost:7071/api/bowman, it uploads the string to blob and don't have time out error:
So, I think the problem is related with the method you use.
I have a python azure function with a queue output binding. I am successfully using this binding to queue messages from within the function. Is it possible to set the message TTL on the underlying queue or on the message itself? I don't need to set it on a per message basis, but will do it that way if that is the only option.
host.json
{
"version": "2.0",
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[1.*, 2.0.0)"
}
}
function.json
{
"scriptFile": "__init__.py",
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get",
"post"
]
},
{
"type": "http",
"direction": "out",
"name": "$return"
},
{
"type": "queue",
"direction": "out",
"name": "msg",
"queueName": "predictions",
"connection": "AzureWebJobsStorage"
}
]
}
function code
import json
import logging
import azure.functions as func
from graphene import Schema
from .helpers import responses
from .schema.Query import Query
def main(req: func.HttpRequest, msg: func.Out[func.QueueMessage]) -> func.HttpResponse:
logging.info('Executing GraphQL function.')
try:
query = req.get_body().decode()
except ValueError:
pass
if query:
schema = Schema(Query)
results = schema.execute(query)
response = responses.graphql(results)
# Write response to azure queue storage
message = responses.storage(query, response)
if message:
msg.set(message)
return response
else:
return responses.bad_request(
'Please pass a GraphQL query in the request body.')
For now this only supports c# language, you could bind it to CloudQueue type. If it's other laguage you have to use the SDK method to implement. If you insist this feature, you could got to this github issue to comment you requirements.
And below is my test code to set TTL in a HTTP trigger function with azure-storage-queue 2.1.0.
import logging
import azure.functions as func
from azure.storage.queue import QueueService
import os
def main(req: func.HttpRequest) -> func.HttpResponse:
logging.info('Python HTTP trigger function processed a request.')
queue_service = QueueService(connection_string=os.environ['AzureWebJobsStorage'])
message = req.params.get('message')
if not message:
try:
req_body = req.get_json()
except ValueError:
pass
else:
message = req_body.get('message')
if message:
queue_service.put_message('myqueue',message,None,300,None)
return func.HttpResponse(f" {message}!")
else:
return func.HttpResponse(
"Please pass message in the request body",
status_code=400
)
Is there a way to start/stop azure function app using python programming by passing a parameter to the http trigger url of the function
If I understand the question correctly, you want to access query parameter in your function file. Here is the binding for a sample azure function:-
{
"scriptFile": "__init__.py",
"bindings": [
{
"authLevel": "function",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get",
"post"
]
},
{
"type": "http",
"direction": "out",
"name": "$return"
}
]
}
and suppose this is the function
def main(req: func.HttpRequest) -> func.HttpResponse:
logging.info('Python HTTP trigger function processed a request.')
name = req.params.get('name')
if not name:
try:
req_body = req.get_json()
except ValueError:
pass
else:
name = req_body.get('name')
if name:
return func.HttpResponse(f"Hello {name}!")
else:
return func.HttpResponse(
"Please pass a name on the query string or in the request body",
status_code=400
)
Above sample function accepts an http POST request, and requires your request body to be a JSON payload with a "name" property.
To access URL query parameters, we make them available to you as distinct environment variables. For example, if you send query parameter "foo", you can access it via os.environ['req_query_foo'].
and then based on the query param , you can start and stop the function .