Start/Stop Azure Function App using python - python

Is there a way to start/stop azure function app using python programming by passing a parameter to the http trigger url of the function

If I understand the question correctly, you want to access query parameter in your function file. Here is the binding for a sample azure function:-
{
"scriptFile": "__init__.py",
"bindings": [
{
"authLevel": "function",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get",
"post"
]
},
{
"type": "http",
"direction": "out",
"name": "$return"
}
]
}
and suppose this is the function
def main(req: func.HttpRequest) -> func.HttpResponse:
logging.info('Python HTTP trigger function processed a request.')
name = req.params.get('name')
if not name:
try:
req_body = req.get_json()
except ValueError:
pass
else:
name = req_body.get('name')
if name:
return func.HttpResponse(f"Hello {name}!")
else:
return func.HttpResponse(
"Please pass a name on the query string or in the request body",
status_code=400
)
Above sample function accepts an http POST request, and requires your request body to be a JSON payload with a "name" property.
To access URL query parameters, we make them available to you as distinct environment variables. For example, if you send query parameter "foo", you can access it via os.environ['req_query_foo'].
and then based on the query param , you can start and stop the function .

Related

How to overwrite a file in azure cosmos DB using azure functions(python) HTTP trigger

I am able to write a file to cosmos DB with the help of output binding, but what I need is to know how to overwrite the existing file that is already in cosmos DB
My code looks like this
import azure.functions as func
def main(req: func.HttpRequest, doc: func.Out[func.Document]) -> func.HttpResponse:
request_body = req.get_body()
doc.set(func.Document.from_json(request_body))
return 'OK'
and my output binding looks like this
{
"scriptFile": "__init__.py",
"bindings": [
{
"authLevel": "function",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get",
"post"
]
},
{
"type": "cosmosDB",
"direction": "out",
"name": "doc",
"databaseName": "demodb",
"collectionName": "data",
"createIfNotExists": "true",
"connectionStringSetting": "AzureCosmosDBConnectionString"
},
{
"type": "http",
"direction": "out",
"name": "$return"
}
]
}
All I want is to know how could I overwrite the existing file that is cosmos DB
Please help me with some sample code....
thanks.
I am able to write a file to cosmos DB with the help of output binding, but what I need is to know how to overwrite the existing file that is already in cosmos DB
As far as I know, currently there is no possible way to overwrite the existing file in a cosmos DB.
If you want to update existing document in cosmos DB, you need to query (or read), modify, and then replace the document.
you can refer the SDK for azure cosmos db.
I have followed this blog created by Evan Wong and I am able to append the file details.
def main(req: func.HttpRequest, cosmos:func.DocumentList) -> func.HttpResponse:
logging.info('Python HTTP trigger function processed a request.')
User_details = []
for user in cosmos:
User_details = {
"id": user['id'],
"name": user['name']
}
users_json.append(User_details)
return func.HttpResponse(
json.dumps(User_details),
status_code=200,
mimetype="application/json"
)

Delete CosmosDB Container Items

I am trying to create an Azure Function (implemented in Python) to delete an item in a CosmosDB container. Using Azure Cosmos DB Input & Output bindings, I was able to add, query and update items but I was not able to find a method that could delete one. Is it possible to delete an item using the binding methods?
The following code is what I am currently using to do a simple update.
_init_.py file
import logging
import azure.functions as func
def main(req: func.HttpRequest, doc: func.Out[func.Document]) -> func.HttpResponse:
logging.info('Python HTTP trigger function processed a request.')
departure_time = ""
arrival_time = ""
try:
req_body = req.get_json()
except ValueError:
pass
else:
bc_id_no = req_body.get('bc_id_no')
trip_id = req_body.get('trip_id')
departure_time = req_body.get('departure_time')
arrival_time = req_body.get('arrival_time')
if bc_id_no and trip_id:
newdocs = func.DocumentList()
input_dict = {
"bc_id_no": bc_id_no,
"id": trip_id,
"departure_time": departure_time,
"arrival_time": arrival_time
}
newdocs.append(func.Document.from_dict(input_dict))
doc.set(newdocs)
return func.HttpResponse(f"This HTTP triggered function executed successfully.")
else:
return func.HttpResponse(
"bc_id_no or trip_id not available",
status_code=200
)
function.json
{
"scriptFile": "__init__.py",
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get",
"post"
],
"route": "update_rec"
},
{
"type": "cosmosDB",
"direction": "out",
"name": "doc",
"databaseName": "mockDB",
"collectionName": "mockCollection",
"connectionStringSetting": "AzureCosmosDBConnectionString"
},
{
"type": "http",
"direction": "out",
"name": "$return"
}
]
}
Understand that it may be possible to use the sqlQuery configuration properties for the input binding to specify a delete statement (not too sure if this is a good practice even..) but just wondering if another method for deletion is available.
The bindings only support querying, reading (Input binding), or adding (Output binding), there is no Delete support.
There is no configuration you can pass that would make the binding execute a Delete, it's just not there in the code: https://github.com/Azure/azure-webjobs-sdk-extensions/tree/cosmos/v3.x/src/WebJobs.Extensions.CosmosDB
The only alternative I can think of is if you used the Python SDK directly inside the Function to perform the delete: https://learn.microsoft.com/azure/cosmos-db/sql-api-sdk-python
Just make sure that the instance is created and maintained outside of the execution scope: https://learn.microsoft.com/azure/azure-functions/manage-connections#static-clients

Azure Functions how to return an HttpResponse or display a message before the script finishes

I have a Python Azure Function that is one file and one main function:
def main(req: func.HttpRequest) -> func.HttpResponse:
[bunch of code]
return func.HttpResponse("the file will be deleted in 10 minutes", status_code=200)
It creates a file inside Azure Blob storage for the user and deletes it in 10 minutes. I use time.sleep(600) to do this. However, the message only arrives at the end of this timer, after the file has already been deleted.
How can I make the HttpResponse show the message before the script ends, then wait 10 minutes before deleting the message?
I've tried adding func.HttpResponse('the file will be deleted in 10 minutes') before the time.sleep(600) but it doesn't return anything.
For a Function with http output binding like this you have to return the http response at the end for the response to work. So with a single Function, you cannot achieve this. Continue reading for the alternate solution.
This problem is typically an 'asynchronous' processing example where you want to respond immediately like "ok, I am going to do this" while it "queues further processing" to be continued in the backend. To achieve this in Azure Function you will need 2 functions as below:
Function 1 : Http trigger, http output and Queue output binding (for simplicity I will use storage queue).
Function 2 : Queue trigger (will get triggered by the message queued by function 1).
Function 1 (update according to your need):
JSON:
{
"scriptFile": "__init__.py",
"bindings": [
{
"authLevel": "function",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get",
"post"
]
},
{
"type": "http",
"direction": "out",
"name": "$return"
},
{
"type": "queue",
"direction": "out",
"name": "msg",
"queueName": "outqueue",
"connection": "AzureStorageQueuesConnectionString"
}
]
}
Code:
import azure.functions as func
def main(req: func.HttpRequest, msg: func.Out[str]) -> func.HttpResponse:
[bunch of code]
input_msg = "<create the message body required by function 2>"
msg.set(input_msg)
return func.HttpResponse("the file will be deleted in 10 minutes", status_code=201)
Function 2 (update according to your need):
JSON:
{
"scriptFile": "__init__.py",
"bindings": [
{
"name": "msg",
"type": "queueTrigger",
"direction": "in",
"queueName": "messages",
"connection": "AzureStorageQueuesConnectionString"
}
]
}
Code:
import json
import azure.functions as func
def main(msg: func.QueueMessage):
# below is just an example of parsing the message, in your case it might be taking the blob info required for deleting
message = json.dumps({
'id': msg.id,
'body': msg.get_body().decode('utf-8'),
'expiration_time': (msg.expiration_time.isoformat()
if msg.expiration_time else None),
'insertion_time': (msg.insertion_time.isoformat()
if msg.insertion_time else None),
'time_next_visible': (msg.time_next_visible.isoformat()
if msg.time_next_visible else None),
'pop_receipt': msg.pop_receipt,
'dequeue_count': msg.dequeue_count
})
[bunch of code]
NOTE: You can also look at Durable Functions where you can handle complex workflow and would not need to manage the queueing yourself. But since your scenario in this case is quite simple, I did not cover it.
This is because the actual response isn't sent before the function itself returns something to the pipeline - the pipeline will then return the result to the caller.
And instead of doing this wonky 10-minute waiting inside a function app (which is something you really never should do), I'd create a queue message, set the initial invisibility to 10 minutes, add to e.g. delete-file-queue. Have a QueueTrigger somewhere, listening to delete-file-queue, and do the deletion of the file.
So instead, do something like this (I'm not super familiar with Functions in python, so treat this as pseudo code):
def main(req: func.HttpRequest) -> func.HttpResponse:
# handle whatever you have to, but do NOT include time.sleep
queue_client.send_message("path/to/blob", visibility_timeout=600)
# the message will end up in the back of the queue, and
# it'll stay invisible for 600 seconds
# this is something we don't have to wait for, and thus, the following
# will return immediately
return func.HttpResponse("file will be deleted in 10 minutes")
Your QueueTrigger would then be something like this:
def main(filename: func.QueueMessage, inputblob: func.InputStream) -> None:
# check if inputblob is none, if not, delete it
In your functions.json, you should include bindings for the filename and inputblob:
{
"name": "filename",
"type": "queueTrigger",
"direction": "in",
"queueName": "delete-file-queue",
"connection": "MyStorageConnectionString"
},
{
"name": "inputblob",
"type": "blob",
"path": "{queueTrigger}",
"connection": "MyStorageConnectionString",
"direction": "in"
}
Guide to initializing a queue_client.
And more info here.

upload hyperlink data to azure binding

I am trying to stream data from hyperlink destination to azure storage. I have to do this via binding since I want to run this from azure function App.
file -- function.json:
{
"scriptFile": "__init__.py",
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get",
"post"
]
},
{
"type": "http",
"direction": "out",
"name": "$return"
},
{
"type": "blob",
"direction": "out",
"name": "outputBlob",
"path": "samples-workitems/{rand-guid}",
"connection": ""
}
]
}
file -- init.py:
import logging
import cdsapi
import azure.functions as func
from azure.storage.blob import BlobServiceClient, BlobClient, ContainerClient
def main(req: func.HttpRequest, outputBlob:func.Out[func.InputStream]) -> func.HttpResponse:
logging.info('Python HTTP trigger function is about to process request.')
try:
source_blob="http://www.africau.edu/images/default/sample.pdf"
with open(source_blob, "rb") as data:
print(data)
outputBlob.set(data)
except Exception as ex:
logging.info(" error!", ex, "occurred.")
return func.HttpResponse(
"This HTTP triggered function executed successfully.",
status_code=200
)
I have tested binding and it works. When I simply do outputBlob.set("sample string") data is streamed as it should be.
I am stuck with converting data from hyperlink to bytes(or blob). While running code above, i get error Exception: TypeError: not all arguments converted during string formatting. Any help in converting this and uploading to azure storage is appreciated.
Problem is you were trying to read the File from URL with open(source_blob, "rb") as data: which of course won't work since open is for local files only. I have changed your code as below using requests module to get the remote URL response and set the content to blob.
import requests
source_url="http://www.africau.edu/images/default/sample.pdf"
with requests.get(source_url, stream=True) as r:
r.raise_for_status()
outputBlob.set(r.content)

How can I set Queue Storage message TTL in the context of an Azure Function output binding in Python?

I have a python azure function with a queue output binding. I am successfully using this binding to queue messages from within the function. Is it possible to set the message TTL on the underlying queue or on the message itself? I don't need to set it on a per message basis, but will do it that way if that is the only option.
host.json
{
"version": "2.0",
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[1.*, 2.0.0)"
}
}
function.json
{
"scriptFile": "__init__.py",
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get",
"post"
]
},
{
"type": "http",
"direction": "out",
"name": "$return"
},
{
"type": "queue",
"direction": "out",
"name": "msg",
"queueName": "predictions",
"connection": "AzureWebJobsStorage"
}
]
}
function code
import json
import logging
import azure.functions as func
from graphene import Schema
from .helpers import responses
from .schema.Query import Query
def main(req: func.HttpRequest, msg: func.Out[func.QueueMessage]) -> func.HttpResponse:
logging.info('Executing GraphQL function.')
try:
query = req.get_body().decode()
except ValueError:
pass
if query:
schema = Schema(Query)
results = schema.execute(query)
response = responses.graphql(results)
# Write response to azure queue storage
message = responses.storage(query, response)
if message:
msg.set(message)
return response
else:
return responses.bad_request(
'Please pass a GraphQL query in the request body.')
For now this only supports c# language, you could bind it to CloudQueue type. If it's other laguage you have to use the SDK method to implement. If you insist this feature, you could got to this github issue to comment you requirements.
And below is my test code to set TTL in a HTTP trigger function with azure-storage-queue 2.1.0.
import logging
import azure.functions as func
from azure.storage.queue import QueueService
import os
def main(req: func.HttpRequest) -> func.HttpResponse:
logging.info('Python HTTP trigger function processed a request.')
queue_service = QueueService(connection_string=os.environ['AzureWebJobsStorage'])
message = req.params.get('message')
if not message:
try:
req_body = req.get_json()
except ValueError:
pass
else:
message = req_body.get('message')
if message:
queue_service.put_message('myqueue',message,None,300,None)
return func.HttpResponse(f" {message}!")
else:
return func.HttpResponse(
"Please pass message in the request body",
status_code=400
)

Categories