How can I create a python webhook sender app? - python

This is a follow up question to this post.
I have a data warehouse table exposed via xxx.com\data API endpoint
I have been querying this table using the following code and parsing it into a dataframe as follows;
import requests
import json
import http.client
import pandas as pd
url = "xxx.com\data?q=Active%20%3D1%20and%20LATITUDE%20%3D%20%20%220.000000%22%20and%20LONGITUDE%20%3D%20%220.000000%22&pageSize =300"
payload = {}
headers = {'Authorization': access_token}
response = requests.request("GET", url, headers=headers, data = payload)
j=json.loads(response.text.encode('utf8'))
df = pd.json_normalize(j['DataSet'])
The warehouse table gets periodically updated and I am required to create a webhook to be listened to by the following Azure httptrigger;
import logging
import os
import json
import pandas as pd
import azure.functions as func
def main(req: func.HttpRequest) -> func.HttpResponse:
logging.info('Python HTTP trigger function processed a request.')
d={
'Date' :['2016-10-30','2016-10-30','2016-11-01','2016-10-30'],
'Time':['09:58:11', '10:05:34', '10:07:57', '11:15:32'],
'Transaction':[2,3,1,1]
}
df=pd.DataFrame(d, columns=['Date','Time','Transaction'])
output = df.to_csv (index_label="idx", encoding = "utf-8")
return func.HttpResponse(output)
When run,the httptrigger successfully listens to the following webhooker sender which I have created and am running locally on my disk.
import logging
import os
import json
import pandas as pd
data={'Lat': '0.000000',
'Long': '0.000000',
'Status': '1', 'Channel URL':"xxx.com\data"}
webhook_url="http://localhost:7071/api/HttpTrigger1"
r=requests.post(webhook_url, headers={'Content-Type':'application/json'}, data =json.dumps(l))
My question is;
How can I deploy the webhook sender to the cloud as an app so that every time "xxx.com\data" is updated with Lat==0,Long=00 and Status=1, a message is send to my webhook listener?
The app can either be Azure/Flask/postman or any other python based webhook builder.

A simple approach can be to wrap your sender code into a Timer Trigger Function which would poll your xxx.com\data at every x seconds (or whatever frequency you decide) and call your webhook (another http triggered function) if there is any change.
{
"name": "mytimer",
"type": "timerTrigger",
"direction": "in",
"schedule": "0 */5 * * * *"
}
import datetime
import logging
import os
import json
import pandas as pd
import azure.functions as func
def main(mytimer: func.TimerRequest) -> None:
utc_timestamp = datetime.datetime.utcnow().replace(
tzinfo=datetime.timezone.utc).isoformat()
if mytimer.past_due:
logging.warn('The timer is past due!')
# "xxx.com\data" polling in real scenario
l={'Lat': '0.000000',
'Long': '0.000000',
'Status': '1',
'Channel URL':"xxx.com\data"}
webhook_url="{function app base url}/api/HttpTrigger1"
r=requests.post(webhook_url, headers={'Content-Type':'application/json'}, data =json.dumps(l))
logging.info('Python timer trigger function ran at %s', utc_timestamp)
At the end of the day, you can deploy both your webhook function (http trigger) and the sender (timer trigger polling) into a Function app.
You can also think of getting rid of the webhook function altogether (to save one intermediate hop) and do your stuffs into the same timer triggered function.

You currently have some polling logic (under querying this table using the following code). If you want to move that to "Cloud" then create a TimerTrigger function, and put all your poller code in it.
If you want to leave that poller code untouched, but want to call some code in "cloud" whenever poller detects a change (updated with Lat==0,Long=00 and Status=1), then you can create an HTTPTrigger function and invoke it from poller whenever it detects the change.
Confusing part is this: How do you detect this change today? Where is the poller code hosted and how often is it executed?
If data in DB is changing then only ways you can execute "some code" whenever the data changes is:
poll the DB periodically, say every 1 minute and if tehre is a change execute "some code" OR
some feature of this DB allows you to configure a REST API (HTTP Webhook) that is called by the DB whenever there is a change. Implement a REST API (e.g. as an HttpTrigger function) and put that "some code" that you want executed inside it. Now whenever there is a change the DB calls your webhook/REST-API and "some code" is executed.
and the way to read it is to call a REST API (xxx.com/data?q=...) then the only ways you can detect

Related

Azure durable function - Python SDK - trigger Activity Function from Client Function

I am preparing automation solution in Azure. I decided to use Azure Durable Functions. As per Durable Functions design I have created: Client Function (triggered by Service Bus message), Activity Function, Orchestrator Function. Service bus message is in Json format. Once Client Function get Service Bus message Client Function has to run Orchestrator Function. I have prepared code in Python, but does not work. In Azure function Code + test window getting an error.500 Internal Server Error. My code below. Main problem here is to run Orchestrator Python Function from Client Function code presented below. Piece of code for receiving service bus json message is ok, I tested it i other functions.
import json
import azure.functions as func
from azure.servicebus import ServiceBusClient, ServiceBusMessage
import azure.durable_functions as df
async def main(msg: func.ServiceBusMessage, starter: str):
result = ({
'body': json.loads(msg.get_body().decode('utf-8'))
})
try:
account_name = result.get('body', {}).get('accountName')
client = df.DurableOrchestrationClient(starter)
instance_id = await client.start_new(msg.route_params["Orchestrator"], None, None)
logging.info(f"Started orchestration with ID = '{instance_id}'.")
except Exception as e:
logging.info(e)
Solution workflow:
While starting the instance of the orchestration function using the start_new method, it will need both payload and messages.
You have given the message in the following code:
instance_id = await client.start_new(msg.route_params["Orchestrator"], None, None)
Adding the payload, might work and by payload, I mean this
payload = msg.get_body().decode('utf-8')
The code will look like
instance_id = await client.start_new(msg.route_params["Orchestrator"], payload)
refer the following documentation.
Also refer this article by Ajit Patra

How to write unit tests for Durable Azure Functions?

I'm writing an Azure Durable Function, and I would like to write some unit tests for this whole Azure Function.
I tried to trigger the Client function (the "Start" function, as it is often called), but I can't make it work.
I'm doing this for two reasons:
It's frustrating to run the Azure Function code by running "func host start" (or pressing F5), then going to my browser, finding the right tab, going to http://localhost:7071/api/orchestrators/FooOrchestrator and going back to VS Code to debug my code.
I'd like to write some unit tests to ensure the quality of my project's code. Therefore I'm open to suggestions, maybe it would be easier to only test the execution of Activity functions.
Client Function code
This is the code of my Client function, mostly boilerplate code like this one
import logging
import azure.functions as func
import azure.durable_functions as df
async def main(req: func.HttpRequest, starter: str) -> func.HttpResponse:
# 'starter' seems to contains the JSON data about
# the URLs to monitor, stop, etc, the Durable Function
client = df.DurableOrchestrationClient(starter)
# The Client function knows which orchestrator to call
# according to 'function_name'
function_name = req.route_params["functionName"]
# This part fails with a ClientConnectorError
# with the message: "Cannot connect to host 127.0.0.1:17071 ssl:default"
instance_id = await client.start_new(function_name, None, None)
logging.info(f"Orchestration '{function_name}' starter with ID = '{instance_id}'.")
return client.create_check_status_response(req, instance_id)
Unit test try
Then I tried to write some code to trigger this Client function like I did for some "classic" Azure Functions:
import asyncio
import json
if __name__ == "__main__":
# Build a simple request to trigger the Client function
req = func.HttpRequest(
method="GET",
body=None,
url="don't care?",
# What orchestrator do you want to trigger?
route_params={"functionName": "FooOrchestrator"},
)
# I copy pasted the data that I obtained when I ran the Durable Function
# with "func host start"
starter = {
"taskHubName": "TestHubName",
"creationUrls": {
"createNewInstancePostUri": "http://localhost:7071/runtime/webhooks/durabletask/orchestrators/{functionName}[/{instanceId}]?code=aakw1DfReOkYCTFMdKPaA1Q6bSfnHZ/0lzvKsS6MVXCJdp4zhHKDJA==",
"createAndWaitOnNewInstancePostUri": "http://localhost:7071/runtime/webhooks/durabletask/orchestrators/{functionName}[/{instanceId}]?timeout={timeoutInSeconds}&pollingInterval={intervalInSeconds}&code=aakw1DfReOkYCTFMdKPaA1Q6bSfnHZ/0lzvKsS6MVXCJdp4zhHKDJA==",
},
"managementUrls": {
"id": "INSTANCEID",
"statusQueryGetUri": "http://localhost:7071/runtime/webhooks/durabletask/instances/INSTANCEID?taskHub=TestHubName&connection=Storage&code=aakw1DfReOkYCTFMdKPaA1Q6bSfnHZ/0lzvKsS6MVXCJdp4zhHKDJA==",
"sendEventPostUri": "http://localhost:7071/runtime/webhooks/durabletask/instances/INSTANCEID/raiseEvent/{eventName}?taskHub=TestHubName&connection=Storage&code=aakw1DfReOkYCTFMdKPaA1Q6bSfnHZ/0lzvKsS6MVXCJdp4zhHKDJA==",
"terminatePostUri": "http://localhost:7071/runtime/webhooks/durabletask/instances/INSTANCEID/terminate?reason={text}&taskHub=TestHubName&connection=Storage&code=aakw1DfReOkYCTFMdKPaA1Q6bSfnHZ/0lzvKsS6MVXCJdp4zhHKDJA==",
"rewindPostUri": "http://localhost:7071/runtime/webhooks/durabletask/instances/INSTANCEID/rewind?reason={text}&taskHub=TestHubName&connection=Storage&code=aakw1DfReOkYCTFMdKPaA1Q6bSfnHZ/0lzvKsS6MVXCJdp4zhHKDJA==",
"purgeHistoryDeleteUri": "http://localhost:7071/runtime/webhooks/durabletask/instances/INSTANCEID?taskHub=TestHubName&connection=Storage&code=aakw1DfReOkYCTFMdKPaA1Q6bSfnHZ/0lzvKsS6MVXCJdp4zhHKDJA==",
"restartPostUri": "http://localhost:7071/runtime/webhooks/durabletask/instances/INSTANCEID/restart?taskHub=TestHubName&connection=Storage&code=aakw1DfReOkYCTFMdKPaA1Q6bSfnHZ/0lzvKsS6MVXCJdp4zhHKDJA==",
},
"baseUrl": "http://localhost:7071/runtime/webhooks/durabletask",
"requiredQueryStringParameters": "code=aakw1DfReOkYCTFMdKPaA1Q6bSfnHZ/0lzvKsS6MVXCJdp4zhHKDJA==",
"rpcBaseUrl": "http://127.0.0.1:17071/durabletask/",
}
# I need to use async methods because the "main" of the Client
# uses async.
reponse = asyncio.get_event_loop().run_until_complete(
main(req, starter=json.dumps(starter))
)
But unfortunately the Client function still fails in the await client.start_new(function_name, None, None) part.
How could I write some unit tests for my Durable Azure Function in Python?
Technical information
Python version: 3.9
Azure Functions Core Tools version 4.0.3971
Function Runtime Version: 4.0.1.16815
Not sure if this will help which is the official documentation from Microsoft on the Unit testing for what you are looking for - https://github.com/kemurayama/durable-functions-for-python-unittest-sample

Azure Functions for python, structured logging to Application Insights?

When using Python (3.8) in Azure Functions, is there a way to send structured logs to Application Insights? More specifically, I'm trying to send custom dimensions with a log message. All I could find about logging is this very brief section.
Update 0127:
It's solved as per this github issue. And here is the sample code:
# Change Instrumentation Key and Ingestion Endpoint before you run this function app
import logging
import azure.functions as func
from opencensus.ext.azure.log_exporter import AzureLogHandler
logger_opencensus = logging.getLogger('opencensus')
logger_opencensus.addHandler(
AzureLogHandler(
connection_string='InstrumentationKey=aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee;IngestionEndpoint=https://eastus-6.in.applicationinsights.azure.com/'
)
)
def main(req: func.HttpRequest) -> func.HttpResponse:
properties = {
'custom_dimensions': {
'key_1': 'value_1',
'key_2': 'value_2'
}
}
logger_opencensus.info('logger_opencensus.info Custom Dimension', extra=properties)
logger_opencensus.info('logger_opencensus.info Statement')
return func.HttpResponse("OK")
Please try OpenCensus Python SDK.
The example code is in the Logs section, step 5:
Description: You can also add custom properties to your log messages in the extra keyword argument by using the custom_dimensions field. These properties appear as key-value pairs in customDimensions in Azure Monitor.
The sample:
import logging
from opencensus.ext.azure.log_exporter import AzureLogHandler
logger = logging.getLogger(__name__)
# TODO: replace the all-zero GUID with your instrumentation key.
logger.addHandler(AzureLogHandler(
connection_string='InstrumentationKey=00000000-0000-0000-0000-000000000000')
)
properties = {'custom_dimensions': {'key_1': 'value_1', 'key_2': 'value_2'}}
# Use properties in logging statements
logger.warning('action', extra=properties)

Dynamic updates in real time to a django template

I'm building a django app that will provide real time data. I'm fairly new to Django, and now i'm focusing on how to update my data in real time, without having to reload the whole page.
Some clarification: the real time data should be update regularly, not only through a user input.
View
def home(request):
symbol = "BTCUSDT"
tst = client.get_ticker(symbol=symbol)
test = tst['lastPrice']
context={"test":test}
return render(request,
"main/home.html", context
)
Template
<h3> var: {{test}} </h3>
I already asked this question, but i'm having some doubts:
I've been told to use Ajax, and that's ok, but is Ajax good for this case, where i will have a page loaded with data updated in real time every x seconds?
I have also been told to use DRF (Django Rest Framework). I've been digging through it a lot, but what it's not clear to me is how does it work with this particular case.
Here below, I'm giving a checklist of the actions needed to implement a solution based on Websocket and Django Channels, as suggested in a previous comment.
The motivation for this are given at the end.
1) Connect to the Websocket and prepare to receive messages
On the client, you need to execute the follwing javascript code:
<script language="javascript">
var ws_url = 'ws://' + window.location.host + '/ws/ticks/';
var ticksSocket = new WebSocket(ws_url);
ticksSocket.onmessage = function(event) {
var data = JSON.parse(event.data);
console.log('data', data);
// do whatever required with received data ...
};
</script>
Here, we open the Websocket, and later elaborate the notifications sent by the server in the onmessage callback.
Possible improvements:
support SSL connections
use ReconnectingWebSocket: a small wrapper on WebSocket API that automatically reconnects
<script language="javascript">
var prefix = (window.location.protocol == 'https:') ? 'wss://' : 'ws://';
var ws_url = prefix + window.location.host + '/ws/ticks/';
var ticksSocket = new ReconnectingWebSocket(ws_url);
...
</script>
2) Install and configure Django Channels and Channel Layers
To configure Django Channels, follow these instructions:
https://channels.readthedocs.io/en/latest/installation.html
Channel Layers is an optional component of Django Channels which provides a "group" abstraction which we'll use later; you can follow the instructions given here:
https://channels.readthedocs.io/en/latest/topics/channel_layers.html#
3) Publish the Websocket endpoint
Routing provides for Websocket (and other protocols) a mapping between the published endpoints and the associated server-side code, much as urlpattens does for HTTP in a traditional Django project
file routing.py
from django.urls import path
from channels.routing import ProtocolTypeRouter, URLRouter
from . import consumers
application = ProtocolTypeRouter({
"websocket": URLRouter([
path("ws/ticks/", consumers.TicksSyncConsumer),
]),
})
4) Write the consumer
The Consumer is a class which provides handlers for Websocket standard (and, possibly, custom) events. In a sense, it does for Websocket what a Django view does for HTTP.
In our case:
websocket_connect(): we accept the connections and register incoming clients to the "ticks" group
websocket_disconnect(): cleanup by removing che client from the group
new_ticks(): our custom handler which broadcasts the received ticks to it's Websocket client
I assume TICKS_GROUP_NAME is a constant string value defined in project's settings
file consumers.py:
from django.conf import settings
from asgiref.sync import async_to_sync
from channels.consumer import SyncConsumer
class TicksSyncConsumer(SyncConsumer):
def websocket_connect(self, event):
self.send({
'type': 'websocket.accept'
})
# Join ticks group
async_to_sync(self.channel_layer.group_add)(
settings.TICKS_GROUP_NAME,
self.channel_name
)
def websocket_disconnect(self, event):
# Leave ticks group
async_to_sync(self.channel_layer.group_discard)(
settings.TICKS_GROUP_NAME,
self.channel_name
)
def new_ticks(self, event):
self.send({
'type': 'websocket.send',
'text': event['content'],
})
5) And finally: broadcast the new ticks
For example:
ticks = [
{'symbol': 'BTCUSDT', 'lastPrice': 1234, ...},
...
]
broadcast_ticks(ticks)
where:
import json
from asgiref.sync import async_to_sync
import channels.layers
def broadcast_ticks(ticks):
channel_layer = channels.layers.get_channel_layer()
async_to_sync(channel_layer.group_send)(
settings.TICKS_GROUP_NAME, {
"type": 'new_ticks',
"content": json.dumps(ticks),
})
We need to enclose the call to group_send() in the async_to_sync() wrapper, as channel.layers provides only the async implementation, and we're calling it from a sync context. Much more details on this are given in the Django Channels documentation.
Notes:
make sure that "type" attribute matches the name of the consumer's handler (that is: 'new_ticks'); this is required
every client has it's own consumer; so when we wrote self.send() in the consumer's handler, that meant: send the data to a single client
here, we send the data to the "group" abstraction, and Channel Layers in turn will deliver it to every registered consumer
Motivations
Polling is still the most appropriate choice in some cases, being simple and effective.
However, on some occasions you might suffer a few limitations:
you keep querying the server even when no new data are available
you introduce some latency (in the worst case, the full period of the polling). The tradeoff is: less latency = more traffic.
With Websocket, you can instead notify the clients only when (and as soon as) new data are available, by sending them a specific message.
AJAX calls and REST APIs are the combinations you are looking for. For real-time update of data, polling the REST API at regular intervals is the best option you have. Something like:
function doPoll(){
$.post('<api_endpoint_here>', function(data) {
// Do operation to update the data here
setTimeout(doPoll, <how_much_delay>);
});
}
Now add Django Rest Framework to your project. They have a simple tutorial here. Create an API endpoint which will return the data as JSON, and use that URL in the AJAX call.
Now you might be confused because you passed in the data into the template as context, while rendering the page from your home view. Thats not going to work anymore. You'll have to add a script to update the value of the element like
document.getElementById("element_id").value = "New Value";
where element_id is the id you give to the element, and "New Value" is the data you get from the response of the AJAX call.
I hope this gives you a basic context.

python stackdriver google functions webhook listener

I'm trying to create a stackdriver webhook listener in google cloud functions, using the following script:
import sys
import logging
import json
from flask import Flask
from flask import Response, request
def webhook(request):
logging.info("Stackdriver ga360_merge_ready starting up on %s" % (str.replace(sys.version, '\n', ' ')))
app = Flask(__name__)
#app.route('/', methods=['POST'])
def simple_handler():
""" Handle a webhook post with no authentication method """
json_data = json.loads(request.data)
logging.info(json.dumps(json_data, indent=4))
return Response("OK")
For the above, I have the following URL:
https://xxxxx.cloudfunctions.net/webhook
"webhook" is the cloud functions name. when I put this URL in with an ending slash, as per the code, it doesn't seem to send across the message in from stackdriver, essentially, I want the message to also come through, presently, all I get is the below three log entries:
Not sure what I'm missing, I'm new to the python/webhooks world
Your simple_handler is never being called because the request is never being routed to the app you've created.
Is there a reason your function is set up like that? I would expect it to be something like this instead:
import sys
import logging
import json
logging.info("Stackdriver ga360_merge_ready starting up on %s" % (str.replace(sys.version, '\n', ' ')))
def webhook(request):
""" Handle a webhook post with no authentication method """
logging.info(json.dumps(request.get_json(), indent=4))
return Response("OK")

Categories