According to the Azure ServiceBus docs here:
The ServiceBusReceiver class defines a high level interface for receiving messages from the Azure Service Bus Queue or Topic Subscription. The two primary channels for message receipt are receive() to make a single request for messages, and async for message in receiver: to continuously receive incoming messages in an ongoing fashion.
I have been attempting to use the async for message in receiver: advice to trigger a function everytime a message comes up, but I'm unsure how to do it right, as I have little experience working with async functions. Could someone familiar with async/service bus explain how the code should be formatted?
Edit: Let me provide some more context. I am creating a python flask service, and on start-up, I need it to start listening to messages on a topic/subscription_name. Whenever it receives a message, it will execute some code, then send a message back. So... how do I start an async listener on startup, and have it execute some code whenever it is triggered? It should also be able to process each message in a non-blocking way. So if two messages are received at once, both should be processed at the same time.
Note: I cannot use Azure Functions.
Assuming you are using topic-subscription, you can use below code:
#!/usr/bin/env python
# --------------------------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for license information.
# --------------------------------------------------------------------------------------------
"""
Example to show receiving batch messages from a Service Bus Subscription under specific Topic asynchronously.
"""
# pylint: disable=C0111
import os
import asyncio
from azure.servicebus.aio import ServiceBusClient
CONNECTION_STR = os.environ['SERVICE_BUS_CONNECTION_STR']
TOPIC_NAME = os.environ["SERVICE_BUS_TOPIC_NAME"]
SUBSCRIPTION_NAME = os.environ["SERVICE_BUS_SUBSCRIPTION_NAME"]
async def main():
servicebus_client = ServiceBusClient.from_connection_string(conn_str=CONNECTION_STR)
async with servicebus_client:
receiver = servicebus_client.get_subscription_receiver(
topic_name=TOPIC_NAME,
subscription_name=SUBSCRIPTION_NAME
)
async with receiver:
received_msgs = await receiver.receive_messages(max_message_count=10, max_wait_time=5)
for msg in received_msgs:
print(str(msg))
await receiver.complete_message(msg)
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
Complete tutorial: Send messages to an Azure Service Bus topic and receive messages from subscriptions to the topic (Python)
Further, you can explore these samples (both sync and async versions): Azure Service Bus client library for Python Samples
Related
I am new to OCPP protocol and I am building a Python OCPP server that can communicate with an EV charger using OCPP protocol. This server has the feature "Authenticate user via RFID". I have created 2 Python files which are Charge_Stattion.py:
# Charge_Stattion.py
import asyncio
import logging
import websockets
from ocpp.v201 import call
from ocpp.v201 import ChargePoint as cp
logging.basicConfig(level=logging.INFO)
class ChargePoint(cp):
async def authentication(self):
request = call.AuthorizePayload(
id_token={'id_token':'AA12345',
'type': 'ISO14443'})
response = await self.call(request)
print(response)
async def main():
async with websockets.connect(
'ws://localhost:9000/CP_1',
subprotocols=['ocpp2.0.1']
) as ws:
cp = ChargePoint('CP_1', ws)
await asyncio.gather(cp.start(), cp.authentication())
if __name__ == '__main__':
asyncio.run(main())
and Central_System.py:
#Central_System.py
import asyncio
import logging
import websockets
from datetime import datetime
from ocpp.routing import on
from ocpp.v201 import ChargePoint as cp
from ocpp.v201 import call_result
from ocpp.v201.enums import AuthorizationStatusType, Action
logging.basicConfig(level=logging.INFO)
class ChargePoint(cp):
#on('BootNotification')
async def on_boot_notification(self, charging_station, reason, **kwargs):
return call_result.BootNotificationPayload(
current_time=datetime.utcnow().isoformat(),
interval=10,
status='Accepted'
)
#on(Action.Authorize)
async def on_authorize(self, id_token):
return call_result.AuthorizePayload(id_token_info={"status": AuthorizationStatusType.accepted})
async def on_connect(websocket, path):
""" For every new charge point that connects, create a ChargePoint
instance and start listening for messages.
"""
try:
requested_protocols = websocket.request_headers[
'Sec-WebSocket-Protocol']
except KeyError:
logging.info("Client hasn't requested any Subprotocol. "
"Closing Connection")
if websocket.subprotocol:
logging.info("Protocols Matched: %s", websocket.subprotocol)
else:
# In the websockets lib if no subprotocols are supported by the
# client and the server, it proceeds without a subprotocol,
# so we have to manually close the connection.
logging.warning('Protocols Mismatched | Expected Subprotocols: %s,'
' but client supports %s | Closing connection',
websocket.available_subprotocols,
requested_protocols)
return await websocket.close()
charge_point_id = path.strip('/')
cp = ChargePoint(charge_point_id, websocket)
logging.info("abcxyz: %s", charge_point_id)
await cp.start()
async def main():
server = await websockets.serve(
on_connect,
'0.0.0.0',
9000,
subprotocols=['ocpp2.0.1']
)
logging.info("WebSocket Server Started")
await server.wait_closed()
if __name__ == '__main__':
asyncio.run(main())
Following the document here, I understand that the user must present an RFID card first, then the Charge Station will send an AuthorizeRequest containing idToken from this RFID card to Central System, then Central System will send and AuthorizeResponse to Charge Station. In the 2 python files above, I have implemented the process Charge Station sends andAuthorizeRequest to Central System and Central System sends back AuthorizeResponse to Charge Station. This picture demonstrates these processes
My questions are:
How can I implement the process EV driver present an RFID card to Charge Station. Should I create 2 other python files which represent EV driver and RFID card?
How can I know whether Center System accept this authentication and how to implement this ?
Any help will be appreciated.
This is a simple flow
EV owner registers himself as a EV client on some server where the server provides an unique id, like "unique-client-id" and stores this value as idTag on a database.
When this client go to charge to some charging station, he inputs that unique id to charging device which sends the id in the following form via websocket connection:
[3, "unique-id-representing-the-current-msg", "Authorize", {"idTag": "unique-client-id"}]
OCPP server receives that message, and looks for received idTag on the database, if it exists it will send back response like below:
[4, "unique-id-representing-the-current-msg", {"idTagInfo": {"status": "Accepted"}}]
I recommend using sanic framework since it has both websocket and http support by default.
After importing the AWSIoTMQTTClient module with
from AWSIoTPythonSDK.MQTTLib import AWSIoTMQTTClient
I went ahead and configured the MQTT client connection
myMQTTClient = AWSIoTMQTTClient("my-clientid")
myMQTTClient.configureEndpoint("123abc-ats.iot.us-east-1.amazonaws.com", 8883)
myMQTTClient.configureCredentials(ROOT_KEY, PRIVATE_KEY, CERT)
myMQTTClient.connect()
I defined helloworld function that I want to use as a callback to catch the messages from the topic as:
def helloworld(client, params, packet):
print('...topic:', packet.topic)
print('...payload:', packet.payload)
myMQTTClient.publish(topic="home/fromserver", QoS=1, payload="{'message':'hello from server'}" )
Please note that the last line in the helloworld function I publish the message back to MQTT to the "home/from-server" topic.
I added two more lines to the script and run it
myMQTTClient.subscribe("home/to-server", 1, helloworld)
while True:
time.sleep(1)
I can fetch the messages from the to-server topic. But publishing the message to from-server topic crashes with AWSIoTExceptions.publishTimeoutException
How can I publish a message back to MQTT without raising the publishTimeoutException?
I've been looking for resources but I can't seem to find what I need.. I have an Azure function with a Service Bus trigger. From this, I make an HTTP call with one of the values found in the Service Bus message.
An additional requirement for me is to deadletter a message if it the HTTP call fails. But as I understand it, the message is not present in the subscription anymore because it was properly received. Is there a way for me to keep the message in the subscription, and then dispose it once it is successful (transfer to DLQ if not?)
I found this piece of code but I'm not sure how it's sending to the DLQ?
https://github.com/Azure/azure-sdk-for-python/blob/azure-servicebus_7.3.0/sdk/servicebus/azure-servicebus/samples/sync_samples/receive_deadlettered_messages.py
"""
Example to show receiving dead-lettered messages from a Service Bus Queue.
"""
# pylint: disable=C0111
import os
from azure.servicebus import ServiceBusClient, ServiceBusMessage, ServiceBusSubQueue
CONNECTION_STR = os.environ['SERVICE_BUS_CONNECTION_STR']
QUEUE_NAME = os.environ["SERVICE_BUS_QUEUE_NAME"]
servicebus_client = ServiceBusClient.from_connection_string(conn_str=CONNECTION_STR)
with servicebus_client:
sender = servicebus_client.get_queue_sender(queue_name=QUEUE_NAME)
messages = [ServiceBusMessage("Message to be deadlettered") for _ in range(10)]
with sender:
sender.send_messages(messages)
print('dead lettering messages')
receiver = servicebus_client.get_queue_receiver(queue_name=QUEUE_NAME)
with receiver:
received_msgs = receiver.receive_messages(max_message_count=10, max_wait_time=5)
for msg in received_msgs:
print(str(msg))
receiver.dead_letter_message(msg)
print('receiving deadlettered messages')
dlq_receiver = servicebus_client.get_queue_receiver(queue_name=QUEUE_NAME, sub_queue=ServiceBusSubQueue.DEAD_LETTER)
with dlq_receiver:
received_msgs = dlq_receiver.receive_messages(max_message_count=10, max_wait_time=5)
for msg in received_msgs:
print(str(msg))
dlq_receiver.complete_message(msg)
print("Receive is done.")
Here is a code snippet in mine:
async def main(msg: func.ServiceBusMessage):
try:
logging.info('Python ServiceBus queue trigger processed message: %s',
msg.get_body().decode('utf-8'))
await asyncio.gather(wait(), wait())
result = json.dumps({
'message_id': msg.message_id,
'metadata' : msg.metadata
})
msgobj = json.loads(result)
val = msgobj['metadata']['value']
run_pipeline(val, msg)
except Exception as e:
logging.error(f"trigger failed: {e}")
TLDR; How do I keep the message in the subscription and either dispose them (if successful) or send them to the DLQ if not?
The code that you pasted is to Recieve Deadletter Messages from the deadletter queue.
I found some code in the docs. You can use this snippet from their example
from azure.servicebus import ServiceBusClient
import os
connstr = os.environ['SERVICE_BUS_CONNECTION_STR']
queue_name = os.environ['SERVICE_BUS_QUEUE_NAME']
with ServiceBusClient.from_connection_string(connstr) as client:
with client.get_queue_receiver(queue_name) as receiver:
for msg in receiver:
print(str(msg))
receiver.dead_letter_message(msg)
You can look at using this above code in your Exception handler
There're four methods to settle a message after receipt:
Complete:
Declares the message processing to be successfully completed, removing the message from the queue.
receiver.complete_message(msg)
Abandon:
Abandon processing of the message for the time being, returning the message immediately back to the queue to be picked up by another (or the same) receiver.
receiver.abandon_message(msg)
DeadLetter:
Transfer the message from the primary queue into the DQL.
receiver.dead_letter_message(msg)
Defer:
Defer is subtly different from the prior settlement methods. It prevents the message from being directly received from the queue by setting it aside.
receiver.defer_message(msg)
To answer your question "How do I keep the message in the subscription and either dispose them (if successful) or send them to the DLQ if not?":
keep the message in the subscription: use abandon_message
dispose them (if successful): use complete_message
send them to the DLQ: use dead_letter_message
My Paho MQTT client does the following:
Subscribe to mytopic/#
Do something
Publish to mytopic/#
Problem:
The published message in step 3 arrives at step 1. I'd like to avoid adding a sender-attribute to the payload.
Is there a proper way of ignoring self-published messages? Something like the following (pseudocode):
def on_message(self, client, userdata, message):
if client.id == message.sender_client_id: # Is there anything like the sender_client_id?
return
Any idea? Thanks!
As of the MQTT v5 spec you can tell the broker not to send your own messages back to you as part of the subscription message.
This removes the need to add the identifier so you can then choose to ignore it.
This does of course rely on both the broker and the MQTT client supporting MQTT v5
This logic should work:
Assign an id to every client
every client publish on mytopic/{id}
every client sub to mytopic/#
ignore messages where message.topic starts with mytopic/{id}
If you are using MQTT v5, you can pass the noLocal option to the paho client when subscribing. This option tells the broker not to send back your own messages.
from paho.mqtt.subscribeoptions import SubscribeOptions
...
options = SubscribeOptions(qos=1, noLocal=True)
client.subscribe('mytopic/#', options=options)
def on_message(self, client, userdata, message):
if client.id == message.sender_client_id: # Is there anything like the sender_client_id?
return
In your pseudocode, you are asking for the client's identity but this is exactly opposite to the MQTT specification. In MQTT, two different clients are unaware of each other's identity, they only communicate via the MQTT broker by subscribing to the topics.
I have my software running on a bunch of clients around my network. I've been playing around with RabbitMQ as a solution for passing messages between each client.
My test code is this:
#!/usr/bin/python2
import pika
import time
connection = pika.AsyncoreConnection(pika.ConnectionParameters(
'localhost'))
channel = connection.channel()
def callback(ch, method, properties, body):
# send messages back on certain events
if body == '5':
channel.basic_publish(exchange='',
routing_key='test',
body='works')
print body
channel.queue_declare(queue='test')
channel.basic_consume(callback, queue='test', no_ack=True)
for i in range(0, 8):
channel.basic_publish(exchange='',
routing_key='test',
body='{}'.format(i))
time.sleep(0.5)
channel.close()
Picture this as kind of a 'chat program'. Each client will need to constantly listen for messages. At times, the client will need to send messages back to the server.
This code works, but I've ran into an issue. When the code below sends out the message works, it then retreives that again from the RabbitMQ queue. Is there a way to tell have my client, a producer and a consumer, not receive the message it just sent?
I can't see this functionality built into RabbitMQ so I figured I'd send messages in the form of:
body='{"client_id" : 1, "message" : "this is the message"}'
Then I can parse that string and check the client_id. The client can then ignore all messagess not destined to it.
Is there a better way? Should I look for an alternative to RabbitMQ?
You can have as many queue in RabbitMQ. Why not have a queue for messages to the server as well as a queue for each client?