RabbitMQ combination of topic exchange and RPC works only once - python

I'm experimenting with creating a combination of the topic exchange mentioned in tutorial #5 and RPC mentioned in tutorial #6, and while it works once, it doesn't work again unless I restart the consumer code.
In the client code, which runs on the machine with the RabbitMQ server, I have register_request() which receives a message (from a higher level made with Flask) and adds it to the exchange based on a routing key, and then waits for a response. The callback reply_queue_callback() adds responses to a dictionary where the keys are the correlation ID.
class QueueManager(object):
def __init__(self):
"""
Initializes an exchange and a reply queue.
"""
self.responses = {}
self.connection = pika.BlockingConnection(pika.ConnectionParameters(host="localhost"))
atexit.register(self.close_connection)
self.channel = self.connection.channel()
self.channel.exchange_declare(exchange=EXCHANGE_NAME, type="topic")
result = self.channel.queue_declare(exclusive=True)
self.reply_queue = result.method.queue
self.channel.basic_consume(self.reply_queue_callback, no_ack=True, queue=self.reply_queue)
def close_connection(self):
"""
Closes the connection to RabbitMQ. Runs upon destruction of the instance.
"""
print "*** Closing queue connection..."
self.connection.close()
def reply_queue_callback(self, ch, method, props, body):
"""
A callback that is executed when there's a new message in the reply queue.
"""
self.responses[props.correlation_id] = literal_eval(body)
def register_request(self, routing_key, message):
"""
Adds a message to the exchange.
"""
corr_id = str(uuid.uuid4())
self.channel.basic_publish(exchange=EXCHANGE_NAME, routing_key=routing_key,
properties=pika.BasicProperties(
reply_to=self.reply_queue,
correlation_id=corr_id),
body=message)
print "*** Sent request with correlation ID", corr_id
return corr_id
def fetch_response(self, corr_id):
"""
A polling function that waits for a message in the reply queue.
"""
print "Waiting for a response..."
while not self.responses.get(corr_id):
self.connection.process_data_events()
return self.responses.pop(corr_id)
In the consumer's code, which runs on a separate machine, receive_requests() is the main function and request_callback() is the callback function for a new message.
def request_callback(ch, method, props, message):
"""
A callback that is executed when a relevant message is found in the exchange.
"""
print "Pulled a request with correlation ID %s" % props.correlation_id
response = produce_response(message)
print "Produced a response, publishing..."
ch.basic_publish(exchange="",
routing_key=props.reply_to,
properties=pika.BasicProperties(correlation_id=props.correlation_id),
body=response)
ch.basic_ack(delivery_tag=method.delivery_tag)
print " [*] Waiting for new messages\n"
def receive_requests():
"""
The main loop. Opens a connection to the RabbitMQ server and consumes messages from the exchange.
"""
connection = pika.BlockingConnection(pika.ConnectionParameters(host=RABBITMQ_IP))
channel = connection.channel()
channel.exchange_declare(exchange=EXCHANGE_NAME, type="topic")
result = channel.queue_declare(exclusive=True)
queue_name = result.method.queue
for binding_key in BINDING_KEYS:
channel.queue_bind(exchange=EXCHANGE_NAME, queue=queue_name, routing_key=binding_key)
channel.basic_consume(request_callback, queue=queue_name, no_ack=True)
try:
print(" [*] Waiting for messages. To exit press CTRL+C\n")
channel.start_consuming()
except KeyboardInterrupt:
print "Aborting..."
When I produce a message the first time, the consumer handles it and I get a response back, but with the second message it seems that nothing reaches the consumer (the client prints that it added the new message to the exchange, but the consumer doesn't print anything). I assume something's wrong with the consumer, because if after the first message I restart the consumer's code and keep the client running as is, a second message works fine.
Any idea what the problem is? Perhaps I'm missing something in the consumer's callback?

Related

AWS ActiveMQ fetch messages from a Consumer and send to a queue

I am trying to fetch messages from a consumer and send it to a queue. For this I am using Stomp.py After going through articles and posts, I wrote below code:
import ssl
import stomp
stompurl = "xxxxxxxx.mq.us-west-2.amazonaws.com"
stompuser = "stomuser"
stomppass = "password"
class MyListener(stomp.ConnectionListener):
msg_list = []
def __init__(self):
self.msg_list = []
def on_error(self, frame):
self.msg_list.append('(ERROR) ' + frame.body)
def on_message(self, frame):
self.msg_list.append(frame.body)
conn = stomp.Connection(host_and_ports=[(stompurl, "61614")], auto_decode=True)
conn.set_ssl(for_hosts=[(stompurl, "61614")], ssl_version=ssl.PROTOCOL_TLS)
lst = MyListener()
listener = conn.set_listener('', lst)
conn.connect(stompuser, stomppass, wait=True)
# conn.send(body='Test message', destination='Test_QUEUE')
conn.subscribe('Test_QUEUE', '102')
print(listener.message_list)
import time; time.sleep(2)
messages = lst.msg_list
# conn.disconnect()
print(messages)
With this code I am able to send messages to Test_QUEUE but I can't fetch all messages from consumer. How can I pull out all messages from a consumer and post to a queue for processing.
I'm not a Python + STOMP expert, but in every other language I've used when you create an asynchronous (i.e. non-blocking) message listener as you have done then you must prevent your application from exiting. You have a time.sleep(2) in there, but is that realistically enough time to fetch all the messages from the queue?
It appears your application will exit after print(messages) which means that if you don't get all the messages during the time.sleep(2) then your application will simply terminate.

How to keep a message within a string instead of showing it on CMD ? python

What does the channel.basic_consume' function return?
how i can access to message using variable i want consumed message and show it in browser?
i build django application send message to rabbitmq and consume messsage from it to show message
in browser like chat
import pika, sys
global message
def consume(room,username):
credentials = pika.PlainCredentials('admin', 'admin')
parameters = pika.ConnectionParameters('192.168.1.14',5672,'/', credentials)
connection = pika.BlockingConnection(parameters)
channel = connection.channel()
channel.exchange_declare(exchange='topic_exchange', exchange_type='topic')
result = channel.queue_declare('', exclusive=True)
queue_name = result.method.queue
arr= [room,username]
binding_key ='.'.join([str(i) for i in arr])
channel.queue_bind(exchange='topic_exchange', queue=queue_name, routing_key=binding_key)
print(' [*] Waiting for logs. To exit press CTRL+C')
def callback(ch, method, properties, body):
print(" [x] %r:%r" % (method.routing_key, body))
channel.basic_consume(queue=queue_name, on_message_callback=callback, auto_ack=True)
global message
#message =
channel.start_consuming()
return message
This isn't going to work. You are running a process that is consuming the messages and printing them. Some other process (django) is listening for requests from your browser.
Not sure how you hope it will work, but consider these alternatives-
your consumer writes the messages to a file, and django reads that file when it gets a request
django connects to the message bus and reads all waiting messages when it gets a request
your consumer writes the messages to a database
You build a websocket application that can push messages to currently connected browsers when it receives a message

Move a message to subscription deadletter for failed HTTP request

I've been looking for resources but I can't seem to find what I need.. I have an Azure function with a Service Bus trigger. From this, I make an HTTP call with one of the values found in the Service Bus message.
An additional requirement for me is to deadletter a message if it the HTTP call fails. But as I understand it, the message is not present in the subscription anymore because it was properly received. Is there a way for me to keep the message in the subscription, and then dispose it once it is successful (transfer to DLQ if not?)
I found this piece of code but I'm not sure how it's sending to the DLQ?
https://github.com/Azure/azure-sdk-for-python/blob/azure-servicebus_7.3.0/sdk/servicebus/azure-servicebus/samples/sync_samples/receive_deadlettered_messages.py
"""
Example to show receiving dead-lettered messages from a Service Bus Queue.
"""
# pylint: disable=C0111
import os
from azure.servicebus import ServiceBusClient, ServiceBusMessage, ServiceBusSubQueue
CONNECTION_STR = os.environ['SERVICE_BUS_CONNECTION_STR']
QUEUE_NAME = os.environ["SERVICE_BUS_QUEUE_NAME"]
servicebus_client = ServiceBusClient.from_connection_string(conn_str=CONNECTION_STR)
with servicebus_client:
sender = servicebus_client.get_queue_sender(queue_name=QUEUE_NAME)
messages = [ServiceBusMessage("Message to be deadlettered") for _ in range(10)]
with sender:
sender.send_messages(messages)
print('dead lettering messages')
receiver = servicebus_client.get_queue_receiver(queue_name=QUEUE_NAME)
with receiver:
received_msgs = receiver.receive_messages(max_message_count=10, max_wait_time=5)
for msg in received_msgs:
print(str(msg))
receiver.dead_letter_message(msg)
print('receiving deadlettered messages')
dlq_receiver = servicebus_client.get_queue_receiver(queue_name=QUEUE_NAME, sub_queue=ServiceBusSubQueue.DEAD_LETTER)
with dlq_receiver:
received_msgs = dlq_receiver.receive_messages(max_message_count=10, max_wait_time=5)
for msg in received_msgs:
print(str(msg))
dlq_receiver.complete_message(msg)
print("Receive is done.")
Here is a code snippet in mine:
async def main(msg: func.ServiceBusMessage):
try:
logging.info('Python ServiceBus queue trigger processed message: %s',
msg.get_body().decode('utf-8'))
await asyncio.gather(wait(), wait())
result = json.dumps({
'message_id': msg.message_id,
'metadata' : msg.metadata
})
msgobj = json.loads(result)
val = msgobj['metadata']['value']
run_pipeline(val, msg)
except Exception as e:
logging.error(f"trigger failed: {e}")
TLDR; How do I keep the message in the subscription and either dispose them (if successful) or send them to the DLQ if not?
The code that you pasted is to Recieve Deadletter Messages from the deadletter queue.
I found some code in the docs. You can use this snippet from their example
from azure.servicebus import ServiceBusClient
import os
connstr = os.environ['SERVICE_BUS_CONNECTION_STR']
queue_name = os.environ['SERVICE_BUS_QUEUE_NAME']
with ServiceBusClient.from_connection_string(connstr) as client:
with client.get_queue_receiver(queue_name) as receiver:
for msg in receiver:
print(str(msg))
receiver.dead_letter_message(msg)
You can look at using this above code in your Exception handler
There're four methods to settle a message after receipt:
Complete:
Declares the message processing to be successfully completed, removing the message from the queue.
receiver.complete_message(msg)
Abandon:
Abandon processing of the message for the time being, returning the message immediately back to the queue to be picked up by another (or the same) receiver.
receiver.abandon_message(msg)
DeadLetter:
Transfer the message from the primary queue into the DQL.
receiver.dead_letter_message(msg)
Defer:
Defer is subtly different from the prior settlement methods. It prevents the message from being directly received from the queue by setting it aside.
receiver.defer_message(msg)
To answer your question "How do I keep the message in the subscription and either dispose them (if successful) or send them to the DLQ if not?":
keep the message in the subscription: use abandon_message
dispose them (if successful): use complete_message
send them to the DLQ: use dead_letter_message

how to trigger an event when consumer is defined in a separate class

I'm a newbie to rabbitMQ. I'm trying to implement a simple queue where new messages trigger some logic in a tool I'm running. Right now I have defined the consumer in a separate class, and created an instance of the consumer class in my tool class. I'm not sure this is allowed. Currently, any time a message is received, it is printed to std.out. What I would like to happen is that any time a message is received, the method "do_something" is called with the message body as an input. What should I be doing in the callback method to make that happen?
class MyTool:
def __init__(self):
self.consumer=None
def start_tool(self, some_input):
queue_name = some_method(some_input)
self.consumer = Consumer(queue_name)
def do_something(self, queue_message):
... do things depending on content of queue_message
return new_result
class Consumer:
def __init__(self, queue_name):
connection = pika.BlockingConnection(pika.ConnectionParameters(host='localhost'))
channel = connection.channel()
channel.queue_declare(queue=queue_name)
channel.basic_consume(queue=queue_name, on_message_callback=self.callback, auto_ack=True)
print(' [*] Waiting for messages. To exit press CTRL+C')
channel.start_consuming()
def callback(self, ch, method, properties, body):
print(" [x] Received %r" % body)
return body

Receive a message with RabbitMQ then process it then send back the results

I would like to send a message (directly) from a script and than process it, and send back the results.
So it's like a double publish-subscribe.
I have 2 scripts:
Processer
Client
The Client sends a message directly (simple string) to the Processer, and than the Processer script counts the characters in the string and sends back the results to the client.
This is how I tried to do:
The Processer waits for a message, calculates something and than answers back to the original sender.
#Processer.py:
import pika
import sys
#Sends back the score
#addr: Connection address
#exchName: Exchange name (where to send)
#rKey: Name of the queue for direct messages
#score: The detected score
def SendActualScore(addr, exchName, rKey, score):
#Send the image thru the created channel with the given routing key (queue name)
channel.basic_publish(exchange=exchName, routing_key=rKey, body=score)
print "(*) Sent: " + score
#When we receive something this is called
def CallbackImg(ch, method, properties, body):
print "(*) Received: " + str(body)
score = str(len(body))
#Send back the score
SendActualScore('localhost', 'valami', rKey, score)
#Subscribe connection
#Receive messages thru this
connection = pika.BlockingConnection(pika.ConnectionParameters('localhost'))
channel = connection.channel()
#RECEIVE MESSAGES - Subscribe
channel.exchange_declare(exchange='valami', type='direct')
#Define a queue, where we don't need the name
#After we disconnected delete the queue (exclusive flag)
result = channel.queue_declare(exclusive=True)
#We need the name of our temporary queue
queue_name = result.method.queue
rKeys = sys.argv[1:]
for rKey in rKeys:
channel.queue_bind(exchange='valami', queue=queue_name, routing_key = rKey)
channel.basic_consume(CallbackImg, queue=queue_name, no_ack=True)
print(' [*] Waiting for messages. To exit press CTRL+C')
channel.start_consuming()
The Client just sends the message and than waits for the answer.
#Client.py:
import pika
import sys
connAddr = 'localhost'
#Establish connection
connection = pika.BlockingConnection(pika.ConnectionParameters(connAddr))
channel = connection.channel()
#Define an exchange channel, we don't need a queue
channel.exchange_declare(exchange='valami', type='direct')
#Send the image thru the created channel
channel.basic_publish(exchange='valami', routing_key='msg', body='Message in the body')
print "[*] Sent"
def Callback(ch, method, properties, body):
print "(*) Received: " + str(body)
result = channel.queue_declare(exclusive=True)
#We need the name of our temporary queue
queue_name = result.method.queue
channel.queue_bind(exchange='valami', queue=queue_name)
channel.basic_consume(Callback, queue=queue_name, no_ack=True)
print(' [*] Waiting for messages. To exit press CTRL+C')
channel.start_consuming()
There could be multiple Clients and I don't know how to send back the messages directly to them.
Have you checked the tutorials for RPC in RabbitMQ w/ python and pika? http://www.rabbitmq.com/tutorials/tutorial-six-python.html
The gist of what you need to do in your client, is found in the RPC tutorial, but with a few modifications.
In your client, you will need to create an exclusive queue - the same way you did in your server.
When you send your message from the client, you need to set the reply_to to the name of the client's exclusive queue
from the tutorial:
channel.basic_publish(exchange='',
routing_key='rpc_queue',
properties=pika.BasicProperties(
reply_to = callback_queue,
),
body=request)
On the server, when you receive a message, you need to read the reply_to header from the message and then basic_publish the reply to that queue.
Rather than thinking about "client" and "server", it may be helpful to frame this in terms of "message producer" and "message consumer".
In your scenario, you need both of your processes to be both a publisher and consumer. The "client" will publish the original message and consume the response. The "server" will consume the original message and publish a response.
The only real difference in your code will be the use of the reply_to header on the original message. This is the name of the queue to which you should publish the response.
Hope that helps!
P.S. I cover the core outline of this in my RabbitMQ Patterns eBook - both RPC and request / reply like you are needing. The book talks in principles and patterns, not in specific programming language (though I mostly write node.js and don't really know python).

Categories