Receive multiple amqp queues in python / pika - python

I'm trying to receive multiple queues, I tried the code: https://stackoverflow.com/a/42351395/3303330
But it's necessary declare the "queue_declare". Hope you can help me guys, it's my code:
import pika
import time
from zeep import Client
parameters = pika.URLParameters('amqp://user:pass#theurl:5672/%2F')
connection = pika.BlockingConnection(parameters)
channel = connection.channel()
channel.queue_declare(queue='queue1', passive=True, durable=True, exclusive=False, auto_delete=False)
print(' [*] Waiting for messages. To exit press CTRL+C')
def callback(ch, method, header, body):
print(" [x] Received %r" % body)
time.sleep(body.count(b'.'))
ch.basic_ack(delivery_tag = method.delivery_tag)
channel.basic_consume(callback, queue='queue1')
channel.start_consuming()

It is not necessary to declare a queue more than once as long as you delcare it to be durable. You can declare more than one queue in your client code or using the RabbitMQ admin interface.
You can use your channel to consume messages from more than one queue. Just execute channel.basic_consume more than once using different queue parameter values.

Related

How to keep a message within a string instead of showing it on CMD ? python

What does the channel.basic_consume' function return?
how i can access to message using variable i want consumed message and show it in browser?
i build django application send message to rabbitmq and consume messsage from it to show message
in browser like chat
import pika, sys
global message
def consume(room,username):
credentials = pika.PlainCredentials('admin', 'admin')
parameters = pika.ConnectionParameters('192.168.1.14',5672,'/', credentials)
connection = pika.BlockingConnection(parameters)
channel = connection.channel()
channel.exchange_declare(exchange='topic_exchange', exchange_type='topic')
result = channel.queue_declare('', exclusive=True)
queue_name = result.method.queue
arr= [room,username]
binding_key ='.'.join([str(i) for i in arr])
channel.queue_bind(exchange='topic_exchange', queue=queue_name, routing_key=binding_key)
print(' [*] Waiting for logs. To exit press CTRL+C')
def callback(ch, method, properties, body):
print(" [x] %r:%r" % (method.routing_key, body))
channel.basic_consume(queue=queue_name, on_message_callback=callback, auto_ack=True)
global message
#message =
channel.start_consuming()
return message
This isn't going to work. You are running a process that is consuming the messages and printing them. Some other process (django) is listening for requests from your browser.
Not sure how you hope it will work, but consider these alternatives-
your consumer writes the messages to a file, and django reads that file when it gets a request
django connects to the message bus and reads all waiting messages when it gets a request
your consumer writes the messages to a database
You build a websocket application that can push messages to currently connected browsers when it receives a message

Move a message to subscription deadletter for failed HTTP request

I've been looking for resources but I can't seem to find what I need.. I have an Azure function with a Service Bus trigger. From this, I make an HTTP call with one of the values found in the Service Bus message.
An additional requirement for me is to deadletter a message if it the HTTP call fails. But as I understand it, the message is not present in the subscription anymore because it was properly received. Is there a way for me to keep the message in the subscription, and then dispose it once it is successful (transfer to DLQ if not?)
I found this piece of code but I'm not sure how it's sending to the DLQ?
https://github.com/Azure/azure-sdk-for-python/blob/azure-servicebus_7.3.0/sdk/servicebus/azure-servicebus/samples/sync_samples/receive_deadlettered_messages.py
"""
Example to show receiving dead-lettered messages from a Service Bus Queue.
"""
# pylint: disable=C0111
import os
from azure.servicebus import ServiceBusClient, ServiceBusMessage, ServiceBusSubQueue
CONNECTION_STR = os.environ['SERVICE_BUS_CONNECTION_STR']
QUEUE_NAME = os.environ["SERVICE_BUS_QUEUE_NAME"]
servicebus_client = ServiceBusClient.from_connection_string(conn_str=CONNECTION_STR)
with servicebus_client:
sender = servicebus_client.get_queue_sender(queue_name=QUEUE_NAME)
messages = [ServiceBusMessage("Message to be deadlettered") for _ in range(10)]
with sender:
sender.send_messages(messages)
print('dead lettering messages')
receiver = servicebus_client.get_queue_receiver(queue_name=QUEUE_NAME)
with receiver:
received_msgs = receiver.receive_messages(max_message_count=10, max_wait_time=5)
for msg in received_msgs:
print(str(msg))
receiver.dead_letter_message(msg)
print('receiving deadlettered messages')
dlq_receiver = servicebus_client.get_queue_receiver(queue_name=QUEUE_NAME, sub_queue=ServiceBusSubQueue.DEAD_LETTER)
with dlq_receiver:
received_msgs = dlq_receiver.receive_messages(max_message_count=10, max_wait_time=5)
for msg in received_msgs:
print(str(msg))
dlq_receiver.complete_message(msg)
print("Receive is done.")
Here is a code snippet in mine:
async def main(msg: func.ServiceBusMessage):
try:
logging.info('Python ServiceBus queue trigger processed message: %s',
msg.get_body().decode('utf-8'))
await asyncio.gather(wait(), wait())
result = json.dumps({
'message_id': msg.message_id,
'metadata' : msg.metadata
})
msgobj = json.loads(result)
val = msgobj['metadata']['value']
run_pipeline(val, msg)
except Exception as e:
logging.error(f"trigger failed: {e}")
TLDR; How do I keep the message in the subscription and either dispose them (if successful) or send them to the DLQ if not?
The code that you pasted is to Recieve Deadletter Messages from the deadletter queue.
I found some code in the docs. You can use this snippet from their example
from azure.servicebus import ServiceBusClient
import os
connstr = os.environ['SERVICE_BUS_CONNECTION_STR']
queue_name = os.environ['SERVICE_BUS_QUEUE_NAME']
with ServiceBusClient.from_connection_string(connstr) as client:
with client.get_queue_receiver(queue_name) as receiver:
for msg in receiver:
print(str(msg))
receiver.dead_letter_message(msg)
You can look at using this above code in your Exception handler
There're four methods to settle a message after receipt:
Complete:
Declares the message processing to be successfully completed, removing the message from the queue.
receiver.complete_message(msg)
Abandon:
Abandon processing of the message for the time being, returning the message immediately back to the queue to be picked up by another (or the same) receiver.
receiver.abandon_message(msg)
DeadLetter:
Transfer the message from the primary queue into the DQL.
receiver.dead_letter_message(msg)
Defer:
Defer is subtly different from the prior settlement methods. It prevents the message from being directly received from the queue by setting it aside.
receiver.defer_message(msg)
To answer your question "How do I keep the message in the subscription and either dispose them (if successful) or send them to the DLQ if not?":
keep the message in the subscription: use abandon_message
dispose them (if successful): use complete_message
send them to the DLQ: use dead_letter_message

Multiple consumer Rabbitmq through multiprocessing

New to python.
I am trying to create multiple consumer for a RabbitMQ client.
I am using PIKA and trying to do with multiprocessing.
It seems connecting but not being able to sustain the loop.
Can you please help.
The part of the code should also take care the writer option through the call back.
it should start the loop and should consume always
import multiprocessing
import time
import pika
# this is the writer part
def callback(ch, method, properties, body):
print (" [x] %r received %r" % (multiprocessing.current_process(), body,))
time.sleep(body.count('.'))
# print " [x] Done"
ch.basic_ack(delivery_tag=method.delivery_tag)
def consume():
credentials = pika.PlainCredentials(userid, password)
parameters = pika.ConnectionParameters(url, port, '/', credentials)
connection = pika.BlockingConnection(
parameters=parameters)
channel = connection.channel()
channel.queue_declare(queue='queuename', durable=True)
channel.basic_consume('queuename',callback)
print (' [*] Waiting for messages. To exit press CTRL+C')
channel.start_consuming()
userid = "user"
password = "pwd"
url = "localhost"
port = 5672
if __name__ == "__main__":
workers = 5
pool = multiprocessing.Pool(processes=workers)
for i in range(0, workers):
pool.apply_async(consume)
#Stay alive
try:
while True:
You aren't doing any exception handling in your sub-processes, so my guess is that exceptions are being thrown that you don't expect. This code works fine in my environment, using Pika 1.1.0 and Python 3.7.3.
Before I checked for exceptions in body.count() a TypeError would be thrown because body was not a str in that case.
Please note that I'm using the correct method to wait for sub-processes, according to these docs.
NOTE: the RabbitMQ team monitors the rabbitmq-users mailing list and only sometimes answers questions on StackOverflow.

Receive a message with RabbitMQ then process it then send back the results

I would like to send a message (directly) from a script and than process it, and send back the results.
So it's like a double publish-subscribe.
I have 2 scripts:
Processer
Client
The Client sends a message directly (simple string) to the Processer, and than the Processer script counts the characters in the string and sends back the results to the client.
This is how I tried to do:
The Processer waits for a message, calculates something and than answers back to the original sender.
#Processer.py:
import pika
import sys
#Sends back the score
#addr: Connection address
#exchName: Exchange name (where to send)
#rKey: Name of the queue for direct messages
#score: The detected score
def SendActualScore(addr, exchName, rKey, score):
#Send the image thru the created channel with the given routing key (queue name)
channel.basic_publish(exchange=exchName, routing_key=rKey, body=score)
print "(*) Sent: " + score
#When we receive something this is called
def CallbackImg(ch, method, properties, body):
print "(*) Received: " + str(body)
score = str(len(body))
#Send back the score
SendActualScore('localhost', 'valami', rKey, score)
#Subscribe connection
#Receive messages thru this
connection = pika.BlockingConnection(pika.ConnectionParameters('localhost'))
channel = connection.channel()
#RECEIVE MESSAGES - Subscribe
channel.exchange_declare(exchange='valami', type='direct')
#Define a queue, where we don't need the name
#After we disconnected delete the queue (exclusive flag)
result = channel.queue_declare(exclusive=True)
#We need the name of our temporary queue
queue_name = result.method.queue
rKeys = sys.argv[1:]
for rKey in rKeys:
channel.queue_bind(exchange='valami', queue=queue_name, routing_key = rKey)
channel.basic_consume(CallbackImg, queue=queue_name, no_ack=True)
print(' [*] Waiting for messages. To exit press CTRL+C')
channel.start_consuming()
The Client just sends the message and than waits for the answer.
#Client.py:
import pika
import sys
connAddr = 'localhost'
#Establish connection
connection = pika.BlockingConnection(pika.ConnectionParameters(connAddr))
channel = connection.channel()
#Define an exchange channel, we don't need a queue
channel.exchange_declare(exchange='valami', type='direct')
#Send the image thru the created channel
channel.basic_publish(exchange='valami', routing_key='msg', body='Message in the body')
print "[*] Sent"
def Callback(ch, method, properties, body):
print "(*) Received: " + str(body)
result = channel.queue_declare(exclusive=True)
#We need the name of our temporary queue
queue_name = result.method.queue
channel.queue_bind(exchange='valami', queue=queue_name)
channel.basic_consume(Callback, queue=queue_name, no_ack=True)
print(' [*] Waiting for messages. To exit press CTRL+C')
channel.start_consuming()
There could be multiple Clients and I don't know how to send back the messages directly to them.
Have you checked the tutorials for RPC in RabbitMQ w/ python and pika? http://www.rabbitmq.com/tutorials/tutorial-six-python.html
The gist of what you need to do in your client, is found in the RPC tutorial, but with a few modifications.
In your client, you will need to create an exclusive queue - the same way you did in your server.
When you send your message from the client, you need to set the reply_to to the name of the client's exclusive queue
from the tutorial:
channel.basic_publish(exchange='',
routing_key='rpc_queue',
properties=pika.BasicProperties(
reply_to = callback_queue,
),
body=request)
On the server, when you receive a message, you need to read the reply_to header from the message and then basic_publish the reply to that queue.
Rather than thinking about "client" and "server", it may be helpful to frame this in terms of "message producer" and "message consumer".
In your scenario, you need both of your processes to be both a publisher and consumer. The "client" will publish the original message and consume the response. The "server" will consume the original message and publish a response.
The only real difference in your code will be the use of the reply_to header on the original message. This is the name of the queue to which you should publish the response.
Hope that helps!
P.S. I cover the core outline of this in my RabbitMQ Patterns eBook - both RPC and request / reply like you are needing. The book talks in principles and patterns, not in specific programming language (though I mostly write node.js and don't really know python).

Is RabbitMQ capable of passing messages to specific clients? Or must I perform those checks client-side?

I have my software running on a bunch of clients around my network. I've been playing around with RabbitMQ as a solution for passing messages between each client.
My test code is this:
#!/usr/bin/python2
import pika
import time
connection = pika.AsyncoreConnection(pika.ConnectionParameters(
'localhost'))
channel = connection.channel()
def callback(ch, method, properties, body):
# send messages back on certain events
if body == '5':
channel.basic_publish(exchange='',
routing_key='test',
body='works')
print body
channel.queue_declare(queue='test')
channel.basic_consume(callback, queue='test', no_ack=True)
for i in range(0, 8):
channel.basic_publish(exchange='',
routing_key='test',
body='{}'.format(i))
time.sleep(0.5)
channel.close()
Picture this as kind of a 'chat program'. Each client will need to constantly listen for messages. At times, the client will need to send messages back to the server.
This code works, but I've ran into an issue. When the code below sends out the message works, it then retreives that again from the RabbitMQ queue. Is there a way to tell have my client, a producer and a consumer, not receive the message it just sent?
I can't see this functionality built into RabbitMQ so I figured I'd send messages in the form of:
body='{"client_id" : 1, "message" : "this is the message"}'
Then I can parse that string and check the client_id. The client can then ignore all messagess not destined to it.
Is there a better way? Should I look for an alternative to RabbitMQ?
You can have as many queue in RabbitMQ. Why not have a queue for messages to the server as well as a queue for each client?

Categories