creating producer and consumer application in python - python

I am trying to write a producer and consumer code in python using pika for rabbitmq. However for my specific case, I need to run producer on a different host and consumer on other.
I have already written a producer code as:
import pika
credentials = pika.PlainCredentials('username', 'password')
parameters = pika.ConnectionParameters('ip add of another host', 5672, '/', credentials)
connection = pika.BlockingConnection()
channel = connection.channel()
channel.queue_declare(queue='test')
channel.basic_publish(exchange='', routing_key='test', body='hello all!')
print (" [x] sent 'Hello all!")
connection.close()
The above producer code is running without any error. I also created a new user and gave administrator credentials to it on rabbitmq-server. However when I run the consumer code on another host running rabbitmq-server, I do not see any output:
import pika
credentials = pika.PlainCredentials('username', 'password')
parameters = pika.ConnectionParameters('localhost', 5672, '/', credentials)
connection = pika.BlockingConnection()
channel = connection.channel()
channel.queue_declare(queue='test')
def callback(ch, method, properties, body):
print(" [x] Recieved %r" % body)
channel.basic_consume(
queue='test', on_message_callback=callback, auto_ack=True)
print (' [x] waiting for messages. To exit press ctrl+c')
channel.start_consume()
So, here i had two hosts on the same network which had rabbitmq installed. However one has 3.7.10 and other had 3.7.16 version of rabbitmq.
The producer is able to send the text without error, but the consumer on another host is not receiving any text.
I do not get any problem when both run on same machine, as i just replace connection settings with localhost. Since user guest is only allowed to connect on localhost by default, i created a new user on consumer host running rabbitmq-server.
Please look if anyone can help me out here...

I have a couple of questions when I see your problem:
Are you 100% sure that on your RabbitMQ management monitoring
you see 2 connections? One from your local host and another from the another host? This will help to debug
Second, Did you check that your ongoing port 5672 on the server that host RabbitMQ is open? Because maybe your producer does not manage to connect What is your cloud provider?
If you don't want to manage those kinds of issues, you should use a service like https://zenaton.com. They host everything for you, and you have integrated monitoring, error handling etc.

Your consumer and producer applications must connect to the same RabbitMQ server. If you have two instances of RabbitMQ running they are independent. Messages do not move from one instance of RabbitMQ to another unless you configure Shovel or Federation.
NOTE: the RabbitMQ team monitors the rabbitmq-users mailing list and only sometimes answers questions on StackOverflow.

You don't seem to be passing the parameters to the BlockingConnection instance.
import pika
rmq_server = "ip_address_of_rmq_server"
credentials = pika.PlainCredentials('username', 'password')
parameters = pika.ConnectionParameters(rmq_server, 5672, '/', credentials)
connection = pika.BlockingConnection(parameters)
channel = connection.channel()
Also, your consumer is attaching to the localhost hostname. Make sure this actually resolves and that your RabbitMQ service is listening on the localhost address (127.0.0.1) It may not be bound to that address. I believe that RMQ will bind to all interfaces (and thus all addresses) by default but I'm not sure.

Related

How to send messages from Flask server to RabbitMQ server queue?

I'm trying to send a message from Flask server, which acts as a producer, to a RabbitMQ server queue. The port I'm using to produce the messages is '5672'.
I've created an exchange and a queue on RabbitMQ's Management server and the main goal is to receive a message in the server's queue.
This is the code that I have at the moment, it does not produce any errors and returns that the message has been sent:
#app.route("/create/<message>")
def create_bucket(message):
credentials = pika.PlainCredentials("guest", "guest")
connection = pika.BlockingConnection(pika.ConnectionParameters(host="localhost", credentials=credentials))
channel = connection.channel()
channel.queue_declare(queue="TestQueue", durable=True)
channel.basic_publish(exchange="TestExchange", routing_key="TestQueue", body=message, properties=pika.BasicProperties(delivery_mode=2))
connection.close()
return "[x] Message sent %s" % message
if __name__ == "__main__":
app.run(debug=True, port=5672)
Though the message does not appear in RabbitMQ's server's queue.
Are there any resources or ways to send a message from Flask's server to RabbitMQ's server queue?
Managed to solve the problem by deleting routing_key="TestQueue", as the routing_key was not declared in RabbitMQ server.

Allow RabbitMQ and Pika maintain the conection always open

I've a Python script which reads stuff from a stream, and when a new string gets readed, it pushes its content (a string) to a RabbitMQ queue.
The thing is that the stream might not send messages in 1, 2 or 9h or so, so I would like to have the RabbitMQ connection always open.
The problem is that when I create the conection and the channel:
self.connection = pika.BlockingConnection(pika.ConnectionParameters(host=self.host, credentials=self.credentials))
channel = self.connection.channel()
channel.exchange_declare(exchange=self.exchange_name, exchange_type='fanout')
... and if after an hour a message arrives, I get this error:
File "/usr/local/lib/python3.7/asyncio/events.py", line 88, in _run
self._context.run(self._callback, *self._args)
File "/var/opt/rabbitmq-agent.py", line 34, in push_to_queue
raise Exception("Error sending the message to the queue: " + format(e))
Exception: Error sending the message to the queue: Send message to publisher error: Channel allocation requires an open connection: <SelectConnection CLOSED socket=None params=<ConnectionParameters host=x port=xvirtual_host=/ ssl=False>>
Which I suppose is that the connection has been closed between the rabbitmq server and client.
How can I avoid this? I would like to have a "please, keep the connection alive always". Maybe setting a super-big heartbeat in the connection parameters of Pika? Something like this:
self.connection = pika.BlockingConnection(pika.ConnectionParameters(host=self.host, credentials=self.credentials, heartbeat=6000))
Any other cooler solutions would be highly appreciated.
Thanks in advance
I would suggest you check connection every time before sending message and if the connection is closed then simply reconnect.
if not self.connection or self.connection.is_closed:
self.connection = pika.BlockingConnection(pika.ConnectionParameters(host=self.host, credentials=self.credentials))
channel = self.connection.channel()
channel.exchange_declare(exchange=self.exchange_name, exchange_type='fanout')
You could try adding heartbeat to your ConnectionParameters. This will create light traffic by sending heartbeats every specified seconds. This will exercise the connections. Some firewalls or proxies tend to scrape idle connections. Even RabbitMQ has a timeout on connections that are idle.
import pika
# Set the connection parameters to connect to rabbit-server1 on port 5672
# on the / virtual host using the username "guest" and password "guest"
credentials = pika.PlainCredentials('guest', 'guest')
parameters = pika.ConnectionParameters('rabbit-server1',
5672,
'/',
heartbeat=60,
credentials)
See here for pika documentation.
Additionally you should have code in place that mitigates network disconnection. This can always happen and will. So appart from the heartbeat have some exception handling ready to re-open closed connections in a graceful way.

Unable to connect to remote rabbitmq server using pika

I am trying to connect to my remote rabbitmq using pika but I am getting Connectionclosed() error. I have made the required changes in rabbit.config for guest user to allow all connections and also the same connection works from my Java code. I even tried creating a new user with all the permission and connecting it, but it still doesn't work. The same code works fine on my localhost though. Can anyone please let me know what might I be doing wrong here?
def queue_message(message, queue):
credentials = pika.PlainCredentials('xxxx', 'xxxx')
parameters = pika.ConnectionParameters('remote-server',
5672,
'/',
credentials)
connection = pika.BlockingConnection(parameters)
channel = connection.channel()
channel.queue_declare(queue='python_update_queue')
channel.basic_publish(exchange='update.fanout',
body=message)
logger.info("Sent message: {} to queue: {}".format(message, queue))
print 'message sent'
connection.close()
Below is the error I get:
app/project/rabbitmq.py" in queue_message
connection = pika.BlockingConnection(parameters)
env/lib/python2.7/site-packages/pika/adapters/blocking_connection.py" in __init__
self._process_io_for_connection_setup()
env/lib/python2.7/site-packages/pika/adapters/blocking_connection.py" in ss_io_for_connection_setup
self._open_error_result.is_ready)
env/lib/python2.7/site-packages/pika/adapters/blocking_connection.py" in _flush_output
raise exceptions.ConnectionClosed
add a connection timeout to your connection parameters - you're probably running into a timeout issue where the connection isn't happening fast enough, across the network.
also, your code is explicitly calling connection.close() ... so that may be why your connection is closing
It was indeed a timeout issue. After increasing the timeout in the connection parameters, the connection was established properly.
parameters = pika.ConnectionParameters('remote-server',
5672,
'/',
socket_timeout=2)
If you connect to remote rabbitmq server, check this:
remote server port open with firewall
remote server have public ip and rabbitmq user have access to that server
rabbitmq server is activately running
add your user admin in administrator tag;
rabbitmqctl set_user_tags admin administrator
add enough permissions to the user admin
rabbitmqctl set_permissions -p / admin ".*" ".*" ".*"

Python pubsub / message queue over HTTP?

I have a python script that will run on a local machine that needs to access a message queue (RabbitMQ) or receive subscribed events over HTTP. I've researched several solutions, but none seem natively designed to allow desktop clients to access them over HTTP. I'm thinking that using Twisted as a proxy is an option as well. Any guidance or recommendations would be greatly appreciated. Thanks in advance.
I've read this tutorial on RabbitMQ site, and they provide name of some libraries that could solve receiving messages.
Sender: send.py
#!/usr/bin/env python
import pika
connection = pika.BlockingConnection(pika.ConnectionParameters(
host='localhost'))
channel = connection.channel()
channel.queue_declare(queue='hello')
channel.basic_publish(exchange='',
routing_key='hello',
body='Hello World!')
print " [x] Sent 'Hello World!'"
connection.close()
Receiver: receive.py
#!/usr/bin/env python
import pika
connection = pika.BlockingConnection(pika.ConnectionParameters(
host='localhost'))
channel = connection.channel()
channel.queue_declare(queue='hello')
print ' [*] Waiting for messages. To exit press CTRL+C'
def callback(ch, method, properties, body):
print " [x] Received %r" % (body,)
channel.basic_consume(callback,
queue='hello',
no_ack=True)
channel.start_consuming()
Now we can try out our programs in a terminal. First, let's send a message using our send.py program:
$ python send.py
[x] Sent 'Hello World!'
The producer program send.py will stop after every run. Let's receive it:
$ python receive.py
[*] Waiting for messages. To exit press CTRL+C
[x] Received 'Hello World!'
Hurray! We were able to send our first message through RabbitMQ. As you might have noticed, the receive.py program doesn't exit. It will stay ready to receive further messages, and may be interrupted with Ctrl-C.
Try to run send.py again in a new terminal.
We've learned how to send and receive a message from a named queue. It's time to move on to part 2 and build a simple work queue.
I've decided to use wamp http://wamp.ws/. Still experimenting with it, but it's working quite well at the moment.
Choice #1
You may be interested in this RabbitHub
Choice #2
If you want it to be on port#80, cant you do port forwarding using a proxy? It could be challenging, but
Choice #3
If your script is not tightly coupled with RMQ message format , you can try celery ( which uses RMQ underneath), then u can try celery Http gateway or celery web hooks if u want any other application to be triggered directly
It might be time consuming to get it up. However, Celery opens up loads of flexibility
Choice #4
For one of my projects, I developed an intermediate web service (Flask Service) to use RMQ
Not ideal, but it served the purpose at that time.

Rabbitmq connection issue when using a username and password

I am trying to start some background processing through rabbitmq, but when I send the request, I get the below error in the rabbitmq log. But, I think I am providing the right credentials, as my celery works are able to connect to rabbitmq server using the same username/password combination.
=ERROR REPORT==== 12-Jun-2012::20:50:29 ===
exception on TCP connection from 127.0.0.1:41708
{channel0_error,starting,
{amqp_error,access_refused,
"AMQPLAIN login refused: user 'guest' - invalid credentials",
'connection.start_ok'}}
To get resolve connection with rabbitmq need to inspect below points:
Connectivity from client machine to rabbitmq server machine [in case if client and server are running on separate machine], need to check
along with port as well.
Credential (username and password), a user must be onboarded into RabbitMQ which will be used to connect with RabbitMQ
Permission to User must be given (permission may be attached to VHOST as well so need to provide permission carefully)
The best way to debug permissions issues in the amqp protocol is to look at the request:
transport://userid:password#hostname:port/virtual_host
from http://docs.celeryproject.org/en/latest/configuration.html#conf-broker-settings

Categories