I'm using a service that publishes messages to Amazon SQS, but my messages come out garbled when I do the following in Python, via boto:
queue = SQS_CONNECTION.get_queue(QUEUE_NAME)
messages = queue.get_messages()
The messages are returned as strings of what appear to be base 64 encoded data
As helped by this discussion https://groups.google.com/forum/#!topic/boto-users/Pv5fUc_RdVU ,
the solution is as follows:
from boto.sqs.message import RawMessage
queue = SQS_CONNECTION.get_queue(QUEUE_NAME)
queue.set_message_class(RawMessage)
messages = queue.get_messages()
Related
Unable to fetch all SQS messages by Lambda(Python).
Its reading the Queue partially
def lambda_handler(event, context):
response = sqs_client.receive_message(QueueUrl="queurlXXX.fifo",
MaxNumberOfMessages=10,WaitTimeSeconds=20,VisibilityTimeout=0)
for message in response.get("Messages", []):
message_body = message["Body"]
#print(message_body)
ip_json=json.loads(message_body)
op_json=json.dumps(ip_json)
#print(op_json)
if op_json:
conn = pymssql.connect(host='DB Credentials', database='DbNAme', port='1433')
cursor = conn.cursor()
cursor.execute("INSERT INTO [table]([MessageId],[Document],[IsProcessed],[CreatedUtc],[CreatedBy],[ModifiedUtc],[ModifiedBy]) VALUES('messageid', %s,1,GETUTCDATE(),'system',GETUTCDATE(),'system');",(op_json))
conn.commit()
#print(f"Message body: {json.loads(message_body)}")
#print(f"Receipt Handle: {message['ReceiptHandle']}")
return "Queue loaded"
It appears that you are saying that your receive_message() call did not return all messages in the Amazon SQS queue. This is a normal behaviour of Amazon SQS due to its highly distributed architecture.
From Amazon SQS short and long polling - Amazon Simple Queue Service:
Consuming messages using short polling
When you consume messages from a queue using short polling, Amazon SQS samples a subset of its servers (based on a weighted random distribution) and returns messages from only those servers. Thus, a particular ReceiveMessage request might not return all of your messages. However, if you have fewer than 1,000 messages in your queue, a subsequent request will return your messages. If you keep consuming from your queues, Amazon SQS samples all of its servers, and you receive all of your messages.
The following diagram shows the short-polling behavior of messages returned from a standard queue after one of your system components makes a receive request. Amazon SQS samples several of its servers (in gray) and returns messages A, C, D, and B from these servers. Message E isn't returned for this request, but is returned for a subsequent request.
I am using ejabberd in python and I found a method to send the messages but how to get them messages or receive those messages in my python console please suggest me some method or way to do this.
to send the message my code is
import xmlrpc.client as xmlrpclib
server_url = 'http://127.0.0.1:5180/xmlrpc/'
server = xmlrpclib.ServerProxy(server_url)
EJABBERD_XMLRPC_LOGIN = {'user':'yatish', 'server':'localhost', 'password':'1234', 'admin':False}
def ejabberdctl(command, data):
fn = getattr(server, command)
print(fn.__dict__,'>>>>>>>>>>')
return fn(EJABBERD_XMLRPC_LOGIN, data)
result = ejabberdctl('send_message', {"type":"chat","from":"yatish#localhost","to":"1#localhost",
"subject":"backend subject","body":"Hey this is message from python1"})
here I can send messages from yatish#localhost to 1#localhost user I want to get all the messages received of the 1#lcoalhost, can you please suggest me some method I have checked all the docs and google by my side but unable to get some ay to receive all those messages in python. if the messages received the client should connected and receive the messages relatime.
thanks
You wrote a XMLRPC client to use the ejabberd's "send_message" administrative command to perform this task.
But there isn't any admin command in ejabberd to check or read XMPP messages.
I suggest you a different approach: forget about using XMLRPC or ejabberd commands. Instead, write a small XMPP client (there are libraries in python for that, see https://xmpp.org/software/libraries/ ).
Your XMPP client should:
login to the FROM account
send the message
logout
Then write another small client that
logins to the TO account, with a possitive presence number
ejabberd will immediately send him the offline messages that were stored
do whatever with those messages, and logout
If you are able to write those XMPP clients in your prefered language (Python or whatever), you can use those clients with any XMPP server: ejabberd, or any other that you may want to install in other machines, or in the future.
I am new to Kafka and trying to read messages from kafka consumer topics using python. I am using below piece of code to read the messages.
from kafka import KafkaConsumer
topic = 'topic'
bootstrap_servers = 'server'
consumer = KafkaConsumer(bootstrap_servers = [bootstrap_servers],
auto_offset_reset = 'earliest',
enable_auto_commit = True,
security_protocol = 'SASL_PLAINTEXT',
sasl_mechanism = 'GSSAPI',
consumer_timeout_ms = 1000)
When I run this, got the error message 'Could not find KfW installation' and failed to connect Kafka. Installed Kerberos for Windows MSI and reran, its able to establish the connectivity.
However, I am trying to avoid KfW installation in the local system, instead find a way to pass the keytab file and principal to use in the authentication process and read the data from kafka topic. (if its possible?)
But not sure, which argument of KafkaConsumer holds the keytab file.
Please suggest any better way available?
I have a two part python standalone application: a publisher and a subscriber.
The publisher generates fake JSON devices objects and published them on a channel called "devices." And as you would guess, the subscriber subscribes to the channel "devices."
(Additionally, given optional command line arguments, the publisher or subscriber can write JSON objects to a socket or a local directory where an Apache Spark Streaming context pickups the JSON objects and processes it. For now, this is not in the picture, as it's optional.)
However, my problem is when my subscriber runs, after the publisher has finished, I get "ERROR: Forbidden".
Here are the respective python code snippets for the publisher:
pubnub = Pubnub(publish_key="my_key", subscribe_key="my_key")
....
pubnub.publish(ch, device_msg)
In the subscriber python file I have the following init code:
def receive(message, channel):
json.dumps(message)
def on_error(message):
print ("ERROR: " + str(message))
....
pubnub = Pubnub(publish_key="my_keys", subscribe_key="my_keys")
# subscribe to a channel and invoke the appropriate callback when a message arrives on that
# channel
#
pubnub.subscribe(channels=ch, callback=receive, error=on_error)
pubnub.start()
While the publisher, when run, seems to publish the JSON messages, all 120 in a loop, whereas the subscriber, when run, seems to fail with the following error message:
ERROR: Forbidden
My attempts to use "demo" keys have made no difference. Note that I'm using a trial account for PubNub.
Since this is one of my first app using its API, has anyone seen this problem before. Surely, something very obvious or trivial is amiss here.
Answer was that there was a copy/paste error with the pub/sub keys.
My system has about 10000 iOS users and i want to send them a push notification but without taking time as i may send another message after 5 minutes or less for the same user,
I read this answer before which also founded in Apple Site:
Push Notification Throughput and Error Checking
There are no caps or batch size limits for using APNs. The iOS 6.1
press release stated that APNs has sent over 4 trillion push
notifications since it was established. It was announced at WWDC 2012
that APNs is sending 7 billion notifications daily.
If you're seeing throughput lower than 9,000 notifications per second,
your server might benefit from improved error handling logic.
But a don't know how to send 9000/s message while i'm sending the notification one by one.
I'm using Python (PyAPNs) and this is my code:
from apns import APNs,Payload
result = execute("SELECT token_hex FROM `Users`")
for row in result:
token_hex = row['token_hex']
apns = APNs(use_sandbox=False, cert_file='Cert.pem', key_file='CertKey.pem')
payload = Payload(alert="Message",badge=1,sound='default')
apns.gateway_server.send_notification(token_hex, payload)
I'm sending to 10000 users in more than 30 minutes...
So what is the problem in my code or what can i do to send the notification in less time...
Thanks in Advance,
I don't know python but looking at your code it looks like you are duplicating calls unnecessarily. You should use the same connection for sending all the notifications.
Perhaps you should try something like this :
from apns import APNs,Payload
result = execute("SELECT token_hex FROM `Users`")
apns = APNs(use_sandbox=False, cert_file='Cert.pem', key_file='CertKey.pem')
payload = Payload(alert="Message",badge=1,sound='default')
for row in result:
token_hex = row['token_hex']
apns.gateway_server.send_notification(token_hex, payload)
This is assuming you are sending the same notification payload to all your devices.