How to Connect kafka consumer topic using keytab in python - python

I am new to Kafka and trying to read messages from kafka consumer topics using python. I am using below piece of code to read the messages.
from kafka import KafkaConsumer
topic = 'topic'
bootstrap_servers = 'server'
consumer = KafkaConsumer(bootstrap_servers = [bootstrap_servers],
auto_offset_reset = 'earliest',
enable_auto_commit = True,
security_protocol = 'SASL_PLAINTEXT',
sasl_mechanism = 'GSSAPI',
consumer_timeout_ms = 1000)
When I run this, got the error message 'Could not find KfW installation' and failed to connect Kafka. Installed Kerberos for Windows MSI and reran, its able to establish the connectivity.
However, I am trying to avoid KfW installation in the local system, instead find a way to pass the keytab file and principal to use in the authentication process and read the data from kafka topic. (if its possible?)
But not sure, which argument of KafkaConsumer holds the keytab file.
Please suggest any better way available?

Related

Retrieve Azure IoT connection string using Python

I have the following code
conn_str = "HostName=my_host.azure-devices.net;DeviceId=MY_DEVICE;SharedAccessKey=MY_KEY"
device_conn = IoTHubDeviceClient.create_from_connection_string(conn_str)
await device_conn.connect()
This works fine, but only because I've manually retrieved this from the IoT hub and pasted it into the code. We are going to have hundreds of these devices, so is there a way to retrieve this connection string programmatically?
It'll be the equivalent of the following
az iot hub device-identity connection-string show --device-id MY_DEVICCE --hub-name MY_HUB --subscription ABCD1234
How do I do this?
The device id and key are you give to the each device and you choose where to store/how to load it. The connection string is just a concept for easy to get started but it has no meaning in the actual technical level.
You can use create_from_symmetric_key(symmetric_key, hostname, device_id, **kwargs) to direct pass key, id and hub uri to sdk.
I found it's not possible to retrieve the actual connection string, but a connection string can be built from the device primary key
from azure.iot.hub import IoTHubRegistryManager
from azure.iot.device import IoTHubDeviceClient
# HUB_HOST is YOURHOST.azure-devices.net
# SHARED_ACCESS_KEY is from the registryReadWrite connection string
reg_str = "HostName={0};SharedAccessKeyName=registryReadWrite;SharedAccessKey={1}".format(
HUB_HOST, SHARED_ACCESS_KEY)
device = IoTHubRegistryManager(reg_str).get_device("MY_DEVICE_ID")
device_key = device.authentication.symmetric_key.primary_key
conn_str = "HostName={0};DeviceId={1};SharedAccessKey={2}".format(
HUB_HOST, "MY_DEVICE_ID", device_key)
client = IoTHubDeviceClient.create_from_connection_string(
conn_str)
client.connect()
# Remaining code here...
Other options you could consider include:
Use the Device Provisioning service to manage provisioning and connecting your device to your IoT hub. You won't need to generate your connection strings manually in this case.
Use X.509 certificates (recommended for production environments instead of SAS). Each device has an X.509 cert derived from the root cert in your hub. See: https://learn.microsoft.com/azure/iot-hub/tutorial-x509-introduction

Receive message of any user in python ejabberd using xmlrpc.client

I am using ejabberd in python and I found a method to send the messages but how to get them messages or receive those messages in my python console please suggest me some method or way to do this.
to send the message my code is
import xmlrpc.client as xmlrpclib
server_url = 'http://127.0.0.1:5180/xmlrpc/'
server = xmlrpclib.ServerProxy(server_url)
EJABBERD_XMLRPC_LOGIN = {'user':'yatish', 'server':'localhost', 'password':'1234', 'admin':False}
def ejabberdctl(command, data):
fn = getattr(server, command)
print(fn.__dict__,'>>>>>>>>>>')
return fn(EJABBERD_XMLRPC_LOGIN, data)
result = ejabberdctl('send_message', {"type":"chat","from":"yatish#localhost","to":"1#localhost",
"subject":"backend subject","body":"Hey this is message from python1"})
here I can send messages from yatish#localhost to 1#localhost user I want to get all the messages received of the 1#lcoalhost, can you please suggest me some method I have checked all the docs and google by my side but unable to get some ay to receive all those messages in python. if the messages received the client should connected and receive the messages relatime.
thanks
You wrote a XMLRPC client to use the ejabberd's "send_message" administrative command to perform this task.
But there isn't any admin command in ejabberd to check or read XMPP messages.
I suggest you a different approach: forget about using XMLRPC or ejabberd commands. Instead, write a small XMPP client (there are libraries in python for that, see https://xmpp.org/software/libraries/ ).
Your XMPP client should:
login to the FROM account
send the message
logout
Then write another small client that
logins to the TO account, with a possitive presence number
ejabberd will immediately send him the offline messages that were stored
do whatever with those messages, and logout
If you are able to write those XMPP clients in your prefered language (Python or whatever), you can use those clients with any XMPP server: ejabberd, or any other that you may want to install in other machines, or in the future.

Can I call the Bluemix message hub service from Python?

The kafka-python client supports Kafka 0.9 but doesn't obviously include the new authentication and encryption features so my guess is that it only works with open servers (as in previous releases). In any case, even the Java client needs a special message hub login module to connect (or so it would seem from the example) which suggests that nothing will work unless there is a similar module available for Python.
My specific scenario is that I want to use the message hub service from a Jupyter notebook also hosted in Bluemix (the Apache Spark service).
I was able to connect using the kafka-python library:
$ pip install --user kafka-python
Then ...
from kafka import KafkaProducer
from kafka.errors import KafkaError
import ssl
############################################
# Service credentials from Bluemix UI:
############################################
bootstrap_servers = # kafka_brokers_sasl
sasl_plain_username = # user
sasl_plain_password = # password
############################################
sasl_mechanism = 'PLAIN'
security_protocol = 'SASL_SSL'
# Create a new context using system defaults, disable all but TLS1.2
context = ssl.create_default_context()
context.options &= ssl.OP_NO_TLSv1
context.options &= ssl.OP_NO_TLSv1_1
producer = KafkaProducer(bootstrap_servers = bootstrap_servers,
sasl_plain_username = sasl_plain_username,
sasl_plain_password = sasl_plain_password,
security_protocol = security_protocol,
ssl_context = context,
sasl_mechanism = sasl_mechanism,
api_version=(0,10))
# Asynchronous by default
future = producer.send('my-topic', b'raw_bytes')
# Block for 'synchronous' sends
try:
record_metadata = future.get(timeout=10)
except KafkaError:
# Decide what to do if produce request failed...
log.exception()
pass
# Successful result returns assigned partition and offset
print (record_metadata.topic)
print (record_metadata.partition)
print (record_metadata.offset)
This worked for me from Bluemix spark as a service from a jupyter notebook, however, note that this approach is not using spark. The code is just running on the driver host.
The SASL support in the Kafka Python client has been requested : https://github.com/dpkp/kafka-python/issues/533 but until the username/password login method used by Message Hub is supported, it won't work
Until this is natively supported by the Bluemix Apache Spark Service, you can follow the same approach as the Realtime Sentiment Analysis project. Helper code for this can be found on the cds labs spark samples github repo.
We've added some text to our documentation on non-Java language support - see the "CONNECTING AND AUTHENTICATING IN A NON-JAVA APPLICATION" section:
https://www.ng.bluemix.net/docs/services/MessageHub/index.html
Our current authentication method is non-standard and not supported by the Apache project, but was a temporary solution. The Message Hub team is working with the Apache Kafka community to develop KIP-43. Once this is finalised, we'll change the Message Hub authentication implementation to match and it will be possible to implement clients to that spec in any language.

Subscribe emits Forbidden error

I have a two part python standalone application: a publisher and a subscriber.
The publisher generates fake JSON devices objects and published them on a channel called "devices." And as you would guess, the subscriber subscribes to the channel "devices."
(Additionally, given optional command line arguments, the publisher or subscriber can write JSON objects to a socket or a local directory where an Apache Spark Streaming context pickups the JSON objects and processes it. For now, this is not in the picture, as it's optional.)
However, my problem is when my subscriber runs, after the publisher has finished, I get "ERROR: Forbidden".
Here are the respective python code snippets for the publisher:
pubnub = Pubnub(publish_key="my_key", subscribe_key="my_key")
....
pubnub.publish(ch, device_msg)
In the subscriber python file I have the following init code:
def receive(message, channel):
json.dumps(message)
def on_error(message):
print ("ERROR: " + str(message))
....
pubnub = Pubnub(publish_key="my_keys", subscribe_key="my_keys")
# subscribe to a channel and invoke the appropriate callback when a message arrives on that
# channel
#
pubnub.subscribe(channels=ch, callback=receive, error=on_error)
pubnub.start()
While the publisher, when run, seems to publish the JSON messages, all 120 in a loop, whereas the subscriber, when run, seems to fail with the following error message:
ERROR: Forbidden
My attempts to use "demo" keys have made no difference. Note that I'm using a trial account for PubNub.
Since this is one of my first app using its API, has anyone seen this problem before. Surely, something very obvious or trivial is amiss here.
Answer was that there was a copy/paste error with the pub/sub keys.

Amazon SQS messages coming through boto garbled

I'm using a service that publishes messages to Amazon SQS, but my messages come out garbled when I do the following in Python, via boto:
queue = SQS_CONNECTION.get_queue(QUEUE_NAME)
messages = queue.get_messages()
The messages are returned as strings of what appear to be base 64 encoded data
As helped by this discussion https://groups.google.com/forum/#!topic/boto-users/Pv5fUc_RdVU ,
the solution is as follows:
from boto.sqs.message import RawMessage
queue = SQS_CONNECTION.get_queue(QUEUE_NAME)
queue.set_message_class(RawMessage)
messages = queue.get_messages()

Categories