I am able to connect to my local kafka server with python kafka package.
however I am not able to connect to external ssl enabled kafka server.
Whereas my java code is able to communicate with the same server using these parameters:
props.put("security.protocol", kafkaProtocol);
props.put(SslConfigs.SSL_PROTOCOL_CONFIG, kafkaProtocol);
props.put(SslConfigs.SSL_TRUSTSTORE_LOCATION_CONFIG, kafkaCertLocation);
props.put(SslConfigs.SSL_TRUSTSTORE_PASSWORD_CONFIG, kafkaCertPassword);
I don't know what exactly the equivalent parameters in python kafka package.
Can some suggest me on this immediately.
I have tried this code:
producer = KafkaProducer(value_serializer=lambda m: json.dumps(m).encode('utf-8'),
bootstrap_servers='YYYYY.KAKFASERVER.com:9094',
security_protocol='SSL',
ssl_certfile='cacerts',
ssl_password='xxxxxxx')
I am getting the following error message:
failed to connect to YYYYY.KAKFASERVER.com:9094 unknown error (_ssl.c:3715)
Related
I am attempting to create a simple producer and consumer with two Python scripts, using Kafka deployed on Microk8s. However, when running the producer.py script, I get the following error on repeat:
...|FAIL|rdkafka#producer-1| [thrd:...:9092/bootstrap]: ...:9092/bootstrap: Connect to ipv4#localhost:9092 failed: Connection refused (after 0ms in state CONNECT, ... identical error(s) suppressed
I am fairly confident that this issue is a result of the listeners not being configured correctly, but I have so far been unable to figure out what I need to do to fix them, due to what I assume is my complete lack of any knowledge in this area. I have reviewed these resources, in addition to several others from this site, but have been unable to find a solution, or at least a solution I can understand enough to act upon.
Steps to Reproduce:
The Python scripts to generate the producer and consumer can be found here.
For Microk8s installation, I followed these instructions. I also installed Helm, since my project requirements dictate that I use Helm charts.
I then installed Kafka using:
helm repo add bitnami https://charts.bitnami.com/bitnami
helm install kafka-release bitnami/kafka
The Python code in the linked post uses 'localhost:9092', as the error also shows - Connect to ipv4#localhost:9092 failed
If you are trying to run that code in a k8s pod, then you need to give the external broker DNS addresses, not the local pod address.
If you run the Python code from outside the k8s cluster, you need to expose a ClusterIP / NodePort external service or Ingress (as the linked Strimzi post shows; plus, you can can still use Strimzi Operator with Helm, so you don't really need the Bitnami Charts).
At a high level, the advertisted.listeners tells clients how to connect to a specific broker. If you advertise localhost, the pod will try to connect to itself, even if the bootstrap connection worked (setup by just listeners). If you advertise kafka.svc.cluster.local, then it will try to connect to the kafka service in the default namespace... But you still need to actually set boostrap.servers = kafka.svc.cluster.local:9092, for example.
This question already exists:
How to fix not receiving kafka messages in python but receiving the same messages in shell?
Closed 3 years ago.
I have a setup of Debezium which uses Kafka. I am able to consume messages from kafka console as described in the doc. However, when I create a kafka consumer using Python on my local, I am unable to consume messages. It should be noted that kafka console works just fine!
I tried looking into this issue but was unable to have similar environments/situations
My python code to connect is:
from kafka import KafkaConsumer
consumer = KafkaConsumer('dbserver1.inventory.customers', group_id='my-group', bootstrap_servers=['localhost:9092'], auto_offset_reset='earliest')
for message in consumer:
print(message)
This just goes blank regardless of existing messages or new messages pushed to this topic.
I am sure that the messages exist because when I open up a console consumer, I get to see the messages.
Just to be clear on the whole setup:
I have followed this (https://github.com/debezium/debezium-examples/tree/master/tutorial#using-mongodb) doc for each step(except the last one).
Everything works but my Python code.
I also tried creating a consumer with kafka:9092 bootstrap server but it ends up in an error:
kafka.errors.NoBrokersAvailable: NoBrokersAvailable
My local is Mac OS.
FYI:
I am able to get everything else, like topics:
>>> consumer = KafkaConsumer('dbserver1.inventory.customers', group_id='my-group', bootstrap_servers=['localhost:9092'], auto_offset_reset='earliest')
>>> consumer.topics()
{'my_connect_offsets', 'my_connect_configs', 'dbserver1.inventory.orders', 'connect-status', 'dbserver1.inventory.customers', 'dbserver1.inventory.products'}
I am starting consumer via command:
docker-compose -f debezium-mongodb.yaml exec kafka /kafka/bin/kafka-console-consumer.sh \
--bootstrap-server kafka:9092 \
--from-beginning \
--property print.key=true \
--topic dbserver1.inventory.customers
Without seeing your compose file, localhost:9092 likely will not work in your Python code based on your docker command
If your Python code is not running in a container, it needs to read from a different port. If it is running in a container, you must use kafka:9092
The port you use depends on the advertised listeners of the container
Connect to Kafka running in Docker from local machine
I am trying to connect to a GRPC server in a celery task. I have the following piece of code
timeout = 1
host = '0.tcp.ngrok.io'
port = '7145'
channel = grpc.insecure_channel('{0}:{1}'.format(host, port))
try:
grpc.channel_ready_future(channel).result(timeout=timeout)
except grpc.FutureTimeoutError:
sys.exit(1)
stub = stub(channel)
When I run this snippet through the Python shell, I am able to establish the connection, and execute the GRPC methods. However, when I run this through the Celery task, I get the grpc.FutureTimeoutError, and the connection does not get established.
The Celery worker lies on the same machine as the grpc server. I tried using the socket library to ping the GRPC server, and that worked (It returned some junk response).
I am using Python 2.7, with grpcio==1.6.0 installed. The Celery version is 4.1.0. Any pointers would be helpful.
I believe Celery uses fork under the hood, and gRPC 1.6 did not support any forking behavior.
Try updating to gRPC 1.7.
I'm trying to connect to an OPC Server using the python Open OPC library, it is working fine with a Matrikon OPC Simulator, however when I try to connect it to the actual server the client seems to be hanging on the OpenOPC.open_client method, I added some debug messages in this API and found that the following API in OpenOPC.py is hanging:
import Pyro.core
Pyro.core.initClient(banner=0)
server_obj = Pyro.core.getProxyForURI("PYROLOC://%s:%s/opc" % (host, port))
return server_obj.create_client() #this API is hanging
So if anyone has used OpenOPC to interface with OPC Servers, and has run into similar problem, please let me know. cheers!
When using OpenOPC in Linux, you can't use DCOM.
So you need to use OpenOPC Gateway Service and the "open_client" method.
This service must be installed and running in the actual OPC server for your client to access it.
I am looking to implement JSON-RPC using the python twisted framework. I've installed the latest version of txJSON-RPC (v0.3.1) and simplejson (v2.3.2) and am trying the json examples with twisted v12.0.0 for Python 2.7.
The server from this StackOverflow post Python Twisted JSON RPC compiles and runs, but the client errors out:
error [Failure instance: Traceback (failure with no frames): <class
'twisted.internet.error.ConnectionRefusedError'>: Connection was refused by other side:
10061: No connection could be made because the target machine actively refused it..
]
Which, from the look of it, means that the server isn't actually running. However, after executing the server, you can connect to the port via telnet.
I also tried the examples included in the /examples directory in the txJSON-RPC tarball, but those didn't run correctly either.
Any ideas about where I can find up-to-date information on how to successfully run a JSON-RPC server with Twisted?