I need to secure my zookeeper with SSL and then configure kafka broker to access to zookeeper with ssl.
Is it possible ?
And if yes, how ?
Thanks
Alain
Related
I want to setup a Shovel in which the destination RabbitMQ is configured to be TLS enabled.
I am unable to create a Shovel and the shovel stays in a starting state.
I have two different RabbitMQ instance in two separate docker container, one of them exposed via port 5671 (SSL) and 6671 (SSL) from host machine.
I am using RabbitMQ management plugin to establish the shovel
Below are the connection details
Source AMQP URI:
amqp://admin:pass#localhost:5672 (non-SSL)
Target AMQP URI
amqps://localhost:6671?cacertfile=/data/shared-file/certificates/ca_certificate.pem&certfile=/data/shared-file/certificates/client_certificate.pem&keyfile=/data/shared-file/certificates/client_key.pem&verify=verify_peer&server_name_indication=MyTestCA
What could be the problem here?
Kindly help
I am new to Apache Kafka, and here is what I have done so far,
Downloaded kafka_2.12-2.1.0
Make Batch file for Zookeeper to run zookeeper server:
start kafka_2.12-2.1.0.\bin\windows\zookeeper-server-start.bat kafka_2.12-2.1.0.\config\zookeeper.properties
Make Batch File for Apache Kafka server
start kafka_2.12-2.1.0\bin\windows\kafka-server-start.bat kafka_2.12-2.1.0\config\server.properties
Started A Producer using batch file.
start kafka_2.12-2.1.0.\bin\windows\kafka-console-producer.bat --broker-list localhost:9092 --topic 3drocket-player
It is running fine but now I am looking for authentication. As I have to implement a consumer with specific auth settings (requirement by the client). Like security protocol is SASL_SSL and SSL mechanism is GSSAPI.
For this reason, I tried to search and find confluet documentation but the problem is it is too abstract that how to take each and every step.
I am looking for detail configuration steps according to my setup. How to configure my kafka server with SASL SSL and GSSAPI protocol. Initially I found that GSSAPI/Keberos has a separate server then, do i need to install more server? Within Confluent Kafka is there any built-in solution.
Configure a SASL port in server.properties
e.g)
listeners=SASL_SSL://host.name:port
security.inter.broker.protocol=SASL_SSL
sasl.mechanism.inter.broker.protocol=GSSAPI
sasl.enabled.mechanisms=GSSAPI
sasl.kerberos.service.name=kafka
ssl.keystore.location=/path/to/keystore.jks
ssl.keystore.password=keystore_password
ssl.truststore.location=/path/to/truststore.jks
ssl.truststore.password=truststore_password
ssl.enabled.protocols=TLSv1.2,TLSv1.1,TLSv1
https://kafka.apache.org/documentation/#security_configbroker
https://kafka.apache.org/documentation/#security_sasl_config
Client:
When you run the Kafka client, you need to set these properties.
security.protocol=SASL_SSL
ssl.truststore.location=/path/to/truststore.jks
ssl.truststore.password=truststore_password
sasl.mechanism=GSSAPI
sasl.kerberos.service.name=kafka
https://kafka.apache.org/documentation/#security_configclients
https://kafka.apache.org/documentation/#security_sasl_kerberos_clientconfig
Then configure the JAAS configuration
KafkaClient {
com.sun.security.auth.module.Krb5LoginModule required
useKeyTab=true
keyTab="path/to/kafka_client.keytab"
storeKey=true
useTicketCache=false
principal="kafka-client-1#EXAMPLE.COM";
};
...
SASL/GSSAPI is for organizations using Kerberos (for example, by using Active Directory). You don’t need to install a new server just for Apache Kafka®. Ask your Kerberos administrator for a principal for each Kafka broker in your cluster and for every operating system user that will access Kafka with Kerberos authentication (via clients and tools).
https://docs.confluent.io/current/kafka/authentication_sasl/authentication_sasl_gssapi.html#kafka-sasl-auth-gssapi
....
I am trying to migrate from SSL Kafka listener to SASL_SSL Kafka listener without disturbing ongoing traffic on SSL listener/port. Is there any way to do this on Kafka version 1.1.1?
I am trying to use Zookeeper for node discovery with Apache Ignite. I have configured Zookeeper to only accept SSL/TLS connections. How do I provide Zookeeper keystore detail to Apache Ignite ZookeeperDiscoverySpi? I have checked the documentation and source code of ignite-zookeeper.jar and I do not see any options to supply these details? Should I be providing these details elsewhere in the ignite config?
Solution:
replace inginite-zookeeper.jar dependency zookeeper-3.4.6.jar with latest zookeeper-3.5.x.jar for proper SSL/Netty support.
supply SSL config details as JVM arguments (no options for this in Ignize Spi API)
I have Nifi cluster with one zookeeper node and five Nifi node. I want to have SSL encryption from the zookeeper server to the Nifi client.
Reading from the Nifi documentation, it says:
Support for SSL in ZooKeeper is being actively developed and is expected to be available in the 3.5.x release version.
The new zookeeper 3.5.3-beta have SSL capabilities.
I installed zookeeper 3.5.3 but I am unable to secure the connection it with SSL: I am getting NotSslRecordException
How can I run Nifi with a secure zookeeper using SSL?
Thank you
It requires more than just running ZooKeeper 3.5.x. There is code in NiFi that uses the ZooKeeper client and that code is not based on the 3.5.x client, so there is no way for NiFi to make a SSL connection.
Note that you also need to setup Zookeeper to use the SSL security for example
zookeeper.ssl.keyStore.location="/path/to/your/keystore"
zookeeper.ssl.keyStore.password="keystore_password"
zookeeper.ssl.trustStore.location="/path/to/your/truststore"
zookeeper.ssl.trustStore.password="truststore_password"
Full docummentation here: https://cwiki.apache.org/confluence/display/ZOOKEEPER/ZooKeeper+SSL+User+Guide