Below is the error we have received when trying to read the stream
Caused by: kafkashaded.org.apache.kafka.common.KafkaException: Failed to load SSL keystore /dbfs/FileStore/Certs/client.keystore.jks
Caused by: java.nio.file.NoSuchFileException: /dbfs/FileStore/Certs/client.keyst
When trying to read a stream from Kafka, Databricks is unable to find keystore files.
df = spark.readStream \
.format("kafka") \
.option("kafka.bootstrap.servers","kafka server with port") \
.option("kafka.security.protocol", "SSL") \
.option("kafka.ssl.truststore.location",'/dbfs/FileStore/Certs/client.truststore.jks' ) \
.option("kafka.ssl.keystore.location", '/dbfs/FileStore/Certs/client.keystore.jks') \
.option("kafka.ssl.keystore.password", keystore_pass) \
.option("kafka.ssl.truststore.password", truststore_pass) \
.option("kafka.ssl.keystore.type", "JKS") \
.option("kafka.ssl.truststore.type", "JKS") \
.option("subscribe","sports") \
.option("startingOffsets", "earliest") \
.load()
The file exists in the dbfs and also able to read the file.
We have also mounted the blob storage in datrbicks and tried to read the files from ADLS gen2.
The driver logs also has additional error: 22/11/04 12:18:07 ERROR DefaultSslEngineFactory: Modification time of key store could not be obtained.
We are trying to read a kafka stream by authenticating it using SSL keystores.
The connection doesn't seem to work as databricks is unable to view the keystores
I was able to access the key-store files by adding dbfs prefix to the original path.
so, instead of using the path /dbfs/FileStore/Certs/client.truststore.jks, I used /dbfs/dbfs/FileStore/Certs/client.truststore.jks. enter code here
But I am now receiving SSL handshake error even though the trust-store I have created is based on server certificate and the fingerprint in the certificate matches the trust-store fingerprint.
kafkashaded.org.apache.kafka.common.errors.SslAuthenticationException: SSL handshake failed Caused by: sun.security.validator.ValidatorException: PKIX path validation failed: java.security.cert.CertPathValidatorException: signature check failed Caused by: java.security.cert.CertPathValidatorException: signature check failed Caused by: java.security.SignatureException: Signature does not match.
Related
I am trying to run Kafka producer client to publish some message to kafka broker. I have given the path to Keystore/Trust store along with Password. I was able to send the message to the broker when i deployed this on Apache tomcat. However when i tried to deploy the same application on websphere, i get error "Failed to load SSL keystore". I have given those directories read/write/execute permission. Is there something with websphere that needs different configuration / settings ?
Caused by: org.apache.kafka.common.KafkaException: org.apache.kafka.common.KafkaException: Failed to load SSL keystore /home/avaya/tcr/uc-ivr-nar-dev.dbplatform.portal.com.jks of type JKS
at org.apache.kafka.common.security.ssl.SslEngineBuilder.createSSLContext(SslEngineBuilder.java:160)
at org.apache.kafka.common.security.ssl.SslEngineBuilder.<init>(SslEngineBuilder.java:102)
at org.apache.kafka.common.security.ssl.SslFactory.configure(SslFactory.java:93)
at org.apache.kafka.common.network.SslChannelBuilder.configure(SslChannelBuilder.java:71)
... 37 more
Caused by: org.apache.kafka.common.KafkaException: Failed to load SSL keystore /home/avaya/tcr/uc-ivr-nar-dev.dbplatform.portal.com.jks of type JKS
at org.apache.kafka.common.security.ssl.SslEngineBuilder$SecurityStore.load(SslEngineBuilder.java:289)
at org.apache.kafka.common.security.ssl.SslEngineBuilder.createSSLContext(SslEngineBuilder.java:142)
... 40 more
Caused by: java.nio.file.AccessDeniedException: /home/avaya/tcr/uc-ivr-nar-dev.dbplatform.portal.com.jks
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:96)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:114)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:119)
at sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:226)
at java.nio.file.Files.newByteChannel(Files.java:372)
at java.nio.file.Files.newByteChannel(Files.java:418)
at java.nio.file.spi.FileSystemProvider.newInputStream(FileSystemProvider.java:395)
at java.nio.file.Files.newInputStream(Files.java:163)
at org.apache.kafka.common.security.ssl.SslEngineBuilder$SecurityStore.load(SslEngineBuilder.java:282)
... 41 more
Open JDK for some reason does not like JKS keystore files. Converted to PCKS12 format and it worked. Nothing to do with websphere container.
I have Kafka brokers in cluster. We use SASL authentication. How can I request for example topics list using kafka-topics.sh?
I assume that I should run
kafka-topics.sh \
--bootstrap-server kafka.broker:9092 \
--command-config config.properties \
--list
And to pass values to config.properties
security.protocol=SASL_SSL
sasl.mechanism=SCRAM-SHA-256
sasl.username=user-name
sasl.password=password
ssl.key.location=/path/to/certs/key.pem
ssl.certificate.location=/path/to/certs/crt.pem
ssl.ca.location=/path/to/certs/ca.pem
When I run it I get
Exception in thread "main" org.apache.kafka.common.KafkaException: Failed to create new KafkaAdminClient
at org.apache.kafka.clients.admin.KafkaAdminClient.createInternal(KafkaAdminClient.java:553)
at org.apache.kafka.clients.admin.KafkaAdminClient.createInternal(KafkaAdminClient.java:485)
at org.apache.kafka.clients.admin.Admin.create(Admin.java:134)
at kafka.admin.TopicCommand$TopicService$.createAdminClient(TopicCommand.scala:205)
at kafka.admin.TopicCommand$TopicService$.apply(TopicCommand.scala:209)
at kafka.admin.TopicCommand$.main(TopicCommand.scala:50)
at kafka.admin.TopicCommand.main(TopicCommand.scala)
Caused by: java.lang.IllegalArgumentException: Could not find a 'KafkaClient' entry in the JAAS configuration. System property 'java.security.auth.login.config' is not set
at org.apache.kafka.common.security.JaasContext.defaultContext(JaasContext.java:131)
at org.apache.kafka.common.security.JaasContext.load(JaasContext.java:96)
at org.apache.kafka.common.security.JaasContext.loadClientContext(JaasContext.java:82)
at org.apache.kafka.common.network.ChannelBuilders.create(ChannelBuilders.java:167)
at org.apache.kafka.common.network.ChannelBuilders.clientChannelBuilder(ChannelBuilders.java:81)
at org.apache.kafka.clients.ClientUtils.createChannelBuilder(ClientUtils.java:105)
at org.apache.kafka.clients.admin.KafkaAdminClient.createInternal(KafkaAdminClient.java:524)
We use the same values to connect it from go service that uses segmentio Kafka driver. What config should be?
To pass SASL credentials you need to use the sasl.jaas.config setting. sasl.username and sasl.password are not valid settings with kafka-topics.sh (and the Java client).
For example:
security.protocol=SASL_SSL
sasl.mechanism=SCRAM-SHA-256
sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required \
username="user-name" \
password="password";
Similarly ssl.key.location, ssl.certificate.location and ssl.ca.location are not valid settings, you need to use ssl.keystore.location and ssl.truststore.location instead. See the full list of configurations: https://kafka.apache.org/documentation/#adminclientconfigs
See the SCRAM client configuration section in the Kafka docs if you want more details.
I'm trying to publish message to a Tibco Queue on a SSL Tibco Server through JMeter 5.4.1 using JMS Point-to-Point Logic Controller.
JMS Point To Point Controller Config
But I'm getting the following error message:
2021-06-13 12:25:46,278 ERROR o.a.j.p.j.s.JMSSampler: Not permitted:
Failed to connect to any server at: ssl://[server-name]:7352,
ssl://[server-name]:7352 [Error: Failed to connect via SSL to
[ssl://[server-name]:7352]: Received fatal alert:
protocol_version: url that returned this exception =
SSL://[server-name]:7352 ]
javax.naming.AuthenticationException: Not permitted: Failed to connect
to any server at: ssl://[server-name]:7352,
ssl://[server-name]:7352 [Error: Failed to connect via SSL to
[ssl://[server-name]:7352]: Received fatal alert:
protocol_version: url that returned this exception =
SSL://[server-name] ] at
com.tibco.tibjms.naming.TibjmsContext.lookup(TibjmsContext.java:670)
~[tibjms.jar:8.0.0] at
com.tibco.tibjms.naming.TibjmsContext.lookup(TibjmsContext.java:491)
~[tibjms.jar:8.0.0] at
javax.naming.InitialContext.lookup(InitialContext.java:417)
~[?:1.8.0_291] at
org.apache.jmeter.protocol.jms.sampler.JMSSampler.threadStarted(JMSSampler.java:638)
[ApacheJMeter_jms.jar:5.4.1] at
org.apache.jmeter.threads.JMeterThread$ThreadListenerTraverser.addNode(JMeterThread.java:784)
[ApacheJMeter_core.jar:5.4.1] at
org.apache.jorphan.collections.HashTree.traverseInto(HashTree.java:993)
[jorphan.jar:5.4.1] at
org.apache.jorphan.collections.HashTree.traverse(HashTree.java:976)
[jorphan.jar:5.4.1] at
org.apache.jmeter.threads.JMeterThread.threadStarted(JMeterThread.java:752)
[ApacheJMeter_core.jar:5.4.1] at
org.apache.jmeter.threads.JMeterThread.initRun(JMeterThread.java:740)
[ApacheJMeter_core.jar:5.4.1] at
org.apache.jmeter.threads.JMeterThread.run(JMeterThread.java:252)
[ApacheJMeter_core.jar:5.4.1]
I tried:
openssl s_client -connect [server-name]:7352
It gave the following output:
SSL-Session:
Protocol : TLSv1.2
Cipher : ECDHE-RSA-AES256-GCM-SHA384
Session-ID:
Session-ID-ctx:
So added the following line in jmeter.properties file.
https.default.protocol=TLSv1.2
Also commented jdk.tls.disabledAlgorithms from java.security file for JDK (I'm using jdk1.8.0_291)
# jdk.tls.disabledAlgorithms=SSLv3, TLSv1, TLSv1.1, RC4, DES, MD5withRSA, \
# DH keySize < 1024, EC keySize < 224, 3DES_EDE_CBC, anon, NULL, \
# include jdk.disabled.namedCurves
But still I'm getting the same error. Someone please help.
I think you're using the wrong property (not only the wrong property but the wrong place as well), you're setting default protocol for HTTPS, while you need to set it for TLS, i.e. add the next line to system.properties file
jdk.tls.client.protocols=TLSv1.2
JMeter restart will be required to apply this property.
If it won't help or you will get different errors - consider adding the next line there as well:
javax.net.debug=all
and then check jmeter.log file and stdout for any suspicious entries
More information:
Configuring JMeter
Apache JMeter Properties Customization Guide
I resolved it by using the latest tibjms.jar in the lib directory in JMeter as the Tibco server was upgraded some hours before I raised this issue.
I have a 3 nodes Kafka cluster. I have enabled SASL_PLAINTEXT and it is working fine with Port 6667. Now I want o enable SSL for different Port in the same cluster. I have enabled the trustore and Keystore certificates. and I did below configuration from the broker side.
listeners : SSL://localhost:6668
security.inter.broker.protocol : SSL
ssl.key.password : xxxx
ssl.keystore.location : /root/kafka.server.keystore.jks
ssl.keystore.password : xxxxx
ssl.truststore.location : /root/kafka.server.truststore.jks
ssl.truststore.password : xxxxxx
ssl.keystore.type : JKS
ssl.truststore.type : JKS
I Have given permission also. I am getting below errors
Caused by: org.apache.kafka.common.KafkaException: org.apache.kafka.common.KafkaException: Failed to load SSL keystore /root/kafka.server.keystore.jks of type JKS
Caused by: org.apache.kafka.common.KafkaException: Failed to load SSL keystore /root/kafka.server.keystore.jks of type JKS
Caused by: java.io.FileNotFoundException: /root/kafka.server.keystore.jks (Permission denied)
Caused by: java.io.FileNotFoundException:
/root/kafka.server.keystore.jks (Permission denied)
The error trace is fairly clear. /root/kafka.server.keystore.jks cannot be accessed by the process. Note that the process typically runs on a different user and I suspect that the keystore has been created by a different user.
Make sure that the user that is running the process has sufficient access rights for reading /root/kafka.server.keystore.jks. One way of achieving this is to change the ownership of the file:
sudo chown -R userWhoRunsTheProcess:userGroup /root/kafka.server.keystore.jks
Regarding the question, listeners takes a list of addresses,
listeners : SSL://0.0.0.0:6668,SASL_PLAINTEXT://0.0.0.0;6667
You already have sasl, so I would suggest using sasl_ssl
I am receiving the following error while trying to connect my REST webservice using HTTP adapter in IBM Mobile First:
"errors": [
"Runtime: Http request failed: javax.net.ssl.SSLException: java.lang.RuntimeException: Unexpected error: java.security.InvalidAlgorithmParameterException: the trustAnchors parameter must be non-empty"
I am passing the user id and password in a base64 encoded format in the headers section of my input.
How do I resolve this error?
Yoel's answer got me on track: your adapter is doing an SSL request to a server that is not trusted by the keystore in your MobileFirst server.
You need to import in your server's keystore the certificate chain of the server that you are trying to reach. What I did was
From Firefox, export the certificate chain in PEM format (.crt extension).
In the server/conf folder of your project, import the certificate chain file. If you are using the defaults form the worklight.properties file, this will do it:
keytool --import -keystore default.keystore -storepass worklight -file remoteServer.crt
This bizarre message means that the truststore you specified was not
found, or couldn't be opened due to access permissions for example.
Quote from: Error - trustAnchors parameter must be non-empty
Author: #EJP
Similar question:
got java.security.InvalidAlgorithmParameterException: the trustAnchors parameter must be non-empty when using cas