Apache Curator Unimplemented Errors When Trying to Create zNodes - apache

I'm attempting to use Apache Curator with a dockerized zookeeper instance and no matter how I attempt to connect I always end up with a
org.apache.zookeeper.KeeperException$UnimplementedException:
KeeperErrorCode = Unimplemented for...
error. I've tried making sense of the documentation but I'm not getting anywhere. I've logged into the zookeeper CLI and ensured the port number is correct thusly:
snerd#powerglove:~$ docker ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES 31f1093495ba compose_zookeeper "/opt/zookeeper/bin/ 3 weeks ago Up About a minute 0.0.0.0:32770->2181/tcp,
0.0.0.0:32769->2888/tcp, 0.0.0.0:32768->3888/tcp zookeeper
here is the code I'm trying to use:
public class App {
public static void main( String[] args ) {
CuratorFramework client = CuratorFrameworkFactory.newClient("0.0.0.0:32770", new RetryUntilElapsed(3000, 1000));
client.start();
try {
client.create().forPath("/larry-smells/foop", "tuna?".getBytes());
} catch (Exception e) {
System.out.println(e.toString());
}
}
}
As far as I can tell from the Curator getting started page, this should work. What am I missing?
edit1
just figured out that I'm able to pull data out of the zookeeper ensemble thusly:
System.out.println(new String(curatorFramework.getData().forPath("/larry-smells")));
but the create command is still blowing up.
edit2
stacktrace of the error:
org.apache.zookeeper.KeeperException$UnimplementedException:
KeeperErrorCode = Unimplemented for /larry-smells/foop at
org.apache.zookeeper.KeeperException.create(KeeperException.java:103)
at
org.apache.zookeeper.KeeperException.create(KeeperException.java:51)
at org.apache.zookeeper.ZooKeeper.create(ZooKeeper.java:1297) at
org.apache.curator.framework.imps.CreateBuilderImpl$17.call(CreateBuilderImpl.java:1040)
at
org.apache.curator.framework.imps.CreateBuilderImpl$17.call(CreateBuilderImpl.java:1023)
at
org.apache.curator.connection.StandardConnectionHandlingPolicy.callWithRetry(StandardConnectionHandlingPolicy.java:67)
at org.apache.curator.RetryLoop.callWithRetry(RetryLoop.java:99) at
org.apache.curator.framework.imps.CreateBuilderImpl.pathInForeground(CreateBuilderImpl.java:1020)
at
org.apache.curator.framework.imps.CreateBuilderImpl.protectedPathInForeground(CreateBuilderImpl.java:501)
at
org.apache.curator.framework.imps.CreateBuilderImpl.forPath(CreateBuilderImpl.java:491)
at
org.apache.curator.framework.imps.CreateBuilderImpl$4.forPath(CreateBuilderImpl.java:367)
at
org.apache.curator.framework.imps.CreateBuilderImpl$4.forPath(CreateBuilderImpl.java:309)
at com.mycompany.app.App.main(App.java:35)

Edit: Apparently this error can occur if you're using the wrong combination of Curator in combination with Zookeeper. From curator.apache.org :
Curator 2.x.x - compatible with both ZooKeeper 3.4.x and ZooKeeper 3.5.x
Curator 3.x.x - compatible only with ZooKeeper 3.5.x and includes support for new features such as dynamic reconfiguration, etc.
It's hard to pinpoint your problem with only that error-code and not a stack trace, but some improvements I would suggest to make your application more stable is:
public class App {
public static void main( String[] args ) {
CuratorFramework client = CuratorFrameworkFactory.newClient("0.0.0.0:32770", new RetryUntilElapsed(3000, 1000));
client.start();
try {
//make sure you're connected to zookeeper.
client.blockUntilConnected();
//Make sure the parents are created.
client.create().creatingParentsIfNeeded().forPath("/larry-smells/foop", "tuna?".getBytes());
} catch (Exception e) {
System.out.println(e.toString());
}
}
}

I also faced a similar exception, I used the below dependencies which are compatible and helps me to resolve the exception.
<dependency>
<groupId>org.apache.zookeeper</groupId>
<artifactId>zookeeper</artifactId>
<version>3.4.6</version>
</dependency>
<dependency>
<groupId>org.apache.curator</groupId>
<artifactId>curator-framework</artifactId>
<version>4.0.1</version>
</dependency>
<dependency>
<groupId>org.apache.curator</groupId>
<artifactId>curator-x-discovery</artifactId>
<version>4.0.1</version>
</dependency>

I had the same problem.
I tried to use inTransaction () as explained here: http://www.programcreek.com/java-api-examples/index.php?api=org.apache.curator.framework.CuratorFramework on exercise 6
and seems to work.
client.inTransaction ().create().forPath("/larry-smells/foop", "tuna?".getBytes()).and ().commit ();

The issue is caused because of incompatibility.
To fix this, you need to change the version like it's explained here:
https://curator.apache.org/zk-compatibility.html
If this doesn't work, just look for the newest curator version which depends on a 3.4.x zookeeper version (currently '2.12.0').

#Massimo Da Ros solution works, but in new version Curator 4.0.0 inTransaction is deprecated, it's recommented use transaction method like below:
CuratorOp op = client.transactionOp().create()
.withMode(CreateMode.PERSISTENT)
.withACL(Ids.OPEN_ACL_UNSAFE)
.forPath("/test", "Data".getBytes());
result = client.transaction().forOperations(op).get(0).toString();

I faced similiar problem. I was using spring-cloud-starter-zookeeper-discovery which by itself of course has compatible zookeeper and curator versions.
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-zookeeper-discovery</artifactId>
</dependency>
I checked the dependency tree and spring-cloud-starter-zookeeper-discovery Version 3.1.1. was using zookeeper Version 3.6.0
The problem was, in my docker-compose.yml I was using zookeeper Version 3.4!
So make sure your docker-compose.yml zookeeper version fits your maven zookeeper version.
version: "3.8"
services:
zookeeper:
container_name: zookeeper
image: zookeeper:3.6 <----------------- zookeeper version
ports:
- "2181:2181"

Related

Caused by: java.lang.NoClassDefFoundError: Could not initialize class io.confluent.kafka.serializers.KafkaAvroSerializerConfig since 6.0.0

im working on a flink (v.1.13.2) application which should publish some objects to my Kafka broker.
For schema validation I use the Confluent Schema Registry.
I previously used the library in version 5.2.0 (also tried other 5.x.x versions):
<dependency>
<groupId>io.confluent</groupId>
<artifactId>kafka-avro-serializer</artifactId>
<!--<version>5.x.x</version>-->
<version>6.2.0</version>
</dependency>
This seems to work but there was a strange behaviour while registering the schema to the registry. The schema was just ""bytes"" After investigation I found out that the suspect part in 'AvroSchemaUtils' was changed.
https://github.com/confluentinc/schema-registry/blob/a2f80f30d6713c50ee54c47885bcde2945932660/client/src/main/java/io/confluent/kafka/schemaregistry/avro/AvroSchemaUtils.java#L88
So I've tryed to update the library to the next working version.
After I updated to 6.x.x. I've got the following error:
Caused by: java.lang.NoClassDefFoundError: Could not initialize class io.confluent.kafka.serializers.KafkaAvroSerializerConfig
at io.confluent.kafka.serializers.KafkaAvroSerializer.configure(KafkaAvroSerializer.java:50)
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:369)
... 23 more
How to find out what wrong here?
This may be the problem. After Kafka Avro Serializer is upgraded, the dependent kafka client is upgraded from kafka_2.12 to kafka_2.13
https://mvnrepository.com/artifact/io.confluent/kafka-avro-serializer/6.2.0

ssl certificate issue for jgit in javafx app image versus javafx app runtime

I have developed a simple JavaFX app using jgit, that is allowing users to play with git.
When the app was started from the IntelliJ idea, I was able to clone the GitHub repo using my jgit implementation of the "git clone" command without any issues. But as soon as I created an image from my app and started the app from the image, I am getting an SSL certificate issue:
Exception:org.eclipse.jgit.api.errors.TransportException:
Secure connection to https://my-github-repo.git could not be established because of SSL problems.
I am trying to understand why I am getting the SSL certificate issue only when I am running the app from the image. Can somebody explain that? I understand that I can disable SSL verification (there are some questions answered on that topic), but I want to know why it is working from IDE and not from the created image...
here is my simplified git clone implementation for http:
try {
CloneCommand command = Git.cloneRepository();
command.setCredentialsProvider(new UsernamePasswordCredentialsProvider(httpUsername, httpPassword));
// run the clone command
command.setURI(repositoryUrl);
command.setDirectory(dirFramework);
git = command.call();
} catch (Exception e) {
LOG.error("Error occurred during task: Git clone: " + e);
}
For creating the image I am using the "org.beryx.runtime" plugin with the Gradle task "runtime".
Here is the build.gradle content:
plugins {
id 'java'
id 'application'
id 'org.openjfx.javafxplugin' version '0.0.8'
id 'org.beryx.runtime' version '1.11.4'
}
group 'org.sovap'
version '1.0-SNAPSHOT'
sourceCompatibility = 11
targetCompatibility = 11
repositories {
mavenCentral()
}
dependencies {
// xml stuff
compile 'jakarta.xml.bind:jakarta.xml.bind-api:2.3.3'
compile 'org.glassfish.jaxb:jaxb-runtime:2.3.2'
compile 'jakarta.activation:jakarta.activation-api:1.2.2'
// logger
compile 'org.apache.logging.log4j:log4j-core:2.13.3'
compile 'org.slf4j:slf4j-api:1.7.30'
compile 'org.slf4j:slf4j-simple:1.7.30'
// cucumber
compile 'io.cucumber:gherkin:15.0.2'
// jgit
compile 'org.eclipse.jgit:org.eclipse.jgit:5.9.0.202009080501-r'
compile 'org.eclipse.jgit:org.eclipse.jgit.archive:5.9.0.202009080501-r'
compile 'org.eclipse.jgit:org.eclipse.jgit.ssh.jsch:5.9.0.202009080501-r'
// file utils
compile 'commons-io:commons-io:2.7'
//controlsfx
compile 'org.controlsfx:controlsfx:11.0.2'
}
javafx {
version = "15"
modules = ['javafx.controls', 'javafx.fxml', 'javafx.web', "javafx.graphics"]
}
application {
mainClassName = 'org.sovap.taman.Launcher'
applicationName = 'taman'
}
runtime {
options = ['--strip-debug', '--compress', '2', '--no-header-files', '--no-man-pages']
imageDir = file("$buildDir/taman")
}
Edit 1:
How the image is created? - image is created with the plugin 'org.beryx.runtime' version '1.11.4', with Gradle task called 'runtime'. At the end of the build.gradle content you can see some specific configuration for runtime task.
Also, it is possible to include modules there as described here: https://badass-runtime-plugin.beryx.org/releases/latest/
I have tested the config with modules you are mentioning for runtime task in build.gradle (also one by one):
runtime {
options = ['--strip-debug', '--compress', '2', '--no-header-files', '--no-man-pages']
imageDir = file("$buildDir/taman")
modules = ['jdk.crypto.cryptoki', 'jdk.crypto.ec', 'jdk.crypto.mscapi']
}
But with that I am getting following exception when starting the app from image:
Exception in thread "main" java.lang.NoClassDefFoundError: javax/xml/stream/XMLStreamException
at org.apache.logging.log4j.core.config.builder.api.ConfigurationBuilderFactory.newConfigurationBuilder(ConfigurationBuilderFactory.java:38)
at org.apache.logging.log4j.core.config.properties.PropertiesConfigurationBuilder.<init>(PropertiesConfigurationBuilder.java:72)
at org.apache.logging.log4j.core.config.properties.PropertiesConfigurationFactory.getConfiguration(PropertiesConfigurationFactory.java:52)
at org.apache.logging.log4j.core.config.properties.PropertiesConfigurationFactory.getConfiguration(PropertiesConfigurationFactory.java:35)
at org.apache.logging.log4j.core.config.ConfigurationFactory$Factory.getConfiguration(ConfigurationFactory.java:551)
at org.apache.logging.log4j.core.config.ConfigurationFactory$Factory.getConfiguration(ConfigurationFactory.java:475)
at org.apache.logging.log4j.core.config.ConfigurationFactory.getConfiguration(ConfigurationFactory.java:323)
at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:687)
at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:708)
at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:263)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:153)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:45)
at org.apache.logging.log4j.LogManager.getContext(LogManager.java:194)
at org.apache.logging.log4j.LogManager.getLogger(LogManager.java:602)
at org.sovap.taman.App.<clinit>(App.java:15)
at org.sovap.taman.Launcher.main(Launcher.java:6)
Caused by: java.lang.ClassNotFoundException: javax.xml.stream.XMLStreamException
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(Unknown Source)
at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(Unknown Source)
at java.base/java.lang.ClassLoader.loadClass(Unknown Source)
... 16 more
What version of Java 11 are you using? - java version "11.0.8" 2020-07-14 LTS
Regarding the different versions of JDK for image and for IntelliJ IDEA environment - it should be using the same installation on my machine
I guess I need to add modules to the runtime task, but I do not know which ones and how to identify them... Any direction you can point me to?
Edit 2:
According to the accepted answer, I was missing modules in my runtime task configuration. Here is how I have figured which ones to add:
The plugin 'org.beryx.runtime' contains a task I used for suggesting modules for the runtime task (task: suggestModules)
After running it I have created a list of modules with the following runtime config (included the ones from accepted answer):
runtime {
options = ['--strip-debug', '--compress', '2', '--no-header-files', '--no-man-pages']
imageDir = file("$buildDir/taman")
modules = [
'java.desktop',
'java.logging',
'java.xml',
'java.compiler',
'java.datatransfer',
'java.rmi',
'java.sql',
'java.naming',
'java.scripting',
'java.management',
'java.security.jgss',
'jdk.jfr',
'java.net.http',
'jdk.jsobject',
'jdk.xml.dom',
'jdk.unsupported',
'jdk.crypto.cryptoki',
'jdk.crypto.ec',
'jdk.crypto.mscapi']
}
Now everything works as expected.
The most likely situation is that you are missing a module in the JRE image related to crypto that is required to authenticate the SSL connection.
What jdk.crypto.* modules are in the final image?
Perhaps if one of these is missing it will affect the ability to handle the SSL certificates?
jdk.crypto.cryptoki
jdk.crypto.ec
jdk.crypto.mscapi
Since some aspects of the security/crypto code are done via service providers, perhaps when you generate the JRE image you should pass the
"--bind-services" option to "Link in service provider modules and their dependences"
You will need to share more details about how the image is created and what specific errors are reported. Try to include the full stack trace of any reported exceptions.
What version of Java 11 are you using?
Could you be running into this: JDK 11 SSL Error on valid certificate (working in previous versions)
(It is unlikely if you are running with the same JDK version in the IDE and the packaged image, but thought I would mention it just in case it gives you a hint.)

Pact provider tests broken: pactVerificationTestTemplate » PreconditionViolation

I'm quite new to CDC testing and only make my first steps. I've deployed the Pact-Broker (docker-compose), running at localhost:80. The consumer sends the generated pacts successfully to the broker, but it seems that the provider can't get a valid contract (but this is only the assumption).
I'm using spring-boot, maven, jUnit5. Application tests are running on Ubuntu.
Using PactFolder with the consumer-generated pact-contract in local directory results in successful tests.
When I'm switching to #PactBroker annotation, the provider is able to connect to the broker and it receives the following response (I got it from debug logs):
{"_links":
{"self":{
"href":"http://localhost/pacts/provider/provider- name/latest","title":"Latest pact versions for the provider provider-name"},
"pb:provider":{"href":"http://localhost/pacticipants/provider-name",
"name":"provider-name"},
"pb:pacts":[
{"href":"http://localhost/pacts/provider/provider-name/consumer/consumer-name/version/1.0.0",
"title":"Pact between consumer-name (v1.0.0) and provider-name",
"name":"consumer-name"}
],
"provider":{
"href":"http://localhost/pacticipants/provider-name",
"title":"provider-name",
"name":"DEPRECATED - please use the pb:provider relation"
},
"pacts":[
{"href":"http://localhost/pacts/provider/provider-name/consumer/consumer-name/version/1.0.0",
"title":"DEPRECATED - please use the pb:pacts relation. Pact between consumer-name (v1.0.0) and provider-name",
"name":"consumer-name"
}
]
}
}
And the test run results in the following:
[ERROR] Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 9.758 s
FAILURE! - in com.tis.payment.mapper.PaymentMapperApplicationTests
[ERROR] pactVerificationTestTemplate{PactVerificationContext}
Time elapsed: 9.752 s
ERROR!
org.junit.platform.commons.util.PreconditionViolationException:
No supporting TestTemplateInvocationContextProvider provided an invocation context
[INFO]
[INFO] Results:
[INFO]
[ERROR] Errors:
[ERROR] PaymentMapperApplicationTests.pactVerificationTestTemplate » PreconditionViolation
[INFO]
[ERROR] Tests run: 1, Failures: 0, Errors: 1, Skipped: 0
As using the local pact file makes tests green, I suppose that the reason is not in the code of my test class, though if it could be helpful, I provide it here:
#ExtendWith(SpringExtension.class)
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.DEFINED_PORT,
properties = "server.port=8082")
#Provider("provider-name")
#PactBroker(host = "localhost", port = "80", tags="latest")
//#PactFolder("target/pacts") # uncomment to use local pact files
public class ApplicationTests {
#MockBean
private ProviderServiceClient providerServiceClient;
#BeforeEach
void setupTestTarget(PactVerificationContext context) {
context.setTarget(new HttpTestTarget("localhost", 8082, "/"));
}
#TestTemplate
#ExtendWith(PactVerificationInvocationContextProvider.class)
void pactVerificationTestTemplate(PactVerificationContext context) {
context.verifyInteraction();
}
#State({"valid payment file"})
public void toValid() {
ServiceResponse response = new ServiceResponse();
response.setBatchId("test");
response.setId(1L);
when(providerServiceClient.save(any())).thenReturn(response);
}
#State({"invalid payment file"})
public void toInvalid() {
}
}
As using local pact files is not an option, I really wonder how to fix the error and will be grateful for any helpful comments.
maven pact dependencies:
<dependency>
<groupId>au.com.dius</groupId>
<artifactId>pact-jvm-model</artifactId>
<version>3.5.22</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>au.com.dius</groupId>
<artifactId>pact-jvm-provider-junit5_2.12</artifactId>
<version>3.5.22</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>au.com.dius</groupId>
<artifactId>pact-jvm-consumer-junit5_2.12</artifactId>
<version>3.5.22</version>
<scope>test</scope>
</dependency>
Plugin for maven to publish the consumer's pacts:
<plugin>
<groupId>au.com.dius</groupId>
<artifactId>pact-jvm-provider-maven_2.12</artifactId>
<version>3.5.22</version>
<configuration>
<pactBrokerUrl>http://localhost:80</pactBrokerUrl>
<trimSnapshot>true</trimSnapshot>
<!-- Defaults to false -->
</configuration>
</plugin>
the pact-provider docker-compose.yml:
version: '2'
services:
postgres:
image: postgres
restart: always
# healthcheck:
# test: psql postgres --command "select 1" -U postgres
ports:
- "5432:5432"
environment:
POSTGRES_USER: postgres
POSTGRES_PASSWORD: password
POSTGRES_DB: postgres
broker_app:
image: dius/pact-broker
depends_on:
- postgres
ports:
- "80:80"
links:
- postgres
environment:
PACT_BROKER_DATABASE_USERNAME: postgres
PACT_BROKER_DATABASE_PASSWORD: password
PACT_BROKER_DATABASE_HOST: postgres
PACT_BROKER_DATABASE_NAME: postgres
The JUnit 5 error org.junit.platform.commons.util.PreconditionViolationException:
No supporting TestTemplateInvocationContextProvider provided an invocation context means no test context was provided, so the templated test method could not be invoked. This is probably due to there not being any pacts to verify (each pact results in an invocation context).
Now to addressing the actual issue as to why you are not getting any pacts to verify from the broker. The Pact Broker is essentially a repository, and the JUnit 5 verification framework will use all the annotations on the pact class to create a query to send to the Pact Broker. This query is not returning any pacts, so there must be a mismatch somewhere.
The only thing I can see from the information you have provided is the URL "http://localhost/pacts/provider/provider- name/latest" in the JSON has an issue (there is whitespace in the provider name). If that is not just a formatting issue with SO, then that won't match (the broker will probably return a 404 with that URL).
If that is not the issue, then check that when you run the verification from Maven, you can access the broker in the same way that the test framework is. Enabling DEBUG level logging will show you all the requests being made. Use something like curl and try the same requests to see what you get.

java.lang.ClassNotFoundException: com.google.api.client.json.JsonFactory

I want to create a web application running on Tomcat 7.0 with JRE JavaSE-1.6 on osx 10.8.
I am using a tutorial from the developers site and the error occurs when I try to call
clientSecrets = GoogleClientSecrets.load(new JacksonFactory(), reader);
I added the Jar google-http-client-jackson-1.16.0-rc.jar to my build path and still get following error:
java.lang.NoClassDefFoundError: com/google/api/client/json/JsonFactory
java.lang.ClassNotFoundException: com.google.api.client.json.JsonFactory
My classpath specifically points to this Jar too.
I just fixed this by changing the following lines:
Original: import com.google.api.client.json.jackson.JacksonFactory;
Modified: import com.google.api.client.json.jackson2.JacksonFactory;
I had a similar problem and solved it by manually adding the required JAR's to my WEB-INF\lib folder outside Eclipse.
From this page here, it says you need 3 libraries:
1) The Generated Java client library for BigQuery
2) The Google HTTP Client Library for Java
3) The Google OAuth Client Library for Java
Do you have them all? It sounds like you have #2, but it sounds like you're missing the google HTTP client.
In POM add below dependency-
<dependency>
<groupId>com.google.api-client</groupId>
<artifactId>google-api-client-gson</artifactId>
<version>2.1.1</version>
</dependency>
<dependency>
<groupId>com.google.oauth-client</groupId>
<artifactId>google-oauth-client-jetty</artifactId>
<version>1.34.1</version>
</dependency>
Note: In above dependency version can be changed
2.Import statement
import com.google.api.client.json.JsonFactory;
import com.google.api.client.json.gson.GsonFactory;
Usage
private static final JsonFactory JSON_FACTORY =
GsonFactory.getDefaultInstance();
GoogleClientSecrets clientSecrets =GoogleClientSecrets.load(JSON_FACTORY,
new InputStreamReader(in));

RabbitMQ tutorials code not working

I am in the process of learning RabbitMQ. I started with the basic rabbitmq tutorials in their website, unfortunately I am not able to compile them due to the following errors:
ConnectionFactory factory = new ConnectionFactory();
factory.setHost("localhost");
Error: "The method newConnection(Address[]) in the type ConnectionFactory is not applicable for the arguments ()"
Connection connection = factory.newConnection();
Error: The method newConnection(Address[]) in the type ConnectionFactory is not applicable for the arguments ()
The maven dependency I have is:
<dependency>
<groupId>com.rabbitmq</groupId>
<artifactId>rabbitmq-client</artifactId>
<version>0.9.1</version>
</dependency>
What exactly am I doing wrong here? Any help would be appreciated!
Thanks!
Your tutorial seems to be "old", try out a more up to date version. The current release of the RabbitMQ Java AMQP library is 3.1.3.. But also have a look at the Maven Repository. Try out version 3.1.1, it's the newest version in the mvn repo.
<dependency>
<groupId>com.rabbitmq</groupId>
<artifactId>amqp-client</artifactId>
<version>3.1.1</version>
</dependency>