I am writing an application that processes an incoming record of coordinates and monitors them for entry or exit in defined geofences. Intermediately, I am maintaining a state whether the current coordinate is inside/outside a specific geofence using aggregation.
Below is the stack trace of the error while running this:
Exception in thread "geofence-events-990be707-d44d-410d-a8a7-b2a3b90cb84f-StreamThread-1" org.apache.kafka.streams.errors.StreamsException: Exception caught in process. taskId=0_0, processor=KSTREAM-SOURCE-0000000003, topic=vehicle-sensor-data, partition=0, offset=0
at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:367)
at org.apache.kafka.streams.processor.internals.AssignedStreamsTasks.process(AssignedStreamsTasks.java:104)
at org.apache.kafka.streams.processor.internals.TaskManager.process(TaskManager.java:413)
at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:862)
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:777)
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:747)
Caused by: org.apache.kafka.streams.errors.StreamsException: A serializer (key: org.apache.kafka.common.serialization.StringSerializer / value: org.apache.kafka.common.serialization.StringSerializer) is not compatible to the actual key or value type (key type: java.lang.String / value type: com.loconav.GeoFenceDTO). Change the default Serdes in StreamConfig or provide correct Serdes via method parameters.
at org.apache.kafka.streams.processor.internals.SinkNode.process(SinkNode.java:94)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:146)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:129)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:93)
at org.apache.kafka.streams.kstream.internals.KStreamFilter$KStreamFilterProcessor.process(KStreamFilter.java:43)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:115)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:146)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:129)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:93)
at org.apache.kafka.streams.kstream.internals.KStreamMap$KStreamMapProcessor.process(KStreamMap.java:42)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:115)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:146)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:129)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:93)
at org.apache.kafka.streams.kstream.internals.KStreamPeek$KStreamPeekProcessor.process(KStreamPeek.java:44)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:115)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:146)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:129)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:93)
at org.apache.kafka.streams.kstream.internals.KStreamFlatMapValues$KStreamFlatMapValuesProcessor.process(KStreamFlatMapValues.java:42)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:115)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:146)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:129)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:93)
at org.apache.kafka.streams.kstream.internals.KStreamKTableJoinProcessor.process(KStreamKTableJoinProcessor.java:73)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:115)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:146)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:129)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:93)
at org.apache.kafka.streams.kstream.internals.KStreamFlatMapValues$KStreamFlatMapValuesProcessor.process(KStreamFlatMapValues.java:42)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:115)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:146)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:129)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:93)
at org.apache.kafka.streams.kstream.internals.KStreamMap$KStreamMapProcessor.process(KStreamMap.java:42)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:115)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:146)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:129)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:93)
at org.apache.kafka.streams.processor.internals.SourceNode.process(SourceNode.java:84)
at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:351)
... 5 more
Caused by: java.lang.ClassCastException: class com.loconav.GeoFenceDTO cannot be cast to class java.lang.String (com.loconav.GeoFenceDTO is in unnamed module of loader 'app'; java.lang.String is in module java.base of loader 'bootstrap')
at org.apache.kafka.common.serialization.StringSerializer.serialize(StringSerializer.java:28)
at org.apache.kafka.common.serialization.Serializer.serialize(Serializer.java:60)
at org.apache.kafka.streams.processor.internals.RecordCollectorImpl.send(RecordCollectorImpl.java:157)
at org.apache.kafka.streams.processor.internals.RecordCollectorImpl.send(RecordCollectorImpl.java:101)
at org.apache.kafka.streams.processor.internals.SinkNode.process(SinkNode.java:89)
... 45 more
I wrote a peek after flatMapValues and the first record gets printed and then this exception occurs. Below is the topology code
vehicleGeoFences
.flatMapValues(
(readOnlyKey, value) -> {
List<GeoFenceDTO> geoFenceDTOS = new ArrayList<>();
// some logic
return geoFenceDTOS;
})
.groupBy((key, value) -> value.getGeofenceId() + "," + value.getVehicleId())
.aggregate(
GeoFenceDTO::new,
(key, value, aggregate) -> {
//setting values
return geofenceDTO;
},
Materialized.with(Serdes.String(), getGeofenceDTOValueSerde()))
.toStream()
.mapValues(
(readOnlyKey, value) -> {
// changing object
})
.to(
Topics.VEHICLE_EVENTS.name(),
Produced.with(vehicleEventsKeySerde, vehicleEventsValueSerde));
I have written a custom Serde on GeofenceDTO than uses jackson ObjectMapper to convert object to bytes and vice-versa. Somehow it looks parameters provided in the Materialized are not being picked. The default key value Serde for the application is StringSerde.
Help appreciated. TIA. I tried changing custom object to Avro generated but same error persisted.
Related
Trying to write a simple POC using Apache Beam and Hive:
public static void main(String[] args) {
PipelineOptions options = PipelineOptionsFactory
.fromArgs(args)
.withValidation()
.as(PVAOptions.class);
Pipeline p = Pipeline.create(options);
p
.apply(TextIO.read().from("src/test/resources/words.txt"))
.apply(ParDo.of(new PukeHive()))
.apply(HCatalogIO.write()
.withBatchSize(100)
.withConfigProperties(getHiveConfigProperties())
.withTable(getHiveTable())
)
;
p.run().waitUntilFinish();
}
static class PukeHive extends DoFn<String, HCatRecord> {
#ProcessElement
public void processElement(ProcessContext c) throws IOException {
DefaultHCatRecord rec = new DefaultHCatRecord(1);
rec.set(0, c.element());
c.output(rec);
}
}
This results in the following exception. Debugging reveals that this is because Beam's WritableCoder tries to create a newInstance() of the abstract class HCatRecord.
org.apache.beam.sdk.Pipeline$PipelineExecutionException: java.lang.RuntimeException: org.apache.beam.sdk.coders.CoderException: unable to deserialize record
at org.apache.beam.runners.direct.DirectRunner$DirectPipelineResult.waitUntilFinish (DirectRunner.java:349)
at org.apache.beam.runners.direct.DirectRunner$DirectPipelineResult.waitUntilFinish (DirectRunner.java:319)
at org.apache.beam.runners.direct.DirectRunner.run (DirectRunner.java:210)
at org.apache.beam.runners.direct.DirectRunner.run (DirectRunner.java:66)
at org.apache.beam.sdk.Pipeline.run (Pipeline.java:311)
at org.apache.beam.sdk.Pipeline.run (Pipeline.java:297)
at com.comp.beam.Main.main (Main.java:48)
at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke (Method.java:498)
at org.codehaus.mojo.exec.ExecJavaMojo$1.run (ExecJavaMojo.java:282)
at java.lang.Thread.run (Thread.java:748)
Caused by: java.lang.RuntimeException: org.apache.beam.sdk.coders.CoderException: unable to deserialize record
at org.apache.beam.runners.direct.ImmutabilityCheckingBundleFactory$ImmutabilityEnforcingBundle.add (ImmutabilityCheckingBundleFactory.java:114)
at org.apache.beam.runners.direct.ParDoEvaluator$BundleOutputManager.output (ParDoEvaluator.java:242)
at org.apache.beam.repackaged.beam_runners_direct_java.runners.core.SimpleDoFnRunner.outputWindowedValue (SimpleDoFnRunner.java:219)
at org.apache.beam.repackaged.beam_runners_direct_java.runners.core.SimpleDoFnRunner.access$700 (SimpleDoFnRunner.java:69)
at org.apache.beam.repackaged.beam_runners_direct_java.runners.core.SimpleDoFnRunner$DoFnProcessContext.output (SimpleDoFnRunner.java:517)
at org.apache.beam.repackaged.beam_runners_direct_java.runners.core.SimpleDoFnRunner$DoFnProcessContext.output (SimpleDoFnRunner.java:505)
at com.comp.beam.Main$PukeHive.processElement (Main.java:61)
Caused by: org.apache.beam.sdk.coders.CoderException: unable to deserialize record
at org.apache.beam.sdk.io.hadoop.WritableCoder.decode (WritableCoder.java:92)
at org.apache.beam.sdk.io.hadoop.WritableCoder.decode (WritableCoder.java:54)
at org.apache.beam.sdk.coders.Coder.decode (Coder.java:170)
at org.apache.beam.sdk.util.CoderUtils.decodeFromSafeStream (CoderUtils.java:122)
at org.apache.beam.sdk.util.CoderUtils.decodeFromByteArray (CoderUtils.java:105)
at org.apache.beam.sdk.util.CoderUtils.decodeFromByteArray (CoderUtils.java:99)
at org.apache.beam.sdk.util.CoderUtils.clone (CoderUtils.java:148)
at org.apache.beam.sdk.util.MutationDetectors$CodedValueMutationDetector.<init> (MutationDetectors.java:117)
at org.apache.beam.sdk.util.MutationDetectors.forValueWithCoder (MutationDetectors.java:46)
at org.apache.beam.runners.direct.ImmutabilityCheckingBundleFactory$ImmutabilityEnforcingBundle.add (ImmutabilityCheckingBundleFactory.java:112)
at org.apache.beam.runners.direct.ParDoEvaluator$BundleOutputManager.output (ParDoEvaluator.java:242)
at org.apache.beam.repackaged.beam_runners_direct_java.runners.core.SimpleDoFnRunner.outputWindowedValue (SimpleDoFnRunner.java:219)
at org.apache.beam.repackaged.beam_runners_direct_java.runners.core.SimpleDoFnRunner.access$700 (SimpleDoFnRunner.java:69)
at org.apache.beam.repackaged.beam_runners_direct_java.runners.core.SimpleDoFnRunner$DoFnProcessContext.output (SimpleDoFnRunner.java:517)
at org.apache.beam.repackaged.beam_runners_direct_java.runners.core.SimpleDoFnRunner$DoFnProcessContext.output (SimpleDoFnRunner.java:505)
at com.comp.beam.Main$PukeHive.processElement (Main.java:61)
at com.comp.beam.Main$PukeHive$DoFnInvoker.invokeProcessElement (Unknown Source)
at org.apache.beam.repackaged.beam_runners_direct_java.runners.core.SimpleDoFnRunner.invokeProcessElement (SimpleDoFnRunner.java:185)
at org.apache.beam.repackaged.beam_runners_direct_java.runners.core.SimpleDoFnRunner.processElement (SimpleDoFnRunner.java:149)
at org.apache.beam.repackaged.beam_runners_direct_java.runners.core.SimplePushbackSideInputDoFnRunner.processElementInReadyWindows (SimplePushbackSideInputDoFnRunner.java:78)
at org.apache.beam.runners.direct.ParDoEvaluator.processElement (ParDoEvaluator.java:189)
at org.apache.beam.runners.direct.DoFnLifecycleManagerRemovingTransformEvaluator.processElement (DoFnLifecycleManagerRemovingTransformEvaluator.java:55)
at org.apache.beam.runners.direct.DirectTransformExecutor.processElements (DirectTransformExecutor.java:161)
at org.apache.beam.runners.direct.DirectTransformExecutor.run (DirectTransformExecutor.java:125)
at java.util.concurrent.Executors$RunnableAdapter.call (Executors.java:511)
at java.util.concurrent.FutureTask.run (FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker (ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run (ThreadPoolExecutor.java:624)
at java.lang.Thread.run (Thread.java:748)
Caused by: java.lang.InstantiationException
at sun.reflect.InstantiationExceptionConstructorAccessorImpl.newInstance (InstantiationExceptionConstructorAccessorImpl.java:48)
at java.lang.reflect.Constructor.newInstance (Constructor.java:423)
at org.apache.beam.sdk.io.hadoop.WritableCoder.decode (WritableCoder.java:85)
at org.apache.beam.sdk.io.hadoop.WritableCoder.decode (WritableCoder.java:54)
at org.apache.beam.sdk.coders.Coder.decode (Coder.java:170)
at org.apache.beam.sdk.util.CoderUtils.decodeFromSafeStream (CoderUtils.java:122)
at org.apache.beam.sdk.util.CoderUtils.decodeFromByteArray (CoderUtils.java:105)
at org.apache.beam.sdk.util.CoderUtils.decodeFromByteArray (CoderUtils.java:99)
at org.apache.beam.sdk.util.CoderUtils.clone (CoderUtils.java:148)
at org.apache.beam.sdk.util.MutationDetectors$CodedValueMutationDetector.<init> (MutationDetectors.java:117)
at org.apache.beam.sdk.util.MutationDetectors.forValueWithCoder (MutationDetectors.java:46)
at org.apache.beam.runners.direct.ImmutabilityCheckingBundleFactory$ImmutabilityEnforcingBundle.add (ImmutabilityCheckingBundleFactory.java:112)
at org.apache.beam.runners.direct.ParDoEvaluator$BundleOutputManager.output (ParDoEvaluator.java:242)
at org.apache.beam.repackaged.beam_runners_direct_java.runners.core.SimpleDoFnRunner.outputWindowedValue (SimpleDoFnRunner.java:219)
at org.apache.beam.repackaged.beam_runners_direct_java.runners.core.SimpleDoFnRunner.access$700 (SimpleDoFnRunner.java:69)
at org.apache.beam.repackaged.beam_runners_direct_java.runners.core.SimpleDoFnRunner$DoFnProcessContext.output (SimpleDoFnRunner.java:517)
at org.apache.beam.repackaged.beam_runners_direct_java.runners.core.SimpleDoFnRunner$DoFnProcessContext.output (SimpleDoFnRunner.java:505)
at com.comp.beam.Main$PukeHive.processElement (Main.java:61)
at com.comp.beam.Main$PukeHive$DoFnInvoker.invokeProcessElement (Unknown Source)
at org.apache.beam.repackaged.beam_runners_direct_java.runners.core.SimpleDoFnRunner.invokeProcessElement (SimpleDoFnRunner.java:185)
at org.apache.beam.repackaged.beam_runners_direct_java.runners.core.SimpleDoFnRunner.processElement (SimpleDoFnRunner.java:149)
at org.apache.beam.repackaged.beam_runners_direct_java.runners.core.SimplePushbackSideInputDoFnRunner.processElementInReadyWindows (SimplePushbackSideInputDoFnRunner.java:78)
at org.apache.beam.runners.direct.ParDoEvaluator.processElement (ParDoEvaluator.java:189)
at org.apache.beam.runners.direct.DoFnLifecycleManagerRemovingTransformEvaluator.processElement (DoFnLifecycleManagerRemovingTransformEvaluator.java:55)
at org.apache.beam.runners.direct.DirectTransformExecutor.processElements (DirectTransformExecutor.java:161)
at org.apache.beam.runners.direct.DirectTransformExecutor.run (DirectTransformExecutor.java:125)
at java.util.concurrent.Executors$RunnableAdapter.call (Executors.java:511)
at java.util.concurrent.FutureTask.run (FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker (ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run (ThreadPoolExecutor.java:624)
at java.lang.Thread.run (Thread.java:748)
How can I feed my data into Hive using Beam?
I believe you need to register the coder for the HCatRecord which would be:
Pipeline p = Pipeline.create(options);
p.getCoderRegistry()
.registerCoderForClass(HCatRecord.class, WritableCoder.of(DefaultHCatRecord.class));
To test this I added the following to the HCatalogIOTest class of the beam project. It uses a different schema but should demonstrate a complete example:
#Test
#NeedsEmptyTestTables
public void testSOKricket() {
// Register the coder
defaultPipeline
.getCoderRegistry()
.registerCoderForClass(HCatRecord.class, WritableCoder.of(DefaultHCatRecord.class));
defaultPipeline
.apply(TextIO.read().from("/tmp/words.txt"))
.apply(ParDo.of(new PukeHive()))
.apply(
HCatalogIO.write()
.withConfigProperties(getConfigPropertiesAsMap(service.getHiveConf()))
.withDatabase(TEST_DATABASE)
.withTable(TEST_TABLE)
.withPartition(new java.util.HashMap<>())
.withBatchSize(1L));
defaultPipeline.run();
}
static class PukeHive extends DoFn<String, HCatRecord> {
#ProcessElement
public void processElement(ProcessContext c) throws Exception {
// our test schema is (mycol1 string, mycol2 int)
DefaultHCatRecord rec = new DefaultHCatRecord(2);
rec.set(0, c.element());
rec.set(1, 1);
c.output(rec);
}
}
I have a cache which store BinaryObject actually in a cluster(2 nodes). Ignite version is 2.1.0.
If I don't use any StreamReceiver(include StreamTransformer), there is no problems when adding lots of BinaryObject data with following code:
IgniteDataStreamer<Long,BinaryObject> ds = ignite.dataStreamer(CACHE_NAME);
SecureRandom random = new SecureRandom();
long i = 0;
long count = 1000000;
while(i++<count){
builder.setField("id", i);
builder.setField("name", "Test"+i);
builder.setField("age", random.nextInt(30));
builder.setField("score", random.nextDouble()*100d);
builder.setField("birthday", new Date());
ds.addData(i, builder.build());
if(i%10000==0){
System.out.println(i+" added...");
}
}
But now, I want to modify my BinaryObject data value before adding, so I tried StreamTransformer like this:
ds.receiver(new StreamTransformer<Long,BinaryObject>(){
#Override
public Object process(MutableEntry<Long, BinaryObject> entry, Object... arguments)
throws EntryProcessorException {
// TODO Auto-generated method stub
Long key = entry.getKey();
BinaryObject value = entry.getValue();
BinaryObjectBuilder builder = value.toBuilder();
//want to change the value of "name" field
builder.setField("name", "Modify"+builder.getField("name"));
entry.setValue(builder.build());
return null;
}
});
while(...){
//... original code to build BinaryObject data and call ds.add method
}
Unluckily following exceptions occurred:
[09:52:36] Topology snapshot [ver=61, servers=2, clients=0, CPUs=8, heap=2.7GB]
10000 added...
20000 added...
30000 added...
40000 added...
[09:52:39,174][SEVERE][data-streamer-#54%null%][DataStreamerImpl] DataStreamer operation failed.
class org.apache.ignite.IgniteCheckedException: Failed to finish operation (too many remaps): 32
at org.apache.ignite.internal.processors.datastreamer.DataStreamerImpl$5.apply(DataStreamerImpl.java:869)
at org.apache.ignite.internal.processors.datastreamer.DataStreamerImpl$5.apply(DataStreamerImpl.java:834)
at org.apache.ignite.internal.util.future.GridFutureAdapter.notifyListener(GridFutureAdapter.java:382)
at org.apache.ignite.internal.util.future.GridFutureAdapter.unblock(GridFutureAdapter.java:346)
at org.apache.ignite.internal.util.future.GridFutureAdapter.unblockAll(GridFutureAdapter.java:334)
at org.apache.ignite.internal.util.future.GridFutureAdapter.onDone(GridFutureAdapter.java:494)
at org.apache.ignite.internal.util.future.GridFutureAdapter.onDone(GridFutureAdapter.java:473)
at org.apache.ignite.internal.util.future.GridFutureAdapter.onDone(GridFutureAdapter.java:461)
at org.apache.ignite.internal.processors.datastreamer.DataStreamerImpl$Buffer$2.apply(DataStreamerImpl.java:1572)
at org.apache.ignite.internal.processors.datastreamer.DataStreamerImpl$Buffer$2.apply(DataStreamerImpl.java:1562)
at org.apache.ignite.internal.util.future.GridFutureAdapter.notifyListener(GridFutureAdapter.java:382)
at org.apache.ignite.internal.util.future.GridFutureAdapter.unblock(GridFutureAdapter.java:346)
at org.apache.ignite.internal.util.future.GridFutureAdapter.unblockAll(GridFutureAdapter.java:334)
at org.apache.ignite.internal.util.future.GridFutureAdapter.onDone(GridFutureAdapter.java:494)
at org.apache.ignite.internal.util.future.GridFutureAdapter.onDone(GridFutureAdapter.java:473)
at org.apache.ignite.internal.util.future.GridFutureAdapter.onDone(GridFutureAdapter.java:461)
at org.apache.ignite.internal.processors.closure.GridClosureProcessor$2.body(GridClosureProcessor.java:967)
at org.apache.ignite.internal.util.worker.GridWorker.run(GridWorker.java:110)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: class org.apache.ignite.IgniteCheckedException: Unknown pair [platformId=0, typeId=-1496463502]
at org.apache.ignite.internal.util.IgniteUtils.cast(IgniteUtils.java:7229)
... 5 more
Caused by: java.lang.ClassNotFoundException: Unknown pair [platformId=0, typeId=-1496463502]
at org.apache.ignite.internal.MarshallerContextImpl.getClassName(MarshallerContextImpl.java:392)
at org.apache.ignite.internal.MarshallerContextImpl.getClass(MarshallerContextImpl.java:342)
at org.apache.ignite.internal.binary.BinaryContext.descriptorForTypeId(BinaryContext.java:686)
at org.apache.ignite.internal.binary.BinaryReaderExImpl.deserialize0(BinaryReaderExImpl.java:1755)
at org.apache.ignite.internal.binary.BinaryReaderExImpl.deserialize(BinaryReaderExImpl.java:1714)
at org.apache.ignite.internal.binary.BinaryObjectImpl.deserializeValue(BinaryObjectImpl.java:797)
at org.apache.ignite.internal.binary.BinaryObjectImpl.value(BinaryObjectImpl.java:143)
at org.apache.ignite.internal.processors.cache.CacheObjectUtils.unwrapBinary(CacheObjectUtils.java:161)
at org.apache.ignite.internal.processors.cache.CacheObjectUtils.unwrapBinaryIfNeeded(CacheObjectUtils.java:41)
at org.apache.ignite.internal.processors.cache.CacheObjectContext.unwrapBinaryIfNeeded(CacheObjectContext.java:125)
at org.apache.ignite.internal.processors.datastreamer.DataStreamerEntry$1.getValue(DataStreamerEntry.java:96)
at org.apache.ignite.stream.StreamTransformer.receive(StreamTransformer.java:45)
at org.apache.ignite.internal.processors.datastreamer.DataStreamerUpdateJob.call(DataStreamerUpdateJob.java:137)
at org.apache.ignite.internal.util.IgniteUtils.wrapThreadLoader(IgniteUtils.java:6608)
at org.apache.ignite.internal.processors.closure.GridClosureProcessor$2.body(GridClosureProcessor.java:959)
... 4 more
What should I do to fix it?
You get ClassNotFoundException because DataStreamer internally tries to deserialize stored BinaryObject. To make it use BinaryObjects directly, you should invoke ds.keepBinary(true) before using it.
Another problem you have in your code is the way you use result of entry.getValue(). Actually, entry, passed to process method represents a record previously stored in cache, so you'll most likely get null value there. If you want to get a newly-assigned value, you should use arguments[0] value.
I'm writing a stream UDF for Aerospike 3.6.2, and I'd like to put some code in a separate Lua module. I followed the example exactly and created a file mymodule.lua with the following contents:
local exports = {}
function exports.one()
return 1
end
function exports.two()
return 2
end
return exports
and put my UDF in a file testUdf.lua:
local MM = require('mymodule')
local function three()
return MM.one() + MM.two()
end
function testUdf(stream)
local type = three()
local testFilter = function(record)
return record.campaignType == type
end
return stream : filter(testFilter)
end
I register both modules and execute a query from the Java client:
LuaConfig.SourceDirectory = this.udfPath;
List<RegisterTask> tasks = new ArrayList<>();
for (String udfName : new String[] { "mymodule.lua", "testUdf.lua" }) {
File udf = new File (this.udfPath, udfName);
tasks.add (this.aerospike.register (null, udf.getPath(), udfName, Language.LUA));
}
tasks.stream().forEach (RegisterTask::waitTillComplete);
Statement stmt = new Statement();
stmt.setNamespace (getRawFactNamespace());
stmt.setSetName (SET_FACT);
stmt.setFilters (Filter.equal (BIN_PRODUCT_CODE, "UX"));
stmt.setBinNames (FACT_BINS);
int count = 0;
try (ResultSet rs = this.aerospike.queryAggregate (null, stmt, "testUdf", "testUdf")) {
while (rs.next()) {
count++;
}
}
I see the lua files, with the correct contents, on both of my test Aerospike servers in aerospike/var/udf/lua, which is where mod-lua.user-path points to in aerospike.conf. I logged package.path and it includes the aerospike/var/udf/lua directory as well.
But when I invoke my UDF, I get the following error:
Exception in thread "main" com.aerospike.client.AerospikeException: org.luaj.vm2.LuaError: testUdf:1 module 'mymodule' not found: mymodule
no field package.preload['mymodule']
mymodule.lua
no class 'mymodule'
stack traceback:
testUdf:1: in main chunk
[Java]: in ?
at com.aerospike.client.query.QueryExecutor.checkForException(QueryExecutor.java:122)
at com.aerospike.client.query.ResultSet.next(ResultSet.java:78)
...
Caused by: org.luaj.vm2.LuaError: testUdf:1 module 'mymodule' not found: mymodule
no field package.preload['mymodule']
mymodule.lua
no class 'mymodule'
stack traceback:
testUdf:1: in main chunk
[Java]: in ?
at org.luaj.vm2.LuaValue.error(Unknown Source)
at org.luaj.vm2.lib.PackageLib$require.call(Unknown Source)
at org.luaj.vm2.LuaClosure.execute(Unknown Source)
at org.luaj.vm2.LuaClosure.onInvoke(Unknown Source)
at org.luaj.vm2.LuaClosure.invoke(Unknown Source)
at org.luaj.vm2.LuaValue.invoke(Unknown Source)
at com.aerospike.client.lua.LuaInstance.loadPackage(LuaInstance.java:113)
at com.aerospike.client.query.QueryAggregateExecutor.runThreads(QueryAggregateExecutor.java:92)
at com.aerospike.client.query.QueryAggregateExecutor.run(QueryAggregateExecutor.java:77)
What am I doing wrong?
I have created a custom rule plugin using custom rule code snippet. Now I have added another rule which will show error when a anonymous class has more than 30 lines.
I have build the custom rule plugin jar using mvn package command and put in in sonar server extension/plugin and restarted the server.
my corresponding rule can be viewed from sonar server but rule cannot analyze the the code though there is a class with anonymous class with loc >30 lines. my custom rule is given below.
#Rule(
key = AnonymousClassesTooBigCheck.KEY,
name = "Avoid too big classes",
description = "avoid too bid classes",
tags = {"brain-overload"},
priority = Priority.MAJOR)
#BelongsToProfile(title = "Sonar way", priority = Priority.MAJOR)
public class AnonymousClassesTooBigCheck extends BaseTreeVisitor implements JavaFileScanner {
public static final String KEY = "AD0001";
private static final RuleKey RULE_KEY = RuleKey.of(MyJavaRulesDefinition.REPOSITORY_KEY, KEY);
private static final int DEFAULT_MAX = 30;
#RuleProperty(defaultValue = "" + DEFAULT_MAX)
public int max = DEFAULT_MAX;
private JavaFileScannerContext context;
/**
* Flag to skip check for class bodies of EnumConstants.
*/
private boolean isEnumConstantBody;
#Override
public void scanFile(JavaFileScannerContext context) {
this.context = context;
isEnumConstantBody = false;
scan(context.getTree());
}
#Override
public void visitNewClass(NewClassTree tree) {
if (tree.classBody() != null && !isEnumConstantBody) {
int lines = getNumberOfLines(tree.classBody());
if (lines > max) {
context.addIssue(tree, RULE_KEY, "Reduce this anonymous class number of lines from " + lines + " to at most " + max + ", or make it a named class.");
}
}
isEnumConstantBody = false;
super.visitNewClass(tree);
}
#Override
public void visitEnumConstant(EnumConstantTree tree) {
isEnumConstantBody = true;
super.visitEnumConstant(tree);
}
#Override
public void visitLambdaExpression(LambdaExpressionTree lambdaExpressionTree) {
int lines = getNumberOfLines(((JavaTree) lambdaExpressionTree.body()).getAstNode());
if (lines > max) {
context.addIssue(lambdaExpressionTree, RULE_KEY, "Reduce this lambda expression number of lines from " + lines + " to at most " + max + ".");
}
super.visitLambdaExpression(lambdaExpressionTree);
}
private int getNumberOfLines(ClassTree classTree) {
int startLine = ((InternalSyntaxToken) classTree.openBraceToken()).getLine();
int endline = ((InternalSyntaxToken) classTree.closeBraceToken()).getLine();
return endline - startLine + 1;
}
private int getNumberOfLines(AstNode node) {
return node.getLastToken().getLine() - node.getTokenLine() + 1;
}
}
Now do I need to do any thing more to get the rule hit?
One more thing
after putting the plugin jar in plugin folder if i run the analyzer using mvn sonar:sonar a error occures
Embedded error: org/sonar/java/model/InternalSyntaxToken
org.sonar.java.model.InternalSyntaxToken
[INFO] ------------------------------------------------------------------------
[INFO] Trace
org.apache.maven.lifecycle.LifecycleExecutionException: Can not execute Sonar
at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoals(DefaultLifecycleExecutor.java:719)
at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeStandaloneGoal(DefaultLifecycleExecutor.java:569)
at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoal(DefaultLifecycleExecutor.java:539)
at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoalAndHandleFailures(DefaultLifecycleExecutor.java:387)
at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeTaskSegments(DefaultLifecycleExecutor.java:284)
at org.apache.maven.lifecycle.DefaultLifecycleExecutor.execute(DefaultLifecycleExecutor.java:180)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:328)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:138)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:362)
at org.apache.maven.cli.compat.CompatibleMain.main(CompatibleMain.java:60)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.codehaus.classworlds.Launcher.launchEnhanced(Launcher.java:315)
at org.codehaus.classworlds.Launcher.launch(Launcher.java:255)
at org.codehaus.classworlds.Launcher.mainWithExitCode(Launcher.java:430)
at org.codehaus.classworlds.Launcher.main(Launcher.java:375)
Caused by: org.apache.maven.plugin.MojoExecutionException: Can not execute Sonar
at org.codehaus.mojo.sonar.Bootstraper.executeMojo(Bootstraper.java:103)
at org.codehaus.mojo.sonar.Bootstraper.start(Bootstraper.java:79)
at org.codehaus.mojo.sonar.SonarMojo.execute(SonarMojo.java:88)
at org.apache.maven.plugin.DefaultPluginManager.executeMojo(DefaultPluginManager.java:490)
at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoals(DefaultLifecycleExecutor.java:694)
... 17 more
Caused by: org.apache.maven.plugin.MojoExecutionException: org/sonar/java/model/InternalSyntaxToken
at org.sonar.maven.ExceptionHandling.handle(ExceptionHandling.java:37)
at org.sonar.maven.SonarMojo.execute(SonarMojo.java:175)
at org.apache.maven.plugin.DefaultPluginManager.executeMojo(DefaultPluginManager.java:490)
at org.codehaus.mojo.sonar.Bootstraper.executeMojo(Bootstraper.java:98)
... 21 more
Caused by: java.lang.NoClassDefFoundError: org/sonar/java/model/InternalSyntaxToken
at org.sonar.samples.java.AnonymousClassesTooBigCheck.getNumberOfLines(AnonymousClassesTooBigCheck.java:78)
at org.sonar.samples.java.AnonymousClassesTooBigCheck.visitNewClass(AnonymousClassesTooBigCheck.java:53)
at org.sonar.java.model.expression.NewClassTreeImpl.accept(NewClassTreeImpl.java:126)
at org.sonar.plugins.java.api.tree.BaseTreeVisitor.scan(BaseTreeVisitor.java:42)
at org.sonar.plugins.java.api.tree.BaseTreeVisitor.visitVariable(BaseTreeVisitor.java:291)
at org.sonar.java.model.declaration.VariableTreeImpl.accept(VariableTreeImpl.java:180)
at org.sonar.plugins.java.api.tree.BaseTreeVisitor.scan(BaseTreeVisitor.java:42)
at org.sonar.plugins.java.api.tree.BaseTreeVisitor.scan(BaseTreeVisitor.java:36)
at org.sonar.plugins.java.api.tree.BaseTreeVisitor.visitClass(BaseTreeVisitor.java:69)
at org.sonar.java.model.declaration.ClassTreeImpl.accept(ClassTreeImpl.java:201)
at org.sonar.plugins.java.api.tree.BaseTreeVisitor.scan(BaseTreeVisitor.java:42)
at org.sonar.plugins.java.api.tree.BaseTreeVisitor.scan(BaseTreeVisitor.java:36)
at org.sonar.plugins.java.api.tree.BaseTreeVisitor.visitClass(BaseTreeVisitor.java:69)
at org.sonar.java.model.declaration.ClassTreeImpl.accept(ClassTreeImpl.java:201)
at org.sonar.plugins.java.api.tree.BaseTreeVisitor.scan(BaseTreeVisitor.java:42)
at org.sonar.plugins.java.api.tree.BaseTreeVisitor.scan(BaseTreeVisitor.java:36)
at org.sonar.plugins.java.api.tree.BaseTreeVisitor.visitCompilationUnit(BaseTreeVisitor.java:55)
at org.sonar.java.model.JavaTree$CompilationUnitTreeImpl.accept(JavaTree.java:202)
at org.sonar.plugins.java.api.tree.BaseTreeVisitor.scan(BaseTreeVisitor.java:42)
at org.sonar.samples.java.AnonymousClassesTooBigCheck.scanFile(AnonymousClassesTooBigCheck.java:47)
at org.sonar.java.model.VisitorsBridge.visitFile(VisitorsBridge.java:123)
at com.sonar.sslr.impl.ast.AstWalker.walkAndVisit(AstWalker.java:67)
at org.sonar.java.ast.AstScanner.simpleScan(AstScanner.java:107)
at org.sonar.java.ast.AstScanner.scan(AstScanner.java:75)
at org.sonar.java.JavaSquid.scanSources(JavaSquid.java:122)
at org.sonar.java.JavaSquid.scan(JavaSquid.java:115)
at org.sonar.plugins.java.JavaSquidSensor.analyse(JavaSquidSensor.java:91)
at org.sonar.batch.phases.SensorsExecutor.executeSensor(SensorsExecutor.java:79)
at org.sonar.batch.phases.SensorsExecutor.execute(SensorsExecutor.java:70)
at org.sonar.batch.phases.PhaseExecutor.execute(PhaseExecutor.java:119)
at org.sonar.batch.scan.ModuleScanContainer.doAfterStart(ModuleScanContainer.java:194)
at org.sonar.api.platform.ComponentContainer.startComponents(ComponentContainer.java:92)
at org.sonar.api.platform.ComponentContainer.execute(ComponentContainer.java:77)
at org.sonar.batch.scan.ProjectScanContainer.scan(ProjectScanContainer.java:233)
at org.sonar.batch.scan.ProjectScanContainer.scanRecursively(ProjectScanContainer.java:228)
at org.sonar.batch.scan.ProjectScanContainer.doAfterStart(ProjectScanContainer.java:221)
at org.sonar.api.platform.ComponentContainer.startComponents(ComponentContainer.java:92)
at org.sonar.api.platform.ComponentContainer.execute(ComponentContainer.java:77)
at org.sonar.batch.scan.ScanTask.scan(ScanTask.java:64)
at org.sonar.batch.scan.ScanTask.execute(ScanTask.java:51)
at org.sonar.batch.bootstrap.TaskContainer.doAfterStart(TaskContainer.java:125)
at org.sonar.api.platform.ComponentContainer.startComponents(ComponentContainer.java:92)
at org.sonar.api.platform.ComponentContainer.execute(ComponentContainer.java:77)
at org.sonar.batch.bootstrap.BootstrapContainer.executeTask(BootstrapContainer.java:173)
at org.sonar.batch.bootstrapper.Batch.executeTask(Batch.java:95)
at org.sonar.batch.bootstrapper.Batch.execute(Batch.java:67)
at org.sonar.runner.batch.IsolatedLauncher.execute(IsolatedLauncher.java:48)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.sonar.runner.impl.BatchLauncher$1.delegateExecution(BatchLauncher.java:87)
at org.sonar.runner.impl.BatchLauncher$1.run(BatchLauncher.java:75)
at java.security.AccessController.doPrivileged(Native Method)
at org.sonar.runner.impl.BatchLauncher.doExecute(BatchLauncher.java:69)
at org.sonar.runner.impl.BatchLauncher.execute(BatchLauncher.java:50)
at org.sonar.runner.api.EmbeddedRunner.doExecute(EmbeddedRunner.java:102)
at org.sonar.runner.api.Runner.execute(Runner.java:100)
at org.sonar.maven.SonarMojo.execute(SonarMojo.java:173)
... 23 more
Caused by: java.lang.ClassNotFoundException: org.sonar.java.model.InternalSyntaxToken
at org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy.loadClass(SelfFirstStrategy.java:50)
at org.codehaus.plexus.classworlds.realm.ClassRealm.unsynchronizedLoadClass(ClassRealm.java:259)
at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClass(ClassRealm.java:235)
at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClass(ClassRealm.java:227)
... 82 more
The reason you encounter this error is that the code of the AnonymousClassesTooBigCheck is using internal APIs, here InternalSyntaxToken is not supposed to be used directly and thus fails at runtime (you might notice that none of the example in the custom rule code snippet is using this class).
I have two JLists, the other has categories of weapons (I'm doing roleplaying character creation software) and the other has items in that specific category. I want the items to change according to category
At the moment jList with items swaps its DefaultListModel every time user changes category (every DefaultListModel variable has different sets of items) this somehow produces NullPointerException whenever user clicks an Item after changing category.
I've tried to search this online and I have read the examples with JList's, but I haven't been able to solve the problem.
Here is the related code
private void jList1ValueChanged(javax.swing.event.ListSelectionEvent evt) { //Kategory
String selection = jList1.getSelectedValue().toString();
if(selection.equals(settings.weapon_classes[0])){
jList2.clearSelection();
jList2.setModel(model_l_pistols);
jList2.validate();
}
if(selection.equals(settings.weapon_classes[1])){
jList2.clearSelection();
jList2.setModel(model_m_pistols);
jList2.validate();
}
}
private void wpn_available_valuechanged_listener(javax.swing.event.ListSelectionEvent evt) {
Weapon wpn = new Weapon(); //The details of selected weapon are stored here for desplaying
try{
String wpn_name = jList2.getSelectedValue().toString();
if(jList1.getSelectedValue().equals("Light pistols")){
wpn = controller.get_Weapon(wpn_name, settings.light_pistols); //Stores right item details from array in settings class
}
if(jList1.getSelectedValue().equals("Medium pistols")){
wpn = controller.get_Weapon(wpn_name, settings.medium_pistols);
}
}catch(Exception e){
System.out.println(e + "String wpn_name = jList2.getSelectedValue().toString();");
}
}
And here is the errormessage I get when running my program with NetBeans IDE (first line includes my try/catch block printing of exception:
java.lang.NullPointerExceptionString wpn_name = jList2.getSelectedValue().toString();
Exception in thread "AWT-EventQueue-0" java.lang.NullPointerException
at charcreation.GUI.wpn_available_valuechanged_listener(Unknown Source)
at charcreation.GUI.access$6900(Unknown Source)
at charcreation.GUI$67.valueChanged(Unknown Source)
at javax.swing.JList.fireSelectionValueChanged(JList.java:1765)
at javax.swing.JList$ListSelectionHandler.valueChanged(JList.java:1779)
at javax.swing.DefaultListSelectionModel.fireValueChanged(DefaultListSelectionModel.java:167)
at javax.swing.DefaultListSelectionModel.fireValueChanged(DefaultListSelectionModel.java:147)
at javax.swing.DefaultListSelectionModel.fireValueChanged(DefaultListSelectionModel.java:194)
at javax.swing.DefaultListSelectionModel.changeSelection(DefaultListSelectionModel.java:388)
at javax.swing.DefaultListSelectionModel.changeSelection(DefaultListSelectionModel.java:398)
at javax.swing.DefaultListSelectionModel.removeSelectionIntervalImpl(DefaultListSelectionModel.java:559)
at javax.swing.DefaultListSelectionModel.clearSelection(DefaultListSelectionModel.java:403)
at javax.swing.JList.clearSelection(JList.java:2013)
at charcreation.GUI.jList1ValueChanged(Unknown Source)
at charcreation.GUI.access$6800(Unknown Source)
at charcreation.GUI$66.valueChanged(Unknown Source)
at javax.swing.JList.fireSelectionValueChanged(JList.java:1765)
at javax.swing.JList$ListSelectionHandler.valueChanged(JList.java:1779)
at javax.swing.DefaultListSelectionModel.fireValueChanged(DefaultListSelectionModel.java:167)
at javax.swing.DefaultListSelectionModel.fireValueChanged(DefaultListSelectionModel.java:147)
at javax.swing.DefaultListSelectionModel.fireValueChanged(DefaultListSelectionModel.java:194)
at javax.swing.DefaultListSelectionModel.changeSelection(DefaultListSelectionModel.java:388)
at javax.swing.DefaultListSelectionModel.changeSelection(DefaultListSelectionModel.java:398)
at javax.swing.DefaultListSelectionModel.setSelectionInterval(DefaultListSelectionModel.java:442)
at javax.swing.JList.setSelectionInterval(JList.java:2035)
at javax.swing.plaf.basic.BasicListUI$Handler.adjustSelection(BasicListUI.java:2731)
at javax.swing.plaf.basic.BasicListUI$Handler.mousePressed(BasicListUI.java:2687)
at java.awt.AWTEventMulticaster.mousePressed(AWTEventMulticaster.java:263)
at java.awt.Component.processMouseEvent(Component.java:6260)
at javax.swing.JComponent.processMouseEvent(JComponent.java:3267)
at java.awt.Component.processEvent(Component.java:6028)
at java.awt.Container.processEvent(Container.java:2041)
at java.awt.Component.dispatchEventImpl(Component.java:4630)
at java.awt.Container.dispatchEventImpl(Container.java:2099)
at java.awt.Component.dispatchEvent(Component.java:4460)
at java.awt.LightweightDispatcher.retargetMouseEvent(Container.java:4574)
at java.awt.LightweightDispatcher.processMouseEvent(Container.java:4235)
at java.awt.LightweightDispatcher.dispatchEvent(Container.java:4168)
at java.awt.Container.dispatchEventImpl(Container.java:2085)
at java.awt.Window.dispatchEventImpl(Window.java:2475)
at java.awt.Component.dispatchEvent(Component.java:4460)
at java.awt.EventQueue.dispatchEvent(EventQueue.java:599)
at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:269)
at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:184)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:174)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:169)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:161)
Possible error could be in this line
String selection = jList1.getSelectedValue().toString();
jList1.getSelectedValue() can be a null value so when the toString() function calls against a null it throws an exception
Use this instead to avoid the exception
String selection = String.valueOf(jList1.getSelectedValue());
This way if it is a null selection you will just get back the String value of null;