The implementation of the FlinkKafkaConsumer010 is not serializable error - serialization

I created a custom class that is based on Apache Flink. The following are some parts of the class definition:
public class StreamData {
private StreamExecutionEnvironment env;
private DataStream<byte[]> data ;
private Properties properties;
public StreamData(){
env = StreamExecutionEnvironment.getExecutionEnvironment();
}
public StreamData(StreamExecutionEnvironment e , DataStream<byte[]> d){
env = e ;
data = d ;
}
public StreamData getDataFromESB(String id, int from) {
final Pattern TOPIC = Pattern.compile(id);
Properties properties = new Properties();
properties.setProperty("bootstrap.servers", "localhost:9092");
properties.setProperty("group.id", Long.toString(System.currentTimeMillis()));
properties.setProperty("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
properties.setProperty("value.deserializer", "org.apache.kafka.common.serialization.ByteArrayDeserializer");
properties.put("metadata.max.age.ms", 30000);
properties.put("enable.auto.commit", "false");
if (from == 0)
properties.setProperty("auto.offset.reset", "earliest");
else
properties.setProperty("auto.offset.reset", "latest");
StreamExecutionEnvironment e = StreamExecutionEnvironment.getExecutionEnvironment();
DataStream<byte[]> stream = env
.addSource(new FlinkKafkaConsumer011<>(TOPIC, new AbstractDeserializationSchema<byte[]>() {
#Override
public byte[] deserialize(byte[] bytes) {
return bytes;
}
}, properties));
return new StreamData(e, stream);
}
public void print(){
data.print() ;
}
public void execute() throws Exception {
env.execute() ;
}
Using class StreamData, trying to get some data from Apache Kafka and print them in the main function:
StreamData stream = new StreamData();
stream.getDataFromESB("original_data", 0);
stream.print();
stream.execute();
I got the error:
Exception in thread "main" org.apache.flink.api.common.InvalidProgramException: The implementation of the FlinkKafkaConsumer010 is not serializable. The object probably contains or references non serializable fields.
Caused by: java.io.NotSerializableException: StreamData
As mentioned here, I think it's because of some data type in getDataFromESB function is not serializable. But I don't know how to solve the problem!

Your AbstractDeserializationSchema is an anonymous inner class, which as a result contains a reference to the outer StreamData class which isn't serializable. Either let StreamData implement Serializable, or define your schema as a top-level class.

It seems that you are importing FlinkKafkaConsumer010 in your code but using FlinkKafkaConsumer011. Please use the following dependency in your sbt file:
"org.apache.flink" %% "flink-connector-kafka-0.11" % flinkVersion

Related

JUnit 5 Parameterized test #ArgumentsSource parameters not loading

I have created below JUnit5 parameterized test with ArgumentsSource for loading arguments for the test:
public class DemoModelValidationTest {
public ParamsProvider paramsProvider;
public DemoModelValidationTest () {
try {
paramsProvider = new ParamsProvider();
}
catch (Exception iaex) {
}
}
#ParameterizedTest
#ArgumentsSource(ParamsProvider.class)
void testAllConfigurations(int configIndex, String a) throws Exception {
paramsProvider.executeSimulation(configIndex);
}
}
and the ParamsProvider class looks like below:
public class ParamsProvider implements ArgumentsProvider {
public static final String modelPath = System.getProperty("user.dir") + File.separator + "demoModels";
YAMLDeserializer deserializedYAML;
MetaModelToValidationModel converter;
ValidationRunner runner;
List<Configuration> configurationList;
List<Arguments> listOfArguments;
public ParamsProvider() throws Exception {
configurationList = new ArrayList<>();
listOfArguments = new LinkedList<>();
deserializedYAML = new YAMLDeserializer(modelPath);
deserializedYAML.load();
converter = new MetaModelToValidationModel(deserializedYAML);
runner = converter.convert();
configurationList = runner.getConfigurations();
for (int i = 0; i < configurationList.size(); i++) {
listOfArguments.add(Arguments.of(i, configurationList.get(i).getName()));
}
}
public void executeSimulation(int configListIndex) throws Exception {
final Configuration config = runner.getConfigurations().get(configListIndex);
runner.run(config);
runner.getReporter().consolePrintReport();
}
#Override
public Stream<? extends Arguments> provideArguments(ExtensionContext context) {
return listOfArguments.stream().map(Arguments::of);
// return Stream.of(Arguments.of(0, "Actuator Power"), Arguments.of(1, "Error Logging"));
}}
In the provideArguments() method, the commented out code is working fine, but the first line of code
listOfArguments.stream().map(Arguments::of)
is returning the following error:
org.junit.platform.commons.PreconditionViolationException: Configuration error: You must configure at least one set of arguments for this #ParameterizedTest
I am not sure whether I am having a casting problem for the stream in provideArguments() method, but I guess it somehow cannot map the elements of listOfArguments to the stream, which can finally take the form like below:
Stream.of(Arguments.of(0, "Actuator Power"), Arguments.of(1, "Error Logging"))
Am I missing a proper stream mapping of listOfArguments?
provideArguments(…) is called before your test is invoked.
Your ParamsProvider class is instantiated by JUnit. Whatever you’re doing in desiralizeAndCreateValidationRunnerInstance should be done in the ParamsProvider constructor.
Also you’re already wrapping the values fro deserialised configurations to Arguments and you’re double wrapping them in providesArguments.
Do this:
#Override
public Stream<? extends Arguments> provideArguments(ExtensionContext context) {
return listOfArguments.stream();
}}

#JsonIdentityReference does not recognize equal values

I'm trying to serialize an object (Root), with some duplicated entries of MyObject. Just want store the whole objects one, I'm using #JsonIdentityReference, which works pretty well.
However, I realize that it will generate un-deserializable object, if there're equal objects with different reference. I wonder if there's a configuration in Jackson to change this behavior, thanks!
#Value
#AllArgsConstructor
#NoArgsConstructor(force = true)
class Root {
private List<MyObject> allObjects;
private Map<String, MyObject> objectMap;
}
#Value
#AllArgsConstructor
#NoArgsConstructor(force = true)
#JsonIdentityReference
#JsonIdentityInfo(generator = ObjectIdGenerators.PropertyGenerator.class, property = "id")
class MyObject {
private String id;
private int value;
}
public class Main {
public static void main() throws JsonProcessingException {
// Constructing equal objects
val obj1 = new MyObject("a", 1);
val obj2 = new MyObject("a", 1);
assert obj1.equals(obj2);
val root = new Root(
Lists.newArrayList(obj1),
ImmutableMap.of(
"lorem", obj2
)
);
val objectMapper = new ObjectMapper();
val json = objectMapper.writeValueAsString(root);
// {"allObjects":[{"id":"a","value":1}],"objectMap":{"lorem":{"id":"a","value":1}}}
// Note here both obj1 and obj2 are expanded.
// Exception: Already had POJO for id
val deserialized = objectMapper.readValue(json, Root.class);
assert root.equals(deserialized);
}
}
I'm using Jackson 2.10.
Full stacktrace:
Exception in thread "main" com.fasterxml.jackson.databind.JsonMappingException: Already had POJO for id (java.lang.String) [[ObjectId: key=a, type=com.fasterxml.jackson.databind.deser.impl.PropertyBasedObjectIdGenerator, scope=java.lang.Object]] (through reference chain: Root["objectMap"]->java.util.LinkedHashMap["lorem"]->MyObject["id"])
at com.fasterxml.jackson.databind.JsonMappingException.wrapWithPath(JsonMappingException.java:394)
at com.fasterxml.jackson.databind.JsonMappingException.wrapWithPath(JsonMappingException.java:353)
at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.wrapAndThrow(BeanDeserializerBase.java:1714)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:371)
at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeWithObjectId(BeanDeserializerBase.java:1257)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:157)
at com.fasterxml.jackson.databind.deser.std.MapDeserializer._readAndBindStringKeyMap(MapDeserializer.java:527)
at com.fasterxml.jackson.databind.deser.std.MapDeserializer.deserialize(MapDeserializer.java:364)
at com.fasterxml.jackson.databind.deser.std.MapDeserializer.deserialize(MapDeserializer.java:29)
at com.fasterxml.jackson.databind.deser.impl.FieldProperty.deserializeAndSet(FieldProperty.java:138)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.vanillaDeserialize(BeanDeserializer.java:288)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:151)
at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:4202)
at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3205)
at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3173)
at Main.main(Main.java:53)
Caused by: java.lang.IllegalStateException: Already had POJO for id (java.lang.String) [[ObjectId: key=a, type=com.fasterxml.jackson.databind.deser.impl.PropertyBasedObjectIdGenerator, scope=java.lang.Object]]
at com.fasterxml.jackson.annotation.SimpleObjectIdResolver.bindItem(SimpleObjectIdResolver.java:24)
at com.fasterxml.jackson.databind.deser.impl.ReadableObjectId.bindItem(ReadableObjectId.java:57)
at com.fasterxml.jackson.databind.deser.impl.ObjectIdValueProperty.deserializeSetAndReturn(ObjectIdValueProperty.java:101)
at com.fasterxml.jackson.databind.deser.impl.ObjectIdValueProperty.deserializeAndSet(ObjectIdValueProperty.java:83)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:369)
... 14 more
As I mentioned earlier, this setup only works if obj1 == obj2, as the two objects with same ID should be identity-equal. In that case, the second object would also net get expanded during serialization (alwaysAsId = false only expands the first object).
However, if you want to have this setup and are fine with the serialization, you could use a custom Resolver for deserialization that stores a single instance per key:
#JsonIdentityReference(alwaysAsId = false)
#JsonIdentityInfo(generator = ObjectIdGenerators.PropertyGenerator.class, property = "id", resolver = CustomScopeResolver.class)
static class MyObject {
private String id;
// ...
}
class CustomScopeResolver implements ObjectIdResolver {
Map<String, MyObject> data = new HashMap<>();
#Override
public void bindItem(final IdKey id, final Object pojo) {
data.put(id.key.toString(), (MyObject) pojo);
}
#Override
public Object resolveId(final IdKey id) {
return data.get(id.key);
}
#Override
public ObjectIdResolver newForDeserialization(final Object context) {
return new CustomScopeResolver();
}
#Override
public boolean canUseFor(final ObjectIdResolver resolverType) {
return false;
}
}
NEW EDIT: Apparently, its very easy: Just turn on objectMapper.configure(SerializationFeature.USE_EQUALITY_FOR_OBJECT_ID, true); so that the DefaultSerializerProvider uses a regular Hashmap instead of an IdentityHashMap to manage the serialized beans.
DEPRECATED: Update for Serialization: It is possible to achieve this by adding a custom SerializationProvider:
class CustomEqualObjectsSerializerProvider extends DefaultSerializerProvider {
private final Collection<MyObject> data = new HashSet<>();
private final SerializerProvider src;
private final SerializationConfig config;
private final SerializerFactory f;
public CustomEqualObjectsSerializerProvider(
final SerializerProvider src,
final SerializationConfig config,
final SerializerFactory f) {
super(src, config, f);
this.src = src;
this.config = config;
this.f = f;
}
#Override
public DefaultSerializerProvider createInstance(final SerializationConfig config, final SerializerFactory jsf) {
return new CustomEqualObjectsSerializerProvider(src, this.config, f);
}
#Override
public WritableObjectId findObjectId(final Object forPojo, final ObjectIdGenerator<?> generatorType) {
// check if there is an equivalent pojo, use it if exists
final Optional<MyObject> equivalentObject = data.stream()
.filter(forPojo::equals)
.findFirst();
if (equivalentObject.isPresent()) {
return super.findObjectId(equivalentObject.get(), generatorType);
} else {
if (forPojo instanceof MyObject) {
data.add((MyObject) forPojo);
}
return super.findObjectId(forPojo, generatorType);
}
}
}
#Test
public void main() throws IOException {
// Constructing equal objects
final MyObject obj1 = new MyObject();
obj1.setId("a");
final MyObject obj2 = new MyObject();
obj2.setId("a");
assert obj1.equals(obj2);
final Root root = new Root();
root.setAllObjects(Collections.singletonList(obj1));
root.setObjectMap(Collections.singletonMap(
"lorem", obj2));
final ObjectMapper objectMapper = new ObjectMapper();
objectMapper.setSerializerProvider(
new CustomEqualObjectsSerializerProvider(
objectMapper.getSerializerProvider(),
objectMapper.getSerializationConfig(),
objectMapper.getSerializerFactory()));
final String json = objectMapper.writeValueAsString(root);
System.out.println(json); // second object is not expanded!
}

Storm Kafkaspout KryoSerialization issue for java bean from kafka topic

Hi I am new to Storm and Kafka.
I am using storm 1.0.1 and kafka 0.10.0
we have a kafkaspout that would receive java bean from kafka topic.
I have spent several hours digging to find the right approach for that.
Found few articles which are useful but none of the approaches worked for me so far.
Following is my codes:
StormTopology:
public class StormTopology {
public static void main(String[] args) throws Exception {
//Topo test /zkroot test
if (args.length == 4) {
System.out.println("started");
BrokerHosts hosts = new ZkHosts("localhost:2181");
SpoutConfig kafkaConf1 = new SpoutConfig(hosts, args[1], args[2],
args[3]);
kafkaConf1.zkRoot = args[2];
kafkaConf1.useStartOffsetTimeIfOffsetOutOfRange = true;
kafkaConf1.startOffsetTime = kafka.api.OffsetRequest.LatestTime();
kafkaConf1.scheme = new SchemeAsMultiScheme(new KryoScheme());
KafkaSpout kafkaSpout1 = new KafkaSpout(kafkaConf1);
System.out.println("started");
ShuffleBolt shuffleBolt = new ShuffleBolt(args[1]);
AnalysisBolt analysisBolt = new AnalysisBolt(args[1]);
TopologyBuilder topologyBuilder = new TopologyBuilder();
topologyBuilder.setSpout("kafkaspout", kafkaSpout1, 1);
//builder.setBolt("counterbolt2", countbolt2, 3).shuffleGrouping("kafkaspout");
//This is for field grouping in bolt we need two bolt for field grouping or it wont work
topologyBuilder.setBolt("shuffleBolt", shuffleBolt, 3).shuffleGrouping("kafkaspout");
topologyBuilder.setBolt("analysisBolt", analysisBolt, 5).fieldsGrouping("shuffleBolt", new Fields("trip"));
Config config = new Config();
config.registerSerialization(VehicleTrip.class, VehicleTripKyroSerializer.class);
config.setDebug(true);
config.setNumWorkers(1);
LocalCluster cluster = new LocalCluster();
cluster.submitTopology(args[0], config, topologyBuilder.createTopology());
// StormSubmitter.submitTopology(args[0], config,
// builder.createTopology());
} else {
System.out
.println("Insufficent Arguements - topologyName kafkaTopic ZKRoot ID");
}
}
}
I am serializing the data at kafka using kryo
KafkaProducer:
public class StreamKafkaProducer {
private static Producer producer;
private final Properties props = new Properties();
private static final StreamKafkaProducer KAFKA_PRODUCER = new StreamKafkaProducer();
private StreamKafkaProducer(){
props.put("bootstrap.servers", "localhost:9092");
props.put("acks", "all");
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer", "com.abc.serializer.MySerializer");
producer = new org.apache.kafka.clients.producer.KafkaProducer(props);
}
public static StreamKafkaProducer getStreamKafkaProducer(){
return KAFKA_PRODUCER;
}
public void produce(String topic, VehicleTrip vehicleTrip){
ProducerRecord<String,VehicleTrip> producerRecord = new ProducerRecord<>(topic,vehicleTrip);
producer.send(producerRecord);
//producer.close();
}
public static void closeProducer(){
producer.close();
}
}
Kyro Serializer:
public class DataKyroSerializer extends Serializer<Data> implements Serializable {
#Override
public void write(Kryo kryo, Output output, VehicleTrip vehicleTrip) {
output.writeLong(data.getStartedOn().getTime());
output.writeLong(data.getEndedOn().getTime());
}
#Override
public Data read(Kryo kryo, Input input, Class<VehicleTrip> aClass) {
Data data = new Data();
data.setStartedOn(new Date(input.readLong()));
data.setEndedOn(new Date(input.readLong()));
return data;
}
I need to get the data back to the Data bean.
As per few articles I need to provide with a custom scheme and make it part of topology but till now I have no luck
Code for Bolt and Scheme
Scheme:
public class KryoScheme implements Scheme {
private ThreadLocal<Kryo> kryos = new ThreadLocal<Kryo>() {
protected Kryo initialValue() {
Kryo kryo = new Kryo();
kryo.addDefaultSerializer(Data.class, new DataKyroSerializer());
return kryo;
};
};
#Override
public List<Object> deserialize(ByteBuffer ser) {
return Utils.tuple(kryos.get().readObject(new ByteBufferInput(ser.array()), Data.class));
}
#Override
public Fields getOutputFields( ) {
return new Fields( "data" );
}
}
and bolt:
public class AnalysisBolt implements IBasicBolt {
/**
*
*/
private static final long serialVersionUID = 1L;
private String topicname = null;
public AnalysisBolt(String topicname) {
this.topicname = topicname;
}
public void prepare(Map stormConf, TopologyContext topologyContext) {
System.out.println("prepare");
}
public void execute(Tuple input, BasicOutputCollector collector) {
System.out.println("execute");
Fields fields = input.getFields();
try {
JSONObject eventJson = (JSONObject) JSONSerializer.toJSON((String) input
.getValueByField(fields.get(1)));
String StartTime = (String) eventJson.get("startedOn");
String EndTime = (String) eventJson.get("endedOn");
String Oid = (String) eventJson.get("_id");
int V_id = (Integer) eventJson.get("vehicleId");
//call method getEventForVehicleWithinTime(Long vehicleId, Date startTime, Date endTime)
System.out.println("==========="+Oid+"| "+V_id+"| "+StartTime+"| "+EndTime);
} catch (Exception e) {
e.printStackTrace();
}
}
but if I submit the storm topology i am getting error:
java.lang.IllegalStateException: Spout 'kafkaspout' contains a
non-serializable field of type com.abc.topology.KryoScheme$1, which
was instantiated prior to topology creation.
com.minda.iconnect.topology.KryoScheme$1 should be instantiated within
the prepare method of 'kafkaspout at the earliest.
Appreciate help to debug the issue and guide to right path.
Thanks
Your ThreadLocal is not Serializable. The preferable solution would be to make your serializer both Serializable and threadsafe. If this is not possible, then I see 2 alternatives since there is no prepare method as you would get in a bolt.
Declare it as static, which is inherently transient.
Declare it transient and access it via a private get method. Then you can initialize the variable on first access.
Within the Storm lifecycle, the topology is instantiated and then serialized to byte format to be stored in ZooKeeper, prior to the topology being executed. Within this step, if a spout or bolt within the topology has an initialized unserializable property, serialization will fail.
If there is a need for a field that is unserializable, initialize it within the bolt or spout's prepare method, which is run after the topology is delivered to the worker.
Source: Best Practices for implementing Apache Storm

#RabbitListener Not receiving messages from queue

I am using #RabbitListner annotation to recieve messages from a RabbitMq queue.
Although I have done all steps required to do this (i.e. Add #EnableRabbit annotation in my config class) and declare SimpleRabbitListenerContainerFactory as a bean , still my method is not recieving messages from the queue . Can anybody suggest what I am missing :
I am using Spring Boot to launch my application
My launch class
#Configuration
#EnableAutoConfiguration
#EnableRabbit
#EnableConfigurationProperties
#EntityScan("persistence.mysql.domain")
#EnableJpaRepositories("persistence.mysql.dao")
#ComponentScan(excludeFilters = { #ComponentScan.Filter(type = FilterType.ASSIGNABLE_TYPE, value = ApiAuthenticationFilter.class),#ComponentScan.Filter(type = FilterType.ASSIGNABLE_TYPE, value = ApiVersionValidationFilter.class)},basePackages = {"common", "mqclient","apache", "dispatcher" })
public class Application {
public static void main(final String[] args) {
final SpringApplicationBuilder appBuilder = new SpringApplicationBuilder(
Application.class);
appBuilder.application().setWebEnvironment(false);
appBuilder.profiles("common", "common_mysql_db", "common_rabbitmq")
.run(args);
}
#Bean
#Primary
#ConfigurationProperties(prefix = "spring.datasource")
public DataSource primaryDataSource() {
return DataSourceBuilder.create().build();
}
}
Here is my Bean to define SimpleRabbitListenerContainerFactory inside a component class
#Component(value = "inputQueueManager")
public class InputQueueManagerImpl extends AbstractQueueManagerImpl {
..///..
#Bean(name = "inputListenerContainerFactory")
public SimpleRabbitListenerContainerFactory rabbitListenerContainerFactory()
{
SimpleRabbitListenerContainerFactory factory = new
SimpleRabbitListenerContainerFactory();
factory.setConnectionFactory(this.rabbitConnectionFactory);
factory.setConcurrentConsumers(Integer.parseInt(this.concurrentConsumers));
factory.setMaxConcurrentConsumers(Integer.parseInt(this.maxConcurrentConsumers));
factory.setMessageConverter(new Jackson2JsonMessageConverter());
return factory;
}
}
And finally my Listener inside another Controller component
#Controller
public class RabbitListner{
#RabbitListener(queues = "Storm1", containerFactory = "inputListenerContainerFactory")
#Override
public void processMessage(QueueMessage message) {
String topic = message.getTopic();
String payload = message.getPayload();
dispatcher.bean.EventBean eventBean = new dispatcher.bean.EventBean();
System.out.println("Data read from the queue");
Unfortunately , I am sending the messages to the queue but the code inside processMessage is not getting executed ever.
I am not sure what is the problem here . Can anybody help ??
By default, the Json message converter requires hints in the message properties as to what type of object to create.
If your producer does not set those properties, it won't be able to do the conversion without some help.
You can inject a ClassMapper into the converter.
The framework provides a DefaultClassMapper which can be customized - either to look at a different message property than the default __TypeId__ property.
If you always want to convert the json to the same object, you can simply set the default type:
DefaultClassMapper classMapper = newDefaultClassMapper();
classMapper.setDefaultType(QueueMessage.class);
Jackson2JsonMessageConverter converter = new Jackson2JsonMessageConverter();
converter.setClassMapper(classMapper);
factory.setMessageConverter(new Jackson2JsonMessageConverter());
The documentation already shows how to configure this.

Jackson vector serialization exception

I have the following code with a simple class and a method for writing and then reading:
ObjectMapper mapper = new ObjectMapper();
try{
DataStore testOut = new DataStore();
DataStore.Checklist ch1 = testOut.addChecklist();
ch1.SetTitle("Checklist1");
String output = mapper.writeValueAsString(testOut);
JsonNode rootNode = mapper.readValue(output, JsonNode.class);
Map<String,Object> userData = mapper.readValue(output, Map.class);
}
public class DataStore {
public static class Checklist
{
public Checklist()
{
}
private String _title;
public String GetTitle()
{
return _title;
}
public void SetTitle(String title)
{
_title = title;
}
}
//Checklists
private Vector<Checklist> _checklists = new Vector<Checklist>();
public Checklist addChecklist()
{
Checklist ch = new Checklist();
ch.SetTitle("New Checklist");
_checklists.add(ch);
return ch;
}
public Vector<Checklist> getChecklists()
{
return _checklists;
}
public void setChecklists(Vector<Checklist> checklists)
{
_checklists = checklists;
}
}
The line:
String output = mapper.writeValueAsString(testOut);
causes an exception that has had me baffled for hours and about to abandon using this at all.
Any hints are appreciated.
Here is the exception:
No serializer found for class DataStore$Checklist and no properties discovered to create BeanSerializer (to avoid exception, disable SerializationConfig.Feature.FAIL_ON_EMPTY_BEANS) ) (through reference chain: DataStore["checklists"]->java.util.Vector[0])
There are multiple ways to do it, but I will start with what you are doing wrong: your naming of getter and setter method is wrong -- in Java one uses "camel-case", so you should be using "getTitle". Because of this, properties are not found.
Besides renaming methods to use Java-style names, there are alternatives:
You can use annotation JsonProperty("title") for GetTitle(), so that property is recognized
If you don't want the wrapper object, you could alternatively just add #JsonValue for GetTitle(), in which case value used for the whole object would be return value of that method.
The answer seems to be: You can't do that with Json. I've seen comments in the Gson tutorial as well, that state that some serialization just doesn't work. I downloaded XStream and spat it out with XML in a few minutes of work and a lot less construction around what I really wanted to persist. In the process, I was able to delete a lot of code.