How to get ArrayList of POJO's from Amazon Lambda (getting only LinkedTreeMap) - arraylist

I try to call my AWS Lambda function (serverless backend) with my Android mobile app client. The AWS lambda function returns an ArrayList of POJO objects (as JSON).
The problem is that the android client AWS Lambda(JSON)DataBinder does not deserialize to my ArrayList of POJOs. I get an ArrayList of LinkedTreeMap (see code at onPostExecute() below).
At the android client side I'm using Android AWS SDK: com.amazonaws:aws-android-sdk-core:2.6
Here is some code:
public void readSurveyList(String strUuid, int intLanguageID) {
// Create an instance of CognitoCachingCredentialsProvider
// You have to configure at least an AWS identity pool to get access to your lambda function
CognitoCachingCredentialsProvider credentialsProvider = new CognitoCachingCredentialsProvider(
this.getApplicationContext(),
IDENTITY_POOL_ID,
Regions.EU_CENTRAL_1);
LambdaInvokerFactory factory = LambdaInvokerFactory.builder()
.context(this.getApplicationContext())
.region(Regions.EU_CENTRAL_1)
.credentialsProvider(credentialsProvider)
.build();
// Create the Lambda proxy object with default Json data binder.
myInterface = factory.build(MyInterface.class);
//create a request object (depends on your lambda function)
SurveyListRequest surveyListRequest = new SurveyListRequest(strUuid, intLanguageID);
// Lambda function in async task with definiton of
// request object (-> SurveyListRequest)
// response object (-> ArrayList<SurveyListItem>>)
new AsyncTask<SurveyListRequest, Void, ArrayList<SurveyListItem>>() {
#Override
protected ArrayList<SurveyListItem> doInBackground(SurveyListRequest... params) {
try {
return myInterface.ReadSurveyList(params[0]);
} catch (LambdaFunctionException lfe) {
Log.e("TAG", String.format("echo method failed: error [%s], details [%s].", lfe.getMessage(), lfe.getDetails()));
return null;
}
}
#Override
protected void onPostExecute(ArrayList<SurveyListItem> surveyList) {
// PROBLEM: here i get a ArrayList of LinkedTreeMap
}
}.execute(surveyListRequest);
}
Here is the code of my lambda function Interface:
public interface MyInterface {
#LambdaFunction
ArrayList<SurveyListItem> ReadSurveyList (SurveyListRequest surveyListRequest);
}
I would expect to get a list of my POJO objects. I found a lot of discussions about Gson and ArrayList type and solutions based on TypeToken (e.g. Gson TypeToken with dynamic ArrayList item type). Maybe same problem ...

I found a solution using a custom LambdaDataBinder. I have specified the type of my POJO-class "SurveyListItem" in deserialize function. The Gson uses the TypeToken definition and converts the JSON string correct to the list of POJOs (in my case "SurveyListItem" objects).
Here is the sourcecode of MyLambdaDataBinder:
public class MyLambdaDataBinder implements LambdaDataBinder {
private final Gson gson;
Type mType;
//CUSTOMIZATION: pass typetoken via class constructor
public MyLambdaDataBinder(Type type) {
this.gson = new Gson();
mType = type;
}
#Override
public <T> T deserialize(byte[] content, Class<T> clazz) {
if (content == null) {
return null;
}
Reader reader = new BufferedReader(new InputStreamReader(new ByteArrayInputStream(content)));
//CUSTOMIZATION: Original line of code: return gson.fromJson (reader, clazz);
return gson.fromJson(reader, mType);
}
#Override
public byte[] serialize(Object object) {
return gson.toJson(object).getBytes(StringUtils.UTF8);
}
}
Here is how to use the custom MyLambdaDataBinder. Use your POJO instead of "SurveyListItem":
myInterface = factory.build(LambdaInterface.class, new MyLambdaDataBinder(new TypeToken<ArrayList<SurveyListItem>>() {}.getType()));

Related

Google Cloud Functions - Realtime Database Trigger - how to deserialize data JSON to POJO?

As described on the Google Cloud Functions docs, it is possible to trigger a Function based on Firebase Realtime Database events (write/create/update/delete).
The following docs sample explains how to get the delta snapshot.
public class FirebaseRtdb implements RawBackgroundFunction {
private static final Logger logger = Logger.getLogger(FirebaseRtdb.class.getName());
// Use GSON (https://github.com/google/gson) to parse JSON content.
private static final Gson gson = new Gson();
#Override
public void accept(String json, Context context) {
logger.info("Function triggered by change to: " + context.resource());
JsonObject body = gson.fromJson(json, JsonObject.class);
boolean isAdmin = false;
if (body != null && body.has("auth")) {
JsonObject authObj = body.getAsJsonObject("auth");
isAdmin = authObj.has("admin") && authObj.get("admin").getAsBoolean();
}
logger.info("Admin?: " + isAdmin);
if (body != null && body.has("delta")) {
logger.info("Delta:");
logger.info(body.get("delta").toString());
}
}
}
The sample works perfectly but the question is: How can I deserialize this delta to a POJO?
I tried:
val mObject = gson.fromJson(body.get("delta").toString(), MyCustomObject::class.java)
But I am getting:
com.google.gson.JsonSyntaxException: java.lang.IllegalStateException: Expected BEGIN_ARRAY but was BEGIN_OBJECT
As far as I know it is because MyObject class has a List<T> field, and Firebase Database always convert Lists to Maps with integer keys.
I preferably do not want to change every List<T> to Map<Int,T>, because I have a lot of classes :(
Thanks in advance!
So, here is what I ended up doing (maybe not the best solution!):
1) Create a custom Json Deserializer for Firebase-coming lists:
class ListFirebaseDeserializer<T> : JsonDeserializer<ArrayList<T>> {
override fun deserialize(json: JsonElement?, typeOfT: Type?, context: JsonDeserializationContext?): ArrayList<T> {
val result = ArrayList<T>()
val typeOfElement = (typeOfT as ParameterizedType).actualTypeArguments[0]
json?.let {
json.asJsonObject.entrySet().forEach {
entry->
result.add(Gson().fromJson(entry.value, typeOfElement))
}
}
return result
}
}
This takes the lists that Firebase turned into maps and convert it back to actual lists.
2) Annotate every list in my POJO with #JsonAdapter(ListFirebaseDeserializer::class), for instance:
class MyCustomObject {
#JsonAdapter(ListFirebaseDeserializer::class)
var myPaymentList = ArrayList<Payment>()
}
It could be a pain if you already have lots of lists to annotate, but it is better than having to use maps instead.
Hope it helps!

Storm Kafkaspout KryoSerialization issue for java bean from kafka topic

Hi I am new to Storm and Kafka.
I am using storm 1.0.1 and kafka 0.10.0
we have a kafkaspout that would receive java bean from kafka topic.
I have spent several hours digging to find the right approach for that.
Found few articles which are useful but none of the approaches worked for me so far.
Following is my codes:
StormTopology:
public class StormTopology {
public static void main(String[] args) throws Exception {
//Topo test /zkroot test
if (args.length == 4) {
System.out.println("started");
BrokerHosts hosts = new ZkHosts("localhost:2181");
SpoutConfig kafkaConf1 = new SpoutConfig(hosts, args[1], args[2],
args[3]);
kafkaConf1.zkRoot = args[2];
kafkaConf1.useStartOffsetTimeIfOffsetOutOfRange = true;
kafkaConf1.startOffsetTime = kafka.api.OffsetRequest.LatestTime();
kafkaConf1.scheme = new SchemeAsMultiScheme(new KryoScheme());
KafkaSpout kafkaSpout1 = new KafkaSpout(kafkaConf1);
System.out.println("started");
ShuffleBolt shuffleBolt = new ShuffleBolt(args[1]);
AnalysisBolt analysisBolt = new AnalysisBolt(args[1]);
TopologyBuilder topologyBuilder = new TopologyBuilder();
topologyBuilder.setSpout("kafkaspout", kafkaSpout1, 1);
//builder.setBolt("counterbolt2", countbolt2, 3).shuffleGrouping("kafkaspout");
//This is for field grouping in bolt we need two bolt for field grouping or it wont work
topologyBuilder.setBolt("shuffleBolt", shuffleBolt, 3).shuffleGrouping("kafkaspout");
topologyBuilder.setBolt("analysisBolt", analysisBolt, 5).fieldsGrouping("shuffleBolt", new Fields("trip"));
Config config = new Config();
config.registerSerialization(VehicleTrip.class, VehicleTripKyroSerializer.class);
config.setDebug(true);
config.setNumWorkers(1);
LocalCluster cluster = new LocalCluster();
cluster.submitTopology(args[0], config, topologyBuilder.createTopology());
// StormSubmitter.submitTopology(args[0], config,
// builder.createTopology());
} else {
System.out
.println("Insufficent Arguements - topologyName kafkaTopic ZKRoot ID");
}
}
}
I am serializing the data at kafka using kryo
KafkaProducer:
public class StreamKafkaProducer {
private static Producer producer;
private final Properties props = new Properties();
private static final StreamKafkaProducer KAFKA_PRODUCER = new StreamKafkaProducer();
private StreamKafkaProducer(){
props.put("bootstrap.servers", "localhost:9092");
props.put("acks", "all");
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer", "com.abc.serializer.MySerializer");
producer = new org.apache.kafka.clients.producer.KafkaProducer(props);
}
public static StreamKafkaProducer getStreamKafkaProducer(){
return KAFKA_PRODUCER;
}
public void produce(String topic, VehicleTrip vehicleTrip){
ProducerRecord<String,VehicleTrip> producerRecord = new ProducerRecord<>(topic,vehicleTrip);
producer.send(producerRecord);
//producer.close();
}
public static void closeProducer(){
producer.close();
}
}
Kyro Serializer:
public class DataKyroSerializer extends Serializer<Data> implements Serializable {
#Override
public void write(Kryo kryo, Output output, VehicleTrip vehicleTrip) {
output.writeLong(data.getStartedOn().getTime());
output.writeLong(data.getEndedOn().getTime());
}
#Override
public Data read(Kryo kryo, Input input, Class<VehicleTrip> aClass) {
Data data = new Data();
data.setStartedOn(new Date(input.readLong()));
data.setEndedOn(new Date(input.readLong()));
return data;
}
I need to get the data back to the Data bean.
As per few articles I need to provide with a custom scheme and make it part of topology but till now I have no luck
Code for Bolt and Scheme
Scheme:
public class KryoScheme implements Scheme {
private ThreadLocal<Kryo> kryos = new ThreadLocal<Kryo>() {
protected Kryo initialValue() {
Kryo kryo = new Kryo();
kryo.addDefaultSerializer(Data.class, new DataKyroSerializer());
return kryo;
};
};
#Override
public List<Object> deserialize(ByteBuffer ser) {
return Utils.tuple(kryos.get().readObject(new ByteBufferInput(ser.array()), Data.class));
}
#Override
public Fields getOutputFields( ) {
return new Fields( "data" );
}
}
and bolt:
public class AnalysisBolt implements IBasicBolt {
/**
*
*/
private static final long serialVersionUID = 1L;
private String topicname = null;
public AnalysisBolt(String topicname) {
this.topicname = topicname;
}
public void prepare(Map stormConf, TopologyContext topologyContext) {
System.out.println("prepare");
}
public void execute(Tuple input, BasicOutputCollector collector) {
System.out.println("execute");
Fields fields = input.getFields();
try {
JSONObject eventJson = (JSONObject) JSONSerializer.toJSON((String) input
.getValueByField(fields.get(1)));
String StartTime = (String) eventJson.get("startedOn");
String EndTime = (String) eventJson.get("endedOn");
String Oid = (String) eventJson.get("_id");
int V_id = (Integer) eventJson.get("vehicleId");
//call method getEventForVehicleWithinTime(Long vehicleId, Date startTime, Date endTime)
System.out.println("==========="+Oid+"| "+V_id+"| "+StartTime+"| "+EndTime);
} catch (Exception e) {
e.printStackTrace();
}
}
but if I submit the storm topology i am getting error:
java.lang.IllegalStateException: Spout 'kafkaspout' contains a
non-serializable field of type com.abc.topology.KryoScheme$1, which
was instantiated prior to topology creation.
com.minda.iconnect.topology.KryoScheme$1 should be instantiated within
the prepare method of 'kafkaspout at the earliest.
Appreciate help to debug the issue and guide to right path.
Thanks
Your ThreadLocal is not Serializable. The preferable solution would be to make your serializer both Serializable and threadsafe. If this is not possible, then I see 2 alternatives since there is no prepare method as you would get in a bolt.
Declare it as static, which is inherently transient.
Declare it transient and access it via a private get method. Then you can initialize the variable on first access.
Within the Storm lifecycle, the topology is instantiated and then serialized to byte format to be stored in ZooKeeper, prior to the topology being executed. Within this step, if a spout or bolt within the topology has an initialized unserializable property, serialization will fail.
If there is a need for a field that is unserializable, initialize it within the bolt or spout's prepare method, which is run after the topology is delivered to the worker.
Source: Best Practices for implementing Apache Storm

Hazelcast 3.6.1 "There is no suitable de-serializer for type" exception

I am using Hazelcast 3.6.1 to read from a Map. The object class stored in the map is called Schedule.
I have configured a custom serializer on the client side like this.
ClientConfig config = new ClientConfig();
SerializationConfig sc = config.getSerializationConfig();
sc.addSerializerConfig(add(new ScheduleSerializer(), Schedule.class));
...
private SerializerConfig add(Serializer serializer, Class<? extends Serializable> clazz) {
SerializerConfig sc = new SerializerConfig();
sc.setImplementation(serializer).setTypeClass(clazz);
return sc;
}
The map is created like this
private final IMap<String, Schedule> map = client.getMap("schedule");
If I get from the map using schedule id as key, the map returns the correct value e.g.
return map.get("zx81");
If I try to use an SQL predicate e.g.
return new ArrayList<>(map.values(new SqlPredicate("statusActive")));
then I get the following error
Exception in thread "main" com.hazelcast.nio.serialization.HazelcastSerializationException: There is no suitable de-serializer for type 2. This exception is likely to be caused by differences in the serialization configuration between members or between clients and members.
The custom serializer is using Kryo to serialize (based on this blog http://blog.hazelcast.com/comparing-serialization-methods/)
public class ScheduleSerializer extends CommonSerializer<Schedule> {
#Override
public int getTypeId() {
return 2;
}
#Override
protected Class<Schedule> getClassToSerialize() {
return Schedule.class;
}
}
The CommonSerializer is defined as
public abstract class CommonSerializer<T> implements StreamSerializer<T> {
protected abstract Class<T> getClassToSerialize();
#Override
public void write(ObjectDataOutput objectDataOutput, T object) {
Output output = new Output((OutputStream) objectDataOutput);
Kryo kryo = KryoInstances.get();
kryo.writeObject(output, object);
output.flush(); // do not close!
KryoInstances.release(kryo);
}
#Override
public T read(ObjectDataInput objectDataInput) {
Input input = new Input((InputStream) objectDataInput);
Kryo kryo = KryoInstances.get();
T result = kryo.readObject(input, getClassToSerialize());
input.close();
KryoInstances.release(kryo);
return result;
}
#Override
public void destroy() {
// empty
}
}
Do I need to do any configuration on the server side? I thought that the client config would be enough.
I am using Hazelcast client 3.6.1 and have one node/member running.
Queries require the nodes to know about the classes as the bytestream has to be deserialized to access the attributes and query them. This means that when you want to query on objects you have to deploy the model classes (and serializers) on the server side as well.
Whereas when you use key-based access we do not need to look into the values (neither into the keys as we compare the byte-arrays of the key) and just send the result. That way neither model classes nor serializers have to be available on the Hazelcast nodes.
I hope that makes sense.

#RabbitListener Not receiving messages from queue

I am using #RabbitListner annotation to recieve messages from a RabbitMq queue.
Although I have done all steps required to do this (i.e. Add #EnableRabbit annotation in my config class) and declare SimpleRabbitListenerContainerFactory as a bean , still my method is not recieving messages from the queue . Can anybody suggest what I am missing :
I am using Spring Boot to launch my application
My launch class
#Configuration
#EnableAutoConfiguration
#EnableRabbit
#EnableConfigurationProperties
#EntityScan("persistence.mysql.domain")
#EnableJpaRepositories("persistence.mysql.dao")
#ComponentScan(excludeFilters = { #ComponentScan.Filter(type = FilterType.ASSIGNABLE_TYPE, value = ApiAuthenticationFilter.class),#ComponentScan.Filter(type = FilterType.ASSIGNABLE_TYPE, value = ApiVersionValidationFilter.class)},basePackages = {"common", "mqclient","apache", "dispatcher" })
public class Application {
public static void main(final String[] args) {
final SpringApplicationBuilder appBuilder = new SpringApplicationBuilder(
Application.class);
appBuilder.application().setWebEnvironment(false);
appBuilder.profiles("common", "common_mysql_db", "common_rabbitmq")
.run(args);
}
#Bean
#Primary
#ConfigurationProperties(prefix = "spring.datasource")
public DataSource primaryDataSource() {
return DataSourceBuilder.create().build();
}
}
Here is my Bean to define SimpleRabbitListenerContainerFactory inside a component class
#Component(value = "inputQueueManager")
public class InputQueueManagerImpl extends AbstractQueueManagerImpl {
..///..
#Bean(name = "inputListenerContainerFactory")
public SimpleRabbitListenerContainerFactory rabbitListenerContainerFactory()
{
SimpleRabbitListenerContainerFactory factory = new
SimpleRabbitListenerContainerFactory();
factory.setConnectionFactory(this.rabbitConnectionFactory);
factory.setConcurrentConsumers(Integer.parseInt(this.concurrentConsumers));
factory.setMaxConcurrentConsumers(Integer.parseInt(this.maxConcurrentConsumers));
factory.setMessageConverter(new Jackson2JsonMessageConverter());
return factory;
}
}
And finally my Listener inside another Controller component
#Controller
public class RabbitListner{
#RabbitListener(queues = "Storm1", containerFactory = "inputListenerContainerFactory")
#Override
public void processMessage(QueueMessage message) {
String topic = message.getTopic();
String payload = message.getPayload();
dispatcher.bean.EventBean eventBean = new dispatcher.bean.EventBean();
System.out.println("Data read from the queue");
Unfortunately , I am sending the messages to the queue but the code inside processMessage is not getting executed ever.
I am not sure what is the problem here . Can anybody help ??
By default, the Json message converter requires hints in the message properties as to what type of object to create.
If your producer does not set those properties, it won't be able to do the conversion without some help.
You can inject a ClassMapper into the converter.
The framework provides a DefaultClassMapper which can be customized - either to look at a different message property than the default __TypeId__ property.
If you always want to convert the json to the same object, you can simply set the default type:
DefaultClassMapper classMapper = newDefaultClassMapper();
classMapper.setDefaultType(QueueMessage.class);
Jackson2JsonMessageConverter converter = new Jackson2JsonMessageConverter();
converter.setClassMapper(classMapper);
factory.setMessageConverter(new Jackson2JsonMessageConverter());
The documentation already shows how to configure this.

Creating JSON without quotes

A library is using Map to use some extra information. This map eventually is being converted a JSON object and I need to set request information to display for debugging purposes as this:
map.put("request", requestString);
I am considering to use Jackson specifically to create a JSON without quotes and want to set as requestString.
I am building necessary information regarding Request and building a Map including request headers, parameters, method etc.
Jackson is creating perfectly valid JSON with quotes but when I set this generated value inside map, It is displayed ugly because of having escaped quotes.
So Jackson is creating this:
{
method : "POST",
path : "/register"
}
When I set this in map, it turns to this:
{
method : \"POST\",
path : \"/register\"
}
Consider this as a huge map including all parameters and other information about request.
What I would like to want this:
{
method : POST,
path : /register
}
I know that this is not a valid JSON but I am using this as a String to a Map which is accepting String values.
public class UnQuotesSerializer extends NonTypedScalarSerializerBase<String>
{
public UnQuotesSerializer() { super(String.class); }
/**
* For Strings, both null and Empty String qualify for emptiness.
*/
#Override
public boolean isEmpty(String value) {
return (value == null) || (value.length() == 0);
}
#Override
public void serialize(String value, JsonGenerator jgen, SerializerProvider provider) throws IOException {
jgen.writeRawValue(value);
}
#Override
public JsonNode getSchema(SerializerProvider provider, Type typeHint) {
return createSchemaNode("string", true);
}
#Override
public void acceptJsonFormatVisitor(JsonFormatVisitorWrapper visitor, JavaType typeHint) throws JsonMappingException {
if (visitor != null) visitor.expectStringFormat(typeHint);
}
}
and
ObjectMapper objectMapper = new ObjectMapper();
SimpleModule module = new SimpleModule("UnQuote");
module.addSerializer(new UnQuotesSerializer());
objectMapper.configure(JsonGenerator.Feature.QUOTE_FIELD_NAMES, false);
objectMapper.configure(JsonParser.Feature.ALLOW_UNQUOTED_FIELD_NAMES, true);
objectMapper.configure(JsonParser.Feature.ALLOW_UNQUOTED_CONTROL_CHARS, true);
objectMapper.registerModule(module);
This is generating without quotes strings.
The following test passes (Jackson 2.5.0)
#Test
public void test() throws Exception {
ObjectMapper mapper = new ObjectMapper();
Map map = new HashMap();
map.put("method", "POST");
map.put("request", "/register");
String s = mapper.writeValueAsString(map);
Map map2 = mapper.readValue(s, Map.class);
Assert.assertEquals(map, map2);
}
so your pseudo JSON without quotes does not seem the way to go