GenericJackson2Json Deserialization error - serialization

I am able to save data to redis in json format but not able to retrieve it as object. I have recently upgraded from jackson version 2.10.x to 2.13.4
#Bean
public RedisTemplate<String, String> redisTemplate(
RedisConnectionFactory redisConnectionFactory
) {
RedisTemplate<String, String> template = new RedisTemplate<>();
template.setConnectionFactory(redisConnectionFactory);
template.setEnableTransactionSupport(false);
template.setEnableDefaultSerializer(false);
template.setKeySerializer(new StringRedisSerializer(StandardCharsets.UTF_8));
template.setHashKeySerializer(new StringRedisSerializer(StandardCharsets.UTF_8));
ObjectMapper objectMapper = new ObjectMapper()
.findAndRegisterModules()
.registerModule(new JavaTimeModule())
.setVisibility(PropertyAccessor.ALL, JsonAutoDetect.Visibility.ANY)
.activateDefaultTypingAsProperty(
BasicPolymorphicTypeValidator.builder().allowIfBaseType(Example.class).build(),
ObjectMapper.DefaultTyping.NON_FINAL,
"#class"
)
.disable(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS);
template.setHashValueSerializer(
new GenericJackson2JsonRedisSerializer(
objectMapper
)
);
return template;
}
Caused by: com.fasterxml.jackson.databind.exc.InvalidTypeIdException: Could not resolve type id 'com.xxx.xxx.xxx.xxx.xxx.xxx.Example' as a subtype of java.lang.Object: Configured PolymorphicTypeValidator (of type com.fasterxml.jackson.databind.jsontype.BasicPolymorphicTypeValidator) denied resolution
at [Source: (byte[])"{"#class":"com.xxx.xxx.xxx.xxx.xxx.xxx.Example","status":["com.xxx.xxx.xxx.xxx.xxx.xxx.Status","NEW"],"rawQuery":"abcdefghijklmnopqrstuvwxyz","createdAt":["java.time.Instant","2023-01-30T14:25:06.938329600Z"]}"; line: 1, column: 11]
at com.fasterxml.jackson.databind.exc.InvalidTypeIdException.from(InvalidTypeIdException.java:43)
at com.fasterxml.jackson.databind.DeserializationContext.invalidTypeIdException(DeserializationContext.java:2073)
at com.fasterxml.jackson.databind.DatabindContext._throwSubtypeClassNotAllowed(DatabindContext.java:287)
at com.fasterxml.jackson.databind.DatabindContext.resolveAndValidateSubType(DatabindContext.java:244)
at com.fasterxml.jackson.databind.jsontype.impl.ClassNameIdResolver._typeFromId(ClassNameIdResolver.java:72)
at com.fasterxml.jackson.databind.jsontype.impl.ClassNameIdResolver.typeFromId(ClassNameIdResolver.java:66)
at com.fasterxml.jackson.databind.jsontype.impl.TypeDeserializerBase._findDeserializer(TypeDeserializerBase.java:159)
at com.fasterxml.jackson.databind.jsontype.impl.AsPropertyTypeDeserializer._deserializeTypedForId(AsPropertyTypeDeserializer.java:125)
at com.fasterxml.jackson.databind.jsontype.impl.AsPropertyTypeDeserializer.deserializeTypedFromObject(AsPropertyTypeDeserializer.java:110)
at com.fasterxml.jackson.databind.jsontype.impl.AsPropertyTypeDeserializer.deserializeTypedFromAny(AsPropertyTypeDeserializer.java:213)
at com.fasterxml.jackson.databind.deser.std.UntypedObjectDeserializer$Vanilla.deserializeWithType(UntypedObjectDeserializer.java:781)
at com.fasterxml.jackson.databind.deser.impl.TypeWrappedDeserializer.deserialize(TypeWrappedDeserializer.java:74)
at com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:323)
at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:4674)
at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3690)
at org.springframework.data.redis.serializer.GenericJackson2JsonRedisSerializer.deserialize(GenericJackson2JsonRedisSerializer.java:163)
... 81 more

Related

Google Cloud Memory Store (Redis), can't connect to redis when instance is just started

I have a problem to connect to redis when my instance is just started.
I use:
runtime: java
env: flex
runtime_config:
jdk: openjdk8
i got following exception:
Caused by: redis.clients.jedis.exceptions.JedisConnectionException: java.net.SocketTimeoutException: connect timed out
RedisConnectionFailureException: Cannot get Jedis connection; nested exception is redis.clients.jedis.exceptions.JedisConnectionException: Could not get a resource from the pool
java.net.SocketTimeoutException: connect timed out
after 2-3 min, it works smoothly
Do i need to add some check in my code or how i should fix it properly?
p.s.
also i use spring boot, with following configuration
#Value("${spring.redis.host}")
private String redisHost;
#Bean
JedisConnectionFactory jedisConnectionFactory() {
// https://cloud.google.com/memorystore/docs/redis/quotas
RedisStandaloneConfiguration config = new RedisStandaloneConfiguration(redisHost, 6379);
return new JedisConnectionFactory(config);
}
#Bean
public RedisTemplate<String, Object> redisTemplate(
#Autowired JedisConnectionFactory jedisConnectionFactory
) {
RedisTemplate<String, Object> template = new RedisTemplate<>();
template.setConnectionFactory(jedisConnectionFactory);
template.setKeySerializer(new StringRedisSerializer());
template.setValueSerializer(new GenericJackson2JsonRedisSerializer(newObjectMapper()));
return template;
}
in pom.xml
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-redis</artifactId>
<version>2.1.2.RELEASE</version>
I solved this problem as follows: in short, I added the “ping” method, which tries to set and get the value from Redis; if it's possible, then application is ready.
Implementation:
First, you need to update app.yaml add following:
readiness_check:
path: "/readiness_check"
check_interval_sec: 5
timeout_sec: 4
failure_threshold: 2
success_threshold: 2
app_start_timeout_sec: 300
Second, in your rest controller:
#GetMapping("/readiness_check")
public ResponseEntity<?> readiness_check() {
if (!cacheConfig.ping()) {
return ResponseEntity.notFound().build();
}
return ResponseEntity.ok().build();
}
Third, class CacheConfig:
public boolean ping() {
long prefix = System.currentTimeMillis();
try {
redisTemplate.opsForValue().set("readiness_check_" + prefix, Boolean.TRUE, 100, TimeUnit.SECONDS);
Boolean val = (Boolean) redisTemplate.opsForValue().get("readiness_check_" + prefix);
return Boolean.TRUE.equals(val);
} catch (Exception e) {
LOGGER.info("ping failed for " + System.currentTimeMillis());
return false;
}
}
P.S.
Also if somebody needs the full implementation of CacheConfig:
#Configuration
public class CacheConfig {
private static final Logger LOGGER = Logger.getLogger(CacheConfig.class.getName());
#Value("${spring.redis.host}")
private String redisHost;
private final RedisTemplate<String, Object> redisTemplate;
#Autowired
public CacheConfig(#Lazy RedisTemplate<String, Object> redisTemplate) {
this.redisTemplate = redisTemplate;
}
#Bean
JedisConnectionFactory jedisConnectionFactory(
#Autowired JedisPoolConfig poolConfig
) {
// https://cloud.google.com/memorystore/docs/redis/quotas
RedisStandaloneConfiguration config = new RedisStandaloneConfiguration(redisHost, 6379);
JedisClientConfiguration clientConfig = JedisClientConfiguration
.builder()
.usePooling()
.poolConfig(poolConfig)
.build();
return new JedisConnectionFactory(config, clientConfig);
}
#Bean
public RedisTemplate<String, Object> redisTemplate(
#Autowired JedisConnectionFactory jedisConnectionFactory
) {
RedisTemplate<String, Object> template = new RedisTemplate<>();
template.setConnectionFactory(jedisConnectionFactory);
template.setKeySerializer(new StringRedisSerializer());
template.setValueSerializer(new GenericJackson2JsonRedisSerializer(newObjectMapper()));
return template;
}
/**
* Example: https://github.com/PengliuIBM/pws_demo/blob/1becdca1bc19320c2742504baa1cada3260f8d93/redisData/src/main/java/com/pivotal/wangyu/study/springdataredis/config/RedisConfig.java
*/
#Bean
redis.clients.jedis.JedisPoolConfig jedisPoolConfig() {
final redis.clients.jedis.JedisPoolConfig poolConfig = new redis.clients.jedis.JedisPoolConfig();
// Maximum active connections to Redis instance
poolConfig.setMaxTotal(16);
// Number of connections to Redis that just sit there and do nothing
poolConfig.setMaxIdle(16);
// Minimum number of idle connections to Redis - these can be seen as always open and ready to serve
poolConfig.setMinIdle(8);
// Tests whether connection is dead when returning a connection to the pool
poolConfig.setTestOnBorrow(true);
// Tests whether connection is dead when connection retrieval method is called
poolConfig.setTestOnReturn(true);
// Tests whether connections are dead during idle periods
poolConfig.setTestWhileIdle(true);
return poolConfig;
}
public boolean ping() {
long prefix = System.currentTimeMillis();
try {
redisTemplate.opsForValue().set("readiness_check_" + prefix, Boolean.TRUE, 100, TimeUnit.SECONDS);
Boolean val = (Boolean) redisTemplate.opsForValue().get("readiness_check_" + prefix);
return Boolean.TRUE.equals(val);
} catch (Exception e) {
LOGGER.info("ping failed for " + System.currentTimeMillis());
return false;
}
}
}

Create and query a binary cache in ignite

I am trying to use BinaryObjects to create the cache at runtime. For example, instead of writing a pojo class such as Employee and configuring it as a cache value type, I need to be able to dynamically configure the cache with the field names and field types for the particular cache.
Here is some sample code:
public class EmployeeQuery {
public static void main(String[] args) throws Exception {
Ignition.setClientMode(true);
try (Ignite ignite = Ignition.start("examples/config/example-ignite.xml")) {
if (!ExamplesUtils.hasServerNodes(ignite))
return;
CacheConfiguration<Integer, BinaryObject> cfg = getbinaryCache("emplCache", 1);
ignite.destroyCache(cfg.getName());
try (IgniteCache<Integer, BinaryObject> emplCache = ignite.getOrCreateCache(cfg)) {
SqlFieldsQuery top5Qry = new SqlFieldsQuery("select * from Employee where salary > 500 limit 5", true);
while (true) {
QueryCursor<List<?>> top5qryResult = emplCache.query(top5Qry);
System.out.println(">>> Employees ");
List<List<?>> all = top5qryResult.getAll();
for (List<?> list : all) {
System.out.println("Top 5 query result : "+list.get(0) + " , "+ list.get(1) + " , " + list.get(2));
}
System.out.println("..... ");
Thread.sleep(5000);
}
}
finally {
ignite.destroyCache(cfg.getName());
}
}
}
private static QueryEntity createEmployeeQueryEntity() {
QueryEntity employeeEntity = new QueryEntity();
employeeEntity.setTableName("Employee");
employeeEntity.setValueType(BinaryObject.class.getName());
employeeEntity.setKeyType(Integer.class.getName());
LinkedHashMap<String, String> fields = new LinkedHashMap<>();
fields.put("id", Integer.class.getName());
fields.put("firstName", String.class.getName());
fields.put("lastName", String.class.getName());
fields.put("salary", Float.class.getName());
fields.put("gender", String.class.getName());
employeeEntity.setFields(fields);
employeeEntity.setIndexes(Arrays.asList(
new QueryIndex("id"),
new QueryIndex("firstName"),
new QueryIndex("lastName"),
new QueryIndex("salary"),
new QueryIndex("gender")
));
return employeeEntity;
}
public static CacheConfiguration<Integer, BinaryObject> getbinaryCache(String cacheName, int duration) {
CacheConfiguration<Integer, BinaryObject> cfg = new CacheConfiguration<>(cacheName);
cfg.setCacheMode(CacheMode.PARTITIONED);
cfg.setName(cacheName);
cfg.setStoreKeepBinary(true);
cfg.setAtomicityMode(CacheAtomicityMode.ATOMIC);
cfg.setIndexedTypes(Integer.class, BinaryObject.class);
cfg.setExpiryPolicyFactory(FactoryBuilder.factoryOf(new CreatedExpiryPolicy(new Duration(SECONDS, duration))));
cfg.setQueryEntities(Arrays.asList(createEmployeeQueryEntity()));
return cfg;
}
}
I am trying to configure the cache with the employeeId (Integer) as key and the whole employee record (BinaryObject) as value. When I run the above class, I get the following exception :
Caused by: org.h2.jdbc.JdbcSQLException: Table "EMPLOYEE" not found; SQL statement:
select * from "emplCache".Employee where salary > 500 limit 5
What am I doing wrong here? Is there anything more other than this line:
employeeEntity.setTableName("Employee");
Next, I am trying to stream data into the cache. Is this the right way to do it?
public class CsvStreamer {
public static void main(String[] args) throws IOException {
Ignition.setClientMode(true);
try (Ignite ignite = Ignition.start("examples/config/example-ignite.xml")) {
if (!ExamplesUtils.hasServerNodes(ignite))
return;
CacheConfiguration<Integer, BinaryObject> cfg = EmployeeQuery.getbinaryCache("emplCache", 1);
try (IgniteDataStreamer<Integer, BinaryObject> stmr = ignite.dataStreamer(cfg.getName())) {
while (true) {
InputStream in = new FileInputStream(new File(args[0]));
try (LineNumberReader rdr = new LineNumberReader(new InputStreamReader(in))) {
int count =0;
for (String line = rdr.readLine(); line != null; line = rdr.readLine()) {
String[] words = line.split(",");
BinaryObject emp = getBinaryObject(words);
stmr.addData(new Integer(words[0]), emp);
System.out.println("Sent data "+count++ +" , sal : "+words[6]);
}
}
}
}
}
}
private static BinaryObject getBinaryObject(String[] rawData) {
BinaryObjectBuilder builder = Ignition.ignite().binary().builder("Employee");
builder.setField("id", new Integer(rawData[0]));
builder.setField("firstName", rawData[1]);
builder.setField("lastName", rawData[2]);
builder.setField("salary", new Float(rawData[6]));
builder.setField("gender", rawData[4]);
BinaryObject binaryObj = builder.build();
return binaryObj;
}
}
Note: I am running this in cluster mode. Both EmployeeQuery and CsvStreamer I run from one machine, and I have ignite running in server mode in two other machines. Ideally I want to avoid the use of a pojo class in my application and make things as dynamic and generic as possible.
You are getting this exception because you didn't configure SQL scheme. In your case (you don't want to create pojo object and etc) I recommend to use SQL like syntacsis which was added to Apache Ignite since 2.0 version. I sure that the following example helps you with configuration: https://github.com/apache/ignite/blob/master/examples/src/main/java/org/apache/ignite/examples/datagrid/CacheQueryDdlExample.java

How to use IBM MobileFirst java adapter to update existing entity?

The jax-rs resource method can get JSON which is part of document.
My issue is that I have to update existing object (entity). So I decided on creating jax-rs ContainerRequestFilter. This filter has to get existing object, replace its properties with new one and put it back to stream. Therefore I hope that I get entity completely in my resource method.
At first I has to get data for authenticated user. But 'securityContext.getAuthenticatedUser()' returns partially provided JSON data?
Is there any possibility to get authenticated user data in jax-rs filter (on ibm MobileFirst platform)?
There is the code of my filter:
#Provider
//#ManagedBean
public class UpdateFilter implements ContainerRequestFilter {
//ReaderInterceptor {
//#Inject
//ExistingObjectDao existingObjectDao;
#Context
AdapterSecurityContext securityContext;
#Override
#OAuthSecurity(scope = "protected") //doesn't work
public void filter(ContainerRequestContext context) throws IOException {
//context.getSecurityContext().getUserPrincipal() // is null
AuthenticatedUser user = securityContext.getAuthenticatedUser(); //is null
Map<String, String> authParams = (Map<String, String>) user.getAttributes().get("lotusCredentials");
InputStream inputStream = context.getEntityStream();
byte[] bytes = new byte[inputStream.available()];
inputStream.read(bytes);
String responseContent = new String(bytes);
String id = context.getUriInfo().getPathParameters().getFirst("id");
Object existingObject = null;
try {
existingObject = existingObjectDao.get(id, authParams);
} catch (Exception e) {
e.printStackTrace();
}
if (existingObject != null) {
ObjectMapper objectMapper = new ObjectMapper();
ObjectReader reader = objectMapper.readerForUpdating(existingObject );
JsonNode r = reader.readTree(responseContent);
responseContent = objectMapper.writer().writeValueAsString(r);
}
context.setEntityStream(new ByteArrayInputStream(responseContent.getBytes()));
}
}

How To update google-cloud-dataflow running in app engine without clearing bigquery tables

I have a google-cloud-dataflow process running on the App-engine.
It listens to messages sent via pubsub and streams to big-query.
I updated my code and I am trying to rerun the app.
But I receive this error:
Exception in thread "main" java.lang.IllegalArgumentException: BigQuery table is not empty
Is there anyway to update data flow without deleting the table?
Since my code might change quite often, and I do not want to delete data in the table.
Here is my code:
public class MyPipline {
private static final Logger LOG = LoggerFactory.getLogger(BotPipline.class);
private static String name;
public static void main(String[] args) {
List<TableFieldSchema> fields = new ArrayList<>();
fields.add(new TableFieldSchema().setName("a").setType("string"));
fields.add(new TableFieldSchema().setName("b").setType("string"));
fields.add(new TableFieldSchema().setName("c").setType("string"));
TableSchema tableSchema = new TableSchema().setFields(fields);
DataflowPipelineOptions options = PipelineOptionsFactory.as(DataflowPipelineOptions.class);
options.setRunner(BlockingDataflowPipelineRunner.class);
options.setProject("my-data-analysis");
options.setStagingLocation("gs://my-bucket/dataflow-jars");
options.setStreaming(true);
Pipeline pipeline = Pipeline.create(options);
PCollection<String> input = pipeline
.apply(PubsubIO.Read.subscription(
"projects/my-data-analysis/subscriptions/myDataflowSub"));
input.apply(ParDo.of(new DoFn<String, Void>() {
#Override
public void processElement(DoFn<String, Void>.ProcessContext c) throws Exception {
LOG.info("json" + c.element());
}
}));
String fileName = UUID.randomUUID().toString().replaceAll("-", "");
input.apply(ParDo.of(new DoFn<String, String>() {
#Override
public void processElement(DoFn<String, String>.ProcessContext c) throws Exception {
JSONObject firstJSONObject = new JSONObject(c.element());
firstJSONObject.put("a", firstJSONObject.get("a").toString()+ "1000");
c.output(firstJSONObject.toString());
}
}).named("update json")).apply(ParDo.of(new DoFn<String, TableRow>() {
#Override
public void processElement(DoFn<String, TableRow>.ProcessContext c) throws Exception {
JSONObject json = new JSONObject(c.element());
TableRow row = new TableRow().set("a", json.get("a")).set("b", json.get("b")).set("c", json.get("c"));
c.output(row);
}
}).named("convert json to table row"))
.apply(BigQueryIO.Write.to("my-data-analysis:mydataset.mytable").withSchema(tableSchema)
);
pipeline.run();
}
}
You need to specify withWriteDisposition on your BigQueryIO.Write - see documentation of the method and of its argument. Depending on your requirements, you need either WRITE_TRUNCATE or WRITE_APPEND.

JSON mapping error in Spring integration

I have configured an inbound HTTP gateway, which takes POST requests (JSON) and does homework to return a JSON response, the request and response payload is the same POJO.
I created beans for Json converters as follows
#Bean
public Jackson2ObjectMapperBuilder jacksonBuilder() {
Jackson2ObjectMapperBuilder builder = new Jackson2ObjectMapperBuilder();
builder.indentOutput(true);
return builder;
}
#Bean
public List<HttpMessageConverter<?>> getConverters(){
List<HttpMessageConverter<?>> converters = new ArrayList<HttpMessageConverter<?>>();
converters.add(new MappingJackson2HttpMessageConverter(jacksonBuilder().build()));
return converters;
}
And then I wired them up to the gate way definition in same Java Config class, snippet as follows:
#Bean
public HttpRequestHandlingMessagingGateway gateway(){
HttpRequestHandlingMessagingGateway gateway = new HttpRequestHandlingMessagingGateway(true);
RequestMapping requestMapping = new RequestMapping();
requestMapping.setMethods(HttpMethod.POST);
requestMapping.setPathPatterns("/appliance/v1/status");
requestMapping.setConsumes("application/json");
requestMapping.setProduces("application/json");
gateway.setRequestMapping(requestMapping);
gateway.setRequestChannel(requestChannel());
gateway.setReplyChannel(replyChannel());
gateway.setMessageConverters(getConverters());
return gateway;
}
And the POJO for which I intend transform is pretty straight forward
public class ApplianceStatus {
private String gatewayId;
private String applianceId;
private boolean running;
public String getGatewayId() {
return gatewayId;
}
public void setGatewayId(String gatewayId) {
this.gatewayId = gatewayId;
}
public String getApplianceId() {
return applianceId;
}
public void setApplianceId(String applianceId) {
this.applianceId = applianceId;
}
public boolean isRunning() {
return running;
}
public void setRunning(boolean running) {
this.running = running;
}
}
However a POST Request to with Content-Type header set to application/json returns 400, the JSON I send is
{
"gatewayId": 1,
"applianceId": 123,
"running": false
}
I get the response
{
"timestamp" : 1434615561240,
"status" : 400,
"error" : "Bad Request",
"exception" : "org.springframework.http.converter.HttpMessageNotReadableException",
"message" : "Bad Request",
"path" : "/appliance/v1/status"
}
and In logs
2015-06-18 14:55:30.501 DEBUG 3447 --- [tp1023996917-22] .w.s.m.a.ResponseStatusExceptionResolver : Resolving exception from handler [gateway]: org.springframework.http.converter.HttpMessageNotReadableException: Could not read JSON: Can not deserialize instance of byte[] out of START_OBJECT token
at [Source: HttpInputOverHTTP#55c2d2c5; line: 1, column: 1]; nested exception is com.fasterxml.jackson.databind.JsonMappingException: Can not deserialize instance of byte[] out of START_OBJECT token
at [Source: HttpInputOverHTTP#55c2d2c5; line: 1, column: 1]
2015-06-18 14:55:30.501 DEBUG 3447 --- [tp1023996917-22] .w.s.m.s.DefaultHandlerExceptionResolver : Resolving exception from handler [gateway]: org.springframework.http.converter.HttpMessageNotReadableException: Could not read JSON: Can not deserialize instance of byte[] out of START_OBJECT token
at [Source: HttpInputOverHTTP#55c2d2c5; line: 1, column: 1]; nested exception is com.fasterxml.jackson.databind.JsonMappingException: Can not deserialize instance of byte[] out of START_OBJECT token
at [Source: HttpInputOverHTTP#55c2d2c5; line: 1, column: 1]
The problem was due to request payload type not set on gateway
gateway.setRequestPayloadType(ApplianceStatus.class);
That solved it.