spring data redis reactive read LocaldateTime cast error - jackson

I use spring Data Redis Reactive framework and spring Boot version is 2.3.0.
Here is my Redis configuration:
#Bean("reactiveRedisTemplate")
public ReactiveRedisTemplate<String, Object> reactiveRedisTemplateString(ReactiveRedisConnectionFactory connectionFactory) {
Jackson2JsonRedisSerializer jacksonSerializer = new Jackson2JsonRedisSerializer<>(Object.class);
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.setVisibility(PropertyAccessor.ALL, JsonAutoDetect.Visibility.ANY);
objectMapper.enableDefaultTyping(ObjectMapper.DefaultTyping.NON_FINAL);
objectMapper.disable(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS);
JavaTimeModule timeModule = new JavaTimeModule();
timeModule.addSerializer(Date.class, new DateSerializer(Boolean.FALSE, new SimpleDateFormat("yyyy-MM-dd HH:mm:ss")));
timeModule.addSerializer(LocalDateTime.class, new LocalDateTimeSerializer(DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss")));
timeModule.addSerializer(LocalDate.class, new LocalDateSerializer(DateTimeFormatter.ofPattern("yyyy-MM-dd")));
timeModule.addSerializer(LocalTime.class, new LocalTimeSerializer(DateTimeFormatter.ofPattern("HH:mm:ss")));
timeModule.addDeserializer(LocalDateTime.class, new LocalDateTimeDeserializer(DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss")));
timeModule.addDeserializer(LocalDate.class, new LocalDateDeserializer(DateTimeFormatter.ofPattern("yyyy-MM-dd")));
timeModule.addDeserializer(LocalTime.class, new LocalTimeDeserializer(DateTimeFormatter.ofPattern("HH:mm:ss")));
objectMapper.setDefaultPropertyInclusion(JsonInclude.Include.NON_NULL);
objectMapper.activateDefaultTyping(LaissezFaireSubTypeValidator.instance,
ObjectMapper.DefaultTyping.NON_FINAL, JsonTypeInfo.As.PROPERTY);
objectMapper.registerModule(timeModule).registerModule(new ParameterNamesModule()).registerModules(ObjectMapper.findModules());
jacksonSerializer.setObjectMapper(objectMapper);
final StringRedisSerializer stringRedisSerializer = new StringRedisSerializer();
RedisSerializationContext<String , Object> serializationContext = RedisSerializationContext
.<String, Object>newSerializationContext()
.key(stringRedisSerializer)
.value(jacksonSerializer)
.hashKey(stringRedisSerializer)
.hashValue(jacksonSerializer)
.build();
return new ReactiveRedisTemplate(connectionFactory, serializationContext);
}
I set up a LocalDateTime object with the hashoperations.put method to display the following form in Redis:
key hashKey "2020-06-23 16:42:44"
When I use Mono<LocalDateTime>Mono = hashoperations.get(key,hashKey) to get the value, the following exception occurs:
java.lang.ClassCastException: class java.lang.String cannot be cast to class java.time.LocalDateTime (java.lang.String and java.time.LocalDateTime are in module java.base of loader 'bootstrap')
at reactor.core.publisher.FluxFilter$FilterSubscriber.onNext(FluxFilter.java:93)
at reactor.core.publisher.MonoNext$NextSubscriber.onNext(MonoNext.java:76)
But when LocalDateTime is a property of an object, there is no problem.
I don't know how to solve it. Thank you for all the answers.

I've solved it
change DefaultTyping.NON_FINAL to DefaultTyping.EVERYTHING
This will take the type of java.time.LocalDateTime

Related

Error when running my first pact-jvm test

I'm new to contract Testing Automation and I've written my first test using jvm-pact. I'm using junit5.
Below is the code
#ExtendWith(PactConsumerTestExt.class) #PactTestFor(providerName = "testProvider", port = "8081") public class ConsumerTests {
public static final String EXPECTED_BODY = "/integration/stubs/team_members/SingleTeamMember.json";
#Pact(consumer = "testConsumer" , provider="testProvider")
public RequestResponsePact singleTeamMemberSuccess(PactDslWithProvider builder) {
Map<String, String> headers = new HashMap<>();
headers.put("Content-Type", "application/json");
return builder
.given("I have at least one team member")
.uponReceiving("a request for a single team member")
.path("/team-members/1")
.method("GET")
.willRespondWith()
.status(200)
.headers(headers)
.body(EXPECTED_BODY)
.toPact();
}
#Test
#PactTestFor(pactMethod = "singleTeamMemberSuccess")
void testSingleTeamMemberSuccess(MockServer mockServer) throws IOException {
HttpResponse httpResponse = (HttpResponse) Request.Get(mockServer.getUrl() + "/team-members/1")
.execute().returnResponse();
assertThat(httpResponse.getStatusLine().getStatusCode(), is(equalTo(200)));
//assertThat(httpResponse.getEntity().getContent(), is(equalTo(TeamMemberSingle200.EXPECTED_BODY_SINGLE_TEAM_MEMBER)) );
}
I'm getting below error on running mvn install
ConsumerTests The following methods annotated with #Pact were not executed during the test: ConsumerTests.singleTeamMemberSuccess If these are currently a work in progress, and a #Disabled annotation to the method
[ERROR] ConsumerTests.singleTeamMemberSuccess:42 » NoClassDefFound Could not initialize class org.codehaus.groovy.reflection.ReflectionCache
Please can someone take a look and advise if I'm missing anything important to run the test successfully.
Thanks,
Poonam

Collect list of Integer (List<Integer>) to map with Java 8 Stream API

I tried to convert a simple List<Integer> to a Map using Java 8 stream API and got the following compile time error:
The method toMap(Function<? super T,? extends K>, Function<? super T,?
extends U>) in the type Collectors is not applicable for the arguments
(Function<Object,Object>, boolean)
My code:
ArrayList<Integer> m_list = new ArrayList<Integer>();
m_list.add(1);
m_list.add(2);
m_list.add(3);
m_list.add(4);
Map<Integer, Boolean> m_map = m_list.stream().collect(
Collectors.toMap(Function.identity(), true));
I also tried the second method below but got the same error.
Map<Integer, Boolean> m_map = m_list.stream().collect(
Collectors.toMap(Integer::intValue, true));
What is the correct way to do this using Java 8 stream API?
You are passing a boolean for the value mapper. You should pass a Function<Integer,Boolean>.
It should be:
Map<Integer, Boolean> m_map = m_list.stream().collect(
Collectors.toMap(Function.identity(), e -> true));

Hazelcast No DataSerializerFactory registered for namespace: 0 on standalone process

Trying to set a HazelCast cluster with tcp-ip enabled on a standalone process.
My class looks like this
public class Person implements Serializable{
private static final long serialVersionUID = 1L;
int personId;
String name;
Person(){};
//getters and setters
}
Hazelcast is loaded as
final Config config = createNewConfig(mapName);
HazelcastInstance node = Hazelcast.newHazelcastInstance(config);`
Config createNewConfig(mapName){
final PersonStore personStore = new PersonStore();
XmlConfigBuilder configBuilder = new XmlConfigBuilder();
Config config = configBuilder.build();
config.setClassLoader(LoadAll.class.getClassLoader());
MapConfig mapConfig = config.getMapConfig(mapName);
MapStoreConfig mapStoreConfig = new MapStoreConfig();
mapStoreConfig.setImplementation(personStore);
return config;
}
and my myhazelcast config has this
<tcp-ip enabled="true">
<member>machine-1</member>
<member>machine-2</member>
</tcp-ip>
Do I need to populate this tag in my xml?
I get this error when a second instance is brought up
com.hazelcast.nio.serialization.HazelcastSerializationException: No DataSerializerFactory registered for namespace: 0
2275 at com.hazelcast.nio.serialization.DataSerializer.read(DataSerializer.java:98)
2276 at com.hazelcast.nio.serialization.DataSerializer.read(DataSerializer.java:39)
2277 at com.hazelcast.nio.serialization.StreamSerializerAdapter.read(StreamSerializerAdapter.java:41)
2278 at com.hazelcast.nio.serialization.SerializationServiceImpl.toObject(SerializationServiceImpl.java:276)
Any help is highly appericiated.
Solved my problem, I had a pom.xml with hazelcast-wm so I did not have actual hazelcast jar in my bundled jar. Including that fixed my issue.
Note that this same "No DataSerializerFactory registered for namespace: 0" error message can also occur in an OSGi environment when you're attempting to use more than one Hazelcast instance within the same VM, but initializing the instances from different bundles. The reason being that the com.hazelcast.util.ServiceLoader.findHighestReachableClassLoader() method will sometimes pick the wrong class loader during Hazelcast initialization (as it won't always pick the class loader you set on the config), and then it ends up with an empty list of DataSerializerFactory instances (hence causing the error message that it can't find the requested factory with id 0). The following shows a way to work around that problem by taking advantage of Java's context class loader:
private HazelcastInstance createHazelcastInstance() {
// Use the following if you're only using the Hazelcast data serializers
final ClassLoader classLoader = Hazelcast.class.getClassLoader();
// Use the following if you have custom data serializers that you need
// final ClassLoader classLoader = this.getClass().getClassLoader();
final com.hazelcast.config.Config config = new com.hazelcast.config.Config();
config.setClassLoader(classLoader);
final ClassLoader previousContextClassLoader = Thread.currentThread().getContextClassLoader();
try {
Thread.currentThread().setContextClassLoader(classLoader);
return Hazelcast.newHazelcastInstance(config);
} finally {
if(previousContextClassLoader != null) {
Thread.currentThread().setContextClassLoader(previousContextClassLoader);
}
}
}

How to generate JSONP in Badgerfish format

I'm trying to create a controller for Spring MVC that will return a JSONP in Badgerfish format. My code currently creates the JSONP correctly using Jackson, but I do not know how to specify Badgerfish format. Assuming that callback is the name of the callback function and summary is my jaxb object, then my code is currently
ObjectMapper objectMapper = new ObjectMapper();
return objectMapper.writeValueAsString(new JSONPObject(callback,summary));
Is there any way to do this using Jackson or I have to use another framework? I have found an approach to generate Badgerfish using RestEasy, but only for JSON.
I actually managed to solve this with Jettison (I did not find a way to do this with Jackson). The required code is
Marshaller marshaller = null;
Writer writer = new StringWriter();
AbstractXMLStreamWriter xmlStreamWriter = new BadgerFishXMLStreamWriter(writer);
try {
marshaller = jaxbContextSummary.createMarshaller();
marshaller.marshal(myObject, xmlStreamWriter);
} catch (JAXBException e) {
logger.error("Could not construct JSONP response", e);
}

OutOfMemory while using Jackson 1.9

I am using Jackson 1.9. in my web application wherein I require to convert complex objects e.g Spring’s ModelMap, BindingResult, java.uil.Map to JSON String objects.
Please consider the following code snippet where I am attempting one such conversion:
Map<String, Object> methodArgsMap = new HashMap<String, Object>();
methodArgsMap.put("map", map);/*map is an instance of org.springframework.ui.ModelMap*/
methodArgsMap.put("command", command);/*command is an instance of a custom POJO viz.ReportBeanParam*/
methodArgsMap.put("result", result);/*result is an instance of org.springframework.validation.BindingResult*/
The method is JSONProcessUtil. getObjectsAsJSONString(...) implemented as follows :
public final class JSONProcessUtil {
private static ObjectMapper objectMapper;
static {
objectMapper = new ObjectMapper();
/*Start : Configs. suggested by Jackson docs to avoid OutOfMemoryError*/
SerializationConfig serConfig = objectMapper.getSerializationConfig();
serConfig.disable(SerializationConfig.Feature.FAIL_ON_EMPTY_BEANS);
objectMapper.getJsonFactory().configure(
JsonParser.Feature.INTERN_FIELD_NAMES, false);
objectMapper.getJsonFactory().configure(
JsonParser.Feature.CANONICALIZE_FIELD_NAMES, false);
/*End : Configs. suggested by Jackson docs to avoid OutOfMemoryError*/
}
public static Map<String, String> getObjectsAsJSONString(
Map<String, Object> argsMap) throws JsonGenerationException,
JsonMappingException, IOException {
log.info("Source app.In JSONProcessUtil.getObjectsAsJSONString(...)");
Map<String, String> jsonStrMap = null;
if (!(argsMap == null || argsMap.isEmpty())) {
jsonStrMap = new HashMap<String, String>();
Set<String> keySet = argsMap.keySet();
Iterator<String> iter = keySet.iterator();
String argName = null;
while (iter.hasNext()) {
argName = iter.next();
log.info("Source app. argName = {}, arg = {} ", argName,
argsMap.get(argName));
jsonStrMap.put(argName,
objectMapper.writeValueAsString(argsMap.get(argName)));/*The line giving error*/
log.info("Proceeding to the next arg !");
}
}
log.info("Source app. Exit from JSONProcessUtil.getObjectsAsJSONString(...)");
return jsonStrMap;
}
}
I am getting an OutOfMemoryError as follows :
INFO [http-8080-7] (JSONProcessUtil.java:73) - Source app. argName = result, arg = org.springframework.validation.BeanPropertyBindingResult: 0 errors DEBUG [http-8080-7] (SecurityContextPersistenceFilter.java:89) - SecurityContextHolder now cleared, as request processing completed Feb 20, 2012 5:03:30 PM org.apache.catalina.core.StandardWrapperValve invoke
SEVERE: Servlet.service() for servlet saas threw exception
java.lang.OutOfMemoryError: Java heap space
at org.codehaus.jackson.util.TextBuffer._charArray(TextBuffer.java:
674)
at org.codehaus.jackson.util.TextBuffer.expand(TextBuffer.java:633)
at org.codehaus.jackson.util.TextBuffer.append(TextBuffer.java:438)
at org.codehaus.jackson.io.SegmentedStringWriter.write(SegmentedStringWriter.java:69)
at org.codehaus.jackson.impl.WriterBasedGenerator._flushBuffer(WriterBasedGenerator.java:1810)
at org.codehaus.jackson.impl.WriterBasedGenerator._writeFieldName(WriterBasedGenerator.java:345)
at org.codehaus.jackson.impl.WriterBasedGenerator.writeFieldName(WriterBasedGenerator.java:217)
at org.codehaus.jackson.map.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:426)
at org.codehaus.jackson.map.ser.BeanSerializer.serializeFields(BeanSerializer.java:175)
at org.codehaus.jackson.map.ser.BeanSerializer.serialize(BeanSerializer.java:142)
at org.codehaus.jackson.map.ser.impl.ObjectArraySerializer.serializeContents(ObjectArraySerializer.java:121)
at org.codehaus.jackson.map.ser.impl.ObjectArraySerializer.serializeContents(ObjectArraySerializer.java:28)
at org.codehaus.jackson.map.ser.ArraySerializers$AsArraySerializer.serialize(ArraySerializers.java:56)
at org.codehaus.jackson.map.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:428)
at org.codehaus.jackson.map.ser.BeanSerializer.serializeFields(BeanSerializer.java:175)
at org.codehaus.jackson.map.ser.BeanSerializer.serialize(BeanSerializer.java:142)
at org.codehaus.jackson.map.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:428)
at org.codehaus.jackson.map.ser.BeanSerializer.serializeFields(BeanSerializer.java:175)
at org.codehaus.jackson.map.ser.BeanSerializer.serialize(BeanSerializer.java:142)
at org.codehaus.jackson.map.ser.MapSerializer.serializeFields(MapSerializer.java:287)
at org.codehaus.jackson.map.ser.MapSerializer.serialize(MapSerializer.java:212)
at org.codehaus.jackson.map.ser.MapSerializer.serialize(MapSerializer.java:23)
at org.codehaus.jackson.map.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:428)
at org.codehaus.jackson.map.ser.BeanSerializer.serializeFields(BeanSerializer.java:175)
at org.codehaus.jackson.map.ser.BeanSerializer.serialize(BeanSerializer.java:142)
at org.codehaus.jackson.map.ser.MapSerializer.serializeFields(MapSerializer.java:287)
at org.codehaus.jackson.map.ser.MapSerializer.serialize(MapSerializer.java:212)
at org.codehaus.jackson.map.ser.MapSerializer.serialize(MapSerializer.java:23)
at org.codehaus.jackson.map.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:428)
at org.codehaus.jackson.map.ser.BeanSerializer.serializeFields(BeanSerializer.java:175)
at org.codehaus.jackson.map.ser.BeanSerializer.serialize(BeanSerializer.java:142)
at org.codehaus.jackson.map.ser.MapSerializer.serializeFields(MapSerializer.java:287)
Please guide about resolving the same.
Thanks and regards !
Sounds like you are producing a huge JSON output, which gets buffered in memory.
This based on error message.
Your choices are either:
Use streaming output to avoid buffering it in memory (however, I am not sure if Spring allows you to do this), or
Increase heap size so you have enough memory
Features to disable interning and canonicalization are only relevant for parsing, and you are generating JSON, not parsing.