jackson deserialization anysetter NPE - jackson

#JsonInclude(JsonInclude.Include.NON_NULL)
public class Payer{
private String name;
#JsonIgnore
private Map<String, Object> additionalProperties = new HashMap<String, Object>();
#JsonAnyGetter
public Map<String, Object> getAdditionalProperties() {
return this.additionalProperties;
}
#JsonAnySetter
public void setAdditionalProperty(String name, Object value) {
this.additionalProperties.put(name, value);
}
}
When I use objectmepper.readValue(json_string, Payer.class) with following json string:
{
"name": "fakeName",
"state": "verifird"
}
I get NPE. Since I have #JsonAnySetter, the state string should be put into additionalProperties, I'd like to know why do I get NPE here?

Include getter and setter for name in Payer class:
#JsonInclude(JsonInclude.Include.NON_NULL)
class Payer{
private String name;
#JsonIgnore
private Map<String, Object> additionalProperties = new HashMap<String, Object>();
#JsonAnyGetter
public Map<String, Object> getAdditionalProperties() {
return this.additionalProperties;
}
#JsonAnySetter
public void setAdditionalProperty(String name, Object value) {
this.additionalProperties.put(name, value);
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
}
Parse the Json:
public static void main(String[] args) throws JsonParseException, JsonMappingException, IOException {
ObjectMapper mapper = new ObjectMapper();
String json_string = "{\n" +
" \"name\": \"fakeName\",\n" +
" \"state\": \"verifird\"\n" +
"}";
Payer readValue = mapper.readValue(json_string , Payer.class);
System.out.println(readValue.getName());
System.out.println(readValue.getAdditionalProperties());
}
Output:
fakeName
{state=verifird}

Related

Jackson deserialiser affected by the Deadbolt "restrict" annotation

I receiving an exception Could not resolve type id 'path.to.MyClass' as a subtype of [simple type, class java.lang.Object]: no such class found on play 2.7 server with DeadBolt (2.6.3 and 2.7.0) when I try deserialise JSON to Map<String, MyClass> inside of route action with a #Restrict annotation. All works fine if Remove this annotation.
MyClass.java
#JsonTypeInfo(use = JsonTypeInfo.Id.CLASS, property = "class")
public class MyClass implements Serializable {
public String name;
public Integer age;
public MyClass(){}
public MyClass(String name, Integer age){
this.name = name;
this.age = age;
}
}
serialise Map<String, MyClass>
Map<String, MyClass> value = new HashMap<>();
value.put("first", new MyClass("Bob",10));
value.put("second", new MyClass("Rob",20));
ObjectMapper mapper = Json.newDefaultMapper();
mapper.enableDefaultTypingAsProperty(ObjectMapper.DefaultTyping.OBJECT_AND_NON_CONCRETE, "class");
String json = null;
try {
json = mapper.writeValueAsString(value);
} catch (JsonProcessingException e) {
e.printStackTrace();
}
output
{
"first":{
"class":"path.to.MyClass",
"name":"Bob",
"age":10
},
"second":{
"class":"path.to.MyClass",
"name":"Rob",
"age":20
}
}
JSON format looks so because of backward compatibility with old server which use old FlexJson.
deserialise
#Restrict({#Group({"Admin"})})
public CompletionStage<Result> action(long id) {
String json = getJsonFromStorage();
Map<String, MyClass> result = new HashMap<>();
try {
ObjectMapper mapper = new ObjectMapper();
mapper.enableDefaultTypingAsProperty(ObjectMapper.DefaultTyping.OBJECT_AND_NON_CONCRETE, "class");
JsonFactory factory = mapper.getFactory();
JsonParser parser = factory.createParser(new ByteArrayInputStream(json.getBytes(Charset.forName("UTF-8"))));
JavaType type = mapper.getTypeFactory().constructType(result.getClass());
t = mapper.readValue(parser, type);
} catch (IOException e) {
e.printStackTrace();
}
return ok("ok")
}
I have a temporary solution. I override class loader for current thread to class loader from play.Environment
public class MyController extends Controller {
#Inject
private Environment environment;
#Restrict({#Group({"Admin"})})
public CompletionStage<Result> action(long id) {
Thread.currentThread().setContextClassLoader(environment.classLoader());
String json = getJsonFromStorage();
Map<String, MyClass> result = new HashMap<>();
try {
ObjectMapper mapper = new ObjectMapper();
mapper.enableDefaultTypingAsProperty(ObjectMapper.DefaultTyping.OBJECT_AND_NON_CONCRETE, "class");
JsonFactory factory = mapper.getFactory();
JsonParser parser = factory.createParser(new ByteArrayInputStream(json.getBytes(Charset.forName("UTF-8"))));
JavaType type = mapper.getTypeFactory().constructType(result.getClass());
t = mapper.readValue(parser, type);
} catch (IOException e) {
e.printStackTrace();
}
return ok("ok")
}
}

Spring Cloud: testing S3 client with TestContainters

I use Spring Cloud's ResourceLoader to access S3, e.g.:
public class S3DownUpLoader {
private final ResourceLoader resourceLoader;
#Autowired
public S3DownUpLoader(ResourceLoader resourceLoader) {
this.resourceLoader = resourceLoader;
}
public String storeOnS3(String filename, byte[] data) throws IOException {
String location = "s3://" + bucket + "/" + filename;
WritableResource writeableResource = (WritableResource) this.resourceLoader.getResource(location);
FileCopyUtils.copy( data, writeableResource.getOutputStream());
return filename;
}
It works okey and I need help to test the code with Localstack/Testcontainers. I've tried following test, but it does not work - my production profile gets picked up(s3 client with localstack config is not injected):
#RunWith(SpringRunner.class)
#SpringBootTest
public class S3DownUpLoaderTest {
#ClassRule
public static LocalStackContainer localstack = new LocalStackContainer().withServices(S3);
#Autowired
S3DownUpLoader s3DownUpLoader;
#Test
public void testA() {
s3DownUpLoader.storeOnS3(...);
}
#TestConfiguration
#EnableContextResourceLoader
public static class S3Configuration {
#Primary
#Bean(destroyMethod = "shutdown")
public AmazonS3 amazonS3() {
return AmazonS3ClientBuilder
.standard()
.withEndpointConfiguration(localstack.getEndpointConfiguration(S3))
.withCredentials(localstack.getDefaultCredentialsProvider())
.build();
}
}
}
as we discussed on GitHub,
We solve this problem in a slightly different way. I've actually never seen the way you use the WritableResource, which looks very interesting. None the less, this is how we solve this issue:
#RunWith(SpringRunner.class)
#SpringBootTest(properties = "spring.profiles.active=test")
#ContextConfiguration(classes = AbstractAmazonS3Test.S3Configuration.class)
public abstract class AbstractAmazonS3Test {
private static final String REGION = Regions.EU_WEST_1.getName();
/**
* Configure S3.
*/
#TestConfiguration
public static class S3Configuration {
#Bean
public AmazonS3 amazonS3() {
//localstack docker image is running locally on port 4572 for S3
final String serviceEndpoint = String.format("http://%s:%s", "127.0.0.1", "4572");
return AmazonS3Client.builder()
.withEndpointConfiguration(new AwsClientBuilder.EndpointConfiguration(serviceEndpoint, REGION))
.withCredentials(new AWSStaticCredentialsProvider(new BasicAWSCredentials("dummyKey", "dummySecret")))
.build();
}
}
}
And a sample test:
public class CsvS3UploadServiceIntegrationTest extends AbstractAmazonS3Test {
private static final String SUCCESS_CSV = "a,b";
private static final String STANDARD_STORAGE = "STANDARD";
#Autowired
private AmazonS3 s3;
#Autowired
private S3ConfigurationProperties properties;
#Autowired
private CsvS3UploadService service;
#Before
public void setUp() {
s3.createBucket(properties.getBucketName());
}
#After
public void tearDown() {
final String bucketName = properties.getBucketName();
s3.listObjects(bucketName).getObjectSummaries().stream()
.map(S3ObjectSummary::getKey)
.forEach(key -> s3.deleteObject(bucketName, key));
s3.deleteBucket(bucketName);
}
#Test
public void uploadSuccessfulCsv() {
service.uploadSuccessfulCsv(SUCCESS_CSV);
final S3ObjectSummary s3ObjectSummary = getOnlyFileFromS3();
assertThat(s3ObjectSummary.getKey(), containsString("-success.csv"));
assertThat(s3ObjectSummary.getETag(), is("b345e1dc09f20fdefdea469f09167892"));
assertThat(s3ObjectSummary.getStorageClass(), is(STANDARD_STORAGE));
assertThat(s3ObjectSummary.getSize(), is(3L));
}
private S3ObjectSummary getOnlyFileFromS3() {
final ObjectListing listing = s3.listObjects(properties.getBucketName());
final List<S3ObjectSummary> objects = listing.getObjectSummaries();
assertThat(objects, iterableWithSize(1));
return Iterables.getOnlyElement(objects);
}
}
And the code under test:
#Service
#RequiredArgsConstructor
#EnableConfigurationProperties(S3ConfigurationProperties.class)
public class CsvS3UploadServiceImpl implements CsvS3UploadService {
private static final String CSV_MIME_TYPE = CSV_UTF_8.toString();
private final AmazonS3 amazonS3;
private final S3ConfigurationProperties properties;
private final S3ObjectKeyService s3ObjectKeyService;
#Override
public void uploadSuccessfulCsv(final String source) {
final String key = s3ObjectKeyService.getSuccessKey();
doUpload(source, key, getObjectMetadata(source));
}
private void doUpload(final String source, final String key, final ObjectMetadata metadata) {
try (ReaderInputStream in = new ReaderInputStream(new StringReader(source), UTF_8)) {
final PutObjectRequest request = new PutObjectRequest(properties.getBucketName(), key, in, metadata);
amazonS3.putObject(request);
} catch (final IOException ioe) {
throw new CsvUploadException("Unable to upload " + key, ioe);
}
}
private ObjectMetadata getObjectMetadata(final String source) {
final ObjectMetadata metadata = new ObjectMetadata();
metadata.setContentType(CSV_MIME_TYPE);
metadata.setContentLength(source.getBytes(UTF_8).length);
metadata.setContentMD5(getMD5ChecksumAsBase64(source));
metadata.setSSEAlgorithm(SSEAlgorithm.KMS.getAlgorithm());
return metadata;
}
private String getMD5ChecksumAsBase64(final String source) {
final HashCode md5 = Hashing.md5().hashString(source, UTF_8);
return Base64.getEncoder().encodeToString(md5.asBytes());
}
}
It seems the only way to provide custom amazonS3 bean for ResourceLoader is to inject it manually. The test looks like
#RunWith(SpringRunner.class)
#SpringBootTest
#ContextConfiguration(classes = S3DownUpLoaderTest.S3Configuration.class)
public class S3DownUpLoaderTest implements ApplicationContextAware {
private static final String BUCKET_NAME = "bucket";
#ClassRule
public static LocalStackContainer localstack = new LocalStackContainer().withServices(S3);
#Autowired
S3DownUpLoader s3DownUpLoader;
#Autowired
SimpleStorageProtocolResolver resourceLoader;
#Autowired
AmazonS3 amazonS3;
#Before
public void setUp(){
amazonS3.createBucket(BUCKET_NAME);
}
#Test
public void someTestA() throws IOException {
....
}
#After
public void tearDown(){
ObjectListing object_listing = amazonS3.listObjects(QLM_BUCKET_NAME);
while (true) {
for (S3ObjectSummary summary : object_listing.getObjectSummaries()) {
amazonS3.deleteObject(BUCKET_NAME, summary.getKey());
}
// more object_listing to retrieve?
if (object_listing.isTruncated()) {
object_listing = amazonS3.listNextBatchOfObjects(object_listing);
} else {
break;
}
};
amazonS3.deleteBucket(BUCKET_NAME);
}
#Override
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
if (applicationContext instanceof ConfigurableApplicationContext) {
ConfigurableApplicationContext configurableApplicationContext = (ConfigurableApplicationContext) applicationContext;
configurableApplicationContext.addProtocolResolver(this.resourceLoader);
}
}
public static class S3Configuration {
#Bean
public S3DownUpLoader s3DownUpLoader(ResourceLoader resourceLoader){
return new S3DownUpLoader(resourceLoader);
}
#Bean(destroyMethod = "shutdown")
public AmazonS3 amazonS3() {
return AmazonS3ClientBuilder
.standard()
.withEndpointConfiguration(localstack.getEndpointConfiguration(S3))
.withCredentials(localstack.getDefaultCredentialsProvider())
.build();
}
#Bean
public SimpleStorageProtocolResolver resourceLoader(){
return new SimpleStorageProtocolResolver(amazonS3());
}
}

Orika convert String to Date when map data from HashMap to bean

I use orika 1.5.2 . when map data from HashMap to bean with convert String to java.util.Date, I get the Exception:
java.lang.ClassCastException: java.lang.String cannot be cast to java.util.Date
DateToStringConverter also not take effect. But the exception not come out when I map data from Bean A to Bean B.
How can I convert String to Date when I map data from a Map to bean?
code:
public class UserA {
private Date birthday;
public Date getBirthday() {
return birthday;
}
public void setBirthday(Date birthday) {
this.birthday = birthday;
}
#Override
public String toString() {
return "UserA [birthday=" + birthday + "]";
}
}
public static void main(String[] args){
HashMap map = new HashMap();
map.put("birthday", "2014-04-28");
UserA ua = new UserA();
MapperFactory mapF = new DefaultMapperFactory.Builder().mapNulls(false).build();
mapF.getConverterFactory().registerConverter(new DateToStringConverter("yyyy-MM-dd"));
mapF.getMapperFacade().map(map, ua);
}
exception:
Exception
U need Object to Date converter, not DateToStringConverter
user:
#Data
public class UserA {
private Date birthday;
}
add Converter
public class ObjectToDateConverter extends CustomConverter<Object, Date> {
#Override
public Date convert(Object source, Type<? extends Date> destinationType, MappingContext context) {
if (source instanceof String) {
try {
return new SimpleDateFormat("yyyy-MM-dd", Locale.getDefault()).parse((String) source);
} catch (ParseException e) {
return null;
}
}
return null;
}
}
test:
#Test
public void testMap(){
HashMap map = new HashMap();
map.put("birthday", "2018-10-14");
UserA ua = new UserA();
MapperFactory mapF = new DefaultMapperFactory.Builder().mapNulls(false).build();
mapF.getConverterFactory().registerConverter(new ObjectToDateConverter());
mapF.getMapperFacade().map(map, ua);
}

How Test PUT RestController in Spring Boot

How can I test one PUT request with Spring Boot??
I have this method:
#RequestMapping(method = RequestMethod.PUT, value = "/")
public NaturezaTitulo save(#RequestBody NaturezaTitulo naturezaTitulo){
return naturezaTituloService.save(naturezaTitulo);
}
and this test class:
#RunWith(SpringJUnit4ClassRunner.class)
#SpringApplicationConfiguration(classes = Application.class)
#WebAppConfiguration
public class NaturezaTituloControllerTest {
private MediaType contentType = new MediaType(MediaType.APPLICATION_JSON.getType(),
MediaType.APPLICATION_JSON.getSubtype(),
Charset.forName("utf8"));
private MockMvc mockMvc;
private HttpMessageConverter mappingJackson2HttpMessageConverter;
private List<NaturezaTitulo> naturezaTituloList = new ArrayList<>();
#Autowired
private WebApplicationContext webApplicationContext;
#Autowired
void setConverters(HttpMessageConverter<?>[] converters) {
this.mappingJackson2HttpMessageConverter = Arrays.asList(converters).stream().filter(
hmc -> hmc instanceof MappingJackson2HttpMessageConverter).findAny().get();
Assert.assertNotNull("the JSON message converter must not be null",
this.mappingJackson2HttpMessageConverter);
}
#Before
public void setup() throws Exception {
this.mockMvc = webAppContextSetup(webApplicationContext).build();
}
#Test
public void naturezaTituloNotFound() throws Exception {
mockMvc.perform(get("/naturezatitulo/55ce2dd6222e629f4b8d6fe0"))
.andExpect(status().is4xxClientError());
}
#Test
public void naturezaTituloSave() throws Exception {
NaturezaTitulo naturezaTitulo = new NaturezaTitulo();
naturezaTitulo.setNatureza("Testando");
mockMvc.perform(put("/naturezatitulo/").content(this.json(naturezaTitulo))
.contentType(contentType))
.andExpect(jsonPath("$.id", notNullValue()));
}
protected String json(Object o) throws IOException {
MockHttpOutputMessage mockHttpOutputMessage = new MockHttpOutputMessage();
this.mappingJackson2HttpMessageConverter.write(
o, MediaType.APPLICATION_JSON, mockHttpOutputMessage);
return mockHttpOutputMessage.getBodyAsString();
}
}
but I got this error:
java.lang.IllegalArgumentException: json can not be null or empty at
com.jayway.jsonpath.internal.Utils.notEmpty(Utils.java:259)
how can I pass one object from body in Put test?
tks

Hazelcast: Does Portable Serialization needs objects to be shared between client and server?

I am getting the below exception:
Could not find PortableFactory for factory-id: 1
com.hazelcast.nio.serialization.HazelcastSerializationException: Could
not find PortableFactory for factory-id: 1
On the client side I have the following code:
public class ClientTest {
public static void main(String[] args) {
List<String> nodes = new ArrayList<String>();
nodes.add("localhost:5701");
ClientConfig clientConfig = new ClientConfig();
ClientNetworkConfig networkConfig = new ClientNetworkConfig();
networkConfig.setAddresses(nodes);
clientConfig.setNetworkConfig(networkConfig);
SerializationConfig serCong = clientConfig.getSerializationConfig();
serCong.addPortableFactory(1, new UserFactoryImpl());
serCong.setPortableVersion(1);
HazelcastInstance hzClient1 = HazelcastClient.newHazelcastClient(clientConfig);
IMap<String, User> map = hzClient1.getMap("user");
System.out.println(map.size() + "hiten");
User user1 = new User();
user1.setFirstName("hiten");
user1.setLastName("singh");
map.put("1", user1);
//hz1.getLifecycleService().terminate();
System.out.println(map.size() + "after");
User user2 = new User();
user2.setFirstName("hiten1");
user2.setLastName("singh1");
map.put("2", user2);
UserEntryProcessor entryProc = new UserEntryProcessor();
User userRes = (User)map.executeOnKey("1", entryProc);
}
static class UserEntryProcessor implements EntryProcessor<String, User>, HazelcastInstanceAware {
private transient HazelcastInstance hazelcastInstance;
#Override
public Object process(Entry<String, User> entry) {
User user = entry.getValue();
if(user != null) {
System.out.println(user.getFirstName());
}
return user;
}
#Override
public EntryBackupProcessor<String, User> getBackupProcessor() {
return null;
}
#Override
public void setHazelcastInstance(HazelcastInstance hazelcastInstance) {
this.hazelcastInstance = hazelcastInstance;
}
}
static class UserFactoryImpl implements PortableFactory{
public final static int USER_PORTABLE_ID = 1;
public final static int FACTORY_ID = 1;
public Portable create(int classId) {
switch (classId) {
case USER_PORTABLE_ID:
return new User();
}
return null;
}
}
static class User implements Portable {
private String firstName;
private String lastName;
public String getLastName() {
return lastName;
}
public void setLastName(String lastName) {
this.lastName = lastName;
}
public String getFirstName() {
return firstName;
}
public void setFirstName(String firstName) {
this.firstName = firstName;
}
#Override
public int getFactoryId() {
return UserFactoryImpl.FACTORY_ID;
}
#Override
public int getClassId() {
return UserFactoryImpl.USER_PORTABLE_ID;
}
#Override
public void writePortable(PortableWriter writer) throws IOException {
writer.writeUTF("first_name", firstName);
writer.writeUTF("last_name", lastName);
}
#Override
public void readPortable(PortableReader reader) throws IOException {
firstName = reader.readUTF("first_name");
lastName = reader.readUTF("last_name");
}
}
}
Yes it does, just as you figured out the factory and the classes need to be available. Currently there is no built-in solution to not share classes for more sophisticated use cases than simple gets / puts. I have JSON support and some other ideas cooking but nothing really done yet.