Currently I'm using WildFly 21.0.2 and JSON-B and JSON-P APIs. The Yasson version in WildFly modules is 1.0.5. I have the following JSON coming from REST endpoint:
{
"circuitInfoResponseList": [
{
"org.my.company.dto.FiberCircuitInfoResponse": {
"attendanceType": "BY_RADIUS",
"index": 0,
...
This is my JsonbDeserializer implementation:
public CircuitInfoResponse deserialize(JsonParser jsonParser, DeserializationContext deserializationContext, Type type) {
jsonParser.next();
String className = jsonParser.getString();
jsonParser.next();
try {
return deserializationContext.deserialize(Class.forName(className).asSubclass(CircuitInfoResponse.class), jsonParser);
} catch (ClassNotFoundException e) {
e.printStackTrace();
throw new JsonbException("Cannot deserialize object.");
}
//return deserializationContext.deserialize(FiberCircuitInfoResponse.class, jsonParser);
}
This method gets the SECOND entry from the json attendanceType and NOT the desired org.my.company.dto.FiberCircuitInfoResponse. BTW... when I serialize the JSON Object I can see the string org.my.company.dto.FiberCircuitInfoResponse however when it arrives and the client side it does NOT contain that string. It comes likes this:
[
{
"circuitInfoResponseList": [
{
"attendanceType": "BY_RADIUS",
"index": 0,
Without that information I cannot tell which subclass to create. I've already tried to follow this tips but without success:
https://javaee.github.io/jsonb-spec/users-guide.html
https://github.com/m0mus/JavaOne2016-JSONB-Demo/blob/4ecc22f69d57fda765631237d897b0a487f58d90/src/main/java/com/oracle/jsonb/demo/serializer/AnimalDeserializer.java
https://javaee.github.io/javaee-spec/javadocs/javax/json/bind/serializer/JsonbDeserializer.html
These are my POJO classes.
Parent class:
import lombok.*;
import lombok.experimental.SuperBuilder;
import javax.json.bind.annotation.JsonbTypeDeserializer;
#Data
#SuperBuilder
#NoArgsConstructor(access = AccessLevel.PROTECTED)
#JsonbTypeDeserializer(CircuitInfoResponseJsonbXerializer.class)
public class CircuitInfoResponse {
...
}
Child class:
import lombok.AccessLevel;
import lombok.Data;
import lombok.EqualsAndHashCode;
import lombok.NoArgsConstructor;
import lombok.experimental.SuperBuilder;
#Data
#SuperBuilder
#EqualsAndHashCode(callSuper = false)
#NoArgsConstructor(access = AccessLevel.PROTECTED)
public class FiberCircuitInfoResponse extends CircuitInfoResponse {
...
}
Serialize code:
Type responseListType = new ArrayList<SimulationServiceResponse>() {}.getClass().getGenericSuperclass();
JsonbConfig config = new JsonbConfig()
.withSerializers(new CircuitInfoResponseJsonbXerializer());
Jsonb jsonb = JsonbBuilder.create(config);
String json = jsonb.toJson(response, responseListType);
System.out.println(json);
return Response.status(Response.Status.OK).entity(json).build();
Deserialize code:
String restJsonResponse = restResponse.readEntity(String.class);
JsonbConfig config = new JsonbConfig()
.withDeserializers(new CircuitInfoResponseJsonbXerializer());
Jsonb jsonbCustom = JsonbBuilder.create(config);
List<SimulationServiceResponse> restResponseEntity = jsonbCustom.fromJson(restJsonResponse, new ArrayList<SimulationServiceResponse>() {}.getClass().getGenericSuperclass());
This is the class that contains a list of Parent class above:
import lombok.AccessLevel;
import lombok.Data;
import lombok.Getter;
import lombok.Setter;
#Data
public class SimulationServiceResponse {
...
#Getter(AccessLevel.NONE)
#Setter(AccessLevel.NONE)
private List<CircuitInfoResponse> circuitInfoResponseList;
public List<CircuitInfoResponse> getCircuitInfoResponseList() {
if (circuitInfoResponseList == null) {
circuitInfoResponseList = new ArrayList<>();
}
return circuitInfoResponseList;
}
public void setCircuitInfoResponseList(List<CircuitInfoResponse> list) {
this.circuitInfoResponseList = list;
}
}
Do you guys have any idea of what I'm doing wrong?
Related
finally after extensive stack-overflowing ;-) and debugging I made it work:
My Feign-client can make requests on Spring-Data-Rest's API and I get a Resource<Something> with filled links back.
My code so far...
The FeignClient:
#FeignClient(name = "serviceclient-hateoas",
url = "${service.url}",
decode404 = true,
path = "${service.basepath:/api/v1}",
configuration = MyFeignHateoasClientConfig.class)
public interface MyFeignHateoasClient {
#RequestMapping(method = RequestMethod.GET, path = "/bookings/search/findByBookingUuid?bookingUuid={uuid}")
Resource<Booking> getBookingByUuid(#PathVariable("uuid") String uuid);
}
The client-config:
#Configuration
public class MyFeignHateoasClientConfig{
#Value("${service.user.name:bla}")
private String serviceUser;
#Value("${service.user.password:blub}")
private String servicePassword;
#Bean
public BasicAuthRequestInterceptor basicAuth() {
return new BasicAuthRequestInterceptor(serviceUser, servicePassword);
}
#Bean
public Decoder decoder() {
return new JacksonDecoder(getObjectMapper());
}
#Bean
public Encoder encoder() {
return new JacksonEncoder(getObjectMapper());
}
public ObjectMapper getObjectMapper() {
return new ObjectMapper()
.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false)
.registerModule(new Jackson2HalModule());
}
#Bean
public Logger logger() {
return new Slf4jLogger(MyFeignHateoasClient.class);
}
#Bean
public Logger.Level logLevel() {
return Logger.Level.FULL;
}
}
And in the application using the client via an jar-dependency:
#SpringBootApplication
#EnableAutoConfiguration
#EnableFeignClients(basePackageClasses=MyFeignHateoasClient.class)
#EnableHypermediaSupport(type = EnableHypermediaSupport.HypermediaType.HAL)
#ComponentScan(excludeFilters = #Filter(type = ... ), basePackageClasses= {....class}, basePackages="...")
public class Application {
...
Now this is working:
#Autowired
private MyFeignHateoasClient serviceClient;
...
void test() {
Resource<Booking> booking = serviceClient.getBookingByUuid(id);
Link link = booking.getLink("relation-name");
}
Now my question:
How do I go on from here, i.e. navigate to the resource in the Link?
The Link is containing an URL on the resource I want to request.
Do I really have to parse the ID out of the URL and add a method to the FeignClient like getRelationById(id)
Is there at least a way to pass the complete resource-url to a method of a FeignClient?
I have found no examples which demonstrate how to proceed from here (despite the POST/modify). Any hints appreciated!
Thx
My current solution:
I added an additional request in the Feign client, taking the whole resource path:
...
public interface MyFeignHateoasClient {
...
#RequestMapping(method = RequestMethod.GET, path = "{resource}")
Resource<MyLinkedEntity> getMyEntityByResource(#PathVariable("resource") String resource);
}
Then I implemented some kind of "HAL-Tool":
...
import java.lang.reflect.Field;
import java.lang.reflect.InvocationHandler;
import java.lang.reflect.Proxy;
import org.springframework.hateoas.Link;
import feign.Target;
import lombok.SneakyThrows;
public class HalTool {
private Object feignClient;
public static HalTool forClient( Object feignClient ) {
return new HalTool(feignClient);
}
private HalTool( Object feignClient ) {
this.feignClient = feignClient;
}
#SneakyThrows
private String getUrl() {
InvocationHandler invocationHandler = Proxy.getInvocationHandler(feignClient);
Field target = invocationHandler.getClass().getDeclaredField("target");
target.setAccessible(true);
Target<?> value = (Target<?>) target.get(invocationHandler);
return value.url();
}
public String toPath( Link link ) {
String href = link.getHref();
String url = getUrl();
int idx = href.indexOf(url);
if (idx >= 0 ) {
idx += url.length();
}
return href.substring(idx);
}
}
And then I could do request a linked resource like this:
Link link = booking.getLink("relation-name");
Resource<MyLinkedEntity> entity = serviceClient.getMyEntityByResource(
HalTool.forClient(serviceClient).toPath(link));
We have an issue with one of our Kafka topics which is consumed by the DefaultKafkaConsumerFactory & ConcurrentMessageListenerContainer combination described here with a JsonDeserializer used by the Factory. Unfortunately someone got a little enthusiastic and published some invalid messages onto the topic. It appears that spring-kafka silently fails to process past the first of these messages. Is it possible to have spring-kafka log an error and continue? Looking at the error messages which are logged it seems that perhaps the Apache kafka-clients library should deal with the case that when iterating a batch of messages one or more of them may fail to parse?
The below code is an example test case illustrating this issue:
import org.apache.kafka.clients.consumer.ConsumerConfig;
import org.apache.kafka.clients.consumer.ConsumerRecord;
import org.apache.kafka.clients.producer.ProducerConfig;
import org.apache.kafka.common.serialization.Serializer;
import org.apache.kafka.common.serialization.StringDeserializer;
import org.apache.kafka.common.serialization.StringSerializer;
import org.junit.ClassRule;
import org.junit.Test;
import org.springframework.kafka.core.DefaultKafkaConsumerFactory;
import org.springframework.kafka.core.DefaultKafkaProducerFactory;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.kafka.listener.KafkaMessageListenerContainer;
import org.springframework.kafka.listener.MessageListener;
import org.springframework.kafka.listener.config.ContainerProperties;
import org.springframework.kafka.support.SendResult;
import org.springframework.kafka.support.serializer.JsonDeserializer;
import org.springframework.kafka.support.serializer.JsonSerializer;
import org.springframework.kafka.test.rule.KafkaEmbedded;
import org.springframework.kafka.test.utils.ContainerTestUtils;
import org.springframework.util.concurrent.ListenableFuture;
import java.util.HashMap;
import java.util.Map;
import java.util.Objects;
import java.util.concurrent.BlockingQueue;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.LinkedBlockingQueue;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.TimeoutException;
import static org.junit.Assert.assertEquals;
import static org.junit.Assert.assertThat;
import static org.springframework.kafka.test.hamcrest.KafkaMatchers.hasKey;
import static org.springframework.kafka.test.hamcrest.KafkaMatchers.hasValue;
/**
* #author jfreedman
*/
public class TestSpringKafka {
private static final String TOPIC1 = "spring.kafka.1.t";
#ClassRule
public static KafkaEmbedded embeddedKafka = new KafkaEmbedded(1, true, 1, TOPIC1);
#Test
public void submitMessageThenGarbageThenAnotherMessage() throws Exception {
final BlockingQueue<ConsumerRecord<String, JsonObject>> records = createListener(TOPIC1);
final KafkaTemplate<String, JsonObject> objectTemplate = createPublisher("json", new JsonSerializer<JsonObject>());
sendAndVerifyMessage(records, objectTemplate, "foo", new JsonObject("foo"), 0L);
// push some garbage text to Kafka which cannot be marshalled, this should not interrupt processing
final KafkaTemplate<String, String> garbageTemplate = createPublisher("garbage", new StringSerializer());
final SendResult<String, String> garbageResult = garbageTemplate.send(TOPIC1, "bar","bar").get(5, TimeUnit.SECONDS);
assertEquals(1L, garbageResult.getRecordMetadata().offset());
sendAndVerifyMessage(records, objectTemplate, "baz", new JsonObject("baz"), 2L);
}
private <T> KafkaTemplate<String, T> createPublisher(final String label, final Serializer<T> serializer) {
final Map<String, Object> producerProps = new HashMap<>();
producerProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, embeddedKafka.getBrokersAsString());
producerProps.put(ProducerConfig.CLIENT_ID_CONFIG, "TestPublisher-" + label);
producerProps.put(ProducerConfig.ACKS_CONFIG, "all");
producerProps.put(ProducerConfig.RETRIES_CONFIG, 2);
producerProps.put(ProducerConfig.MAX_IN_FLIGHT_REQUESTS_PER_CONNECTION, 1);
producerProps.put(ProducerConfig.REQUEST_TIMEOUT_MS_CONFIG, 5000);
producerProps.put(ProducerConfig.MAX_BLOCK_MS_CONFIG, 5000);
producerProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
producerProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, serializer.getClass());
final DefaultKafkaProducerFactory<String, T> pf = new DefaultKafkaProducerFactory<>(producerProps);
pf.setValueSerializer(serializer);
return new KafkaTemplate<>(pf);
}
private BlockingQueue<ConsumerRecord<String, JsonObject>> createListener(final String topic) throws Exception {
final Map<String, Object> consumerProps = new HashMap<>();
consumerProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, embeddedKafka.getBrokersAsString());
consumerProps.put(ConsumerConfig.GROUP_ID_CONFIG, "TestConsumer");
consumerProps.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, true);
consumerProps.put(ConsumerConfig.AUTO_COMMIT_INTERVAL_MS_CONFIG, "100");
consumerProps.put(ConsumerConfig.SESSION_TIMEOUT_MS_CONFIG, 15000);
consumerProps.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
consumerProps.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class);
final DefaultKafkaConsumerFactory<String, JsonObject> cf = new DefaultKafkaConsumerFactory<>(consumerProps);
cf.setValueDeserializer(new JsonDeserializer<>(JsonObject.class));
final KafkaMessageListenerContainer<String, JsonObject> container = new KafkaMessageListenerContainer<>(cf, new ContainerProperties(topic));
final BlockingQueue<ConsumerRecord<String, JsonObject>> records = new LinkedBlockingQueue<>();
container.setupMessageListener((MessageListener<String, JsonObject>) records::add);
container.setBeanName("TestListener");
container.start();
ContainerTestUtils.waitForAssignment(container, embeddedKafka.getPartitionsPerTopic());
return records;
}
private void sendAndVerifyMessage(final BlockingQueue<ConsumerRecord<String, JsonObject>> records,
final KafkaTemplate<String, JsonObject> template,
final String key, final JsonObject value,
final long expectedOffset) throws InterruptedException, ExecutionException, TimeoutException {
final ListenableFuture<SendResult<String, JsonObject>> future = template.send(TOPIC1, key, value);
final ConsumerRecord<String, JsonObject> record = records.poll(5, TimeUnit.SECONDS);
assertThat(record, hasKey(key));
assertThat(record, hasValue(value));
assertEquals(expectedOffset, future.get(5, TimeUnit.SECONDS).getRecordMetadata().offset());
}
public static final class JsonObject {
private String value;
public JsonObject() {}
JsonObject(final String value) {
this.value = value;
}
public String getValue() {
return value;
}
public void setValue(final String value) {
this.value = value;
}
#Override
public boolean equals(final Object o) {
if (this == o) { return true; }
if (o == null || getClass() != o.getClass()) { return false; }
final JsonObject that = (JsonObject) o;
return Objects.equals(value, that.value);
}
#Override
public int hashCode() {
return Objects.hash(value);
}
#Override
public String toString() {
return "JsonObject{" +
"value='" + value + '\'' +
'}';
}
}
}
I have a solution but I don't know if it's the best one, I extended JsonDeserializer as follows which results in a null value being consumed by spring-kafka and requires the necessary downstream changes to handle that case.
class SafeJsonDeserializer[A >: Null](targetType: Class[A], objectMapper: ObjectMapper) extends JsonDeserializer[A](targetType, objectMapper) with Logging {
override def deserialize(topic: String, data: Array[Byte]): A = try {
super.deserialize(topic, data)
} catch {
case e: Exception =>
logger.error("Failed to deserialize data [%s] from topic [%s]".format(new String(data), topic), e)
null
}
}
Starting from the spring-kafka-2.x.x, we now have the comfort of declaring beans in the config file for the interface KafkaListenerErrorHandler with a implementation something as
#Bean
public ConsumerAwareListenerErrorHandler listen3ErrorHandler() {
return (m, e, c) -> {
this.listen3Exception = e;
MessageHeaders headers = m.getHeaders();
c.seek(new org.apache.kafka.common.TopicPartition(
headers.get(KafkaHeaders.RECEIVED_TOPIC, String.class),
headers.get(KafkaHeaders.RECEIVED_PARTITION_ID, Integer.class)),
headers.get(KafkaHeaders.OFFSET, Long.class));
return null;
};
}
more resources can be found at https://docs.spring.io/spring-kafka/reference/htmlsingle/#annotation-error-handling There is also another link with the similar issue: Spring Kafka error handling - v1.1.x and How to handle SerializationException after deserialization
Use ErrorHandlingDeserializer2. This is a delegating key/value deserializer that catches exceptions, returning them in the headers as serialized java objects.
Under consumer configuration, add/update the below lines:
import org.apache.kafka.clients.consumer.ConsumerConfig
import org.springframework.kafka.support.serializer.ErrorHandlingDeserializer2
configProps.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,
classOf[ErrorHandlingDeserializer2[JsonDeserializer]].getName)
configProps.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, classOf[ErrorHandlingDeserializer2[StringDeserializer]].getName)
configProps.put(ErrorHandlingDeserializer2.KEY_DESERIALIZER_CLASS, classOf[StringDeserializer].getName)
configProps.put(ErrorHandlingDeserializer2.VALUE_DESERIALIZER_CLASS, classOf[JsonDeserializer].getName)
I have a problem using Moxy to convert a JSON String to an XML Object.
Here is the exception I get when I do this conversion:
Exception [EclipseLink-43] (Eclipse Persistence Services - 2.6.2.v20151217-774c696): org.eclipse.persistence.exceptions.DescriptorException
Exception Description: Missing class for indicator field value [TENANT] of type [class java.lang.String].
Descriptor: XMLDescriptor(fr.niji.nates.webservices.macd.ws.COMPONENTTYPE --> [])
at org.eclipse.persistence.exceptions.DescriptorException.missingClassForIndicatorFieldValue(DescriptorException.java:940)
at org.eclipse.persistence.internal.oxm.QNameInheritancePolicy.classFromRow(QNameInheritancePolicy.java:278)
[...]
Here is the class COMPONENTTYPE:
import javax.xml.bind.annotation.XmlAccessType;
import javax.xml.bind.annotation.XmlAccessorType;
import javax.xml.bind.annotation.XmlAttribute;
import javax.xml.bind.annotation.XmlSeeAlso;
import javax.xml.bind.annotation.XmlType;
#XmlAccessorType(XmlAccessType.FIELD)
#XmlType(name = "COMPONENT_TYPE")
#XmlSeeAlso({
COMPONENTDETAILTYPE.class,
MACDRESULTTYPE.Created.class
})
public class COMPONENTTYPE {
#XmlAttribute(name = "type", required = true)
protected String type;
#XmlAttribute(name = "dbid", required = true)
protected int dbid;
public String getType() {
return type;
}
public void setType(String value) {
this.type = value;
}
public int getDbid() {
return dbid;
}
public void setDbid(int value) {
this.dbid = value;
}
}
The problem seems to be only on "type" attribute.
Does anyone have an idea?
Thanks,
The solution I found is to add the annotation #XmlDiscriminatorNode to the class :
import javax.xml.bind.annotation.XmlAccessType;
import javax.xml.bind.annotation.XmlAccessorType;
import javax.xml.bind.annotation.XmlAttribute;
import javax.xml.bind.annotation.XmlSeeAlso;
import javax.xml.bind.annotation.XmlType;
import org.eclipse.persistence.oxm.annotations.XmlDiscriminatorNode;
#XmlAccessorType(XmlAccessType.FIELD)
#XmlType(name = "COMPONENT_TYPE")
#XmlSeeAlso({
COMPONENTDETAILTYPE.class,
fr.niji.nates.webservices.macd.ws.MACDRESULTTYPE.Created.class
})
#XmlDiscriminatorNode("##type")
public class COMPONENTTYPE {
[...]
I'm using spring 4.2 to create some restfull webservices.
But we realized that when a user mistypes one of the not-mandatory #RequestParam, we do not get an error that the param he passed is unknown.
like we have #RequestParam(required=false, value="valueA") String value A and in the call he uses '?valuueA=AA' -> we want an error.
But I do not seem to find a way to do this, the value is just ignored and the user is unaware of this.
One possible solution would be to create an implementation of HandlerInterceptor which will verify that all request parameters passed to the handler method are declared in its #RequestParam annotated parameters.
However you should consider the disadvantages of such solution. There might be situations where you want to allow certain parameters to be passed in and not be declared as request params. For instance if you have request like GET /foo?page=1&offset=0 and have handler with following signature:
#RequestMapping
public List<Foo> listFoos(PagingParams page);
and PagingParams is a class containing page and offset properties, it will normally be mapped from the request parameters. Implementation of a solution you want would interfere with this Spring MVC'c functionality.
That being said, here is a sample implementation I had in mind:
public class UndeclaredParamsHandlerInterceptor extends HandlerInterceptorAdapter {
#Override
public boolean preHandle(HttpServletRequest request, HttpServletResponse response,
Object handler) throws Exception {
if (handler instanceof HandlerMethod) {
HandlerMethod handlerMethod = (HandlerMethod) handler;
checkParams(request, getDeclaredRequestParams(handlerMethod));
}
return true;
}
private void checkParams(HttpServletRequest request, Set<String> allowedParams) {
request.getParameterMap().entrySet().forEach(entry -> {
String param = entry.getKey();
if (!allowedParams.contains(param)) {
throw new UndeclaredRequestParamException(param, allowedParams);
}
});
}
private Set<String> getDeclaredRequestParams(HandlerMethod handlerMethod) {
Set<String> declaredRequestParams = new HashSet<>();
MethodParameter[] methodParameters = handlerMethod.getMethodParameters();
ParameterNameDiscoverer parameterNameDiscoverer = new DefaultParameterNameDiscoverer();
for (MethodParameter methodParameter : methodParameters) {
if (methodParameter.hasParameterAnnotation(RequestParam.class)) {
RequestParam requestParam = methodParameter.getParameterAnnotation(RequestParam.class);
if (StringUtils.hasText(requestParam.value())) {
declaredRequestParams.add(requestParam.value());
} else {
methodParameter.initParameterNameDiscovery(parameterNameDiscoverer);
declaredRequestParams.add(methodParameter.getParameterName());
}
}
}
return declaredRequestParams;
}
}
Basically this will do what I described above. You can then add exception handler for the exception it throws and translate it to HTTP 400 response. I've put more of an complete sample on Github, which includes a way to selectively enable this behavior for individual handler methods via annotation.
I translated Bohuslav Burghardt's solution for Spring WebFlux applications.
I dropped the #DisallowUndeclaredRequestParams annotation class from GitHub because I didn't need it -- it just applies the filter to all HandlerMethods. But someone else could update this answer and put it back.
package com.example.springundeclaredparamerror;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.ObjectMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.core.DefaultParameterNameDiscoverer;
import org.springframework.core.MethodParameter;
import org.springframework.core.ParameterNameDiscoverer;
import org.springframework.http.HttpStatus;
import org.springframework.http.server.reactive.ServerHttpRequest;
import org.springframework.stereotype.Component;
import org.springframework.util.StringUtils;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.method.HandlerMethod;
import org.springframework.web.reactive.result.method.annotation.RequestMappingHandlerMapping;
import org.springframework.web.server.ServerWebExchange;
import org.springframework.web.server.WebFilter;
import org.springframework.web.server.WebFilterChain;
import reactor.core.publisher.Mono;
import java.nio.charset.StandardCharsets;
import java.util.HashSet;
import java.util.Optional;
import java.util.Set;
/**
* Handler interceptor used for ensuring that no request params other than those explicitly
* declared via {#link RequestParam} parameters of the handler method are passed in.
*/
// Implementation translated into WebFlux WebFilter from:
// https://github.com/bohuslav-burghardt/spring-sandbox/tree/master/handler-interceptors/src/main/java/handler_interceptors
#Component
public class DisallowUndeclaredParamsFilter implements WebFilter {
private static final Logger LOGGER = LoggerFactory.getLogger(DisallowUndeclaredParamsFilter.class);
#Autowired
#Qualifier("requestMappingHandlerMapping")
RequestMappingHandlerMapping mapping;
#Autowired
ObjectMapper mapper;
#Override
public Mono<Void> filter(ServerWebExchange serverWebExchange, WebFilterChain webFilterChain) {
Object o = mapping.getHandler(serverWebExchange).toFuture().getNow(null);
Optional<String> undeclaredParam = Optional.empty();
if (o != null && o instanceof HandlerMethod) {
var handlerMethod = (HandlerMethod) o;
undeclaredParam = checkParams(serverWebExchange.getRequest(),
getDeclaredRequestParams(handlerMethod));
}
return undeclaredParam.map((param) -> RespondWithError(serverWebExchange, param))
.orElseGet(() -> webFilterChain.filter(serverWebExchange));
}
/** Responds to the request with an error message for the given undeclared parameter. */
private Mono<Void> RespondWithError(ServerWebExchange serverWebExchange, String undeclaredParam) {
final HttpStatus status = HttpStatus.BAD_REQUEST;
serverWebExchange.getResponse().setStatusCode(status);
serverWebExchange.getResponse().getHeaders().add(
"Content-Type", "application/json");
UndeclaredParamErrorResponse response = new UndeclaredParamErrorResponse();
response.message = "Parameter not expected: " + undeclaredParam;
response.statusCode = status.value();
String error = null;
try {
error = mapper.writeValueAsString(response);
} catch (JsonProcessingException e) {
error = "Parameter not expected; error generating JSON response";
LOGGER.warn("Error generating JSON response for undeclared argument", e);
}
return serverWebExchange.getResponse().writeAndFlushWith(
Mono.just(Mono.just(serverWebExchange.getResponse().bufferFactory().wrap(
error.getBytes(StandardCharsets.UTF_8)))));
}
/** Structure for generating error JSON. */
static class UndeclaredParamErrorResponse {
public String message;
public int statusCode;
}
/**
* Check that all of the request params of the specified request are contained within the specified set of allowed
* parameters.
*
* #param request Request whose params to check.
* #param allowedParams Set of allowed request parameters.
* #return Name of a param in the request that is not allowed, or empty if all params in the request are allowed.
*/
private Optional<String> checkParams(ServerHttpRequest request, Set<String> allowedParams) {
return request.getQueryParams().keySet().stream().filter(param ->
!allowedParams.contains(param)
).findFirst();
}
/**
* Extract all request parameters declared via {#link RequestParam} for the specified handler method.
*
* #param handlerMethod Handler method to extract declared params for.
* #return Set of declared request parameters.
*/
private Set<String> getDeclaredRequestParams(HandlerMethod handlerMethod) {
Set<String> declaredRequestParams = new HashSet<>();
MethodParameter[] methodParameters = handlerMethod.getMethodParameters();
ParameterNameDiscoverer parameterNameDiscoverer = new DefaultParameterNameDiscoverer();
for (MethodParameter methodParameter : methodParameters) {
if (methodParameter.hasParameterAnnotation(RequestParam.class)) {
RequestParam requestParam = methodParameter.getParameterAnnotation(RequestParam.class);
if (StringUtils.hasText(requestParam.value())) {
declaredRequestParams.add(requestParam.value());
} else {
methodParameter.initParameterNameDiscovery(parameterNameDiscoverer);
declaredRequestParams.add(methodParameter.getParameterName());
}
}
}
return declaredRequestParams;
}
}
Here's the unit test I wrote for it. I recommend checking it into your codebase as well.
package com.example.springundeclaredparamerror;
import com.github.tomakehurst.wiremock.junit.WireMockRule;
import org.junit.Rule;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.autoconfigure.web.reactive.WebFluxTest;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Import;
import org.springframework.test.context.junit4.SpringRunner;
import org.springframework.test.web.reactive.server.WebTestClient;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;
import reactor.core.publisher.Mono;
import static com.github.tomakehurst.wiremock.core.WireMockConfiguration.wireMockConfig;
#RunWith(SpringRunner.class)
#WebFluxTest(controllers = {DisallowUndeclaredParamFilterTest.TestController.class})
public class DisallowUndeclaredParamFilterTest {
private static final String TEST_ENDPOINT = "/disallowUndeclaredParamFilterTest";
#Rule
public final WireMockRule wireMockRule = new WireMockRule(wireMockConfig().dynamicPort());
#Autowired
private WebTestClient webClient;
#Configuration
#Import({TestController.class, DisallowUndeclaredParamsFilter.class})
static class TestConfig {
}
#RestController
static class TestController {
#GetMapping(TEST_ENDPOINT)
public Mono<String> retrieveEntity(#RequestParam(name = "a", required = false) final String a) {
return Mono.just("ok");
}
}
#Test
public void testAllowsNoArgs() {
webClient.get().uri(TEST_ENDPOINT).exchange().expectBody(String.class).isEqualTo("ok");
}
#Test
public void testAllowsDeclaredArg() {
webClient.get().uri(TEST_ENDPOINT + "?a=1").exchange().expectBody(String.class).isEqualTo("ok");
}
#Test
public void testDisallowsUndeclaredArg() {
webClient.get().uri(TEST_ENDPOINT + "?b=1").exchange().expectStatus().is4xxClientError();
}
}
I've just started evaluating Neo4j to see how well its fits our use case.
I'm using the embedded Java API to insert edges and nodes into a graph.
After creating around 5000 nodes I get the following error (using Neo4j 2.1.6 and 2.1.7 on OS X Yosemite)
org.neo4j.graphdb.TransactionFailureException: Unable to commit transaction
Caused by: javax.transaction.xa.XAException
Caused by: org.neo4j.kernel.impl.nioneo.store.UnderlyingStorageException: java.io.FileNotFoundException: /Users/mihir.k/IdeaProjects/Turant/target/neo4j-hello-db/schema/label/lucene/_8zr.frq (Too many open files)
Caused by: java.io.FileNotFoundException: /Users/mihir.k/IdeaProjects/Turant/target/neo4j-hello-db/schema/label/lucene/_8zr.frq (Too many open files)
I've looked at numerous similar StackOverFlow questions and other related threads online. They all suggest increasing the max open files limit.
I've tried doing that.
These are my settings:
kern.maxfiles: 65536
kern.maxfilesperproc: 65536
However this hasn't fixed the error.
While the Neo4j code runs I tried using the lsof|wc -l command. The code always breaks when around 10000 files are open.
The following is the main class that deals with Neo4j:
import java.io.File;
import java.io.Serializable;
import java.util.Map;
import java.util.concurrent.TimeUnit;
import org.neo4j.cypher.internal.compiler.v1_9.commands.True;
import org.neo4j.cypher.internal.compiler.v2_0.ast.False;
import org.neo4j.graphdb.*;
import org.neo4j.graphdb.factory.GraphDatabaseFactory;
import org.neo4j.graphdb.schema.Schema;
import org.neo4j.graphdb.schema.IndexDefinition;
import org.neo4j.graphdb.index.UniqueFactory;
import org.neo4j.graphdb.index.Index;
import org.neo4j.graphdb.index.IndexHits;
public class Neo4jDB implements Serializable {
private static final String DB_PATH = "target/neo4j-hello-db-spark";
IndexDefinition indexDefinition;
private static GraphDatabaseFactory dbFactory;
public static GraphDatabaseService db;
public void main(String[] args) {
System.out.println("Life is a disease, sexually transmitted and irrevocably fatal. Stop coding and read some Neil Gaiman.");
}
public void startDbInstance() {
db =new GraphDatabaseFactory().newEmbeddedDatabase(DB_PATH);
}
public Node createOrGetNode ( LabelsUser360 label , String key, String nodeName ,Map<String,Object> propertyMap)
{
System.out.println("Creating/Getting node");
try ( Transaction tx = db.beginTx() ) {
Node node;
if (db.findNodesByLabelAndProperty(label, key, nodeName).iterator().hasNext()) {
node = db.findNodesByLabelAndProperty(label, key, nodeName).iterator().next();
} else {
node = db.createNode(label);
node.setProperty(key, nodeName);
}
for (Map.Entry<String, Object> entry : propertyMap.entrySet()) {
node.setProperty(entry.getKey(), entry.getValue());
}
tx.success();
return node;
}
}
public void createUniquenessConstraint(LabelsUser360 label , String property)
{
try ( Transaction tx = db.beginTx() )
{
db.schema()
.constraintFor(label)
.assertPropertyIsUnique(property)
.create();
tx.success();
}
}
public void createOrUpdateRelationship(RelationshipsUser360 relationshipType ,Node startNode, Node endNode, Map<String,Object> propertyMap)
{
try ( Transaction tx = db.beginTx() ) {
if (startNode.hasRelationship(relationshipType, Direction.OUTGOING)) {
Relationship relationship = startNode.getSingleRelationship(relationshipType, Direction.OUTGOING);
for (Map.Entry<String, Object> entry : propertyMap.entrySet()) {
relationship.setProperty(entry.getKey(), entry.getValue());
}
} else {
Relationship relationship = startNode.createRelationshipTo(endNode, relationshipType);
for (Map.Entry<String, Object> entry : propertyMap.entrySet()) {
relationship.setProperty(entry.getKey(), entry.getValue());
}
}
tx.success();
}
}
public void registerShutdownHook( final GraphDatabaseService graphDb )
{
Runtime.getRuntime().addShutdownHook( new Thread()
{
#Override
public void run()
{
db.shutdown();
}
} );
}
}
There is another Neo4jAdapter class that is used to implement domain specific logic. It uses the Neo4jDB class to do add/update nodes/properties/relationships
import org.apache.lucene.index.IndexWriter;
import org.codehaus.jackson.map.ObjectMapper;
import org.json.*;
import org.neo4j.graphdb.*;
import org.neo4j.graphdb.factory.GraphDatabaseFactory;
import org.neo4j.graphdb.schema.IndexDefinition;
import java.io.*;
import java.nio.file.Files;
import java.nio.file.Paths;
import java.util.HashMap;
import java.util.Iterator;
import java.util.Map;
import java.util.Set;
public class Neo4jAdapter implements Serializable {
static Neo4jDB n4j = new Neo4jDB();
public static GraphDatabaseService db = Neo4jDB.db ;
public void begin() {
n4j.startDbInstance();
}
public static void main(String[] args) {}
public String graphPut(String jsonString) {
System.out.println("graphput called");
HashMap<String, Object> map = jsonToMap(jsonString); //Json deserializer
Node startNode = n4j.createOrGetNode(...);
Node endNode = n4j.createOrGetNode(...);
propertyMap = new HashMap<String, Object>();
propertyMap.put(....);
try (Transaction tx = Neo4jDB.db.beginTx()) {
Relationship relationship = startNode.getSingleRelationship(...);
if (relationship != null) {
Integer currentCount = (Integer) relationship.getProperty("count");
Integer updatedCount = currentCount + 1;
propertyMap.put("count", updatedCount);
} else {
Integer updatedCount = 1;
propertyMap.put("count", updatedCount);
}
tx.success();
}
n4j.createOrUpdateRelationship(RelationshipsUser360.BLAH, startNode, endNode, propertyMap);
}
}
}
return "Are you sponge worthy??";
}
}
Finally, there is a Sprak App that calls the "graphput" method of the Neo4jAdapter class. The relevant code snippet is (the following is scala+spark code) :
val graphdb : Neo4jAdapter = new Neo4jAdapter()
graphdb.begin()
linesEnriched.foreach(a=>graphdb.graphPutMap(a))
where 'a' is a json string and linesEnriched is a Spark RDD (basically a set of strings)