Spring AOP Pointcut if condition - aop

I am facing an issue with pointcuts, I am trying to enable the #Around when the log.isDebugEnabled is true for this I am trying the following code:
#Pointcut("within(org.apache.commons.logging.impl.Log4JLogger..*)")
public boolean isDebugEnabled() {
return log.isDebugEnabled();
}
and for testing purposes, I have two aspects configured
#AfterThrowing(value = "!isDebugEnabled()", throwing = "exception")
and
#Around(value = "isDebugEnabled()")
But all the times when I try to execute the code it always goes to #AfterThrowing, and it is not clear for me what I am doing wrong!
I am using aspectJWeaver 1.8.9, with Spring MVC 4.3!
Here is a sample class emulating the issue:
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.aspectj.lang.JoinPoint;
import org.aspectj.lang.ProceedingJoinPoint;
import org.aspectj.lang.annotation.AfterThrowing;
import org.aspectj.lang.annotation.Around;
import org.aspectj.lang.annotation.Aspect;
import org.aspectj.lang.annotation.Pointcut;
import org.springframework.stereotype.Component;
import org.springframework.util.StopWatch;
#Component
#Aspect
public class SampleAspect {
private static final Log log = LogFactory.getLog(SampleAspect.class);
#Pointcut("within(org.apache.commons.logging.impl.Log4JLogger..*)")
public boolean isDebugEnabled() {
return log.isDebugEnabled();
}
#AfterThrowing(value = " !isDebugEnabled()", throwing = "exception")
public void getCalledOnException(JoinPoint joinPoint, Exception exception) {
log.error("Method " + joinPoint.getSignature() + " Throws the exception " + exception.getStackTrace());
}
//Never execute around method even when log.isDebugEnabled() = true
#Around(value = "isDebugEnabled()")
public Object aroundTest(ProceedingJoinPoint proceedingJoinPoint) throws Throwable {
StopWatch stopWatch = new StopWatch();
stopWatch.start();
final Object proceed;
try {
proceed = proceedingJoinPoint.proceed();
} catch (Exception e) {
throw e;
}
stopWatch.stop();
log.debug("It took " + stopWatch.getTotalTimeSeconds() + " seconds to be proceed");
return proceed;
}
}
edit,
I tried to use if() from aspectJ, but it didn't work in my project either.
#Pointcut("call(* *.*(int)) && args(i) && if()")
public static boolean someCallWithIfTest(int i) {
return i > 0;
}
Not sure if I need to add a different import or so, but I didn't manage to make it work.

Couple of points from documentation
== Spring AOP Capabilities and Goals
Spring AOP currently supports only method execution join points
(advising the execution of methods on Spring beans)
=== Declaring a Pointcut
In the #AspectJ annotation-style of AOP, a pointcut signature is provided
by a regular method definition, and the pointcut expression is
indicated by using the #Pointcut annotation (the method serving as the
pointcut signature must have a void return type).
Apache commons classes are not managed by Spring container . So the following will not be honoured.
#Pointcut("within(org.apache.commons.logging.impl.Log4JLogger..*)")
Following pointcut method is not valid
public boolean isDebugEnabled() {
return log.isDebugEnabled();
}

Related

How to properly test flux from sink (processor)?

I have processor like class, which internally uses sink. I have made extremely simplified one to showcase my question:
import reactor.core.publisher.Sinks;
import reactor.test.StepVerifier;
import java.time.Duration;
public class TestBed {
public static void main(String[] args) {
class StringProcessor {
public final Sinks.Many<String> sink = Sinks.many().multicast().directBestEffort();
public void httpPostWebhookController(String inputData) {
sink.emitNext(
inputData.toLowerCase() + " " + inputData.toUpperCase(),
(signalType, emitResult) -> {
System.out.println("error, signalType=" + signalType + "; emitResult=" + emitResult);
return false;
}
);
}
}
final StringProcessor stringProcessor = new StringProcessor();
final StepVerifier stepVerifier = StepVerifier.create(stringProcessor.sink.asFlux())
.expectSubscription()
.expectNext("asdf ASDF")
.expectNext("qw QW")
.thenCancel();
stringProcessor.httpPostWebhookController("asdf");
stringProcessor.httpPostWebhookController("Qw");
stepVerifier.verify(Duration.ofSeconds(2));
}
}
My stepVerified does not subscribe and when it subscribe (upon verify(Duration) call), it misses testing signals. I cannot move verify call before httpPostWebhookController method call, because, it is blocking and will fail because no signal comes.
How to use StepVerifier in such scenario?
As I have asked on udemy course (instructor Vinoth Selvaraj), solution is to use verifyLater call. It will cause to trigger subscription and it does not block. Fixed test code:
final StringProcessor stringProcessor = new StringProcessor();
final StepVerifier stepVerifier = StepVerifier.create(stringProcessor.sink.asFlux().log())
.expectSubscription()
.expectNext("asdf ASDF")
.expectNext("qw QW")
.thenCancel()
.verifyLater();
stringProcessor.httpPostWebhookController("asdf");
stringProcessor.httpPostWebhookController("Qw");
stepVerifier.verify(Duration.ofSeconds(2));

BinaryInvalidTypeException in Ignite Remote Filter

The following code is based on a combination of Ingite's CacheQueryExample and CacheContinuousQueryExample.
The code starts a fat Ignite client. Three organizations are created in the cache and we are listening to the updates to the cache. The remote filter is set to trigger the continuous query if the organization name is "Google". Peer class loading is enabled by the default examples xml config file (example-ignite.xml), so the expectation is that the remote node is aware of the Organization class.
However the following exceptions are shown in the Ignite server's console (one for each cache entry) and all three records are returned to the client in the continuous query's event handler instead of just the "Google" record. If the filter is changed to check on the key instead of the value, the correct behavior is observed and a single record is returned to the local listener.
[08:28:43,302][SEVERE][sys-stripe-1-#2][query] CacheEntryEventFilter failed: class o.a.i.binary.BinaryInvalidTypeException: o.a.i.examples.model.Organization
[08:28:51,819][SEVERE][sys-stripe-2-#3][query] CacheEntryEventFilter failed: class o.a.i.binary.BinaryInvalidTypeException: o.a.i.examples.model.Organization
[08:28:52,692][SEVERE][sys-stripe-3-#4][query] CacheEntryEventFilter failed: class o.a.i.binary.BinaryInvalidTypeException: o.a.i.examples.model.Organization
To run the code
Start an ignite server using examples/config/example-ignite.xml as the configuration file.
Replace the content of ignite's CacheContinuousQueryExample.java with the following code. You may have to change the path to the configuration file to an absolute path.
package org.apache.ignite.examples.datagrid;
import javax.cache.Cache;
import javax.cache.configuration.Factory;
import javax.cache.event.CacheEntryEvent;
import javax.cache.event.CacheEntryEventFilter;
import javax.cache.event.CacheEntryUpdatedListener;
import org.apache.ignite.Ignite;
import org.apache.ignite.IgniteCache;
import org.apache.ignite.Ignition;
import org.apache.ignite.cache.CacheMode;
import org.apache.ignite.cache.affinity.AffinityKey;
import org.apache.ignite.cache.query.ContinuousQuery;
import org.apache.ignite.cache.query.QueryCursor;
import org.apache.ignite.cache.query.ScanQuery;
import org.apache.ignite.configuration.CacheConfiguration;
import org.apache.ignite.examples.ExampleNodeStartup;
import org.apache.ignite.examples.model.Organization;
import org.apache.ignite.examples.model.Person;
import org.apache.ignite.lang.IgniteBiPredicate;
import java.util.Collection;
/**
* This examples demonstrates continuous query API.
* <p>
* Remote nodes should always be started with special configuration file which
* enables P2P class loading: {#code 'ignite.{sh|bat} examples/config/example-ignite.xml'}.
* <p>
* Alternatively you can run {#link ExampleNodeStartup} in another JVM which will
* start node with {#code examples/config/example-ignite.xml} configuration.
*/
public class CacheContinuousQueryExample {
/** Organizations cache name. */
private static final String ORG_CACHE = CacheQueryExample.class.getSimpleName() + "Organizations";
/**
* Executes example.
*
* #param args Command line arguments, none required.
* #throws Exception If example execution failed.
*/
public static void main(String[] args) throws Exception {
Ignition.setClientMode(true);
try (Ignite ignite = Ignition.start("examples/config/example-ignite.xml")) {
System.out.println();
System.out.println(">>> Cache continuous query example started.");
CacheConfiguration<Long, Organization> orgCacheCfg = new CacheConfiguration<>(ORG_CACHE);
orgCacheCfg.setCacheMode(CacheMode.PARTITIONED); // Default.
orgCacheCfg.setIndexedTypes(Long.class, Organization.class);
// Auto-close cache at the end of the example.
try {
ignite.getOrCreateCache(orgCacheCfg);
// Create new continuous query.
ContinuousQuery<Long, Organization> qry = new ContinuousQuery<>();
// Callback that is called locally when update notifications are received.
qry.setLocalListener(new CacheEntryUpdatedListener<Long, Organization>() {
#Override public void onUpdated(Iterable<CacheEntryEvent<? extends Long, ? extends Organization>> evts) {
for (CacheEntryEvent<? extends Long, ? extends Organization> e : evts)
System.out.println("Updated entry [key=" + e.getKey() + ", val=" + e.getValue() + ']');
}
});
// This filter will be evaluated remotely on all nodes.
// Entry that pass this filter will be sent to the caller.
qry.setRemoteFilterFactory(new Factory<CacheEntryEventFilter<Long, Organization>>() {
#Override public CacheEntryEventFilter<Long, Organization> create() {
return new CacheEntryEventFilter<Long, Organization>() {
#Override public boolean evaluate(CacheEntryEvent<? extends Long, ? extends Organization> e) {
//return e.getKey() == 3;
return e.getValue().name().equals("Google");
}
};
}
});
ignite.getOrCreateCache(ORG_CACHE).query(qry);
// Populate caches.
initialize();
Thread.sleep(2000);
}
finally {
// Distributed cache could be removed from cluster only by #destroyCache() call.
ignite.destroyCache(ORG_CACHE);
}
}
}
/**
* Populate cache with test data.
*/
private static void initialize() {
IgniteCache<Long, Organization> orgCache = Ignition.ignite().cache(ORG_CACHE);
// Clear cache before running the example.
orgCache.clear();
// Organizations.
Organization org1 = new Organization("ApacheIgnite");
Organization org2 = new Organization("Apple");
Organization org3 = new Organization("Google");
orgCache.put(org1.id(), org1);
orgCache.put(org2.id(), org2);
orgCache.put(org3.id(), org3);
}
}
Here is an interim workaround that involves using and deserializing binary objects. Hopefully, someone can post a proper solution.
Here is the main() function modified to work with BinaryObjects instead of the Organization object:
public static void main(String[] args) throws Exception {
Ignition.setClientMode(true);
try (Ignite ignite = Ignition.start("examples/config/example-ignite.xml")) {
System.out.println();
System.out.println(">>> Cache continuous query example started.");
CacheConfiguration<Long, Organization> orgCacheCfg = new CacheConfiguration<>(ORG_CACHE);
orgCacheCfg.setCacheMode(CacheMode.PARTITIONED); // Default.
orgCacheCfg.setIndexedTypes(Long.class, Organization.class);
// Auto-close cache at the end of the example.
try {
ignite.getOrCreateCache(orgCacheCfg);
// Create new continuous query.
ContinuousQuery<Long, BinaryObject> qry = new ContinuousQuery<>();
// Callback that is called locally when update notifications are received.
qry.setLocalListener(new CacheEntryUpdatedListener<Long, BinaryObject>() {
#Override public void onUpdated(Iterable<CacheEntryEvent<? extends Long, ? extends BinaryObject>> evts) {
for (CacheEntryEvent<? extends Long, ? extends BinaryObject> e : evts) {
Organization org = e.getValue().deserialize();
System.out.println("Updated entry [key=" + e.getKey() + ", val=" + org + ']');
}
}
});
// This filter will be evaluated remotely on all nodes.
// Entry that pass this filter will be sent to the caller.
qry.setRemoteFilterFactory(new Factory<CacheEntryEventFilter<Long, BinaryObject>>() {
#Override public CacheEntryEventFilter<Long, BinaryObject> create() {
return new CacheEntryEventFilter<Long, BinaryObject>() {
#Override public boolean evaluate(CacheEntryEvent<? extends Long, ? extends BinaryObject> e) {
//return e.getKey() == 3;
//return e.getValue().name().equals("Google");
return e.getValue().field("name").equals("Google");
}
};
}
});
ignite.getOrCreateCache(ORG_CACHE).withKeepBinary().query(qry);
// Populate caches.
initialize();
Thread.sleep(2000);
}
finally {
// Distributed cache could be removed from cluster only by #destroyCache() call.
ignite.destroyCache(ORG_CACHE);
}
}
}
Peer class loading is enabled ... so the expectation is that the remote node is aware of the Organization class.
This is the problem. You can't peer class load "model" objects, i.e., objects used to create the table.
Two solutions:
Deploy the model class(es) to the server ahead of time. The rest of the code -- the filters -- can be peer class loaded
As #rgb1380 demonstrates, you can use BinaryObjects, which is the underlying data format
Another small point, to use "autoclose" you need to structure your code like this:
// Auto-close cache at the end of the example.
try (var cache = ignite.getOrCreateCache(orgCacheCfg)) {
// do stuff
}

Customized parameter logging when using aspect oriented programing

All the examples I've seen that use aspect oriented programming for logging either log just class, method name and duration, and if they log parameters and return values they simply use ToString(). I need to have more control over what is logged. For example I want to skip passwords, or in some cases log all properties of an object but in other cases just the id property.
Any suggestions? I looked at AspectJ in Java and Unity interception in C# and could not find a solution.
You could try introducing parameter annotations to augment your parameters with some attributes. One of those attributes could signal to skip logging the parameter, another one could be used to specify a converter class for the string representation.
With the following annotations:
#Documented
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.METHOD)
public #interface Log {
}
#Documented
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.PARAMETER)
public #interface SkipLogging {
}
#Documented
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.PARAMETER)
public #interface ToStringWith {
Class<? extends Function<?, String>> value();
}
the aspect could look like this:
import java.lang.reflect.Parameter;
import java.util.function.Function;
import java.util.stream.Collectors;
import java.util.stream.IntStream;
import org.aspectj.lang.reflect.MethodSignature;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public aspect LoggingAspect {
private final static Logger logger = LoggerFactory.getLogger(LoggingAspect.class);
pointcut loggableMethod(): execution(#Log * *..*.*(..));
before(): loggableMethod() {
MethodSignature signature = (MethodSignature) thisJoinPoint.getSignature();
Parameter[] parameters = signature.getMethod()
.getParameters();
String message = IntStream.range(0, parameters.length)
.filter(i -> this.isLoggable(parameters[i]))
.<String>mapToObj(i -> toString(parameters[i], thisJoinPoint.getArgs()[i]))
.collect(Collectors.joining(", ",
"method execution " + signature.getName() + "(", ")"));
Logger methodLogger = LoggerFactory.getLogger(
thisJoinPointStaticPart.getSignature().getDeclaringType());
methodLogger.debug(message);
}
private boolean isLoggable(Parameter parameter) {
return parameter.getAnnotation(SkipLogging.class) == null;
}
private String toString(Parameter parameter, Object value) {
ToStringWith toStringWith = parameter.getAnnotation(ToStringWith.class);
if (toStringWith != null) {
Class<? extends Function<?, String>> converterClass =
toStringWith.value();
try {
#SuppressWarnings("unchecked")
Function<Object, String> converter = (Function<Object, String>)
converterClass.newInstance();
String str = converter.apply(value);
return String.format("%s='%s'", parameter.getName(), str);
} catch (Exception e) {
logger.error("Couldn't instantiate toString converter for logging "
+ converterClass.getName(), e);
return String.format("%s=<error converting to string>",
parameter.getName());
}
} else {
return String.format("%s='%s'", parameter.getName(), String.valueOf(value));
}
}
}
Test code:
public static class SomethingToStringConverter implements Function<Something, String> {
#Override
public String apply(Something something) {
return "Something nice";
}
}
#Log
public void test(
#ToStringWith(SomethingToStringConverter.class) Something something,
String string,
#SkipLogging Class<?> cls,
Object object) {
}
public static void main(String[] args) {
// execution of this method should log the following message:
// method execution test(something='Something nice', string='some string', object='null')
test(new Something(), "some string", Object.class, null);
}
I used Java 8 Streams API in my answer for it's compactness, you could convert the code to normal Java code if you don't use Java 8 features or need better efficiency. It's just to give you an idea.

Spring restController: how to error when unknown #RequestParam is in url

I'm using spring 4.2 to create some restfull webservices.
But we realized that when a user mistypes one of the not-mandatory #RequestParam, we do not get an error that the param he passed is unknown.
like we have #RequestParam(required=false, value="valueA") String value A and in the call he uses '?valuueA=AA' -> we want an error.
But I do not seem to find a way to do this, the value is just ignored and the user is unaware of this.
One possible solution would be to create an implementation of HandlerInterceptor which will verify that all request parameters passed to the handler method are declared in its #RequestParam annotated parameters.
However you should consider the disadvantages of such solution. There might be situations where you want to allow certain parameters to be passed in and not be declared as request params. For instance if you have request like GET /foo?page=1&offset=0 and have handler with following signature:
#RequestMapping
public List<Foo> listFoos(PagingParams page);
and PagingParams is a class containing page and offset properties, it will normally be mapped from the request parameters. Implementation of a solution you want would interfere with this Spring MVC'c functionality.
That being said, here is a sample implementation I had in mind:
public class UndeclaredParamsHandlerInterceptor extends HandlerInterceptorAdapter {
#Override
public boolean preHandle(HttpServletRequest request, HttpServletResponse response,
Object handler) throws Exception {
if (handler instanceof HandlerMethod) {
HandlerMethod handlerMethod = (HandlerMethod) handler;
checkParams(request, getDeclaredRequestParams(handlerMethod));
}
return true;
}
private void checkParams(HttpServletRequest request, Set<String> allowedParams) {
request.getParameterMap().entrySet().forEach(entry -> {
String param = entry.getKey();
if (!allowedParams.contains(param)) {
throw new UndeclaredRequestParamException(param, allowedParams);
}
});
}
private Set<String> getDeclaredRequestParams(HandlerMethod handlerMethod) {
Set<String> declaredRequestParams = new HashSet<>();
MethodParameter[] methodParameters = handlerMethod.getMethodParameters();
ParameterNameDiscoverer parameterNameDiscoverer = new DefaultParameterNameDiscoverer();
for (MethodParameter methodParameter : methodParameters) {
if (methodParameter.hasParameterAnnotation(RequestParam.class)) {
RequestParam requestParam = methodParameter.getParameterAnnotation(RequestParam.class);
if (StringUtils.hasText(requestParam.value())) {
declaredRequestParams.add(requestParam.value());
} else {
methodParameter.initParameterNameDiscovery(parameterNameDiscoverer);
declaredRequestParams.add(methodParameter.getParameterName());
}
}
}
return declaredRequestParams;
}
}
Basically this will do what I described above. You can then add exception handler for the exception it throws and translate it to HTTP 400 response. I've put more of an complete sample on Github, which includes a way to selectively enable this behavior for individual handler methods via annotation.
I translated Bohuslav Burghardt's solution for Spring WebFlux applications.
I dropped the #DisallowUndeclaredRequestParams annotation class from GitHub because I didn't need it -- it just applies the filter to all HandlerMethods. But someone else could update this answer and put it back.
package com.example.springundeclaredparamerror;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.ObjectMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.core.DefaultParameterNameDiscoverer;
import org.springframework.core.MethodParameter;
import org.springframework.core.ParameterNameDiscoverer;
import org.springframework.http.HttpStatus;
import org.springframework.http.server.reactive.ServerHttpRequest;
import org.springframework.stereotype.Component;
import org.springframework.util.StringUtils;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.method.HandlerMethod;
import org.springframework.web.reactive.result.method.annotation.RequestMappingHandlerMapping;
import org.springframework.web.server.ServerWebExchange;
import org.springframework.web.server.WebFilter;
import org.springframework.web.server.WebFilterChain;
import reactor.core.publisher.Mono;
import java.nio.charset.StandardCharsets;
import java.util.HashSet;
import java.util.Optional;
import java.util.Set;
/**
* Handler interceptor used for ensuring that no request params other than those explicitly
* declared via {#link RequestParam} parameters of the handler method are passed in.
*/
// Implementation translated into WebFlux WebFilter from:
// https://github.com/bohuslav-burghardt/spring-sandbox/tree/master/handler-interceptors/src/main/java/handler_interceptors
#Component
public class DisallowUndeclaredParamsFilter implements WebFilter {
private static final Logger LOGGER = LoggerFactory.getLogger(DisallowUndeclaredParamsFilter.class);
#Autowired
#Qualifier("requestMappingHandlerMapping")
RequestMappingHandlerMapping mapping;
#Autowired
ObjectMapper mapper;
#Override
public Mono<Void> filter(ServerWebExchange serverWebExchange, WebFilterChain webFilterChain) {
Object o = mapping.getHandler(serverWebExchange).toFuture().getNow(null);
Optional<String> undeclaredParam = Optional.empty();
if (o != null && o instanceof HandlerMethod) {
var handlerMethod = (HandlerMethod) o;
undeclaredParam = checkParams(serverWebExchange.getRequest(),
getDeclaredRequestParams(handlerMethod));
}
return undeclaredParam.map((param) -> RespondWithError(serverWebExchange, param))
.orElseGet(() -> webFilterChain.filter(serverWebExchange));
}
/** Responds to the request with an error message for the given undeclared parameter. */
private Mono<Void> RespondWithError(ServerWebExchange serverWebExchange, String undeclaredParam) {
final HttpStatus status = HttpStatus.BAD_REQUEST;
serverWebExchange.getResponse().setStatusCode(status);
serverWebExchange.getResponse().getHeaders().add(
"Content-Type", "application/json");
UndeclaredParamErrorResponse response = new UndeclaredParamErrorResponse();
response.message = "Parameter not expected: " + undeclaredParam;
response.statusCode = status.value();
String error = null;
try {
error = mapper.writeValueAsString(response);
} catch (JsonProcessingException e) {
error = "Parameter not expected; error generating JSON response";
LOGGER.warn("Error generating JSON response for undeclared argument", e);
}
return serverWebExchange.getResponse().writeAndFlushWith(
Mono.just(Mono.just(serverWebExchange.getResponse().bufferFactory().wrap(
error.getBytes(StandardCharsets.UTF_8)))));
}
/** Structure for generating error JSON. */
static class UndeclaredParamErrorResponse {
public String message;
public int statusCode;
}
/**
* Check that all of the request params of the specified request are contained within the specified set of allowed
* parameters.
*
* #param request Request whose params to check.
* #param allowedParams Set of allowed request parameters.
* #return Name of a param in the request that is not allowed, or empty if all params in the request are allowed.
*/
private Optional<String> checkParams(ServerHttpRequest request, Set<String> allowedParams) {
return request.getQueryParams().keySet().stream().filter(param ->
!allowedParams.contains(param)
).findFirst();
}
/**
* Extract all request parameters declared via {#link RequestParam} for the specified handler method.
*
* #param handlerMethod Handler method to extract declared params for.
* #return Set of declared request parameters.
*/
private Set<String> getDeclaredRequestParams(HandlerMethod handlerMethod) {
Set<String> declaredRequestParams = new HashSet<>();
MethodParameter[] methodParameters = handlerMethod.getMethodParameters();
ParameterNameDiscoverer parameterNameDiscoverer = new DefaultParameterNameDiscoverer();
for (MethodParameter methodParameter : methodParameters) {
if (methodParameter.hasParameterAnnotation(RequestParam.class)) {
RequestParam requestParam = methodParameter.getParameterAnnotation(RequestParam.class);
if (StringUtils.hasText(requestParam.value())) {
declaredRequestParams.add(requestParam.value());
} else {
methodParameter.initParameterNameDiscovery(parameterNameDiscoverer);
declaredRequestParams.add(methodParameter.getParameterName());
}
}
}
return declaredRequestParams;
}
}
Here's the unit test I wrote for it. I recommend checking it into your codebase as well.
package com.example.springundeclaredparamerror;
import com.github.tomakehurst.wiremock.junit.WireMockRule;
import org.junit.Rule;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.autoconfigure.web.reactive.WebFluxTest;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Import;
import org.springframework.test.context.junit4.SpringRunner;
import org.springframework.test.web.reactive.server.WebTestClient;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;
import reactor.core.publisher.Mono;
import static com.github.tomakehurst.wiremock.core.WireMockConfiguration.wireMockConfig;
#RunWith(SpringRunner.class)
#WebFluxTest(controllers = {DisallowUndeclaredParamFilterTest.TestController.class})
public class DisallowUndeclaredParamFilterTest {
private static final String TEST_ENDPOINT = "/disallowUndeclaredParamFilterTest";
#Rule
public final WireMockRule wireMockRule = new WireMockRule(wireMockConfig().dynamicPort());
#Autowired
private WebTestClient webClient;
#Configuration
#Import({TestController.class, DisallowUndeclaredParamsFilter.class})
static class TestConfig {
}
#RestController
static class TestController {
#GetMapping(TEST_ENDPOINT)
public Mono<String> retrieveEntity(#RequestParam(name = "a", required = false) final String a) {
return Mono.just("ok");
}
}
#Test
public void testAllowsNoArgs() {
webClient.get().uri(TEST_ENDPOINT).exchange().expectBody(String.class).isEqualTo("ok");
}
#Test
public void testAllowsDeclaredArg() {
webClient.get().uri(TEST_ENDPOINT + "?a=1").exchange().expectBody(String.class).isEqualTo("ok");
}
#Test
public void testDisallowsUndeclaredArg() {
webClient.get().uri(TEST_ENDPOINT + "?b=1").exchange().expectStatus().is4xxClientError();
}
}

AspectJ, Intertype Definition

I am receiving this error when I compile
The type XXX must implement the inherited abstract method
I have three files
A default implementation [com.SafeReaderIMPL.java]
public class SafeReaderIMPL implements ISafeReader {
private boolean successfulRead;
public SafeReaderIMPL() {
successfulRead = true;
}
protected void fail() {
successfulRead = false;
}
#Override
public boolean isSuccessfulRead() {
return successfulRead;
}
}
An interface file [com.ISafeReader.java]
public interface ISafeReader {
public boolean isSuccessfulRead();
}
An apsect (using annotations) [com.SafeReaderAspect.java]
#Aspect
public class SafeReaderAspect {
#DeclareParents(value = "com.BadReader", defaultImpl = SafeReaderIMPL.class)
public ISafeReader implementedInterface;
#AfterThrowing(pointcut = "execution(* *.*(..)) && this(m)", throwing = "e")
public void handleBadRead(JoinPoint joinPoint, ISafeReader m, Throwable e) {
((SafeReaderIMPL)m).fail();
}
}
And a Test Class [com.BadReader]
public class BadReader {
public void fail() throws Throwable {
throw new Throwable();
}
}
I compile the first three files in a separate jar using
ajc -source 1.8 -sourceroots . -outjar aspectLib.jar
I then compile the second file using the aspectLib like so
ajc -source 1.8 -sourceroots . -aspectpath ./aspectLib.jar -outjar common.jar
When I go to compile the second jar I get the error. I am using the latest stable version of AspectJ 1.8.3
BadReader.java:10 [error] The type BadReader must implement the
inherited abstract method ISafeReader.isSuccessfulRead() public class
BadReader {
^^^^^^^^
The problem is not two-step compilation as such, but the fact that #DeclareParents in #AspectJ syntax in not 100% compatible with declare parents in native syntax. Actually, #DeclareParents for introducing default interface implementations is superseded by #DeclareMixin (see this bug ticket), but the downside of the mixin approach is that you do not have a real A implements B scenario there, i.e. you cannot cast as you wish in your after-throwing advice, so this is also not a good option in your case.
So what do you do if you want to keep the two-step compilation approach? Just use native syntax:
Interface:
package com;
public interface ISafeReader {
boolean isSuccessfulRead();
}
Default implementation:
package com;
public class SafeReaderIMPL implements ISafeReader {
private boolean successfulRead;
public SafeReaderIMPL() { successfulRead = true; }
public void fail() { successfulRead = false; }
#Override public boolean isSuccessfulRead() { return successfulRead; }
}
ITD aspect:
package com;
public aspect SafeReaderAspect {
declare parents : com.BadReader implements SafeReaderIMPL;
after(ISafeReader safeReader) throwing : execution(* *(..)) && this(safeReader) {
System.out.println(thisJoinPoint + " - calling 'fail()' before rethrowing error");
((SafeReaderIMPL) safeReader).fail();
}
}
ITD target class with sample main method:
package com;
public class BadReader {
public void doSomething() {
throw new RuntimeException("my error");
}
public static void main(String[] args) {
BadReader badReader = new BadReader();
System.out.println("badReader.isSuccessfulRead() = " + badReader.isSuccessfulRead());
try { badReader.doSomething(); }
catch(Throwable t) { System.out.println(t); }
System.out.println("badReader.isSuccessfulRead() = " + badReader.isSuccessfulRead());
}
}
Now you can use the two-stage compilation approach.
Console output:
badReader.isSuccessfulRead() = true
execution(void com.BadReader.doSomething()) - calling 'fail()' before rethrowing error
java.lang.RuntimeException: my error
badReader.isSuccessfulRead() = false
The problem is due to the two-step compilation. During the second step, ajc needs the source code of SafeReaderIMPL to be able to weave BadReader, but it cannot find it into aspectLib.jar
In fact, if you try compiling in a single step (I did), it compiles and runs.
Unfortunately I don't know a way to fix this without providing the source code during the second compile step, which I suppose would render the whole two-step approach a bit pointless.