Around annotion executed twice using WebFlux - spring-webflux

I'm facing a weird behaviour while using AOP with AspectJ.
Basically the #Around method its called either once either twice and while trying to debugging I can't find the reason why it's being executing twice (I mean what triggers the second execution of the method)
here is some code :
#Aspect
#Slf4j
public class ReactiveRedisCacheAspect {
#Pointcut("#annotation(com.xxx.xxx.cache.aop.annotations.ReactiveRedisCacheable)")
public void cacheablePointCut() {}
#Around("cacheablePointCut()")
public Object cacheableAround(final ProceedingJoinPoint proceedingJoinPoint) {
log.debug("ReactiveRedisCacheAspect cacheableAround.... - {}", proceedingJoinPoint);
MethodSignature methodSignature = (MethodSignature) proceedingJoinPoint.getSignature();
Method method = methodSignature.getMethod();
Class<?> returnTypeName = method.getReturnType();
Duration duration = Duration.ofHours(getDuration(method));
String redisKey = getKey(method, proceedingJoinPoint);
if (returnTypeName.isAssignableFrom(Flux.class)) {
log.debug("returning Flux");
return cacheRepository.hasKey(redisKey)
.filter(found -> found)
.flatMapMany(found -> cacheRepository.findByKey(redisKey))
.flatMap(found -> saveFlux(proceedingJoinPoint, redisKey, duration));
} else if (returnTypeName.isAssignableFrom(Mono.class)) {
log.debug("Returning Mono");
return cacheRepository.hasKey(redisKey)
.flatMap(found -> {
if (found) {
return cacheRepository.findByKey(redisKey);
} else {
return saveMono(proceedingJoinPoint, redisKey, duration);
}
});
} else {
throw new RuntimeException("non reactive object supported (Mono,Flux)");
}
}
private String getKey(final Method method, final ProceedingJoinPoint proceedingJoinPoint) {
ReactiveRedisCacheable annotation = method.getAnnotation(ReactiveRedisCacheable.class);
String cacheName = annotation.cacheName();
String key = annotation.key();
cacheName = (String) AspectSupportUtils.getKeyValue(proceedingJoinPoint, cacheName);
key = (String) AspectSupportUtils.getKeyValue(proceedingJoinPoint, key);
return cacheName + "_" + key;
}
}
public class AspectSupportUtils {
private static final ExpressionEvaluator evaluator = new ExpressionEvaluator();
public static Object getKeyValue(JoinPoint joinPoint, String keyExpression) {
if (keyExpression.contains("#") || keyExpression.contains("'")) {
return getKeyValue(joinPoint.getTarget(), joinPoint.getArgs(), joinPoint.getTarget().getClass(),
((MethodSignature) joinPoint.getSignature()).getMethod(), keyExpression);
}
return keyExpression;
}
private static Object getKeyValue(Object object, Object[] args, Class<?> clazz, Method method, String keyExpression) {
if (StringUtils.hasText(keyExpression)) {
EvaluationContext evaluationContext = evaluator.createEvaluationContext(object, clazz, method, args);
AnnotatedElementKey methodKey = new AnnotatedElementKey(method, clazz);
return evaluator.key(keyExpression, methodKey, evaluationContext);
}
return SimpleKeyGenerator.generateKey(args);
}
}
#Target({ElementType.METHOD})
#Retention(RetentionPolicy.RUNTIME)
#Documented
public #interface ReactiveRedisCacheable {
String key();
String cacheName();
long duration() default 1L;
}
#RestController
#RequestMapping("api/pub/v1")
public class TestRestController{
#ReactiveRedisCacheable(cacheName = "test-cache", key = "#name", duration = 1L)
#GetMapping(value = "test")
public Mono<String> getName(#RequestParam(value = "name") String name){
return Mono.just(name);
}
}
#Configuration
public class Config {
#Bean
public ReactiveRedisCacheAspect reactiveRedisCache (ReactiveRedisCacheAspect reactiveRedisCacheAspect) {
return reactiveRedisCacheAspect;
}
}
logs:
ReactiveRedisCacheAspect cacheableAround.... - {}execution(Mono com.abc.def.xxx.rest.TestRestcontroller.getName(String))
2021-06-04 15:36:23.096 INFO [fo-bff,f688025287be7e7c,f688025287be7e7c] 20060 --- [ctor-http-nio-3] c.m.s.c.a.i.ReactiveRedisCacheAspect : Returning Mono
2021-06-04 15:36:23.097 INFO [fo-bff,f688025287be7e7c,f688025287be7e7c] 20060 --- [ctor-http-nio-3] c.m.s.c.repository.CacheRepositoryImpl : searching key: (bff_pippo)
ReactiveRedisCacheAspect cacheableAround.... - {}execution(Mono com.abc.def.xxx.rest.TestRestcontroller.getName(String))
2021-06-04 15:36:23.236 INFO [fo-bff,f688025287be7e7c,f688025287be7e7c] 20060 --- [ioEventLoop-7-2] c.m.s.c.a.i.ReactiveRedisCacheAspect : Returning Mono
2021-06-04 15:36:23.236 INFO [fo-bff,f688025287be7e7c,f688025287be7e7c] 20060 --- [ioEventLoop-7-2] c.m.s.c.repository.CacheRepositoryImpl : searching key: (bff_pippo)
2021-06-04 15:36:23.250 INFO [fo-bff,f688025287be7e7c,f688025287be7e7c] 20060 --- [ioEventLoop-7-2] c.m.s.c.repository.CacheRepositoryImpl : saving obj: (key:bff_pippo) (expiresIn:3600s)
2021-06-04 15:36:23.275 INFO [fo-bff,f688025287be7e7c,f688025287be7e7c] 20060 --- [ioEventLoop-7-2] c.m.s.c.repository.CacheRepositoryImpl : saving obj: (key:bff_pippo) (expiresIn:3600s)
So far I would have expected the cacheableAround would be executed only once, but what happens its a bit weird, if the object is present on redis the method is executed only once but if is not present the method is executed twice which it doesn't make sense, moreover it should be the business logic to manage what to do inside the method.
Thanks in advance!

You did not mention whether you use native AspectJ via load- or compile-time weaving or simply Spring AOP. Because I see not #Component annotation on your aspect, it might as well be native AspectJ, unless you configure your beans via #Bean factory methods in a configuration class or XML.
Assuming that you are using full AspectJ, a common problem newbies coming from Spring AOP have, is that they are not used to the fact that AspectJ not only intercepts execution joinpoints, but also call ones. This leads to the superficial perception that the same joinpoint is intercepted twice. But in reality, it is once the method call (in the class from which the call is made) and once the method execution (in the class where the target method resides). This is easy to determine if at the beginning of your advice method you simply log the joinpoint. In your case:
System.out.println(proceedingJoinPoint);
If then on the console you see something like
call(public void org.acme.MyClass.myMethod())
execution(public void org.acme.MyClass.myMethod())
then you know what is happening.
In case you use Spring AOP, probably it is an issue with the aspect or the Redis caching behaviour that is different from your expectation.

Related

Unable to add mutator for an existing field of a class

I'm trying to add a mutator for an existing private final field. I can transform the field modifiers to remove the final specification and add an accessor method:
// accessor interface
public interface UniqueIdAccessor {
Serializable getUniqueId();
}
// mutator interface
public interface UniqueIdMutator {
void setUniqueId(Serializable uniqueId);
}
...
// fragment of Java agent implementation
return new AgentBuilder.Default()
.type(hasSuperType(named("org.junit.runner.Description")))
.transform(new Transformer() {
#Override
public DynamicType.Builder<?> transform(DynamicType.Builder<?> builder, TypeDescription typeDescription,
ClassLoader classLoader, JavaModule module) {
return builder.field(named("fUniqueId")).transform(ForField.withModifiers(FieldManifestation.PLAIN))
.implement(UniqueIdAccessor.class).intercept(FieldAccessor.ofField("fUniqueId"))
// .implement(UniqueIdMutator.class).intercept(FieldAccessor.ofField("fUniqueId"))
.implement(Hooked.class);
}
})
.installOn(instrumentation);
...
Here's a method that uses reflection to check the modifiers of the target field and calls the accessor to get the value of the field.
private static void injectProxy(Description description) {
try {
Field bar = Description.class.getDeclaredField("fUniqueId");
System.out.println("isFinal: " + ((bar.getModifiers() & Modifier.FINAL) != 0));
} catch (NoSuchFieldException | SecurityException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
Serializable uniqueId = ((UniqueIdAccessor) description).getUniqueId();
System.out.println("uniqueId: " + uniqueId);
}
// isFinal: false
// uniqueId: <description-unique-id>
... but if I uncomment the second "implement" expression to add the mutator, the transform blows up:
// isFinal: true
// java.lang.ClassCastException:
// class org.junit.runner.Description cannot be cast to class com.nordstrom.automation.junit.UniqueIdAccessor
// (org.junit.runner.Description and com.nordstrom.automation.junit.UniqueIdAccessor
// are in unnamed module of loader 'app')
I could set the field value with reflection, but that defeats the purpose of using Byte Buddy in the first place!
The problem with this approach is that the field accessor considers the input type prior to the modification. Byte Buddy prohibits this as it does not consider the mutation to be legal, not knowing about the removed modifier. As a result, the transformation fails in its entirety and you get the error you are seeing. (Register a listener to see this error.)
To avoid this, you can implement a custom Implementation using FieldAccess (without or). You can have a look at the more convenient FieldAccessor to see how this is implemented, only that you need to drop the validity checks.
Thanks for pointing me in the right direction! I assemble the StackManipulation object that defines the mutator method with this:
final TypeDescription description = TypePool.Default.ofSystemLoader().describe("org.junit.runner.Description").resolve();
final Generic _void_ = TypeDescription.VOID.asGenericType();
final Generic serializable = TypePool.Default.ofSystemLoader().describe("java.io.Serializable").resolve().asGenericType();
final MethodDescription.Token setUniqueIdToken = new MethodDescription.Token("setUniqueId", Modifier.PUBLIC, _void_, Arrays.asList(serializable));
final MethodDescription setUniqueId = new MethodDescription.Latent(description, setUniqueIdToken);
final Token fUniqueIdToken = new FieldDescription.Token("fUniqueId", Modifier.PRIVATE, serializable);
final FieldDescription fUniqueId = new FieldDescription.Latent(description, fUniqueIdToken);
final StackManipulation setUniqueIdImpl = new StackManipulation.Compound(
MethodVariableAccess.loadThis(),
MethodVariableAccess.load(setUniqueId.getParameters().get(0)),
Assigner.DEFAULT.assign(serializable, serializable, Typing.STATIC),
FieldAccess.forField(fUniqueId).write(),
MethodReturn.VOID
);
... and I transform the target class with this:
return new AgentBuilder.Default()
.type(hasSuperType(named("org.junit.runner.Description")))
.transform(new Transformer() {
#Override
public DynamicType.Builder<?> transform(DynamicType.Builder<?> builder, TypeDescription typeDescription,
ClassLoader classLoader, JavaModule module) {
return builder.field(named("fUniqueId")).transform(ForField.withModifiers(FieldManifestation.PLAIN))
.implement(AnnotationsAccessor.class).intercept(FieldAccessor.ofField("fAnnotations"))
.implement(UniqueIdAccessor.class).intercept(FieldAccessor.ofField("fUniqueId"))
.implement(UniqueIdMutator.class).intercept(new Implementation.Simple(setUniqueIdImpl));
}
})
.installOn(instrumentation);
Here are the definitions of the three interfaces used in the transform:
// annotations accessor interface
public interface AnnotationsAccessor {
Annotation[] annotations();
}
// unique ID accessor interface
public interface UniqueIdAccessor {
Serializable getUniqueId();
}
// unique ID mutator interface
public interface UniqueIdMutator {
void setUniqueId(Serializable uniqueId);
}

ByteBuddy: AbstractMethodError when set interceptor

The following is my learning code, when I executing the code, an exception happened:
Exception in thread "main"
java.lang.AbstractMethodError: org.learning.UserRepository$ByteBuddy$etz0xUhc.$$_pharos_set_interceptor(Lorg/learning/Interceptor;)V**
ByteBuddy version: 1.10.14
final TypeCache.SimpleKey cacheKey = getCacheKey(learningClazz, interceptor.getClass());
Class proxyClass = load(learningClazz, proxyCache, cacheKey, byteBuddy ->
byteBuddy
.subclass(learningClazz)
.defineField(ProxyConfiguration.INTERCEPTOR_FIELD_NAME, Interceptor.class, Visibility.PRIVATE)
.method(not(isDeclaredBy(Object.class)))
.intercept(MethodDelegation.to(ProxyConfiguration.InterceptorDispatcher.class))
.implement(ProxyConfiguration.class)
.intercept(FieldAccessor.ofField(ProxyConfiguration.INTERCEPTOR_FIELD_NAME)
.withAssigner(Assigner.DEFAULT, Assigner.Typing.DYNAMIC)));
final ProxyConfiguration proxy = (ProxyConfiguration) proxyClass.getDeclaredConstructor().newInstance();
proxy.$$_pharos_set_interceptor(interceptor);
return (T) proxy;
public interface ProxyConfiguration {
String INTERCEPTOR_FIELD_NAME = "$$_pharos_interceptor";
void $$_pharos_set_interceptor(Interceptor interceptor);
class InterceptorDispatcher {
#RuntimeType
public static Object intercept(
#This final Object instance,
#Origin final Method method,
#AllArguments final Object[] arguments,
#StubValue final Object stubValue,
#FieldValue(INTERCEPTOR_FIELD_NAME) Interceptor interceptor,
#SuperMethod Method superMethod
) throws Throwable
{
if (interceptor == null) {
return stubValue;
}
else {
return interceptor.intercept(instance, method, arguments, superMethod);
}
}
}
}
Package-private methods are overriden but the JVM will not dispatch them dynamically if the subclass is loaded on a different class loader. If you declare the method public, the problem should be solved. Alternatively, inject the class into the target class loader.

Customized parameter logging when using aspect oriented programing

All the examples I've seen that use aspect oriented programming for logging either log just class, method name and duration, and if they log parameters and return values they simply use ToString(). I need to have more control over what is logged. For example I want to skip passwords, or in some cases log all properties of an object but in other cases just the id property.
Any suggestions? I looked at AspectJ in Java and Unity interception in C# and could not find a solution.
You could try introducing parameter annotations to augment your parameters with some attributes. One of those attributes could signal to skip logging the parameter, another one could be used to specify a converter class for the string representation.
With the following annotations:
#Documented
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.METHOD)
public #interface Log {
}
#Documented
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.PARAMETER)
public #interface SkipLogging {
}
#Documented
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.PARAMETER)
public #interface ToStringWith {
Class<? extends Function<?, String>> value();
}
the aspect could look like this:
import java.lang.reflect.Parameter;
import java.util.function.Function;
import java.util.stream.Collectors;
import java.util.stream.IntStream;
import org.aspectj.lang.reflect.MethodSignature;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public aspect LoggingAspect {
private final static Logger logger = LoggerFactory.getLogger(LoggingAspect.class);
pointcut loggableMethod(): execution(#Log * *..*.*(..));
before(): loggableMethod() {
MethodSignature signature = (MethodSignature) thisJoinPoint.getSignature();
Parameter[] parameters = signature.getMethod()
.getParameters();
String message = IntStream.range(0, parameters.length)
.filter(i -> this.isLoggable(parameters[i]))
.<String>mapToObj(i -> toString(parameters[i], thisJoinPoint.getArgs()[i]))
.collect(Collectors.joining(", ",
"method execution " + signature.getName() + "(", ")"));
Logger methodLogger = LoggerFactory.getLogger(
thisJoinPointStaticPart.getSignature().getDeclaringType());
methodLogger.debug(message);
}
private boolean isLoggable(Parameter parameter) {
return parameter.getAnnotation(SkipLogging.class) == null;
}
private String toString(Parameter parameter, Object value) {
ToStringWith toStringWith = parameter.getAnnotation(ToStringWith.class);
if (toStringWith != null) {
Class<? extends Function<?, String>> converterClass =
toStringWith.value();
try {
#SuppressWarnings("unchecked")
Function<Object, String> converter = (Function<Object, String>)
converterClass.newInstance();
String str = converter.apply(value);
return String.format("%s='%s'", parameter.getName(), str);
} catch (Exception e) {
logger.error("Couldn't instantiate toString converter for logging "
+ converterClass.getName(), e);
return String.format("%s=<error converting to string>",
parameter.getName());
}
} else {
return String.format("%s='%s'", parameter.getName(), String.valueOf(value));
}
}
}
Test code:
public static class SomethingToStringConverter implements Function<Something, String> {
#Override
public String apply(Something something) {
return "Something nice";
}
}
#Log
public void test(
#ToStringWith(SomethingToStringConverter.class) Something something,
String string,
#SkipLogging Class<?> cls,
Object object) {
}
public static void main(String[] args) {
// execution of this method should log the following message:
// method execution test(something='Something nice', string='some string', object='null')
test(new Something(), "some string", Object.class, null);
}
I used Java 8 Streams API in my answer for it's compactness, you could convert the code to normal Java code if you don't use Java 8 features or need better efficiency. It's just to give you an idea.

Storm Kafkaspout KryoSerialization issue for java bean from kafka topic

Hi I am new to Storm and Kafka.
I am using storm 1.0.1 and kafka 0.10.0
we have a kafkaspout that would receive java bean from kafka topic.
I have spent several hours digging to find the right approach for that.
Found few articles which are useful but none of the approaches worked for me so far.
Following is my codes:
StormTopology:
public class StormTopology {
public static void main(String[] args) throws Exception {
//Topo test /zkroot test
if (args.length == 4) {
System.out.println("started");
BrokerHosts hosts = new ZkHosts("localhost:2181");
SpoutConfig kafkaConf1 = new SpoutConfig(hosts, args[1], args[2],
args[3]);
kafkaConf1.zkRoot = args[2];
kafkaConf1.useStartOffsetTimeIfOffsetOutOfRange = true;
kafkaConf1.startOffsetTime = kafka.api.OffsetRequest.LatestTime();
kafkaConf1.scheme = new SchemeAsMultiScheme(new KryoScheme());
KafkaSpout kafkaSpout1 = new KafkaSpout(kafkaConf1);
System.out.println("started");
ShuffleBolt shuffleBolt = new ShuffleBolt(args[1]);
AnalysisBolt analysisBolt = new AnalysisBolt(args[1]);
TopologyBuilder topologyBuilder = new TopologyBuilder();
topologyBuilder.setSpout("kafkaspout", kafkaSpout1, 1);
//builder.setBolt("counterbolt2", countbolt2, 3).shuffleGrouping("kafkaspout");
//This is for field grouping in bolt we need two bolt for field grouping or it wont work
topologyBuilder.setBolt("shuffleBolt", shuffleBolt, 3).shuffleGrouping("kafkaspout");
topologyBuilder.setBolt("analysisBolt", analysisBolt, 5).fieldsGrouping("shuffleBolt", new Fields("trip"));
Config config = new Config();
config.registerSerialization(VehicleTrip.class, VehicleTripKyroSerializer.class);
config.setDebug(true);
config.setNumWorkers(1);
LocalCluster cluster = new LocalCluster();
cluster.submitTopology(args[0], config, topologyBuilder.createTopology());
// StormSubmitter.submitTopology(args[0], config,
// builder.createTopology());
} else {
System.out
.println("Insufficent Arguements - topologyName kafkaTopic ZKRoot ID");
}
}
}
I am serializing the data at kafka using kryo
KafkaProducer:
public class StreamKafkaProducer {
private static Producer producer;
private final Properties props = new Properties();
private static final StreamKafkaProducer KAFKA_PRODUCER = new StreamKafkaProducer();
private StreamKafkaProducer(){
props.put("bootstrap.servers", "localhost:9092");
props.put("acks", "all");
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer", "com.abc.serializer.MySerializer");
producer = new org.apache.kafka.clients.producer.KafkaProducer(props);
}
public static StreamKafkaProducer getStreamKafkaProducer(){
return KAFKA_PRODUCER;
}
public void produce(String topic, VehicleTrip vehicleTrip){
ProducerRecord<String,VehicleTrip> producerRecord = new ProducerRecord<>(topic,vehicleTrip);
producer.send(producerRecord);
//producer.close();
}
public static void closeProducer(){
producer.close();
}
}
Kyro Serializer:
public class DataKyroSerializer extends Serializer<Data> implements Serializable {
#Override
public void write(Kryo kryo, Output output, VehicleTrip vehicleTrip) {
output.writeLong(data.getStartedOn().getTime());
output.writeLong(data.getEndedOn().getTime());
}
#Override
public Data read(Kryo kryo, Input input, Class<VehicleTrip> aClass) {
Data data = new Data();
data.setStartedOn(new Date(input.readLong()));
data.setEndedOn(new Date(input.readLong()));
return data;
}
I need to get the data back to the Data bean.
As per few articles I need to provide with a custom scheme and make it part of topology but till now I have no luck
Code for Bolt and Scheme
Scheme:
public class KryoScheme implements Scheme {
private ThreadLocal<Kryo> kryos = new ThreadLocal<Kryo>() {
protected Kryo initialValue() {
Kryo kryo = new Kryo();
kryo.addDefaultSerializer(Data.class, new DataKyroSerializer());
return kryo;
};
};
#Override
public List<Object> deserialize(ByteBuffer ser) {
return Utils.tuple(kryos.get().readObject(new ByteBufferInput(ser.array()), Data.class));
}
#Override
public Fields getOutputFields( ) {
return new Fields( "data" );
}
}
and bolt:
public class AnalysisBolt implements IBasicBolt {
/**
*
*/
private static final long serialVersionUID = 1L;
private String topicname = null;
public AnalysisBolt(String topicname) {
this.topicname = topicname;
}
public void prepare(Map stormConf, TopologyContext topologyContext) {
System.out.println("prepare");
}
public void execute(Tuple input, BasicOutputCollector collector) {
System.out.println("execute");
Fields fields = input.getFields();
try {
JSONObject eventJson = (JSONObject) JSONSerializer.toJSON((String) input
.getValueByField(fields.get(1)));
String StartTime = (String) eventJson.get("startedOn");
String EndTime = (String) eventJson.get("endedOn");
String Oid = (String) eventJson.get("_id");
int V_id = (Integer) eventJson.get("vehicleId");
//call method getEventForVehicleWithinTime(Long vehicleId, Date startTime, Date endTime)
System.out.println("==========="+Oid+"| "+V_id+"| "+StartTime+"| "+EndTime);
} catch (Exception e) {
e.printStackTrace();
}
}
but if I submit the storm topology i am getting error:
java.lang.IllegalStateException: Spout 'kafkaspout' contains a
non-serializable field of type com.abc.topology.KryoScheme$1, which
was instantiated prior to topology creation.
com.minda.iconnect.topology.KryoScheme$1 should be instantiated within
the prepare method of 'kafkaspout at the earliest.
Appreciate help to debug the issue and guide to right path.
Thanks
Your ThreadLocal is not Serializable. The preferable solution would be to make your serializer both Serializable and threadsafe. If this is not possible, then I see 2 alternatives since there is no prepare method as you would get in a bolt.
Declare it as static, which is inherently transient.
Declare it transient and access it via a private get method. Then you can initialize the variable on first access.
Within the Storm lifecycle, the topology is instantiated and then serialized to byte format to be stored in ZooKeeper, prior to the topology being executed. Within this step, if a spout or bolt within the topology has an initialized unserializable property, serialization will fail.
If there is a need for a field that is unserializable, initialize it within the bolt or spout's prepare method, which is run after the topology is delivered to the worker.
Source: Best Practices for implementing Apache Storm

Spring JDBC and Java 8 - JDBCTemplate: retrieving SQL statement and parameters for debugging

I am using Spring JDBC and some nice Java 8 lambda-syntax to execute queries with the JDBCTemplate.
The reason for choosing Springs JDBCTemplate, is the implicit resource-handling that Spring-jdbc offers (I do NOT want a ORM framework for my simple usecase's).
My problem is that I want to debug the whole SQL statements with their parameters. Spring prints the SQL by default but not the parameters. Therefor I have subclassed the JDBCTemplate and overridden a query-method.
An example usage of the JDBCTemplate:
public List<Product> getProductsByModel(String modelName) {
List<Product> productList = jdbcTemplate.query(
"select * from product p, productmodel m " +
"where p.modelId = m.id " +
"and m.name = ?",
(rs, rowNum) -> new Product(
rs.getInt("id"),
rs.getString("stc_number"),
rs.getString("version"),
getModelById(rs.getInt("modelId")), // method not shown
rs.getString("displayName"),
rs.getString("imageUrl")
),
modelName);
return productList;
}
To get hold of the parameters I have, as mentioned, overridden the JDBCTemplate class. By doing a cast and using reflection I get the Object[] field with the parameters from an instance of ArgumentPreparedStatementSetter.
I suspect this implementation could potentially be dangerous, as the actual implementation of the PreparedStatementSetter may not always be ArgumentPreparedStatementSetter (Yes I should do an instanceOf check). Also, the reflection code may not be as elegant, but that is besides the point now though :).
Here's my custom implementation:
public class CustomJdbcTemplate extends JdbcTemplate {
private static final Logger log = LoggerFactory.getLogger(CustomJdbcTemplate.class);
public CustomJdbcTemplate(DataSource dataSource) {
super(dataSource);
}
public <T> T query(PreparedStatementCreator psc, final PreparedStatementSetter pss, final ResultSetExtractor<T> rse)
throws DataAccessException {
if(log.isDebugEnabled()) {
ArgumentPreparedStatementSetter aps = (ArgumentPreparedStatementSetter) pss;
try {
Field args = aps.getClass().getDeclaredField("args");
args.setAccessible(true);
Object[] parameters = (Object[]) args.get(aps);
log.debug("Parameters for SQL query: " + Arrays.toString(parameters));
} catch (NoSuchFieldException | IllegalAccessException e) {
throw new GenericException(e.toString(), e);
}
}
return super.query(psc, pss, rse);
}
}
So, when I execute the log.debug(...) statement I would also like to have the original SQL query logged (same line). Has anyone done something similar or are there any better suggestions as to how this can be achieved?
I do quite a few queries using this CustomJDBCTemplate and all my tests run, so I think it may be an acceptable solution of for most debug purposes.
Kind regards,
Thomas
I found a way to get the SQL-statement, so I will answer my own question :)
The PreparedStatementCreator has the following implementation:
private static class SimplePreparedStatementCreator implements PreparedStatementCreator, SqlProvider
So the SqlProvider has a getSql() method which does exactly what I need.
Posting the "improved" CustomJdbcTemplate class if anyone ever should need to do the same :)
public class CustomJdbcTemplate extends JdbcTemplate {
private static final Logger log = LoggerFactory.getLogger(CustomJdbcTemplate.class);
public CustomJdbcTemplate(DataSource dataSource) {
super(dataSource);
}
public <T> T query(PreparedStatementCreator psc, final PreparedStatementSetter pss, final ResultSetExtractor<T> rse)
throws DataAccessException {
if(log.isDebugEnabled()) {
if(pss instanceof ArgumentPreparedStatementSetter) {
ArgumentPreparedStatementSetter aps = (ArgumentPreparedStatementSetter) pss;
try {
Field args = aps.getClass().getDeclaredField("args");
args.setAccessible(true);
Object[] parameters = (Object[]) args.get(aps);
log.debug("SQL query: [{}]\tParams: {} ", getSql(psc), Arrays.toString(parameters));
} catch (NoSuchFieldException | IllegalAccessException e) {
throw new GenericException(e.toString(), e);
}
}
}
return super.query(psc, pss, rse);
}
private static String getSql(Object sqlProvider) { // this code is also found in the JDBCTemplate class
if (sqlProvider instanceof SqlProvider) {
return ((SqlProvider) sqlProvider).getSql();
}
else {
return null;
}
}
}