Changing Hibernate SQL logging level in Logback at runtime through JMX - sql

I have configured Hibernate to use logback logging library. And created an appender that catches logging data from "org.hibernate.SQL" and "org.hibernate.type" loggers. By default, those are set to INFO level.
As the next step I try to change the level of those 2 loggers to DEBUG level using JMX interface of logback. But it does not work and log file contains no data. Only if I set the logging level to DEBUG in the configuration file and then restart the server it works.
Should I do anything additional in order to make Hibernate to start logging?
Here goes the appender/logger configuration:
<configuration debug="false" scan="true" scanPeriod="5 minutes">
<jmxConfigurator />
...
<property name="SQL_LOG_LEVEL" value="DEBUG" />
<appender name="SQL_LOG" class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>${LOG_DIRECTORY}/sql_${weblogic.Name}.log</file>
<rollingPolicy class="ch.qos.logback.core.rolling.FixedWindowRollingPolicy">
<fileNamePattern>${ROTATION_DIRECTORY}/sql_${weblogic.Name}.%i.log.zip</fileNamePattern>
<minIndex>1</minIndex>
<maxIndex>5</maxIndex>
</rollingPolicy>
<triggeringPolicy class="ch.qos.logback.core.rolling.SizeBasedTriggeringPolicy">
<maxFileSize>50MB</maxFileSize>
</triggeringPolicy>
<encoder>
<pattern>${LOG_PATTERN}</pattern>
</encoder>
</appender>
<logger name="org.hibernate.SQL" level="${SQL_LOG_LEVEL}" additivity="false">
<appender-ref ref="SQL_LOG" />
</logger>
<logger name="org.hibernate.type" level="${SQL_LOG_LEVEL}" additivity="false">
<appender-ref ref="SQL_LOG" />
</logger>
...
</configuration>
EDIT: I have several applications (EAR) files deployed on the same container. All applications are using same logging configuration.

Problem appears to be in fact that I deploy several applications on one sever and, basically, each application's class loader has a copy of logback libraries. That's why several logging context are created, but because they all share the same name ("default"), basically, only one get registered to MBean server.
The solution could be either moving logback libraries higher in class loader hierarchy or use logger context separation as proposed by logback documentation.

I was not able to log any output from org.hibernate.SQL and friends until I set my log level to TRACE instead of DEBUG using logback over slf4j.

Related

Running Gatling tests not showing testFile.log on IntelliJ

We have an issue where whenever our Gatling performance tests are run the .log file that should generate at the root of the folder is not there.
This is my whole logback file if anyone able to help please.
<contextListener class="ch.qos.logback.classic.jul.LevelChangePropagator">
<resetJUL>true</resetJUL>
</contextListener>
<appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
<!-- <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
<level>INFO</level>
</filter> -->
<encoder>
<pattern>%d{HH:mm:ss.SSS} [%-5level] %logger{15} - %msg%n%rEx</pattern>
<immediateFlush>true</immediateFlush>
</encoder>
</appender>
<appender name="FILE" class="ch.qos.logback.core.FileAppender">
<file>testFile.log</file>
<append>false</append>
<!-- encoders are assigned the type
ch.qos.logback.classic.encoder.PatternLayoutEncoder by default -->
<encoder>
<pattern>%d{HH:mm:ss.SSS} [%-5level] %logger{15} - %msg%n%rEx</pattern>
</encoder>
</appender>
<!-- uncomment and set to DEBUG to log all failing HTTP requests -->
<!-- uncomment and set to TRACE to log all HTTP requests -->
<logger name="io.gatling.http.engine.response" level="TRACE" />
<root level="WARN">
<appender-ref ref="CONSOLE" />
<appender-ref ref="FILE" />
</root>
Thank you very much.
Update
It seems the issue may be with IntelliJ itself as noticed we can see the file when going directly to the finder.
Disabling custom plugins should help. Seems one of configuration was corrupted.
There's a good chance the file is simply not generated where you expect it. Try setting an absolute path instead to verify.

Enable perf4j profiled annotation in intellij

I'm trying to enable perf4j annotations in intellij but I'm struggling to configure correctly AspectJ. More specifically the log file is created correctly but lacks of any data from the annotated method.
These are the relevant extracts of configuration:
logback.xml
<configuration debug="true">
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<!-- encoders are assigned the type
ch.qos.logback.classic.encoder.PatternLayoutEncoder by default -->
<encoder>
<pattern>%d{HH:mm:ss} [%thread] %-5level %logger{36} - %msg%n</pattern>
</encoder>
</appender>
<appender name="statistics" class="ch.qos.logback.core.FileAppender">
<file>./target/statisticsLogback.log</file>
<append>false</append>
<layout>
<pattern>%msg%n</pattern>
</layout>
</appender>
<appender name="coalescingStatistics" class="org.perf4j.logback.AsyncCoalescingStatisticsAppender">
<!-- encoders are assigned the type
ch.qos.logback.classic.encoder.PatternLayoutEncoder by default -->
<timeSlice>1000</timeSlice>
<appender-ref ref="statistics"/>
</appender>
<appender name="listAppender" class="ch.qos.logback.core.read.ListAppender">
<!-- encoders are assigned the type
ch.qos.logback.classic.encoder.PatternLayoutEncoder by default -->
<timeSlice>1000</timeSlice>
</appender>
<logger name="org.perf4j.TimingLogger" level="info">
<appender-ref ref="coalescingStatistics" />
<appender-ref ref="listAppender"/>
</logger>
<root level="debug">
<appender-ref ref="STDOUT" />
</root>
aop.xml
<?xml version="1.0" encoding="UTF-8"?>
<aspectj>
<!--
We only want to weave in the log4j TimingAspect into the #Profiled classes.
Note that Perf4J provides TimingAspects for the most popular Java logging
frameworks and facades: log4j, java.util.logging, Apache Commons Logging
and SLF4J. The TimingAspect you specify here will depend on which logging
framework you wish to use in your code.
-->
<aspects>
<aspect name="org.perf4j.slf4j.aop.TimingAspect"/>
<!-- if SLF4J/logback use org.perf4j.slf4j.aop.TimingAspect instead -->
</aspects>
<weaver options="-verbose -showWeaveInfo">
<!--
Here is where we specify the classes to be woven. You can specify package
names like com.company.project.*
-->
<include within="com.mycode.myproject.mypackage.*"/>
<include within="org.perf4j.slf4j.aop.*"/>
</weaver>
</aspectj>
Finally the related test method is tagged with the #Profiled annotation, this is part of the package defined in the aop.xml.
This configuration results in the log file being produced (which suggests that the logback.xml is configured correctly, however it only contains headers and no statistics from the tagged method.
The main question I have is where the AspectJ configuration should go within Intellij, I have included the aop.xml under a manually created META-INF folder in the src folder but I'm not sure this is detected by AspectJ at all.
Thanks in advance
UPDATE
I have made some progress on this since my initial post, specifically introducing two changes:
i) included -javaagent:lib\aspectjweaver.jar
ii) moved the aop.xml into a META-INF folder.
The aop configuration is now being picked up as it logs the configuration details and it also mentions the method being profiled.
The issue is now that the thread being profiled crashes, it doesn't log any exceptions but via debug the issue seems to be related to a ClassNotFoundException in org.aspectj.runtime.reflect.Factory when trying to instantiate org.aspectj.runtime.reflect.JoinPointImpl.
To isolate the issue i have removed all the maven imports of aspectJ and used the jars provided by the installation package but the issue persists, also the fact that the application crashes without any logging makes the issue tracking harder.
UPDATE
To clarify:
After reading more about this including the manual in the wayback link (thanks for that) I realised I was mixing up load-time / compile-time approach. Since then I tried both methods as described in the guide but with the same results described in my earlier update.
As per above, I do start the application with aspectj weaver option (-javaagent)
The build is done via IDE, as per above at the moment I have removed the aspectj / perf4j dependencies from Maven and linked to local jars
As mentioned the aop.xml does get picked up as mentioned in the update with no errors or warning, just confirmation of the weaved method
Okay, I have added a full Maven example to a GitHub repo which you can just clone and play around with.
Some basic things to consider:
For compile-time weaving (CTW) you need aspectjrt.jar on the classpath when compiling and running the code. You also need to use the AspectJ compiler to build the project, a normal Java compiler is not enough.
For load-time weaving (LTW) you need aspectjweaver.jar as a Java agent on the command line when running the code: -javaagent:/path/to/aspectjweaver.jar. You also need to add it as a VM argument to your LTW run configuration in IDEA.
For LTW you also need META-INF/aop.xml in your resources folder. Please also note that in order to encompass subpackages you should use the ..* notation, not just .*, e.g. <include within="de.scrum_master..*"/>.
You find more information in my project's read-me file.
P.S.: The Perf4J documentation is outdated and the project unmaintained. Thus, it still mentions AspectJ 1.6.x as necessary dependencies. I built and ran everything with the latest AspectJ 1.8.10 and it runs just fine, both from Maven and IDEA.

multi log4net instances using different configurations from the same config file

I am writing an application that will require 2 different loggers, each logging in a totally different way. When I create each instance of the log4net logger how can I get it to read from its own config section within the same app.config file. Is this possible as all I have seen so far is it taking the default
You can log two or more things independently without using separate config files.
LogManager.GetLogger("Log1")
LogManager.GetLogger("Log2")
Then in your config file you can create them like this
<logger name="Log1" additivity="false">
<level value="INFO" />
<appender-ref ref="LogFileAppender1" />
</logger>
<logger name="LOg2" additivity="false">
<level value="INFO" />
<appender-ref ref="LogFileAppender2" />
</logger>
By selecting additivity as false then they will log separately. You can then populate their appenders to write the info as needed.

How to log quartz and nhibernate in different log files using log4net

I use in my application nhibernate and qurtz and I would like the log4net to write the logs to different files. the nhibernate logs to "nhibernate.log" and the qurtz logs to "quartz.log".
How do I need to cinfigure the log4net config file to get this result??
Thanks, Avi.
You can configure which appender the nhibernate logger has to use:
<logger name="NHibernate">
<level value="ERROR" />
<appender-ref ref="NHibernateAppender"/>
</logger>
<logger name="NHibernate.SQL">
<level value="ERROR" />
<appender-ref ref="NHibernateAppender"/>
</logger>
Configure a different appender for your other loggers and you have sepparate log files.

Intermittent log4net RollingFileAppender locked file issue

We are seeing an intermittent issue on development and production machines whereby our log files are not getting logged to.
When running in development and debugging using Visual Studio we get the following log4net error messages in the VS output window:
log4net:ERROR [RollingFileAppender] Unable to acquire lock on file C:\folder\file.log.
The process cannot access the file 'C:\folder\file.log' because it is being used by another process.
log4net:ERROR XmlConfigurator: Failed to find configuration section 'log4net' in the application's .config file.
Check your .config file for the <log4net> and <configSections> elements.
The configuration section should look like:
<section
name="log4net"
type="log4net.Config.Log4NetConfigurationSectionHandler,log4net" />
Our current workaround for the issue is to rename the last log file. We would of course expect this to fail (due to the aforementioned file lock), but it normally doesn't. Once or twice the rename has failed due to a lock from the aspnet_wp.exe process.
Our log4net configuration section is shown below:
<log4net>
<appender name="RollingLogFileAppender" type="log4net.Appender.RollingFileAppender">
<file value="C:\folder\file.log"/>
<appendToFile value="true" />
<datePattern value="yyyyMMdd" />
<rollingStyle value="Date" />
<maximumFileSize value="10MB" />
<maxSizeRollBackups value="100" />
<layout type="log4net.Layout.PatternLayout">
<header value="[Header]
"/>
<footer value="[Footer]
"/>
<conversionPattern value="%date %-5level %logger ${COMPUTERNAME} %property{UserHostAddress} [%property{SessionID}] - %message%newline"/>
</layout>
</appender>
<root>
<level value="INFO"/>
<appender-ref ref="RollingLogFileAppender"/>
</root>
</log4net>
As mentioned, we are seeing this intermittently on machines, but once the issue happens it persists.
Try adding
<lockingModel type="log4net.Appender.FileAppender+MinimalLock" />
to your <appender /> element. There is some performance impact because this means that log4net will lock the file, write to it, and unlock it for each write operation (as opposed to the default behavior, which acquires and holds onto the lock for a long time).
One implication of the default behavior is that if you're using it under a Web site that is being executed under multiple worker processes running on the same machine, each one will try to acquire and hold onto that lock indefinitely, and two of them are just going to lose. Changing the locking model to the minimal lock works around this issue.
(When debugging, ungraceful terminations and spinning up lots of new worker processes is exactly the type of thing that's likely to happen.)
Good luck!
Also be aware of the log4net FAQ:
How do I get multiple process to log to the same file?
Before you even start trying any of the alternatives provided, ask
yourself whether you really need to have multiple processes log to the
same file, then don't do it ;-).
FileAppender offers pluggable locking models for this usecase but all
existing implementations have issues and drawbacks.
By default the FileAppender holds an exclusive write lock on the log
file while it is logging. This prevents other processes from writing
to the file. This model is known to break down with (at least on some
versions of) Mono on Linux and log files may get corrupted as soon as
another process tries to access the log file.
MinimalLock only acquires the write lock while a log is being written.
This allows multiple processes to interleave writes to the same file,
albeit with a considerable loss in performance.
InterProcessLock doesn't lock the file at all but synchronizes using a
system wide Mutex. This will only work if all processes cooperate (and
use the same locking model). The acquisition and release of a Mutex
for every log entry to be written will result in a loss of
performance, but the Mutex is preferable to the use of MinimalLock.
If you use RollingFileAppender things become even worse as several
process may try to start rolling the log file concurrently.
RollingFileAppender completely ignores the locking model when rolling
files, rolling files is simply not compatible with this scenario.
A better alternative is to have your processes log to
RemotingAppenders. Using the RemoteLoggingServerPlugin (or
IRemoteLoggingSink) a process can receive all the events and log them
to a single log file. One of the examples shows how to use the
RemoteLoggingServerPlugin.
If you have
<staticLogFileName value="true" />
<rollingStyle value="Date" />
<datePattern value="yyyyMMdd" />
and add
<lockingModel type="log4net.Appender.FileAppender+MinimalLock" />
then there will be an error while the rolling happens.
The first process will create the new file and the rename the current file.
Then next proces will do the same and take the newly created file and overwrite the newly renamed file.
Resulting in the logfiel for the last day being empty.