We are seeing an intermittent issue on development and production machines whereby our log files are not getting logged to.
When running in development and debugging using Visual Studio we get the following log4net error messages in the VS output window:
log4net:ERROR [RollingFileAppender] Unable to acquire lock on file C:\folder\file.log.
The process cannot access the file 'C:\folder\file.log' because it is being used by another process.
log4net:ERROR XmlConfigurator: Failed to find configuration section 'log4net' in the application's .config file.
Check your .config file for the <log4net> and <configSections> elements.
The configuration section should look like:
<section
name="log4net"
type="log4net.Config.Log4NetConfigurationSectionHandler,log4net" />
Our current workaround for the issue is to rename the last log file. We would of course expect this to fail (due to the aforementioned file lock), but it normally doesn't. Once or twice the rename has failed due to a lock from the aspnet_wp.exe process.
Our log4net configuration section is shown below:
<log4net>
<appender name="RollingLogFileAppender" type="log4net.Appender.RollingFileAppender">
<file value="C:\folder\file.log"/>
<appendToFile value="true" />
<datePattern value="yyyyMMdd" />
<rollingStyle value="Date" />
<maximumFileSize value="10MB" />
<maxSizeRollBackups value="100" />
<layout type="log4net.Layout.PatternLayout">
<header value="[Header]
"/>
<footer value="[Footer]
"/>
<conversionPattern value="%date %-5level %logger ${COMPUTERNAME} %property{UserHostAddress} [%property{SessionID}] - %message%newline"/>
</layout>
</appender>
<root>
<level value="INFO"/>
<appender-ref ref="RollingLogFileAppender"/>
</root>
</log4net>
As mentioned, we are seeing this intermittently on machines, but once the issue happens it persists.
Try adding
<lockingModel type="log4net.Appender.FileAppender+MinimalLock" />
to your <appender /> element. There is some performance impact because this means that log4net will lock the file, write to it, and unlock it for each write operation (as opposed to the default behavior, which acquires and holds onto the lock for a long time).
One implication of the default behavior is that if you're using it under a Web site that is being executed under multiple worker processes running on the same machine, each one will try to acquire and hold onto that lock indefinitely, and two of them are just going to lose. Changing the locking model to the minimal lock works around this issue.
(When debugging, ungraceful terminations and spinning up lots of new worker processes is exactly the type of thing that's likely to happen.)
Good luck!
Also be aware of the log4net FAQ:
How do I get multiple process to log to the same file?
Before you even start trying any of the alternatives provided, ask
yourself whether you really need to have multiple processes log to the
same file, then don't do it ;-).
FileAppender offers pluggable locking models for this usecase but all
existing implementations have issues and drawbacks.
By default the FileAppender holds an exclusive write lock on the log
file while it is logging. This prevents other processes from writing
to the file. This model is known to break down with (at least on some
versions of) Mono on Linux and log files may get corrupted as soon as
another process tries to access the log file.
MinimalLock only acquires the write lock while a log is being written.
This allows multiple processes to interleave writes to the same file,
albeit with a considerable loss in performance.
InterProcessLock doesn't lock the file at all but synchronizes using a
system wide Mutex. This will only work if all processes cooperate (and
use the same locking model). The acquisition and release of a Mutex
for every log entry to be written will result in a loss of
performance, but the Mutex is preferable to the use of MinimalLock.
If you use RollingFileAppender things become even worse as several
process may try to start rolling the log file concurrently.
RollingFileAppender completely ignores the locking model when rolling
files, rolling files is simply not compatible with this scenario.
A better alternative is to have your processes log to
RemotingAppenders. Using the RemoteLoggingServerPlugin (or
IRemoteLoggingSink) a process can receive all the events and log them
to a single log file. One of the examples shows how to use the
RemoteLoggingServerPlugin.
If you have
<staticLogFileName value="true" />
<rollingStyle value="Date" />
<datePattern value="yyyyMMdd" />
and add
<lockingModel type="log4net.Appender.FileAppender+MinimalLock" />
then there will be an error while the rolling happens.
The first process will create the new file and the rename the current file.
Then next proces will do the same and take the newly created file and overwrite the newly renamed file.
Resulting in the logfiel for the last day being empty.
Related
We are using the RollingFileAppender in a web-applicfation running behind a load-balaner on multiple nodes.
We noticed that logs of previous days often are very small, only a few lines, while the current log is large. It is not 100% consistent, about 1 in 5 previous logs apear to be a full log (using 2 nodes).
We figured that both nodes must be renaming log.log to the previous date with a minimal timespan between them. The last node to do so will actually overwrite the previous log with the new logfile created moments earlier by the first node.
This is our curated config:
<log4net>
<appender name="RollingLogFileAppender" type="log4net.Appender.RollingFileAppender">
<lockingModel type="log4net.Appender.FileAppender+MinimalLock"/>
<file value="\\shared-path\log.log" />
<datePattern value="yyyy-MM-dd.'txt'"/>
<staticLogFileName value="true"/>
<appendToFile value="true"/>
<rollingStyle value="Date"/>
<maxSizeRollBackups value="10"/>
<maximumFileSize value="20MB"/>
<layout type="log4net.Layout.PatternLayout">
<conversionPattern value="%date [%thread] %message%newline"/>
</layout>
</appender>
<root>
<level value="ALL"/>
<appender-ref ref="RollingLogFileAppender"/>
</root>
</log4net>
Is there any simple way to prevent this from hapening?
The problem can be solved by setting <staticLogFileName value="false"/>.
In that case it will not rename log.log to the formatted archive name, but simply create a new file for the new log.
A disadvantage of this solution is that any Tail or monitoring tool will now have to check the directory for the newest file instead of just going for the static log.log.
Edit:
This is not a viable solution. We are missing chunks of logs, the time sometimes goes back 1 minute so log4net seems to be caching before flushing to the file...
See also this answer to the question Logging web services operations behind load balancer
I'm trying to put in place a kiosk on a Surface Go using the following AssignedAccess.xml file in my provisioning package:
<?xml version="1.0" encoding="utf-8" ?>
<AssignedAccessConfiguration
xmlns="https://schemas.microsoft.com/AssignedAccess/2017/config"
xmlns:r1809="https://schemas.microsoft.com/AssignedAccess/201810/config"
>
<Profiles>
<Profile Id="{f46cfb9f-044f-4d96-bb33-ea1c1c18a354}">
<AllAppsList>
<AllowedApps>
<App AppUserModelId="Microsoft.Windows.Explorer" r1809:AutoLaunch="true" />
<App AppUserModelId="Microsoft.WindowsCalculator_8wekyb3d8bbwe!App" />
<App DesktopAppPath="C:\Program Files\SumatraPDF\SumatraPDF.exe" />
</AllowedApps>
</AllAppsList>
<r1809:FileExplorerNamespaceRestrictions>
<r1809:AllowedNamespace Name="Downloads" />
</r1809:FileExplorerNamespaceRestrictions>
<StartLayout>
<![CDATA[<LayoutModificationTemplate xmlns:defaultlayout="http://schemas.microsoft.com/Start/2014/FullDefaultLayout" xmlns:start="http://schemas.microsoft.com/Start/2014/StartLayout" Version="1" xmlns="http://schemas.microsoft.com/Start/2014/LayoutModification">
<LayoutOptions StartTileGroupCellWidth="6" />
<DefaultLayoutOverride>
<StartLayoutCollection>
<defaultlayout:StartLayout GroupCellWidth="6">
<start:Group Name="Apps">
<start:Tile Size="4x2" Column="0" Row="2" AppUserModelID="Microsoft.WindowsCalculator_8wekyb3d8bbwe!App" />
<start:DesktopApplicationTile Size="2x2" Column="0" Row="0" DesktopApplicationLinkPath="%APPDATA%\Microsoft\Windows\Start Menu\Programs\SumatraPDF.lnk" />
<start:DesktopApplicationTile Size="2x2" Column="2" Row="0" DesktopApplicationLinkPath="%APPDATA%\Microsoft\Windows\Start Menu\Programs\System Tools\File Explorer.lnk" />
</start:Group>
</defaultlayout:StartLayout>
</StartLayoutCollection>
</DefaultLayoutOverride>
</LayoutModificationTemplate>
]]>
</StartLayout>
<Taskbar ShowTaskbar="false" />
</Profile>
</Profiles>
<Configs>
<Config>
<Account>CouncilKiosk</Account>
<DefaultProfile Id="{f46cfb9f-044f-4d96-bb33-ea1c1c18a354}"/>
</Config>
</Configs>
</AssignedAccessConfiguration>
I took a look at the logs and the consensus seems to be this error code '0xC00CE223'. According to my research this is telling me that "Validate failed because the document does not contain exactly one root node." (XML DOM Error Messages Doc) I'm not sure where this is going wrong.
The provisioning package is also setting 2 user accounts (local admin and local user), hiding OOBE, enabling tablet mode as default, and running a provisioning command script that installs a single application and sets registry keys necessary for autologin.
UPDATE: I re-imaged the Surface Go with Windows 10 Pro and it still fails. But now I get an error '0x8000FFFF' which appears to be related to windows update and the windows store. I only have 1 USB port on this thing so it isn't connected to the internet at this time.
UPDATE 2: I re-imaged with a more up to date ISO of 10 Pro and I'm back to the original errors listed in the above post. I have updated the XML file and changed the tag as well as the xmlns from rs5 to r1809. I am not seeing any changes and this continues to be a frustrating problem to have.
Test to change this:
https://schemas.microsoft.com/AssignedAccess/2017/config
to the following:
http://schemas.microsoft.com/AssignedAccess/2017/config
I'm trying to enable perf4j annotations in intellij but I'm struggling to configure correctly AspectJ. More specifically the log file is created correctly but lacks of any data from the annotated method.
These are the relevant extracts of configuration:
logback.xml
<configuration debug="true">
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<!-- encoders are assigned the type
ch.qos.logback.classic.encoder.PatternLayoutEncoder by default -->
<encoder>
<pattern>%d{HH:mm:ss} [%thread] %-5level %logger{36} - %msg%n</pattern>
</encoder>
</appender>
<appender name="statistics" class="ch.qos.logback.core.FileAppender">
<file>./target/statisticsLogback.log</file>
<append>false</append>
<layout>
<pattern>%msg%n</pattern>
</layout>
</appender>
<appender name="coalescingStatistics" class="org.perf4j.logback.AsyncCoalescingStatisticsAppender">
<!-- encoders are assigned the type
ch.qos.logback.classic.encoder.PatternLayoutEncoder by default -->
<timeSlice>1000</timeSlice>
<appender-ref ref="statistics"/>
</appender>
<appender name="listAppender" class="ch.qos.logback.core.read.ListAppender">
<!-- encoders are assigned the type
ch.qos.logback.classic.encoder.PatternLayoutEncoder by default -->
<timeSlice>1000</timeSlice>
</appender>
<logger name="org.perf4j.TimingLogger" level="info">
<appender-ref ref="coalescingStatistics" />
<appender-ref ref="listAppender"/>
</logger>
<root level="debug">
<appender-ref ref="STDOUT" />
</root>
aop.xml
<?xml version="1.0" encoding="UTF-8"?>
<aspectj>
<!--
We only want to weave in the log4j TimingAspect into the #Profiled classes.
Note that Perf4J provides TimingAspects for the most popular Java logging
frameworks and facades: log4j, java.util.logging, Apache Commons Logging
and SLF4J. The TimingAspect you specify here will depend on which logging
framework you wish to use in your code.
-->
<aspects>
<aspect name="org.perf4j.slf4j.aop.TimingAspect"/>
<!-- if SLF4J/logback use org.perf4j.slf4j.aop.TimingAspect instead -->
</aspects>
<weaver options="-verbose -showWeaveInfo">
<!--
Here is where we specify the classes to be woven. You can specify package
names like com.company.project.*
-->
<include within="com.mycode.myproject.mypackage.*"/>
<include within="org.perf4j.slf4j.aop.*"/>
</weaver>
</aspectj>
Finally the related test method is tagged with the #Profiled annotation, this is part of the package defined in the aop.xml.
This configuration results in the log file being produced (which suggests that the logback.xml is configured correctly, however it only contains headers and no statistics from the tagged method.
The main question I have is where the AspectJ configuration should go within Intellij, I have included the aop.xml under a manually created META-INF folder in the src folder but I'm not sure this is detected by AspectJ at all.
Thanks in advance
UPDATE
I have made some progress on this since my initial post, specifically introducing two changes:
i) included -javaagent:lib\aspectjweaver.jar
ii) moved the aop.xml into a META-INF folder.
The aop configuration is now being picked up as it logs the configuration details and it also mentions the method being profiled.
The issue is now that the thread being profiled crashes, it doesn't log any exceptions but via debug the issue seems to be related to a ClassNotFoundException in org.aspectj.runtime.reflect.Factory when trying to instantiate org.aspectj.runtime.reflect.JoinPointImpl.
To isolate the issue i have removed all the maven imports of aspectJ and used the jars provided by the installation package but the issue persists, also the fact that the application crashes without any logging makes the issue tracking harder.
UPDATE
To clarify:
After reading more about this including the manual in the wayback link (thanks for that) I realised I was mixing up load-time / compile-time approach. Since then I tried both methods as described in the guide but with the same results described in my earlier update.
As per above, I do start the application with aspectj weaver option (-javaagent)
The build is done via IDE, as per above at the moment I have removed the aspectj / perf4j dependencies from Maven and linked to local jars
As mentioned the aop.xml does get picked up as mentioned in the update with no errors or warning, just confirmation of the weaved method
Okay, I have added a full Maven example to a GitHub repo which you can just clone and play around with.
Some basic things to consider:
For compile-time weaving (CTW) you need aspectjrt.jar on the classpath when compiling and running the code. You also need to use the AspectJ compiler to build the project, a normal Java compiler is not enough.
For load-time weaving (LTW) you need aspectjweaver.jar as a Java agent on the command line when running the code: -javaagent:/path/to/aspectjweaver.jar. You also need to add it as a VM argument to your LTW run configuration in IDEA.
For LTW you also need META-INF/aop.xml in your resources folder. Please also note that in order to encompass subpackages you should use the ..* notation, not just .*, e.g. <include within="de.scrum_master..*"/>.
You find more information in my project's read-me file.
P.S.: The Perf4J documentation is outdated and the project unmaintained. Thus, it still mentions AspectJ 1.6.x as necessary dependencies. I built and ran everything with the latest AspectJ 1.8.10 and it runs just fine, both from Maven and IDEA.
I'm pretty new to build servers but have been asked by my employer to do some testing (because F5 is not a build process, as the excellent article by Jeff Atwood says). At this stage, I'm working on getting some sample builds and test reports up and running on a CruiseControl.NET server. So far, I've gotten a build up and running (the configuration file will need some tidying up before adding new builds/projects but the proof of concept is there) but the reporting is causing something of a headache.
The main report I'm looking for is for out NUnit tests and SpecFlow integration tests. The tests run fine (as I'm getting a sensible looking xml file generated) and am looking to merge that in to the main build results so that I can show the results of the NUnit/SpecFlowtests.
Whenever the build completes, the following is reported in the messages (in ViewFarmReport.aspx): "Failing Tasks : XmlLogPublisher "
This combined with the following error reported in the Windows application log (source - CC.Net)
2015-03-24 08:36:52,987 [Initech.SuperCrm-DEV] ERROR CruiseControl.NET [(null)] - Publisher threw exception: ThoughtWorks.CruiseControl.Core.CruiseControlException: Unable to read the contents of the file: C:\CCNet\BuildArtifacts\Initech.SuperCrm-DEV\msbuild-results-7c657954-2c3e-405f-b0f1-7da1299788fd.xml ---> System.IO.FileNotFoundException: Could not find file 'C:\CCNet\BuildArtifacts\Initech.SuperCrm-DEV\msbuild-results-7c657954-2c3e-405f-b0f1-7da1299788fd.xml'.
(company/application name "censored")
This leads me to suspect that the failure to merge in the msbuild results (which I believe CruiseControl.NET automatically scrapes since version... 1.5 or 1.6?) is preventing the NUnit results from being merged in.
There is no msbuild-results file in the BuildArtifacts folder, which does not surprise me as I do not believe my current msbuild configuration allows for xml based logging as I am using the ThoughtWorks.CruiseControl.MsBuild.dll logger.
According to the online documentation for CruiseControl.NET there is XML enabled custom logger: ThoughtWorks.CruiseControl.MsBuild.XmlLogger which can be used, however the download location for this logger: here
appears not to exist any more.
Can anyone say whether I'm thinking along the right lines here and what my options are?
For reference, here is my complete configuration:
<cruisecontrol xmlns:cb="urn:ccnet.config.builder">
<cb:define MSBuildPath="C:\Windows\Microsoft.NET\Framework\v4.0.30319" />
<cb:define WorkingBaseDir="C:\CCNet\Builds" />
<cb:define ArtifactBaseDir="C:\CCNet\BuildArtifacts" />
<cb:define MSBuildLogger="C:\Program Files (x86)\CruiseControl.NET\server
\ThoughtWorks.CruiseControl.MsBuild.dll" />
<cb:define NUnitExe="C:\Jenkins\Nunit\nunit-console.exe" />
<cb:define name="vsts_ci">
<executable>C:\Jenkins\tf.exe</executable>
<server>http://tfs-srv:8080/tfs/LEEDS/</server>
<domain>CONTOSO</domain>
<autoGetSource>true</autoGetSource>
<cleanCopy>true</cleanCopy>
<force>true</force>
<deleteWorkspace>true</deleteWorkspace>
</cb:define>
<project name="Initech.Libraries" description="Shared libraries used in all Initech projects"
queue="Q1">
<state type="state" directory="C:\CCNet\State"/>
<artifactDirectory>$(ArtifactBaseDir)\Initech.Libraries</artifactDirectory>
<workingDirectory>$(WorkingBaseDir)\Initech.Libraries</workingDirectory>
<triggers>
<intervalTrigger
name="continuous"
seconds="30"
buildCondition="IfModificationExists"
initialSeconds="5"/>
</triggers>
<sourcecontrol type="vsts">
<cb:vsts_ci/>
<workspace>CCNET_Initech.Libraries</workspace>
<project>$/InitechLibraries/Initech.Libraries</project>
</sourcecontrol>
</project>
<project name="Initech.SuperCrm-DEV" description="Initech.SuperCrm Application, Development
Version" queue="Q1">
<cb:define ArtifactDirectory="$(ArtifactBaseDir)\Initech.SuperCrm-DEV" />
<cb:define WorkingDirectory="$(WorkingBaseDir)\Initech.SuperCrm-DEV" />
<cb:define OutputDirectory="$(WorkingDirectory)\Initech.SuperCrm\bin\Debug" />
<cb:define ProjectFile="Initech.SuperCrm.sln" />
<cb:define NUnitLog="$(WorkingDirectory)\NunitResults.xml" />
<state type="state" directory="C:\CCNet\State"/>
<artifactDirectory>$(ArtifactDirectory)</artifactDirectory>
<workingDirectory>$(WorkingDirectory)</workingDirectory>
<triggers>
<!-- check the source control every X time for changes,
and run the tasks if changes are found -->
<intervalTrigger
name="continuous"
seconds="30"
buildCondition="IfModificationExists"
initialSeconds="5"/>
</triggers>
<sourcecontrol type="vsts">
<cb:vsts_ci/>
<workspace>CCNET_Initech.SuperCrm-DEV</workspace>
<project>$/InitechSuperCrm/SuperCrm/Initech.SuperCrm-DEV</project>
</sourcecontrol>
<tasks>
<exec>
<executable>C:\Program Files (x86)\DXperience 12.1\Tools\DXperience
\ProjectConverter-console.exe</executable>
<buildArgs>$(WorkingDirectory)</buildArgs>
</exec>
<msbuild>
<executable>$(MSBuildPath)\MSBuild.exe</executable>
<workingDirectory>$(WorkingDirectory)</workingDirectory>
<projectFile>$(ProjectFile)</projectFile>
<timeout>900</timeout>
<logger>$(MSBuildLogger)</logger>
</msbuild>
<exec>
<executable>$(NUnitExe)</executable>
<buildArgs>/xml=$(NUnitLog) /nologo $(WorkingDirectory)\$(ProjectFile)
</buildArgs>
</exec>
</tasks>
<publishers>
<buildpublisher>
<sourceDir>$(OutputDirectory)</sourceDir>
<useLabelSubDirectory>true</useLabelSubDirectory>
<alwaysPublish>false</alwaysPublish>
<cleanPublishDirPriorToCopy>true</cleanPublishDirPriorToCopy>
</buildpublisher>
<merge>
<files>
<file>$(NUnitLog)</file>
</files>
</merge>
<xmllogger logDir="C:\CCNet\BuildArtifacts\Initech.SuperCrm-DEV\buildlogs" />
<artifactcleanup cleanUpMethod="KeepLastXBuilds"
cleanUpValue="50" />
</publishers>
</project>
</cruisecontrol>
I've been tearing my hair while trying to figure this out, and I don't have much to begin with, so any help would be greatly appreciated.
After a prolonged period of banging my head against the wall, I seem to have finally found the solution (well solutions).
1) Kobush.Build.dll (https://www.nuget.org/packages/Kobush.Build/) can be used as the logger for MSBuild. Looking at the attributions in CruiseControl.NET's documentation, it appears to have been written by the same developer (but extended).
2) Some tweaks were needed due to the default location of the msbuild-report output. Because, by default, it was dumped to the buildartifacts folder then it is susceptible to being prematurely deleted.
I no longer clean the publish directory prior to copying (in the buildpublisher) and perform the merge and xmllogger portions of the publisher before artifact cleanup.
As a result, I now have msbuild and nunit output/results integrated in to the main build log and these can be consumed through the CruiseControl.NET dashboard.
There's probably a tidier way of handling this, but at the moment I'm just getting a proof of concept going.
I have configured Hibernate to use logback logging library. And created an appender that catches logging data from "org.hibernate.SQL" and "org.hibernate.type" loggers. By default, those are set to INFO level.
As the next step I try to change the level of those 2 loggers to DEBUG level using JMX interface of logback. But it does not work and log file contains no data. Only if I set the logging level to DEBUG in the configuration file and then restart the server it works.
Should I do anything additional in order to make Hibernate to start logging?
Here goes the appender/logger configuration:
<configuration debug="false" scan="true" scanPeriod="5 minutes">
<jmxConfigurator />
...
<property name="SQL_LOG_LEVEL" value="DEBUG" />
<appender name="SQL_LOG" class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>${LOG_DIRECTORY}/sql_${weblogic.Name}.log</file>
<rollingPolicy class="ch.qos.logback.core.rolling.FixedWindowRollingPolicy">
<fileNamePattern>${ROTATION_DIRECTORY}/sql_${weblogic.Name}.%i.log.zip</fileNamePattern>
<minIndex>1</minIndex>
<maxIndex>5</maxIndex>
</rollingPolicy>
<triggeringPolicy class="ch.qos.logback.core.rolling.SizeBasedTriggeringPolicy">
<maxFileSize>50MB</maxFileSize>
</triggeringPolicy>
<encoder>
<pattern>${LOG_PATTERN}</pattern>
</encoder>
</appender>
<logger name="org.hibernate.SQL" level="${SQL_LOG_LEVEL}" additivity="false">
<appender-ref ref="SQL_LOG" />
</logger>
<logger name="org.hibernate.type" level="${SQL_LOG_LEVEL}" additivity="false">
<appender-ref ref="SQL_LOG" />
</logger>
...
</configuration>
EDIT: I have several applications (EAR) files deployed on the same container. All applications are using same logging configuration.
Problem appears to be in fact that I deploy several applications on one sever and, basically, each application's class loader has a copy of logback libraries. That's why several logging context are created, but because they all share the same name ("default"), basically, only one get registered to MBean server.
The solution could be either moving logback libraries higher in class loader hierarchy or use logger context separation as proposed by logback documentation.
I was not able to log any output from org.hibernate.SQL and friends until I set my log level to TRACE instead of DEBUG using logback over slf4j.