Nlog variable and log file issue with multiple log files - vb.net

I created a Log File class that uses NLog with the hopes of writing to multiple log files at the same time. This seems to work fine until I add variables to the mix.
Problem
Changing the variable seems to change that variable setting for all instances of the log file instead of the particular instance I am working with.
My Code
Here is how I have the programming structured:
LogClass - Basically a wrapper to give me some additional functionality. The 'SetVariable' is what I am using to set the particular variable (called dqwAlertName)
With this log class, I am passing in the specific logger that I want to use like this:
Public iLogger as new Logger(NLog.LogManager.GetLogger("dataQualityWatcher"),True)
That statement instantiates the logging class with the "dataQualityWatcher" logger and sets Debug=True (which I simply use to allow a more verbose logging that can be turned on and off).
With that said... The statement above is ALSO within another class object:
dataQualityWatcher Class - This is a 'watcher' that is called many times over and runs continuously. If you familiar with FileSystemWatcher, it works similarly to that. It basically watches data for a specific value and raises an event.
Inside THIS class is where I instantiate the logger as mentioned above with the following code:
Public iLogger as new Logger(NLog.LogManager.GetLogger("dataQualityWatcher"), True)
iLogger.SetVariable("dqwAlertName", _AlertName)
The first line instantiates, the second line will set the variable. The Logging Class SetVariable method is pretty basic:
Public Sub SetVariable(variableName as string, value as String)
'Set variable context for the logger
NLog.LogManager.Configuration.Variables(variableName) = value
End Sub
I am using that variable within the NLog.config file in the following manner:
<variable name="LogLayout" value="[${date:format=MM/dd/yyyy h\:mm\:ss.fff tt}] [${gdc:item=location}] | ${level} | ${message}" />
<variable name="InfoLayout" value="[${date:format=MM/dd/yyyy h\:mm\:ss.fff tt}] ${gdc:item=SoftwareName} Version ${gdc:item=SoftwareVersion} - ${message}" />
<variable name="DebugInfoLayout" value="[${date:format=MM/dd/yyyy h\:mm\:ss.fff tt}] ${message}" />
<variable name="logDir" value="C:/Log/PWTester/" />
<variable name="dqwAlertName" value="" />
<targets>
<target name="dataQualityWatcher" xsi:type="File" fileName="${logDir}/LogFiles/${var:dqwAlertName}-DataQualityWatcher.log" layout="${LogLayout}" />
</targets>
<rules>
<logger name="dataQualityWatcher" minlevel="Trace" writeTo="dataQualityWatcher" />
</rules>
THE PROBLEM:
I run multiple 'watchers' (as I call them) with the following code to create that object and assign properties:
dataWatch.Add(New dataQualityWatcher(True) With {.Tags = lstTags, .AlertTimerInterval = Convert.ToInt64(intTimerMilliseconds), .AlertGroupID = Convert.ToInt64(CARow(0)), .EmailGroupID = Convert.ToInt64(CARow(1)), .CustomSubject = CARow(3), .CustomMessage = CARow(4), .AlertName = DataAlertGroupName, .Debug = blnVerboseLogging, .HistorianServer = SH})
Multiple Version Example
I run the code above where: .AlertName = {"Test1", "Test2", "Test3"}. Other parameters would also change and a new object is instantiated each time. In this example there are 3 dataQualityWatcher objects instantiated, which also instantiates 3 Logger objects.
Each time a new dataQualityWatcher object is instanciated, it instanciates a Logger, which would then write to the file. The AlertName variable is passed on through the SetVariable method above.
I would expect 3 log files to be written:
Test1-DataQualityWatcher.log
Test2-DataQualityWatcher.log
Test3-DataQualityWatcher.log
This DOES happen. However, the last dataQualityWatch object that is created will run the SetVariable method = "Test3" (in this example). Now that variable is set and all 3 Loggers will begin logging to that file (i.e., Test3-DataQualityWatcher.log).
I can only assume that there is a better way to do this with variables such that they are for the life of that particular log instance, but I can't seem to figure it out!
Thanks in advance and sorry for the VERY, VERY long post.

As far as I understand your are trying to log to multiple files, with one target.
This won't work well with the use of variables as they are static (Shared in VB.net) - so this isn't threadsafe.
Other options to do this are:
Create multiple file targets in your nlog.config and setup the right <rules>, or
Pass extra properties for every message, and use event-properties: fileName="${logDir}/LogFiles/${event-properties:dqwAlertName}-DataQualityWatcher.log", VB.NET call:
Dim theEvent As New LogEventInfo(LogLevel.Debug, "", "Pass my custom value")
theEvent.Properties("MyValue") = "My custom string".
You could write a sub class for Logger to make it less verbose. Or
Create the targets & rules programmatically (in VB.NET). See tutorial (in C#)
If performance is very important, choose for 1 or 3.

Related

How to resolve Checkstyle error: 'method def modifier ' has incorrect indentation

Got a checkstyle error that states a member def modifier has incorrect indentation level 4 and is expected to be level 2.
Apart from having checkstyle as a plugin, you must download the its jar file as well, just because there you will be able to see what google and sun check file are doing to your code, and true to be told, it is kind of hard to understand checkstyle documentation, and having those files at hand will ease the process to understand what is going on.
Getting back to your question, there is a module called Indentation which has a property for basicOffset that is setting the space it waits to find when scanning your code, I'll show you an example:
<module name="TreeWalker">
<module name="Indentation">
<property name="basicOffset" value="2"/>
<property name="caseIndent" value="2"/>
</module>
XML content above shows a simple example, where I want to show you this module thing, that's why I added another property called caseIndent for same module called Indentation which resides inside TreeWalker. As you can see the basicOffset property has the number 2 as a value, then you could say, wait, the message I got said 4 and not 2. I'll explain it:
class Foo { // no space at the left side
private void fooMethod() { // a tab or 2 space at the left side
int a = 0; // 2 space from method's declaration plus 2 for this is 4
}
}
Definively you ought see the xml I mentioned before, replicate in your own with basic stuff and play around to have a better understanding. You can get more info form here.

jboss-cli property format for path attribute

As explained in JBoss EAP 7 documentation, one can pass in a properties file to the CLI instance with the --properties flag.
I'm trying to create a generic script for logging profiles.
This is my properties file:
profilename=myProfileName
filepath=/some/dir/somefile.log
And this is my script:
set profilename=${profilename}
set filepath=${filepath}
/profile=full-ha/subsystem=logging/logging-profile=$profilename:add
/profile=full-ha/subsystem=logging/logging-profile=$profilename/periodic-size-rotating-file-handler=myHandler:add(file={"relative-to" => "some.dir","path" => $filepath},suffix=.yyyy-MM-dd,max-backup-index=50,rotate-on-boot=true,rotate-size=20m)
The script doesn't generate any error and completes successfully, and the $profilename variable is correctly replaced by its value.
But the $filepath variable seems to be a problem:
<logging-profile name="myProfileName">
<periodic-size-rotating-file-handler name="myHandler" rotate-on-boot="true">
<file relative-to="some.dir" path="$filepath}"/>
<rotate-size value="20m"/>
<max-backup-index value="50"/>
<suffix value=".yyyy-MM-dd"/>
</periodic-size-rotating-file-handler>
</logging-profile>
What is the specific format to use so that a variable can be used for the path attribute?
Edit: tested with JBoss EAP 7.2, and now it works as expected, so I guess it was indeed a bug.
I know this is very late answer, but is the filepath variable last one in your list ?
Because this seems like a line ending issue if add new line at the end this would get picked up correctly.

Issues with overriding the Mule Watermark (SF Polling) in flow via ObjectStore:store

I have defined the object store as following:
<objectstore:config name="objectStore" objectStore-ref="_defaultUserObjectStore"/>
And am trying to modify the watermark variable defined by name "lastmodified" in object store via a flow which call
<objectstore:store key="lastmodified" value-ref="#[payload.lastmodified]" overwrite="true" config-ref="objectStore" doc:name="Default User Object Store"/>
Note: payload.lastmodified has appropriate value of "2016-06-29T15:08:45.000Z" in it.
I am not seeing any error on console but when the next time the Poll executes it doesn't read the updated value of the watermark.
Any pointer would be surely helpful.
Thanks.
Instead of the method used above, try using poll-watermarking. Can set you update expression in poll-watermarking and if needed, can use object store also.
I fixed it by making changing the object store config to: <objectstore:config name="objectStore" partition="mule.watermark" doc:name="ObjectStore: Connector"/>

Using a Team City environment variable to override a project property

I have a C# project property called Version defined as
<Version Condition="$(Version)==''">1.2.3.4<Version>
1.2.3.4 is the default value.
I have a Team City system property, also called Version, set up to override. So in the custom run dialog in Team City, I can specify a value for Version and that value gets used. This works fine.
If I leave the parameter blank in Team City, however, the default value is still overwritten with blank (null?). If I delete the Team City parameter, the default value is used.
Is the condition incorrect? How can I set up the Team City property to be blank, and only override if I enter some value?
Updated answer after OP's comment:
From docs:
MSBuild allows you to set properties
from the command line using the
/property or /p command line switch.
Property values received from the
command line override property values
set in the project file and property
values inherited from environment
variables.
So you can just set a property $(VersionTC) in TeamCity configuration and check if that property is empty or not and set version
<Version>$(VersionTC)<Version>
<Version Condition="'$(VersionTC)'==''">1.2.3.4<Version>
( so you set Version to VersionTC first. Then see if it empty and set the default )
Have a look at this blog post explaining all this.
Try something like below:
<Version Condition=" '$(Version)'=='' ">1.2.3.4<Version>
Note the ' ' (single quotes) around $(Version)
Team City is probably still passing the parameter on the command line, just with a blank value, as in,
/p:Version=""
or something similar. The symptom you are seeing is due to how MSBuild deals with overridden properties. When specified on a command line, a property will take that value whether or not it is also declared in a static (global in the file, not inside a target) PropertyGroup declaration. So your declaration of Version, with the Condition being checked for teh empty string, is being skipped entirely.
One way around this is to move your PropertyGroup containing the declaration of $(Version), with its Condition, inside the target where it is first used. MSBuild will allow overwriting the value of a command line property from a "dynamic" property created at runtime from within a target.
If you run this command line...
> msbuild My.proj /t:Ver /p:Version=""
...and have this target...
<Target Name="Ver">
<PropertyGroup>
<Version Condition="'$(Version)' == ''">1.2.3.4<Version>
</PropertyGroup>
<Message Text="Version: '$(Version)'" />
</Target>
... you will get Version showing 1.2.3.4, whereas with the PropertyGroup outside the target, it will retain the empty value.

Birt data source parameters from a property file

I have multiple BIRT reports that obtains the data from the same jdbc data source.
Is it possible to obtain the conection parameters (Driver URL, User Name, and Password) from an external property file or similar?
One you create a functional data source, you can add that data source to a report library that can be imported and used by all BIRT reports in your system. The source inside the library can have static connection attributes, or you can abstract them using externalized properties.
If you want to externalize the connection info, you will need to tweak the Data source itself. Inside the Data Source Editor, there is a "Property Binding" section that allows you to abstract all the values governing the data connection. From there you can bind the values (using the expression editor) to either report parameters or a properties file.
To bind to a report parameter, use this syntax: params[parametername].value as the expression.
To bind to a properties file, set the Resource file in the Report's top-level properties. From there you can just use the property key value to bind the entry to the Data Source.
Good Luck!
An alternative to the good #Mystik's "Property binding" solution is externalizing to a connection profile.
Create a data source (say "DS"), setting up a correct configuration of the parameters to connect to a DB.
Right click on "DS" > Externalize to Connection Profile... > check both options, set a name for the Connection Profile, Ok > set the path and filename were to save the Connection Profile Store (say "reportName.cps"), uncheck Encrypt... (in this way we can modify information in the XML file by hand).
Now we have "reportName.cps", an XML file that we can modify according to the environment where we place our report (development, production,...). The problem is that "DS" has loaded statically those info from "reportName.cps". It loads them dinamically if it can find "reportName.cps" in the absolute path we specified. So changing environment the file path will be different and the report won't find our file. To tell the report the correct location of the file and load it dynamically let's write a script:
Setup a beforeOpen script to use the connection profile that is deployed in the resource folder which can be different for every environment:
var myresourcefolder = reportContext.getDesignHandle().getResourceFolder();
this.setExtensionProperty("OdaConnProfileStorePath", myresourcefolder + "/reportName.cps");
For those struggling configuring a connection profile, the files must look as follow (exemple using PostgreSQL as an exemple):
db-config-birt.xml (or whatever name)
<?xml version="1.0"?>
<DataTools.ServerProfiles version="1.0">
<profile autoconnect="No" desc="" id="uuid" name="MyPostgreSQL"
providerID="org.eclipse.birt.report.data.oda.jdbc">
<baseproperties>
<property name="odaDriverClass" value="org.postgresql.Driver"/>
<property name="odaURL" value="jdbc:postgresql://XX:5432/YY"/>
<property name="odaPassword" value="zzz"/>
<property name="odaUser" value="abc"/>
</baseproperties>
</profile>
</DataTools.ServerProfiles>
The key points here are:
The xml MUST start with <?xml version="1.0"?> (or <?xml version="1.0" encoding="UTF-8" standalone="no"?> but when I was using it, I have having a parsing exception while deploying on Tomcat)
The properties keys MUST be odaDriverClass, odaURL, odaPassword, odaUser (order doesn't matter)
This file must have the right to be accessed, for e.g. chmod 664 this file
If any of the 2 conditions above aren't met, Birt will throw a laconic :
org.eclipse.birt.report.engine.api.EngineException: An exception occurred during processing. Please see the following message for details:
Cannot open the connection for the driver: org.eclipse.birt.report.data.oda.jdbc.
org.eclipse.birt.report.data.oda.jdbc.JDBCException: Missing properties in Connection.open(Properties). ;
org.eclipse.datatools.connectivity.oda.OdaException: Unable to find or access the named profile (MyPostgreSQL) in profile store path (/opt/tomcat/mytomcat/conf/db-config-birt.xml). ;
org.eclipse.datatools.connectivity.oda.OdaException ;
Then in the report (myreport.rptdesign), in the XML of it, the datasource must then look like that:
myreport.rptdesign (or whatever name)
<data-sources>
<oda-data-source extensionID="org.eclipse.birt.report.data.oda.jdbc" name="MyPostgreSQL" id="320">
<property name="OdaConnProfileName">MyPostgreSQL</property>
<property name="OdaConnProfileStorePath">/opt/tomcat/mytomcat/conf/db-config-birt.xml</property>
</oda-data-source>
</data-sources>
Obviously, you will adapt the OdaConnProfileStorePath to suit your needs