How to get SQL with parameter values on an exception - nhibernate

Hard to believe, but I can't seem to find a straight answer for this: How can I get the SQL statement including the parameter values when the statement generates an exception and only when it generates an exception. I know how to log the statement+parameters for every SQL generated, but that's way too much. When there's a System.Data.SqlClient.SqlException, though, it only provides the SQL, not the parameter values. How can I catch that at a point where I have access to the that data so that I can log it?

Based on a number of responses to various questions (not just mine), I've cobbled something together that does the trick. I think it could be useful to others as well, so I'm including a good deal of it here:
The basic idea is to
Have NH log all queries, pretty-printed and with the parameter values in situ
Throw all those logs out except the one just prior to the exception.
I use Log4Net, and the setup is like this:
<?xml version="1.0"?>
<log4net>
<appender name="RockAndRoll" type="Util.PrettySqlRollingFileAppender, Util">
<file type="log4net.Util.PatternString" >
<conversionPattern value="%env{Temp}\\%property{LogDir}\\MyApp.log" />
</file>
<DatePattern value="MM-dd-yyyy" />
<appendToFile value="true" />
<immediateFlush value="true" />
<rollingStyle value="Composite" />
<maxSizeRollBackups value="10" />
<maximumFileSize value="100MB" />
<staticLogFileName value="true" />
<layout type="log4net.Layout.PatternLayout">
<conversionPattern value="%date %-5level %logger - %message%newline" />
</layout>
</appender>
<appender name="ErrorBufferingAppender" type="log4net.Appender.BufferingForwardingAppender">
<bufferSize value="2" />
<lossy value="true" />
<evaluator type="log4net.Core.LevelEvaluator">
<threshold value="ERROR" />
</evaluator>
<appender-ref ref="RockAndRoll" />
<Fix value="0" />
</appender>
<logger name="NHibernate.SQL">
<additivity>false</additivity>
<appender-ref ref="ErrorBufferingAppender" />
<level value="debug" />
</logger>
<logger name="error-buffer">
<additivity>false</additivity>
<appender-ref ref="ErrorBufferingAppender" />
<level value="debug" />
</logger>
<root>
<level value="info" />
<appender-ref ref="RockAndRoll" />
</root>
</log4net>
The NHibernate.SQL logger logs all queries to the ErrorBufferingAppender, which keeps throwing them out and saves only the last one in its buffer. When I catch an exception I log one line at ERROR level to logger error-buffer, which passes it to ErrorBufferingAppender which -- because it's at ERROR level -- pushes it, along with the last query, out to RockAndRoll, my RollingFileAppender.
I implemented a subclass of RollingFileAppender called PrettySqlRollingFileAppender (which I'm happy to provide if anyone's interested) that takes the parameters from the end of the query and substitutes them inside the query itself, making it much more readable.

If you are using nhibernate for querying the db (as the tag presence on your question suggests), and your SQL dialect/driver relies on ADO, you should get a GenericADOException from the failing query.
Its Message property normally already include parameters values.
For example, executing the following failing query (provided you have at least one row in DB):
var result = session.Query<Entity>()
.Where(e => e.Name.Length / 0 == 1);
Yields a GenericADOException with message:
could not execute query
[ select entity0_.Id as Id1_0_, entity0_.Name as Name2_0_ from Entity entity0_ where len(entity0_.Name)/#p0=#p1 ]
Name:p1 - Value:0 Name:p2 - Value:1
The two literals, 0 and 1, of the query have been parameterized and their values are included in the message (with an index base mismatch: on hibernate queries, they are 1 based, while on the SQL query with my setup, they end up 0 based).
So there is nothing special to do to have them. Just log the exception message.
Have you just missed it, or were you asking something else indeed?
Your question was not explicit enough in my opinion. You should include a MVCE. It would have show me more precisely in which case you were not able of getting those parameters values.

Related

Name of the user who has processed the cube

There is a code piece in which I have to get the username of the user who has processed the cube or has made any changes in the cube structure.
I have searched in the SSAS DMV's, but didn't find what I needed; I only found the last processed time, but not the name of the user.
Any suggestions?
You can track this using an Extended Event. Add the ProgressReportBegin and ProgressReportEnd events which are for processing. These events include the NTUserName and StartTime fields which you can use to find who processed the cube and when. The Extended Event will need to be running when the cube is processed to capture this. The following is an example XMLA command which can be run when connected to your SSAS database in SSMS to create an Extended Event that tracks cube processing and outputs the results to a file. Of course many of these options are just defaults and you may want to make adjustments as necessary.
https://learn.microsoft.com/en-us/sql/analysis-services/instances/monitor-analysis-services-with-sql-server-extended-events?view=sql-server-2017
<Create xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
<ObjectDefinition>
<Trace>
<ID>XE_Cube_Process</ID>
<Name>XE_Cube_Process</Name>
<XEvent xmlns="http://schemas.microsoft.com/analysisservices/2011/engine/300/300">
<event_session name="XE_Cube_Process" dispatchLatency="0" maxEventSize="0" maxMemory="4" memoryPartition="none" eventRetentionMode="AllowSingleEventLoss" trackCausality="true" xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
<event package="AS" name="ProgressReportEnd" />
<event package="AS" name="ProgressReportBegin" />
<target package="package0" name="event_file">
<parameter name="filename" value="C:\Test\XE_Cube_Process.xel" />
<parameter name="max_file_size" value="4096" />
<parameter name="max_rollover_files" value="10" />
<parameter name="increment" value="1024" />
</target>
</event_session>
</XEvent>
</Trace>
</ObjectDefinition>
</Create>

Mondrian/Saiku - Closure Table - Null Pointer Exception

I am currently doing a PoC and facing a problem with closure table. I am using Saiku CE and database is postgres. Everything works until I add a closure table.
If I remove closure table hierarchy, I don't get any error. If keep it I get the error. I have created my demo schema using Foodmart.xml which I downloaded from Saiku itself.
Some forums suggested that it's an open bug with Mondrian but if it is then why same syntax works with foodmart? Is it a problem with Saiku CE? If I use Saiku EE (Trial version) for my PoC then will it work?
11:54:17,900 WARN [RolapUtil] Mondrian: Warning: JDBC driver sun.jdbc.odbc.JdbcOdbcDriver not found
11:54:17,902 WARN [RolapUtil] Mondrian: Warning: JDBC driver oracle.jdbc.OracleDriver not found
11:54:18,728 ERROR [SecurityAwareConnectionManager] Error connecting: ersdemods
java.lang.NullPointerException
<Dimension name="Organisation" key="Org Id">
<Attributes>
<Attribute name="Par Org" table="org_organisation" keyColumn="parent_id" />
<Attribute name="Org Id" table="org_organisation" keyColumn="id" nameColumn="name" />
<Attribute name='Country Name' table='org_organisation' keyColumn='country' hasHierarchy='false' />
<Attribute name='County Name' table='org_organisation' hasHierarchy='false'>
<Key>
<Column name='country' />
<Column name='county' />
</Key>
<Name>
<Column name='county' />
</Name>
</Attribute>
<Attribute name='City Name' table='org_organisation' keyColumn='city' hasHierarchy='false' />
</Attributes>
<Hierarchies>
<Hierarchy name="Organisations" allMemberName="All Organisations">
<Level attribute="Org Id" parentAttribute="Par Org" nullParentValue="NULL">
<Closure table='organisation_closure' parentColumn="closure_parent_org_id" childColumn="org_id" />
</Level>
</Hierarchy>
<Hierarchy name='Oragnisation Location' allMemberName='All Org Location'>
<Level attribute='Country Name' />
<Level attribute='County Name' />
<Level attribute='City Name' />
</Hierarchy>
</Hierarchies>
</Dimension>
Regards,
Puneet Tayal
Managed to fix this issue. Dimension definition was correct however dimension with closure table should be declared within the cube.
If you declare them outside of the cube you would get this idiotic error.
Looks like a bug with Mondrian 4.
Regrads,
Puneet Tayal

Logging query execution time in Eclipselink

I'd like to configure EclipseLink to log executed SQL queries and query execution time.
In persistence.xml I have following properties set:
<property name="eclipselink.logging.level" value="ALL" />
<property name="eclipselink.logging.level.sql" value="ALL"/>
<property name="eclipselink.logging.parameters" value="true"/>
<property name="eclipselink.logging.timestamp" value="true" />
I can see SQL queries, bound parameters, and execution timestamp. But not the execution time.
Here is the example of messages I see in log:
[EL Fine]: sql: 2016-07-14 17:22:40.114--Connection(1201360998)--SELECT ID, a, b, c FROM my_table WHERE (id = ?)
bind => [4]
It there any way how to get this information to logs?

Logback: One file with maximum file size

my system support team needs one simple log-file, with a maximum size of 10MB. Older log-lines can be deleted when the file reaches 10MB. So roll out the oldest lines.
What is a good appender for this?
I have one appender, but this still created a second file, and then starts again with an empty new file. This is not what my support team wants.
Help is appreciated.
<configuration>
<appender name="TEST" class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>${LOG_HOME}/test.log</file>
<rollingPolicy class="ch.qos.logback.core.rolling.FixedWindowRollingPolicy">
<fileNamePattern>${LOG_HOME}/test.%i.log</fileNamePattern>
<minIndex>1</minIndex>
<maxIndex>1</maxIndex>
</rollingPolicy>
<triggeringPolicy class="ch.qos.logback.core.rolling.SizeBasedTriggeringPolicy">
<maxFileSize>10MB</maxFileSize>
</triggeringPolicy>
<encoder>
<pattern>%date %-5level [%thread] - %mdc{loginName} - [%logger]- %msg%n</pattern>
</encoder>
</appender>
<root level="DEBUG">
<appender-ref ref="FILE" />
</root>
</configuration>
Keeping everything in a single file and constantly adding the most recent while deleting the oldest lines is going to perform really really poorly. I suspect that logback can't be made to do this.
What I suggest is you use the regular size based policy, configure it to stay inside your 10MB limit overall, then just concatenate the files when you grab them.

How to get real sql from NHibernate with Log4Net

I decided to learn how to use an ORM to avoid learning some SQL (mistake -- only approach would be to go no-SQL).
I have been able to get the Nhibernate "SQL" using Log4Net, using the instructions that are duplicated in quite a few blogs. I get "SQL" like this:
NHibernate.Loader.Loader: 2011-11-11 15:03:14,348 [9] INFO NHibernate.Loader.Loader [(null)] - SELECT this_.RegionID as RegionID9_0_, this_.RegionDescription as RegionDe2_9_0_ FROM Region this_
Now correct me if I am wrong but that is not SQL, and I can't understand why all of these blogs talk like it is.
The strange thing is that earlier when I was messing around with log4net, I am sure that I was able to get ordinary SQL logged to a logfile. When I basically did getall() of an entity (read a whole table), all of the individual queries were listed there with the id in the query -one for each row(entity). I definitely didn't imagine this. Can anyone tell me how this is done with log4net? Here is my config right now:
<log4net>
<appender name="DebugSQL" type="log4net.Appender.TraceAppender">
<layout type="log4net.Layout.PatternLayout">
<conversionPattern value="%date [%thread] %-5level %logger [%property{NDC}] - %message%newline" />
</layout>
</appender>
<appender name="rollingFile" type="log4net.Appender.RollingFileAppender,log4net">
<param name="File" value="log.txt"/>
<param name="AppendToFile" value="true"/>
<param name="DatePattern" value="yyyy.MM.dd"/>
<layout type="log4net.Layout.PatternLayout,log4net">
<conversionpattern value="%d %p %m%n">
</conversionpattern>
</layout>
</appender>
<logger name="NHibernate.Loader.Loader" additivity="false">
<level value="All"/>
<appender-ref ref="DebugSQL" />
</logger>
Edit: I now know that it was sql, and I couldn't reproduce the emitted sql that I had seen earlier because lazyloading was on before:
NHibernate.SQL: SELECT region_.RegionDescription as RegionDe2_9_ FROM Region region_ WHERE region_.RegionID=#p0;#p0 = 1 [Type: Int32 (0)]
NHibernate.SQL: SELECT region_.RegionDescription as RegionDe2_9_ FROM Region region_ WHERE region_.RegionID=#p0;#p0 = 2 [Type: Int32 (0)]
NHibernate.SQL: SELECT region_.RegionDescription as RegionDe2_9_ FROM Region region_ WHERE region_.RegionID=#p0;#p0 = 3 [Type: Int32 (0)]
NHibernate.SQL: SELECT region_.RegionDescription as RegionDe2_9_ FROM Region region_ WHERE region_.RegionID=#p0;#p0 = 4 [Type: Int32 (0)]
What you see after the dash in the log is indeed SQL. It is syntactically and semantically correct, but it just looks plain ugly. This is commonplace when SQL is not written manually: code generators use names such as RegionID9_0_ for disambiguation, making the output look unusual to a human reader.
change the conversionPattern on the 4th line from:
%date [%thread] %-5level %logger [%property{NDC}] - %message%newline
to
%message%newline
and you will log only the SQL (it's the value of %message)
the following line, from your log, is SQL:
SELECT this_.RegionID as RegionID9_0_, this_.RegionDescription as RegionDe2_9_0_ FROM Region this_