NHibernate - How to log Named Parameterised Query with parameter values? - nhibernate

I have a parameterised named Query like this :
Query moveOutQuery = session.createSQLQuery(moveOutQueryStr.toString())
.addEntity(MyClass.class)
.setParameter("assignmentStatus", Constants.CHECKED_OUT)
I want to see the actual SQL query with parameters filled in. However while debugging I only get the following query:
Select * from my_assignment WHERE assignment_status in ( :assignmentStatus )
Why isn't the assignmentStatus being substituted for its real value?

Why isn't the assignmentStatus being substituted for its real value?
This is because NHibernate use query parameters to input values. This is efficient in many cases and also helpful against SQL Injection attack. Parameters are sent separately. You can find them at the bottom if SQL is logged as explained below.
You may log each SQL to file as explained below.
This is implemented through log4net.dll; you need to add reference.
Add namespaces as below:
using log4net;
using log4net.Appender;
using log4net.Core;
using log4net.Layout;
using log4net.Repository.Hierarchy;
Configure log4net in NHibernate as below:
Hierarchy hierarchy = (Hierarchy)LogManager.GetRepository();
hierarchy.Root.RemoveAllAppenders();
FileAppender fileAppender = new FileAppender();
fileAppender.Name = "NHFileAppender";
fileAppender.File = logFilePath;
fileAppender.AppendToFile = true;
fileAppender.LockingModel = new FileAppender.MinimalLock();
fileAppender.Layout = new PatternLayout("%d{yyyy-MM-dd HH:mm:ss}:%m%n%n");
fileAppender.ActivateOptions();
Logger logger = hierarchy.GetLogger("NHibernate.SQL") as Logger;
logger.Additivity = false;
logger.Level = Level.Debug;
logger.AddAppender(fileAppender);
hierarchy.Configured = true;
You also need to set ShowSql while configuration as below:
configuration.SetProperty(NHibernate.Cfg.Environment.ShowSql, "true");
configuration.SetProperty(NHibernate.Cfg.Environment.FormatSql, "true");
You need to call this code once at startup of your application. Output log includes values of parameters as well.
Following is the code:
session.CreateSQLQuery("SELECT * FROM MyEntity WHERE MyProperty = :MyProperty")
.AddEntity(typeof(MyEntity))
.SetParameter("MyProperty", "filterValue")
.UniqueResult<MyEntity>();
Following is the logged query:
2020-01-09 14:25:39:
SELECT
*
FROM
MyEntity
WHERE
MyProperty = #p0;
#p0 = 'filterValue' [Type: String (4000:0:0)]
As you can see, parameter value filterValue is listed at the bottom.
This works for all query APIs like IQueryOver, IQuery, ISQLQuery etc.
This logs both success and failed statements. You can play with FileAppender and Logger class to meet your additional requirements.
Also refer PatternLayout from documentation. More details can also be found here, here and here. This Q/A discusses the same.
Following Q/A may also help:
Get executed SQL from nHibernate
Using log4net to write to different loggers
How to log SQL calls with NHibernate to the console of Visual Studio?
As you see, this logs the parameter values at bottom of the query. If you want those logged embedded in the query, please refer to this article.

Related

Groovy SQL Multiple ResultSets

I am calling a stored procedure from my Groovy code. The stored proc looks like this
SELECT * FROM blahblahblah
SELECT * FROM suchAndsuch
So basically, two SELECT statements and therefore two ResultSets.
sql.eachRow("dbo.testing 'param1'"){ rs ->
println rs
}
This works fine for a single ResultSet. How can I get the second one (or an arbitrary number of ResultSets for that matter).
You would need callWithAllRows() or its variant.
The return type of this method is List<List<GroovyRowResult>>.
Use this when calling a stored procedure that utilizes both output
parameters and returns multiple ResultSets.
This question is kind of old, but I will answer since I came across the same requirement recently and it maybe useful for future reference for me and others.
I'm working on a Spring application with SphinxSearch. When you run a query in sphinx, you get results, you need to run a second query to get the metadata for number of records etc...
// the query
String query = """
SELECT * FROM INDEX_NAME WHERE MATCH('SEARCHTERM')
LIMIT 0,25 OPTION MAX_MATCHES=25;
SHOW META LIKE 'total_found';
"""
// create an instance of our groovy sql (sphinx doesn't use a username or password, jdbc url is all we need)
// connection can be created from java, don't have to use groovy for it
Sql sql = Sql.newInstance('jdbc:mysql://127.0.0.1:9306/?characterEncoding=utf8&maxAllowedPacket=512000&allowMultiQueries=true','sphinx','sphinx123','com.mysql.jdbc.Driver')
// create a prepared statement so we can execute multiple resultsets
PreparedStatement ps = sql.getConnection().prepareStatement(query)
// execute the prepared statement
ps.execute()
// get the first result set and pass to GroovyResultSetExtension
GroovyResultSetExtension rs1 = new GroovyResultSetExtension(ps.getResultSet())
rs1.eachRow {
println it
}
// call getMoreResults on the prepared statement to activate the 2nd set of results
ps.getMoreResults()
// get the second result set and pass to GroovyResultSetExtension
GroovyResultSetExtension rs2 = new GroovyResultSetExtension(ps.getResultSet())
rs2.eachRow {
println it
}
Just some test code, this needs some improving on. You can loop the result sets and do whatever processing...
Comments should be self-explanatory, hope it helps others in the future!

Is is possible to use Postgresql's Explain to analyse queries generated by Grails GORM

I found and have used a closure that temporarily turns hibernate.SQL logging to Trace to allow me to see the exact queries that are generated. However I would like to be able to have PostgresQL's explain run automatically instead of having to pull out queries individually for analysis.
logging closure:
(found here: http://www.intelligrape.com/blog/2011/10/21/log-sql-in-grails-for-a-piece-of-code/)
public static def execute(Closure closure) {
Logger sqlLogger = Logger.getLogger("org.hibernate.SQL");
Logger transactionLogger = Logger.getLogger("org.hibernate.transaction");
Level currentLevel = sqlLogger.level
Level transLevel = transactionLogger.level
sqlLogger.setLevel(Level.TRACE)
transactionLogger.setLevel(Level.TRACE)
def result = closure.call()
sqlLogger.setLevel(currentLevel)
transactionLogger.setLevel(transLevel)
result
}
usage:
def result
execute{
result=Dog.createCriteria().list{
eq("breed","Greyhound")
}
}
I would like something that can be used in a similar way.
Is this something I could do with a sub-class of Criteria or Hibernate.Restrictions ?
Or is there something I'm missing in the docs on how to modify the SQL statement that is sent to the DB from GORM?
Thanks for any info.
Assuming you don't want all queries, but only the longer-running ones you might find the "auto explain" add-on for PostgreSQL useful.
http://www.postgresql.org/docs/9.1/static/auto-explain.html

NHibernate not finding named query result sets in 2nd level cache

I have a simple unit test where I execute the same NHibernate named query 2 times (different session each time) with the identical parameter. It's a simple int parameter, and since my query is a named query I assume these 2 calls are identical and the results should be cached.
In fact, I can see in my log that the results ARE being cached, but with different keys. So, my 2nd query results are never found in cache.
here's a snip from my log (note how the keys are different):
(first query)
DEBUG NHibernate.Caches.SysCache2.SysCacheRegion [(null)] <(null)> -
adding new data: key= [snipped]... parameters: ['809']; named
parameters: {}#743460424 &
value=System.Collections.Generic.List`1[System.Object]
(second query)
DEBUG NHibernate.Caches.SysCache2.SysCacheRegion [(null)] <(null)> -
adding new data: key=[snipped]... parameters: ['809']; named
parameters: {}#704749285 &
value=System.Collections.Generic.List`1[System.Object]
I have NHibernate set up to use the query cache. And I have these queries set to cacheable=true. Don't know where else to look. Anyone have any suggestions?
Thanks
-Mike
Okay - i figured this out. I was executing my named query using the following syntax:
IQuery q = session.GetNamedQuery("MyQuery")
.SetResultTransformer(Transformers.AliasToBean(typeof(MyDTO)))
.SetCacheable(true)
.SetCacheRegion("MyCacheRegion");
( which, I might add, is EXACTLY how the NHibernate docs tell you how to do it.. but I digress ;) )
If you use create a new AliasToBean Transformer for every query, then each query object (which is the key to the cache) will be unique and you will never get a cache hit. So, in short, if you do it like the nhib docs say then caching wont work.
Instead, create your transformer one time in a static member var and then use that for your query, and caching will work - like this:
private static IResultTransformer myTransformer = Transformers.AliasToBean(typeof(MyDTO))
...
IQuery q = session.GetNamedQuery("MyQuery")
.SetResultTransformer(myTransformer)
.SetCacheable(true)
.SetCacheRegion("MyCacheRegion");

Can I pretty-print the DBIC_TRACE output in DBIx::Class?

Setting the DBIC_TRACE environment variable to true:
BEGIN { $ENV{DBIC_TRACE} = 1 }
generates very helpful output, especially showing the SQL query that is being executed, but the SQL query is all on one line.
Is there a way to push it through some kinda "sql tidy" routine to format it better, perhaps breaking it up over multiple lines? Failing that, could anyone give me a nudge into where in the code I'd need to hack to add such a hook? And what the best tool is to accept a badly formatted SQL query and push out a nicely formatted one?
"nice formatting" in this context simply means better than "all on one line". I'm not particularly fussed about specific styles of formatting queries
Thanks!
As of DBIx::Class 0.08124 it's built in.
Just set $ENV{DBIC_TRACE_PROFILE} to console or console_monochrome.
From the documentation of DBIx::Class::Storage
If DBIC_TRACE is set then trace information is produced (as when the
debug method is set). ...
debug Causes trace information to be emitted on the debugobj
object. (or STDERR if debugobj has not specifically been set).
debugobj Sets or retrieves the object used for metric collection.
Defaults to an instance of DBIx::Class::Storage::Statistics that is
compatible with the original method of using a coderef as a callback.
See the aforementioned Statistics class for more information.
In other words, you should set debugobj in that class to an object that subclasses DBIx::Class::Storage::Statistics. In your subclass, you can reformat the query the way you want it to be.
First, thanks for the pointers! Partial answer follows ....
What I've got so far ... first some scaffolding:
# Connect to our db through DBIx::Class
my $schema = My::Schema->connect('dbi:SQLite:/home/me/accounts.db');
# See also BEGIN { $ENV{DBIC_TRACE} = 1 }
$schema->storage->debug(1);
# Create an instance of our subclassed (see below)
# DBIx::Class::Storage::Statistics class
my $stats = My::DBIx::Class::Storage::Statistics->new();
# Set the debugobj object on our schema's storage
$schema->storage->debugobj($stats);
And the definition of My::DBIx::Class::Storage::Statistics being:
package My::DBIx::Class::Storage::Statistics;
use base qw<DBIx::Class::Storage::Statistics>;
use Data::Dumper qw<Dumper>;
use SQL::Statement;
use SQL::Parser;
sub query_start {
my ($self, $sql_query, #params) = #_;
print "The original sql query is\n$sql_query\n\n";
my $parser = SQL::Parser->new();
my $stmt = SQL::Statement->new($sql_query, $parser);
#printf "%s\n", $stmt->command;
print "The parameters for this query are:";
print Dumper \#params;
}
Which solves the problem about how to hook in to get the SQL query for me to "pretty-ify".
Then I run a query:
my $rs = $schema->resultset('SomeTable')->search(
{
'email' => $email,
'others.some_col' => 1,
},
{ join => 'others' }
);
$rs->count;
However SQL::Parser barfs on the SQL generated by DBIx::Class:
The original sql query is
SELECT COUNT( * ) FROM some_table me LEFT JOIN others other_table ON ( others.some_col_id = me.id ) WHERE ( others.some_col_id = ? AND email = ? )
SQL ERROR: Bad table or column name '(others' has chars not alphanumeric or underscore!
SQL ERROR: No equijoin condition in WHERE or ON clause
So ... is there a better parser than SQL::Parser for the job?

SQL server profiler not showing LINQ To Sql queries

I am trying to view SQL generated by Linq to SQL in the SQL Server Profiler (2005).
I can see the sql sent to the server from anything except for linq to sql.
I'm betting that I need to change event selections for the trace, but not sure what else to select.
I am currently only selecting this:
SQL:StmtCompleted - TextData & SPID
I don't want to use data context logging nor the SQL Debug Visualizer. I need to use the profiler.
Why can I not see the LINQ to SQL queries?
Thanks.
EDIT
I added SQL:BatchCompleted and that hasn't helped.
EDIT 2
I added the event RPC:Completed which is found under the Stored Procedures category in event selection. This worked!
You need RPC call - the queries are executed as exec_sql.
Are you including enough of the options in the SQL Profiler to see the BatchCompleted events, too?
Marc
There is also an option in the data context class to enable log in the client side. When log is enabled is possible to see the queries.
See this link:
http://www.davidhayden.com/blog/dave/archive/2007/08/17/DataContextLogLoggingLINQToSQLOutputConsoleDebuggerOuputWindow.aspx
Had the same problem and none of the solutions above worked for me.
What worked for me was adding ToList() enumerator to the query.
Before:
var data = null == id ?
(from ...
select new
{
...
})
:
(from ..
select new
{
...
});
After:
var data = null == id ?
(from ...
select new
{
...
}).ToList()
:
(from ..
select new
{
...
}).ToList();
foreach (var obj in data)
{
xxx = obj.somename; --> now you can see the sql query in Profiler