From an ICriteria, is it possible to retrieve a string containing the SQL that NHibernate is planning on executing?
I know that is possible to receive a trace, but I was wondering if there is a method that can be called that generates the SQL (for example, so you don't have to actually flush to the database).
It's not directly exposed anywhere. Keep in mind the generated SQL is dialect, driver and batcher dependant, so generation of the final SQL occurs late in the pipeline.
NHibernate Profiler works great for us. there's a trial available at nhprof.com/
edit: NHProf attaches itself on the connection from your machine to the database and captures any SQL passing by, with the number of results and the time spent on fetching processing. NHProf also gives you all sorts of advice that will improve performance.
Related
Im designing a UWP app that uses an SQLite database to store its information. From previous research I have blearnt that using the SQLite function SQLiteConnection.Update() and SQLiteConnetion.Insert() functions are safe to use as the inputs are sanitised before entering in the database.
The next step I need to do is sync that data with an online database - in this case SQL Server - using a service layer as my go between. Given that the data was previously sanitised by the SQLite database insert, do I still need to parameterise the object values using the service layer before they are passed to my SQL Server database?
The simple assumption says yes because, despite them being sanitised by the SQLite input, they are technically still raw strings that could have an effect on the main database if not parameterised when sending them there.
Should I just simply employ the idea of "If in doubt, parameterise" ?
I would say that you should always use SQL parameters. There are a few reasons why you should do so:
Security.
Performance. If you use parameters the reuse of execution plans could increase. For details see this article.
Reliability. It is always easier to make a mistake if you build SQL commands by concatenating strings.
When I troubleshoot a large .NET app which uses only stored procedures, I capture the sql which includes the SP name from SQL Server Profiler and then it's easy to do a global search for the SP in the source files and find the exact line which produced the SQL.
When using Entity Framework, this is not possible due to the dynamic creation of SQL statements. However there are times when I capture some problematic sql statements from production and want to know where in the code they were generated from.
I know one can have EF generate logs and tracing on demand. This probably would be taxing for a busy server and produces too much logs. I read some stuff about using mini profiler but not sure if it fits my needs as I don't have access to the production server. I do however have access to attach SQL Server Profiler to the database server.
My idea is to find a way to have EF attach/inject a unique code to the generated SQL but it doesn't affect the outcome of the SQL. I can then use it to cross reference it to the line of code which injected it into the SQL. The unique code is static which means a unique static code is used for every EF linq statement. Maybe sent as a dummy sql or a comment along with the sql statement.
I know this will add some extra traffic but in my case, it will add extra flexibility and cut a lot of troubleshooting time.
Any ideas of how to do this or any alternatives?
One very simple approach would be to execute something via ExecuteStoreCommand(): Refresh data from stored procedure. I'm not sure if you can "execute" just a comment, but at the very least you should be able to do something like:
ExecuteStoreCommand("DECLARE #MyTag VARCHAR(100) = 'some_unique_id';");
This is very simple, but you would have to find the association in two steps:
Get the SessionID (i.e. SPID) from poorly performing query in SQL Server Profiler
Search the Profiler entries for the prior SQL statement for that same SPID
Another option that might be a little more complicated but would remove that additional step when it comes to making that association is to "intercept" the commands before they get executed and inject a comment with your unique id. Please see the following S.O. Answer for details. You shouldn't need the full extent of what they did, but even if you do, it seems like all of the code (or all the relevant stuff) is there:
Adding a query hint when calling Table-Valued Function
By the way, this situation is a point in favor of using Stored Procedures instead of an ORM. And, what do you expect to be able to do in terms of performance tuning once you do find the offending app code? (another point in favor of using Stored Procedures instead of an ORM ;-).
I have windows server 2008 r2 with microsoft sql server installed.
In my application, I am currently designing a tool for my users, that is querying database to see, if user has any notifications. Since my users can access the application multiple times in a short timespan, i was thinking about putting some kind of a cache on my query logic. But then I thought, that my ms sql server probably does that already for me. Am I right? Or do I need to configure something to make it happen? If it does, then for how long does it keep the cache up?
It's safe to assume that MSSQL will has the caching worked out pretty well =)
Don't bother trying to build anything yourself on top of it, simply make sure that the method you use to query for changes is efficient (eg. don't query on non-indexed columns).
PS: wouldn't caching locally defeat the whole purpose of checking for changes on the database?
Internally the database does all sorts of things, including 'caching', but at all times it works incredibly hard to make sure your users see up-to-date data. So it has to do some work each time your application makes a request.
If you want to reduce the workload by keeping static data in your application then you have to implement it yourself.
The later versions of the .net framework have caching features built in so you should take a look at those (building your own caching can get very complex).
SQL Server will handle caching for you, yes. When you create a query or a stored procedure SQL Server will cache that execution plan and reuse it accordingly. From MSDN:
SQL Server execution plans have the following main components: Query
Plan The bulk of the execution plan is a re-entrant, read-only data
structure used by any number of users. This is referred to as the
query plan. No user context is stored in the query plan. There are
never more than one or two copies of the query plan in memory: one
copy for all serial executions and another for all parallel
executions. The parallel copy covers all parallel executions,
regardless of their degree of parallelism.
Execution Context, each user that is currently executing the query has a data structure that holds
the data specific to their execution, such as parameter values. This
data structure is referred to as the execution context. The execution
context data structures are reused. If a user executes a query and one
of the structures is not being used, it is reinitialized with the
context for the new user.
If you wish to clear this cache you can execute sp_recompile or DBCC FREEPROCHCACHE
My boss and I have been trying to see what sort of auditing plan we could try for our stored procedures. Currently there're two external applications taking information from our database through stored procedures and we're interested in auditing when they're being executed, and what values are passed as parameters. So far what I've done is simply create a table for the stored procedures one of the apps is using, and as they use the same input parameters, have one column per parameter. Obviously this isn't the best choice, but we wanted to get quick info to see if they were running batch processes and when they were running them. I've tried SQL Server Audit, but it doesn't catch the parameters unless you're executing a SP in a query.
SQL Server Profiler will do this for you; its included for free. Setup a trace and let it run.
You can also apply quite a bit of filtering to the trace, so you don't need to track everything; you can also direct the output to a file, or sql table for later analysis. This is probably your best bet for a time limited audit.
I think I've used the SQL Server Profiler (http://msdn.microsoft.com/en-us/library/ms181091.aspx) in the past to audit SQL execution. It's not something you would run all the time, but you can get a snapshot of what's running and how it's being executed.
I haven't tried using them, but you might look at event notifications and see if they will work for you.
From BOL
Event notifications can be used to do the following:
Log and review changes or activity occurring on the database.
I'm trying to log hibernate activity (only dml operations) to an sql script file.
My goal is to have a way to reconstruct the database from a given starting point to the current state by executing the generated script.
I can get the sql queries from log4j logs but they have more information than the raw sql queries and i would need to parse them and extract only the helpful statements.
So i'm looking for a programatic way, maybe by listening the persist/merge/delete operations and accessing the hibernate-generated sql statements.
I don't like to reinvent the wheel so, if anybody know a way for doing this i would appreciate it very much.
Thanks in advance
Generally the best way to do this is to just turn on logging on your SQL server. All the major RDBMSes support logging all the SQL statements that they run. This has the added advantage of catching things that happened outside of Hibernate.
You could also try to use NHProf which will intercept/record hibernate traffic to the database and dump it into an XML file. You might have to parse the file by hand, but all the information will be there.
You could also hook at the JDBC level directly and record the JDBC statements that are performed.
P6Spy is a great tool to inspect what's going on. It can log the queries, though I don't know if you can replay them as is.
I'm sure there are other such tool (or at worse you could try subclass the DataSource, Connection and PreparedStatement implementation of your choice to do that yourself).