Using SQL profiler, I was able to find the query generated from Nhibernate was executed in the
EXEC sp_executesql N'select ...'
fashion.
I am wondering if there is any way to force Nhibernate to generate the plain
Select ...
statement instead.
The reason I want this is because apparently SQL Server generated different execution plans for them, and in my scenario, the plain "select ..." runs MUCH faster.
-----Update----- Nov 30, 2012
I just found this link Why does sp_executesql run slower when parameters are passed as arguments
And I believe the popular answer(with 4 up votes upto now) explained the reason well.
So Now the question is
Can I generate a straight query instead of parametrized one using nhibernate?
No, NHibernate issues commands to SQL Server using sp_executesql and this cannot be changed. However, you should be able to troubleshoot any slow queries to resolve the performance issues. The first thing that I would like at is to check that the parameters supplied with the sp_executesql call have the same data types as the columns they reference.
In your Session Factory configuration you can enable ShowSql. This will output the SQL queries generated to the output window while you are debugging. You'll need to make sure to set your BatchSize to 0 to see all the queries. If batching is enabled, you won't be able to see the queries it groups up (to optimize performance).
NHibernate Profiler is also an Invaluable (but commercial) tool for debugging your code. http://www.hibernatingrhinos.com/products/NHProf
You should clear your execution plans on your server and than try agian:
DBCC FREEPROCCACHE
And/or you can force a recompilation of the execution plan by injecting option(recompile) into your query.
Nhibernate uses log4net and you just need to add an appender as mentioned here incase you are using log4net:
https://devio.wordpress.com/2010/04/14/logging-sql-statements-generated-by-nhibernate/
For example:
<appender name="DebugSQL" type="log4net.Appender.FileAppender">
<param name="File" value=".\nhdb.log"/>
<param name="AppendToFile" value="true" />
<layout type="log4net.Layout.PatternLayout">
<conversionPattern
value="%date [%thread] %-5level %logger [%property{NDC}] - %message%newline" />
</layout>
</appender>
<logger name="NHibernate.SQL" additivity="false">
<level value="DEBUG" />
<appender-ref ref="DebugSQL" />
</logger>
Related
I have some SQL scripts that I'd like to convert into Liquibase XML changelogs. Is this possible?
What exactly do you mean by "convert into changelogs"?
If you don't need cross-database compatibility, you could just use the sql executor:
<changeSet id="testing123" author="me">
<sql splitStatements="false">
<!-- all your existing SQL paste here -->
</sql>
</changeSet>
The splitStatements=false setting will make sure the complete SQL will be sent in one command; otherwise, Liquibase would split it into multiple commands at semicolons.
You could use a detour via a database. Apply your script to a database of choice (that is supported by liquibase). Then use generateChangeLog to generate the changelogs in xml from the database. You will have the XML changelog generated for all the SQL scripts you have applied to your database.
Alternatively, have a look at answer on this SO post.
I am having some performance trouble in our production server.
1) The principal query takes generally 0.5 secs, but sometimes it takes 20 secs, (then the client goes crazy because it's seems that crashed). The query is over a view and it does not have any eager or lazy object attached, only plain attributes. The query returns 100 paginated records (over 65.000 records)
Why is so variable? How can I improve the query, as the query is working fine most of the times?
2) Generally, the Javamelody stats show that server uses between 5 and 10 connections at same time. But sometimes this goes up to the max (100) and then the server goes busy and unstable.
We are having between 1800 and 2000 sessions working on the app.
This is my config:
Tomcat Server: AWS Linux EC2 Instance t2.medium
MS SQL Server: AWS Windows EC2 Instance c4.large (it has a SQL Server Express and now we are moving to an SQL Server Web Edition to get more powerful [maybe this is the problem?])
JDBC Connection Pooling Configuration:
<bean class="org.apache.commons.dbcp2.BasicDataSource" id="dataSource">
<property name="url" value="jdbc:sqlserver://url..."/>
<property name="username" value="username..."/>
<property name="password" value="password..."/>
<property name="initialSize" value="10"/>
<property name="maxTotal" value="100"/>
<property name="maxIdle" value="50"/>
<property name="minIdle" value="10"/>
</bean>
Should I change connection pooling config? How can I improve these leaks?
Thank you, and sorry for my english.
I confirm that 5 to 10 jdbc connections used at the same time, with queries taking about 0.5s is a lot. That's a clear indication that the database is certainly heavy loaded, even more so when having only 2 vcpu.
Queries taking sometimes 20s, instead of 0.5s, is just when the database has severe difficulties to keep up with the load: they are put waiting in queue.
And used jdbc connections increasing to 100 is when the database just "gives up": imagine 100 queries at the same time, when it already has difficulties to keep up with 5 to 10 queries at the same time.
In this case, sql queries should be optimized or the database server should changed.
I am troubleshooting an application that uses a SQL Server database and I am seeing a lot of sp_execute calls.
I can not seem to find the sp_prepare calls.
How can you inspect all of the prepared SQL statements in memory?
I was looking for a way to see the actual SQL statements executed by sp_execute in SQL Server 2008 R2 Profiler.
To do this, I created a new trace, and clicked on the "Events Selection" tab. I selected "Show all events" and checked Stored Procedures > SP:StmtCompleted. Running the trace, I was then able to see the actual SQL statements.
I ran into this issue as well. SQL Profiler was not capturing the sp_prepare statement because it occurred before the SQL Profiler trace had started to run. The various postings that rely on sys.dm_exec_sql_text did not help because I could not find the correct sql_handle or plan_handle value to provide for that stored procedure.
I found a solution from this blog post: in SQL Profiler, click the "Show all events" checkbox and then under the "Stored Procedures" heading choose "SP:CacheHit".
In the resulting SQL Profiler output, you'll see an "SP:CacheHit" row containing the cached SQL statement near your "RPC:Starting ... sp_execute" statement.
You can then reconstruct and reexecute the full SQL statement in SSMS if you wish using:
exec sp_executesql #stmt=N'{statement from SP:CacheHit}',
#params=N'{parameter declaration from SP:CacheHit}',
#param1={value}, {...parameters from RPC:Starting sp_execute statement}
Following up on my comment above, I found a number of relevant links:
How can I find out what command sp_execute is running (without using Profiler)
SP_EXECUTE executing... what?
See the query in sp_execute
Microsoft has documentation but it maybe a challenging piecing things together (as always). If the plan handle is known, you can use this:
sys.dm_exec_sql_text (Transact-SQL)
This is a table-valued function. You can see a blog article here that exploits such table-valued functions to retrieve object dependencies for a valid handle of a compiled (prepared) plan.
I just received a project that hasnt been touched in a while. It was written in VS2005 and SQl 2000. I upgraded the project to VS2010 with no problems. However; when I tried to modify the table adapters etc. I get the error "You must have Microsoft SQL 2005 or greater".
This project has 100's of datasets and table adapters all referencing SQL 2000.
I guess I have 2 questions:
Should I takeall these adapters out and make a data layer and connect to the DB that way?
Or can I upgrade the DB to SQL 2008 and it will all work the way it is? Not sure what is the best approach on this one.
And this is a desktop app if it matters.
Any suggestions would be great.
Thanks
I was able to connect VS 2010 to a MSSQL 2000 db by selecting a SQL db and then the OLE option - then said remember password (yes, I know it's not encrypted). That provided a connection string in the projectname.exe.config file that looks like:
<add name="projectname.My.MySettings.dbnameConnectionString"
connectionString="Provider=SQLOLEDB;Data Source=ip.ip.ip.ip; Persist Security Info=TRUE;Password=xxxxxxx;User ID=username; Inital Catalog=dbname"
providerName="System.Data.OleDb" />
<startup>
<supportedRuntime version="v4.0" sku=".NETFramework, Version=4.0,Profile=Client" />
With this I get Messages on the last line here -
Could not find schema information for the attribute 'sku' (and 'version' and element 'supportedRuntime')
I have a MS SQL database running (MS SQL 2005) and am connecting to it via the net.sourceforge.jtds.jdbc.Driver.
The query works fine for all the columns except one that is a varchar(max). Any ideas how to get around this issues?
I am using the jdbc driver to run a data index into a SOLR implementation.
(I do not control the database, so the first prize solution would be where I can tweak the SQL command to get the desired results)
Thanks
I have found what looks to be a answer. In setting up the driver for the connection to SQL server I did not specify useLobs=false. I am a bit worried about what this will mean for performance, but at least for now it works.
<dataSource
driver="net.sourceforge.jtds.jdbc.Driver"
url="jdbc:jtds:sqlserver://server/database;useLOBs=false"
user="user"
password="password" />
I had the same problem with connecting to MS SQL 2K3. The useLOBs=false did not work for me, but changing the SELECT to CAST(Name AS varchar(255))'Name' worked for me.