Analysis Services CommitTransaction causes AS hanged - ssas

After I process a cube and commit transaction, something will be wrong with AS. No response when connect to AS using Management Studio.
The cube has two partitions which contains 100,000,000 fact records in each.
Processing is completed and I have confirmed that by using SQL Server profiler. But the CommitTransaction event never finish. What is worse is that Analysis Services will not response.
Hope someone can help me.
Thanks,

Related

SQL Server Management Studio : execution timeout expired

I keep on getting a timeout in my Microsoft SQL Server Management Studio. I'm not running any code. All I am doing is trying to look at the tables within a database in the Object Explorer. I always get:
Execution Timeout Expired
I've looked at some of my settings and it says lockout of 0, meaning it should be unlimited time. I'm not even running any code. Just trying to understand what's in my database by going through the Object Explorer.
Thanks!
It depends on your work environment. But in all cases, I trust it is related to the Database but not the Studio itself.
If you are working on a server that is reached by the network by many other clients, then:
It could be a transient network problem,
High load of requests on the Server,
Deadlock or other problems related to multiprocess contention.
I suggest you troubleshoot your server in idle time, and if possible you detach the databases one by one and work to see which database is resulting in the problem. For this database, you go through the Stored Procedures and Functions and try to enhance them in terms of performance.

Query timeout expired while running a request from a packaged .NET tool

I am working on an application built using a .NET packaged configuration tool. There is a out of box request that in turn fires a delete SQL statement and tries to delete a row from the database. Since past few months, we are receiving the following error- [Microsoft][ODBC SQL Server Driver]Query timeout expired.
We are unable to simulate this scenario in lower environments and can see this issue only on production.
The only issue we think is there might be a deadlock issue in a situation where this delete request is deadlocking with another simultaneous insert in the table (we have a scheduled job that inserts data in a relational database from large XML-splitting data from nodes)
What can be done to resolve this? Note, indexing has been done on the tables recently that reduces the frequency for some time but it again spiked up.

SSAS Cube processing logs

Where are SSAS cube processing (not error, not flight recorder) logs stored?
We have a SQL Agent Job running a SQL Server Analysis Services command. In there is some DMX which processes each dimension then processes the cube database (containing two cubes)
I want to know how long each of the various queries are taking. There is one query per dimension and one query per measure group
The cube was taking 20 minutes now it's taking 2 hours.
We are using SSAS 2008 R2
I have searched long and hard and as far as I can tell there is no such log.
questions which are not duplicates of this:
Error Log records in SSAS
get output of last Process on SSAS cube
I don't want to use Profiler. I want to see how long each query took in the last cube build at least. I can see all this info if I run interactively. How do I make it log this info when run from a job?
There are a couple of options. You could just continue processing the cube just the way you are but start logging all processing events. Other than the Profiler GUI, there are three main ways to do that:
A server side trace writes a .trc file to disk on the SSAS server with little overhead:
http://blogs.msdn.com/b/karang/archive/2009/11/02/sql-2005-sql-2008-analysis-services-server-side-tracing.aspx
Then you can load it into SQL server to analyze later via PowerShell:
http://www.bp-msbi.com/2012/02/counting-number-of-queries-executed-in-ssas/
Install a community maintained service called ASTrace which uses the Profiler APIs (without the GUI) and writes the Profiler events you choose directly to SQL Server real-time.
https://github.com/Microsoft/Analysis-Services/tree/master/AsTrace
Log XEvents and analyze them later:
http://blog.crossjoin.co.uk/2012/05/05/using-xevents-in-ssas-2012/
Or:
https://francescodechirico.wordpress.com/2012/08/03/identify-storage-engine-and-formula-engine-bottlenecks-with-new-ssas-xevents-5/
All of those options will log all processing transactions. You get to choose just which events you want to log (processing events not queries, for example).
But another alternative is to use a session trace. You could stop using the "SQL Server Analysis Services Command" step type in SQL Agent and start using a PowerShell step type and do something like the following. This example is running an SSAS backup from PowerShell with a SessionTrace to watch all the "profiler" events just for that one session:
https://gallery.technet.microsoft.com/scriptcenter/Backup-Ssas-Databases-with-da62b084

Log Shipping vs Replciation Vs Mirroring in SQL server 2012

I have a SQL Server 2012 database which currently used as a transactional database and reporting database. The application reads/writes into the same database and the reports are also generated against the same database.
Due to some performance issue, I have decided to maintain the two copies of the database. One will be a transactional database which will be accessed by the application. The other database will be the exact copy of the transactional database and it will only be used by the reporting service.
Following are the requirements:
The reporting database should be synched with transactional database in every one hour. That is, the reporting database can have stale data for maximum of 1 hour.
It must be read-only database.
The main intension is NOT recovery or availability.
I am not sure which strategy, transactional log shipping, mirroring or replication, will be best suited in my case. Also if I do the synch operation more frequently (say in every 10 minutes), will there be any impact on the transactional database or the reporting service?
Thanks
I strongly recommend you to use a standby database in readonly state. And every 15 minutes your sqlserveragent has a scheduled job to: a) generate a new .trn logfile within main db, and b) restore it into standby one(your reports db). The only issue is: using this technique your session will be disconnected while agent restores the .trn logfile. But if you can stop the restore job, run your reports and then reactivate it, there is no problem. Seems to be exactly what you need. Or if your reports are fast to run, probably will not be disconnected...if im not wrong restore job can also be configured to wait opened session to finish or to close it. I can check it this last doubt for you tomorrow if you don't find..
Once it is running in the same sql server instance, you don't have to worry about extra licensing...

SQL Azure - Intermittent Long Running Query

I have a query that is intermittently running very slowly in SQL Azure
Runs in < 1 second in SSMS
Is submitted through Entity Framework through the application
Is NOT calling a stored procedure
Is passed parameters
Using DMVs determined that it runs and then enters a "Suspended" state
The wait type is indicated as IO_COMPLETION pointing towards waiting for a resource
This is in Azure so the normal server related monitoring is not available
Execution plan has been reviewed with no scans indicated
Issue will be occurring one day...all day, then with no changes to the database be cleared up the next morning
Once the issue was occurring then I couldn't login to the database for several seconds. Once I could the issue was corrected and the query performed as expected.
Any ideas on what could be causing this or other things that can be checked would be greatly appreciated!
Thanks in advance!