I have two reports in SSRS 2008, Dashboard and Drillthrough.
Dashboard contains many datasets (all stored procedures), and takes about 4-5 seconds to run.
Clicking on an aggregated value in one of the tables in Dashboard takes the user to Drillthrough, which has one dataset - a stored procedure accepting two parameters (int and char(1), which are passed from Dashboard), which runs very quickly in SSMS.
The Drillthrough dataset is large, averaging around 10,000 rows, which are displayed in a table. The report is configured to have 200 rows per page, and so can have a lot of pages.
The problem:
When I click a link in Dashboard, nothing happens for about a minute. There are several issues I have with this:
The fact that the screen does not immediately switch to the 'Report is being generated' screen means confusion for the user, who sees no response (in cases where the report is embedded in a web page). Is this normal behaviour?
The Drillthrough query itself runs very quickly in SSMS, therefore, why is it taking so long on the Report Server? Where is the hold up likely to be? (I read up on 'parameter sniffing' in relation to this, but as the query runs quickly in SSMS, it seems that my problem wouldn't be due to issues around that.)
Related
I am performing an analysis on the frequency of SSRS Report executions.
However, I need a method of separating reports ran 'manually' (by user interaction) and those that occur due to an 'Auto Refresh' (Auto Refresh Parameter on SSRS Report).
Is there any method of separating these out when querying the ReportServer Database, or at least ignoring any executions which were due to an Auto-Refresh event?
Thanks in advance.
You can query the ExecutionLog views in the Report Server database. The fields you'll likely need to pay attention to are ItemAction and Source, but you'll need to determine which combinations you consider to be an execution. I'd start with ItemAction="Render" and Source="Live", possibly also looking at Format(Web vs PDF, etc).
Best thing to do is play with a report and see what data it generates in the log table, and then determine which ones you are interested in capturing.
I am fairly new to SSRS reports so I am looking for guidance. I have SSRS reports that have 3 visible parameters: Manager, Director, and VP. The report will display data based on the parameters selected. Initially, the report was taking a very long time to load and my research led me to creating a snapshot of the report.
The initial load of the report is really quick (~5 secs) but the parameters are set to "Select All" in all sections. When the report is later filtered to say, only 1 VP, the load time can vary anywhere between 20 to 90 seconds. Because this report will be used by all aspects of management within the organization, load time is critical.
Is it possible to load the filtered data quicker? Is there anything I can do?
Any help will be much appreciated.
Thank you!
This is a pretty broad efficiency issue. One of the big questions is whether or not the query takes a long time to run in the database or just in SSRS. Ideally you would start with optimizing the query and indexing, but that's not always enough. So the work has to be done somewhere, all you can do is shift the work to be done before the report is run. Here are a couple options:
Caching
Turn on caching for the report.
Schedule a subscription to run with each possible value for the parameter. This will cause the report to still load quickly once an individual is specified.
Intermediate Table
Schedule a SQL stored procedure to aggregate and index the data in a new table in your database.
Point the report to run from this data for quick reads.
Each option has it's pros and cons because you have to balance where the data preparation work is done. Sometimes you have to try a few options to see what works best for your situation.
I have a stored procedure as a source connection in Tableau 8.1. It takes a long time to fetch and display ( about 1 min) 40000 records (there is no bar chart, pie charts etc).
What the stored proc does is it selects 40000 records with some 6-7 table joins.
However the same stored procedures executes and displays the records in sql server management studio within 3 seconds.
After using SQL Server Profiler, it shows that some 45000 inserts into a tableau temp table occurs which takes a long time. Also, it shows in the log file that it takes a high percentage of time for the inserts while the execution of stored proc itself takes about 4-5 seconds only.Is this the problem ?Any suggestion how to over come this issue?
Regards
Gautam S
A few of places to start:
First check out the Tableau log file in your Tableau repository directory after trying to access your data. There will be a lot of information in there, but you should be able to see the actual SQL that Tableau sends to your database -- and that may give you some clues about what it is doing that is taking so long. If nothing else, you can cut and paste the SQL into your database tools and try to isolate the problem SQL without Tableau in the mix
If the log file doesn't give you an idea about how to restructure your system to avoid the long query, then send it along with info about your schema to Tableau support. They may be able to help.
Simplify whatever you can to reduce the problem to its core, get rid of everything in your visualization but a total, and then slowly build it back up to see what causes the behavior. For example, make a test version and remove one table at a time from your query to see what causes the problem.
Avoid using quick filters if you see performance problems (or minimize them) Nice feature, but comes with a performance cost
Try the Tableau performance monitoring (record and analysis) features
Work with a smaller data set during testing so you can more quickly experiment with different approaches
Try replacing your stored procedure with a view. That's usually better if at all possible.
Add indices to speed the joins
If there is no way around the long operation and if updates are infrequent, make a Tableau extract so that you only pay that cost periodically
If none of these things help, cut the problem down to its simplest version and post a schema and the problem SQL Otherwise, people can only give you generic advice
Goal:
Display the result based on the picture below in reporting Service 2008 R2.
Problem:
How should I do it?
You also have to remember that in reality the list contains lots of data, maybe miljon
In terms of the report itself, this should be a fairly standard implementation.
You'll need to create one Tablix, with one Group for Customer (one row), one Group for Artist (two rows, one for the headers and one for the Artist name, then a detail row for the Title.
It looks like you need more formatting options for the Customers Textbox - you could merge the cells in the Customer header row, then insert a Rectangle, which will give you more options to move objects around in the row.
For large reports you've got a few options:
Processing large reports: http://msdn.microsoft.com/en-us/library/ms159638(v=sql.105).aspx
Report Snapshots: http://msdn.microsoft.com/en-us/library/ms156325(v=sql.105).aspx
Report Caching: http://msdn.microsoft.com/en-us/library/ms155927(v=sql.105).aspx
I would recommend scheduling a Snapshot overnight to offload the processing to a quiet time, then making sure the report has sensible pagination set up so not too much data has to be handled at one time when viewed (i.e. not trying to view thousands of reports at one time when viewed in Report Manager).
Another option would be to set up an overnight Subscription that could save the report to a fileshare or send it as an email.
Basically you're thinking about reducing the amount of processing that needs to be done at peak times and processing the report once for future use to reduce overall resource usage.
I would use a List with text-boxes inside to for that kind of display.
In addition you may consider to add page break after each customer.
Personally I Experienced Lots of performance issues when dealing with thousands of rows, not to mention millions.
My advise to you is to re-consider the report main target: if the report is for exporting purposes - then don't use the ssrs for that.
If the report is for viewing - then perhaps it is possible to narrow down the data using parameters per user's choice.
Last thing, I wish you Good luck :)
I have 10 reports and 250 customers. All the reports are ran by my customers and each report take parameters. Depending on the parameters, same report connects to different database and gets result. I know with different parameters caching is not an option. But I dont want to run these reports on live data during day time. Is there anything I can do (snapshot, subscription) that can run overnight and either sends these reports or save a snapshot that could be used for next 24 hours?
Thanks in advance.
As M Fredrickson suggests, subscriptions might work here depending on the number of different reports to be sent.
Another approach is to consolidate your data query to a single shared datasource. Shared datasources can have caching enabled, and there are several options for refreshing that cache, such as on first access or on a timed schedule. See MSDN for more details.
The challenge with a cached datasource is to figure out how to remove all parameters from the actual data query by moving them elsewhere, usually the dataset filter in the report, or into the filters of the individual data elements, such as your tablixes.
I use this approach to refresh a 10 minute query overnight, and then return the report all day long in less than 30 seconds, with many different possible parameters filtering the dataset.
You can also mix this approach with others by using multiple datasets in your report, some cached and some not.
I would suggest going the route of subscriptions. While you could do some fancy hack to get multiple snapshots of a single report, it would be cleaner to use subscriptions.
However, since you've got 250 customers, and 10 different reports, I doubt that you'll want to configure and manage 2,500 different subscriptions within Report Manager... so I would suggest that you create a data driven subscription for each of the reports.