SSRS Caching and/or Snapshot - sql

I am fairly new to SSRS reports so I am looking for guidance. I have SSRS reports that have 3 visible parameters: Manager, Director, and VP. The report will display data based on the parameters selected. Initially, the report was taking a very long time to load and my research led me to creating a snapshot of the report.
The initial load of the report is really quick (~5 secs) but the parameters are set to "Select All" in all sections. When the report is later filtered to say, only 1 VP, the load time can vary anywhere between 20 to 90 seconds. Because this report will be used by all aspects of management within the organization, load time is critical.
Is it possible to load the filtered data quicker? Is there anything I can do?
Any help will be much appreciated.
Thank you!

This is a pretty broad efficiency issue. One of the big questions is whether or not the query takes a long time to run in the database or just in SSRS. Ideally you would start with optimizing the query and indexing, but that's not always enough. So the work has to be done somewhere, all you can do is shift the work to be done before the report is run. Here are a couple options:
Caching
Turn on caching for the report.
Schedule a subscription to run with each possible value for the parameter. This will cause the report to still load quickly once an individual is specified.
Intermediate Table
Schedule a SQL stored procedure to aggregate and index the data in a new table in your database.
Point the report to run from this data for quick reads.
Each option has it's pros and cons because you have to balance where the data preparation work is done. Sometimes you have to try a few options to see what works best for your situation.

Related

Measuring the averaged elapsed time for SQL code running in google BigQuery

As BigQuery is a shared resource, it is possible that one gets different values running the same code on BigQuery. OK one option that I always use is to turn off caching in Query Settings, Cache preference. This way queries will not be cached. The problem with this setting is that if you refresh the browser or leave it idle, that Cache Preference box will be ticked again.
Anyhow I had a discussion with some developers that are optimizing the code. In a nutshell, they take slow running code, run it 5 times and get an average, then following optimization then run the code again 5 times to get an average value for optimized SQL. Details are not clear to me. However, my preference would be (all in BQ console)
create a user session
turn off sql caching
On BQ console paste the slow running code;
On the same session paste the optimized code
Run the codes (separated by ";")
This will ensure that any systematics like BQ busy/overloaded, slow connection etc will affect "BOTH" SQL piece equally and the systematics will be cancelled out. In my option one only need to run it once as caching is turned off as well. Running 5 times to get an average looks excessive and superfluous?
Appreciate any suggestions/feedback
Thanks
Measuring the time is one way, the other way to see if the query has been optimized is the understanding of the query plan and how slots are used effectively.
I've been with BigQuery more than 6 years, and what you describe was never used by me. In BigQuery actually what matters is reducing the costs, and that can be done iteratively rewriting the query, and using partitioning/clustering/materialized views, caching/temporary tables.

How to auto refresh Power BI dataset after failure

I have scheduled query for a dataset in Power BI.
In case of a refresh failure, I want Power BI to "retry" to refresh the data again, up to 5 times.
Is there a way to do it?
For the time being it doesn't seem possible as confirmed by this post. You can play with the "Command time out in minutes(optional)" in your query when creating your data source as noted in the comments.
Under Advanced options.
If the timeout is left blank the default is 10 minutes. So if the issue is that your queries are timing out this may be the solve for you.
Another workaround is that you can schedule your data source to update multiple times at half hour increments. Like so. Note that depending on how big your data set is this may place a burden on the server you are pulling from. If that is the case then looking into incremental refresh would be your next go to.
Hope this helps.

Tableau 8.1 taking long time to display report

I have a stored procedure as a source connection in Tableau 8.1. It takes a long time to fetch and display ( about 1 min) 40000 records (there is no bar chart, pie charts etc).
What the stored proc does is it selects 40000 records with some 6-7 table joins.
However the same stored procedures executes and displays the records in sql server management studio within 3 seconds.
After using SQL Server Profiler, it shows that some 45000 inserts into a tableau temp table occurs which takes a long time. Also, it shows in the log file that it takes a high percentage of time for the inserts while the execution of stored proc itself takes about 4-5 seconds only.Is this the problem ?Any suggestion how to over come this issue?
Regards
Gautam S
A few of places to start:
First check out the Tableau log file in your Tableau repository directory after trying to access your data. There will be a lot of information in there, but you should be able to see the actual SQL that Tableau sends to your database -- and that may give you some clues about what it is doing that is taking so long. If nothing else, you can cut and paste the SQL into your database tools and try to isolate the problem SQL without Tableau in the mix
If the log file doesn't give you an idea about how to restructure your system to avoid the long query, then send it along with info about your schema to Tableau support. They may be able to help.
Simplify whatever you can to reduce the problem to its core, get rid of everything in your visualization but a total, and then slowly build it back up to see what causes the behavior. For example, make a test version and remove one table at a time from your query to see what causes the problem.
Avoid using quick filters if you see performance problems (or minimize them) Nice feature, but comes with a performance cost
Try the Tableau performance monitoring (record and analysis) features
Work with a smaller data set during testing so you can more quickly experiment with different approaches
Try replacing your stored procedure with a view. That's usually better if at all possible.
Add indices to speed the joins
If there is no way around the long operation and if updates are infrequent, make a Tableau extract so that you only pay that cost periodically
If none of these things help, cut the problem down to its simplest version and post a schema and the problem SQL Otherwise, people can only give you generic advice

Gain a Customized report

Goal:
Display the result based on the picture below in reporting Service 2008 R2.
Problem:
How should I do it?
You also have to remember that in reality the list contains lots of data, maybe miljon
In terms of the report itself, this should be a fairly standard implementation.
You'll need to create one Tablix, with one Group for Customer (one row), one Group for Artist (two rows, one for the headers and one for the Artist name, then a detail row for the Title.
It looks like you need more formatting options for the Customers Textbox - you could merge the cells in the Customer header row, then insert a Rectangle, which will give you more options to move objects around in the row.
For large reports you've got a few options:
Processing large reports: http://msdn.microsoft.com/en-us/library/ms159638(v=sql.105).aspx
Report Snapshots: http://msdn.microsoft.com/en-us/library/ms156325(v=sql.105).aspx
Report Caching: http://msdn.microsoft.com/en-us/library/ms155927(v=sql.105).aspx
I would recommend scheduling a Snapshot overnight to offload the processing to a quiet time, then making sure the report has sensible pagination set up so not too much data has to be handled at one time when viewed (i.e. not trying to view thousands of reports at one time when viewed in Report Manager).
Another option would be to set up an overnight Subscription that could save the report to a fileshare or send it as an email.
Basically you're thinking about reducing the amount of processing that needs to be done at peak times and processing the report once for future use to reduce overall resource usage.
I would use a List with text-boxes inside to for that kind of display.
In addition you may consider to add page break after each customer.
Personally I Experienced Lots of performance issues when dealing with thousands of rows, not to mention millions.
My advise to you is to re-consider the report main target: if the report is for exporting purposes - then don't use the ssrs for that.
If the report is for viewing - then perhaps it is possible to narrow down the data using parameters per user's choice.
Last thing, I wish you Good luck :)

Optimize Reporting in Reporting Services

I have 10 reports and 250 customers. All the reports are ran by my customers and each report take parameters. Depending on the parameters, same report connects to different database and gets result. I know with different parameters caching is not an option. But I dont want to run these reports on live data during day time. Is there anything I can do (snapshot, subscription) that can run overnight and either sends these reports or save a snapshot that could be used for next 24 hours?
Thanks in advance.
As M Fredrickson suggests, subscriptions might work here depending on the number of different reports to be sent.
Another approach is to consolidate your data query to a single shared datasource. Shared datasources can have caching enabled, and there are several options for refreshing that cache, such as on first access or on a timed schedule. See MSDN for more details.
The challenge with a cached datasource is to figure out how to remove all parameters from the actual data query by moving them elsewhere, usually the dataset filter in the report, or into the filters of the individual data elements, such as your tablixes.
I use this approach to refresh a 10 minute query overnight, and then return the report all day long in less than 30 seconds, with many different possible parameters filtering the dataset.
You can also mix this approach with others by using multiple datasets in your report, some cached and some not.
I would suggest going the route of subscriptions. While you could do some fancy hack to get multiple snapshots of a single report, it would be cleaner to use subscriptions.
However, since you've got 250 customers, and 10 different reports, I doubt that you'll want to configure and manage 2,500 different subscriptions within Report Manager... so I would suggest that you create a data driven subscription for each of the reports.