SSAS Cube - Excel Drill through not filtered as required - ssas

I have a SSAS DSV similar to following structure:
Id Type Special
1 A 1
2 B Null
3 A Null
4 C 1
5 C Null
I built a dimension for this DSV including one attribute for Type.
Then I have in my cube three measures
Measure1: Count of rows
Measure2A: Sum of Special
Measure2B: Count of non-empty values for Special
Finally in Excel, I display data as following:
Rows --> Type attribute
Values --> Measure1 / Measure2A / Measure2B
When I look at the results, everything is correct.
For instance, I get a count of 1 for measure2A and measure 2B for row = C
BUT when I attempt to drill through for related cells, instead of getting 1 row, I get 2 (the ones where type = C without considering the value of Special)
I guess I am doing something wrong in my design of the cube but cannot understand what.

When determining what rows to show in drillthrough SSAS only considers the dimension context not which detail rows have a non null measure value.
You could add a new dimension on the Special column and add that dimension as a filter to your PivotTable.
Or you could install ASSP and construct a custom rowset action that fires an MDX query which does a NON EMPTY on your measure.
http://asstoredprocedures.codeplex.com/wikipage?title=Drillthrough&referringTitle=Home

Related

Variance column in QlikView

I need to create a "Variance" column in qlikview:
2013 2014 Variance
Measure 1 100 110 10%
Measure 2 105 100 -4.8%
...
Can this be done in Qlikview with just one "Calculated dimension" column that says something like:
[Value for Column2]/[Value for Column 1] - 1
So that it works for any new measures I add in the table and regardless of what the column 1 and column 2 are?
EDIT:
Sample Data:
Year Measure1 Measure2
2012 9750 197
2013 10000 200
2014 11000 210
2015 11500 215
I need the output to be structured as shown below with the Variance column as a calculation between 2 selected Year dimension values.
You can do this with the Column() function.
Column(2) / Column(1)
The number refers to the Expression column -- first Expression is #1, etc. Dimension columns are not counted.
An alternative that is insensitive to column postion, you can use column labels in the expression. Assume expressions with labels of "Sales" & "Margin". The variance expression can be written as:
[Margin] / [Sales]
I am 99% certain there is no built in function to do it.
I've played with a few options I thought might work, but only this is proving of any use.
I've defined the 2 calculated dimensions as variables Year1 and Year2 which I change through an input box. Then the simple calculated dimension =[$(Year2)]-[$(Year1)] gives me a new dimension of the variances.
This assumes the Measure is something that comes from the data, something like this, and now you just want to display it over varying years. I haven't considered what to do if the measures are all expressions.
Here is how to do it using variables and expressions. This will create 2 new columns that will look like a new dimension but will actually be defined by the use of an if() statement.
First step in the script we need to create a dimension that has the Measures in it. this should not associate to any of the column already in the data. (UDR stands for User Defined Report). the dual just allows us to define a sort order that is not alphabetical.
UDR:
load dual(UDR,Sort) as Rows inline [
UDR, Sort
Measure1, 1
Measure2, 2];
the result should be something like this.
Next step is creating the 2 variables for the years.
I do this in the script, but you could use any method of variable creation.
set vMaxYear="=max(Year)";
set vMinYear="=min(Year)";
Now we use the field Rows as the dimension.
And we need to create an expression for the max selected year. Notice that the first if() is testing which rows of the UDR the expression is on and then the definition of the expression for that line is provided. The second if() is testing the year dimension against the variable vMaxYear which will change as selections are made. Min year is the same just replace vMaxYear with vMinYear.
if(Rows='Measure1',sum(if(Year=vMaxYear,Measure1)),
if(Rows='Measure2',sum(if(Year=vMaxYear,Measure2))))
Lastly we use the column() function to calculate the variance in the third expression.
column(1)/column(2)
To make the expression labels dynamic I simply add =vMinYear into the label.
The result is this table, that will respond to my selections in the year list box.
2013 vs 2014
2014 vs 2015

MDX Query Can't connect Fiscal Month/Quarter to my Measures

I've been building an MDX query using excel's powerpivot. I connect to my cube, drag and drop Measures /Dimensions and my query has been working just fine. Up until I try to pull different dimensions.
A simple version of my query:
SELECT
NON EMPTY { [Measures].[EP Projected Impressions] } ON COLUMNS,
NON EMPTY { ([EP Hierarchy].[EP Tactic ID].[EP Tactic ID].ALLMEMBERS ) } ON ROWS
FROM [MI_Cube]
This will return:
(EP Tactic ID) (EP Projected Impressions)
1 10
2 20
3 30
4 40
5 50
Now when I try to pull in date information for each tactic from the Time dimension it just gives me a copy of the above results with each time dimension member.
Example query:
SELECT
NON EMPTY { [Measures].[EP Projected Impressions] } ON COLUMNS,
NON EMPTY { ([EP Hierarchy].[EP Tactic ID].[EP Tactic ID].ALLMEMBERS * [Time].[Fiscal Year].[Fiscal Year].ALLMEMBERS ) } ON ROWS
FROM [MI_Cube]
Results:
(EP Tactic ID) (EP Projected Impressions) (Fiscal Year)
1 10 FY2015
1 10 FY2014
1 10 FY2013
1 10 FY2012
1 10 FY2011
2 20 FY2015
2 20 FY2014
2 20 FY2013
2 20 FY2012
2 20 FY2011
etc....
Does this mean that I cannot pull the Time.FiscalYear dimension for each TacticID? Or do I need to restructure my query? EP Hierarchy has lots of dimension members I can pull successfully, but when I try to pull anything from EP Hierarchy and Time my results get multiplied instead of combined.
Thanks for any advice, trying to wrap my head around cubes and mdx queries.
It seems that you are simply missing a relation between the fact table holding the [EP Projected Impressions] member, and the dimension table holding your [Time] dimension.
By adding a relation between a foreign key on the fact table and the primary key on the dimension table, your measures should get correctly filtered by any attributes you slice on the dimension.
Thank you for the responses, it turns out the measure I was using was not connected to the time dimension. Apparently that was an expected behavior, after trying different measures I am getting the results I was expecting.

Mdx - Flag -Actual

I have two dimensions DimFlag and DimPNL and a fact table FactAmount. I am looking to:
When pnl is stat(Is Stat=1) : sum (Actual x FlagId)
For pnl I multiply the amounts by field FlagId basically if it will be so 0 0 X = 0 ...
DimFlag
FlagId FlagLabel
-----------------
1 NotClosed
0 IsClosed
DimPNL
PNLId PNLName Is Stat
1 a 1
2 test 1
3 test2 0
FactAmount
id PNLId FlagId Actual
1 1 1 100
2 2 1 10
3 3 0 120
I tried the following MDX but it didn't work, any idea please ?
Scope (
[Dim PNL].[PNL].members,[Measures].members
);
this = iif([Dim PNL].[PNL].CurrentMember.Properties("Is Stat") =1
,
aggregate([Dim PNL].[PNL].currentmember,[Measures].currentmember)* iif([Dim Flag].[Flag Label].[Flag Label].currentmember = 0, 0, 1),
aggregate([Dim PNL].[PNL].currentmember,[Measures].currentmember)
);
While this type of calculation can be done in MDX, the MDX can get complex and performs bad. I would suggest to explicitly do the calculation e. g. in the DSV or a view on the fact table that you then use instead of the fact table directly in the DSV. The result of the calculation would then be another column on which you can base a standard measure.
To do it in the DSV, assuming you use a relational table as the base for the fact table, add a named calculation to it, define the column name however you like, and use the expression Actual * FlagID. For the other calculation, you may need a subselect, i. e. the expression would be Actual * case when pnlId in(1,2) then 1 else 0 end. You can use any SQL that works as a column expression in the select list as the expression in for a named calculation.
Implementing the same in a view on FactAmount, you could implement the second expression better, as then you could join table DimPNL in the view definition and thus use column IsStat in the calculation. Then you would replace table FactAmout by the view, which has the two additional measure columns.
In either case, just define two measures on the two new columns in the cube, and you are done.
As a rule, calculations that are done on record level in the fact table before any aggregation should be done at data loading time, i. e. as described above.

SSRS - Is there a way to have a table split by a page break to appear on the same page?

I am creating a report using MS Report Builder 3.0. For this report, I have a stored procedure built that filters down to the specific rows needed, and then I use a row group to group on a particular field (pass_no). The table that is displayed is 2 columns and 3 rows within the row group. The basic description of what I want to accomplish is instead of the rows running onto the next page, I want the rows to continue on the same page in a new set of 2 columns. Think of it like a newspaper where the text continues in a new column rather than running down onto the next page.
For the example I'm going to use here, there are 12 rows of data returned by the SP, and 8 unique values in the pass_no column which is what my row group is grouped on. So in the report I end up with 8 groups of 3 rows. I'm aiming to have the table display 6 pass_no values (so 6 groups of 3 rows) before, for lack of better terminology, starting a new table.
My first approach at this has been to create a column group and set the grouping expression to the following:
=Floor((RowNumber(Nothing) - 1) / 6)
While this works in creating a new set of 2 columns, the split for the new columns is based on the row number from the raw data returned by the SP rather than the number of rows sets created by the row group. So because there are 12 rows returned, and the 6th and 7th rows have the same pass_no value, the second set of columns duplicates that 1 set of data. Also, the top 6 rows of the second column set are blank with the second set of values appearing below the first set.
If I add an additional column group where it is also grouping by pass_no, then I don't get the duplicate values, but I do get a pair of columns for each pass_no as well (as would be expected). I've tried modifying the expression above a bit and changed Nothing to the row group name and have tried the table name, but neither of them have yielded the desired result.
I can't alter the SP to do the grouping there because there are other column values that are not identical and I pull that data into a cell value expression within the table using Join(LookupSet()).
I have also considered creating 2 tables and applying a filter to the table so the first table only displays the first 6 results and the second table displays the remaining results, but that also looks at the raw data rather than the groupings and TOP N can't be used on pass_no as it's a text value, not an integer. This would also cause problems if I need to go to 3 tables.
So long story short, is there a way to do a table break rather than a page break or to overflow columns onto the same page rather than onto a new page?
Here's the pertinent portions of the Dataset:
http://sqlfiddle.com/#!2/5082b/1
PASS_NO MASTERTRAN TRANS_NO DESCRIPTION IS_MOD
7913019000 4931019000 4931019000 General Admission Adult 0
7914019000 4932019000 4932019000 Sea Turtle Hosp Adult 0
7914019000 4932019000 4933019000 2:00 PM SEA TURTLE HOSP 1
7916019000 4934019000 4934019000 Sea Turtle Hosp Child 0
7916019000 4934019000 4935019000 2:00 PM SEA TURTLE HOSP 1
7917019000 4934019000 4934019000 Sea Turtle Hosp Child 0
7917019000 4934019000 4935019000 2:00 PM SEA TURTLE HOSP 1
7918019000 4934019000 4934019000 Sea Turtle Hosp Child 0
7918019000 4934019000 4935019000 2:00 PM SEA TURTLE HOSP 1
7922019000 4936019000 4936019000 General Admission Child 0
7923019000 4936019000 4936019000 General Admission Child 0
7924019000 4936019000 4936019000 General Admission Child 0
I think your data presents a bit of a problem.
As you've already figured out, typically for this sort of setup you'd set up a row group with an expression like:
=(RowNumber(Nothing) - 1) Mod 6
And a column group expression like:
=Ceiling(RowNumber(Nothing) / 6)
This would create a six row tablix that would grow horizontally as required.
See this SO question for a similar example.
However, you currently have the requirement of also grouping by another column - pass_no in your case. Normally you can approximate a group-level row number with an expression like:
=RunningValue(Fields!pass_no.Value, CountDistinct, "DataSet1")
Unfortunately, when you try to add this into one of the grouping expressions like:
=Ceiling(RunningValue(Fields!pass_no.Value, CountDistinct, "DataSet1") / 6)
You get the following error:
A group expression for the tablix 'Tablix1' includes the aggregate
function RunningValue. RunningValue cannot be used in group
expressions.
Based on all this, my recommendation is to try and get a Dataset that has one row per pass_no value and base the tablix on this, with the above row/column grouping expressions, i.e. no need to group on multiple pass_no rows. So in your example it would have eight rows. You could then have a separate Dataset with all the individual rows and use a lookupset function to concatenate the description, etc.
Your other option is to try and get everything on one Dataset only, including the aggregates as required. This might not be possible, but for description at least you can leverage any of the various techniques here to get a delimited list. Once you have this list you can replace the delimiter with vbCrLf to split it back over multiple rows.
All this is a very long-winded way of saying that I don't know if your requirement is possible with your data, but if you look at having at least one Dataset with one row per pass_no you should be able to make it work.

Dynamic use of MDX AVG function

Anyone have advice on how to build an average measure that is dynamic -- it doesn't specify a particular slice but instead uses your current view? I'm working within a front-end OLAP viewer (Strategy Companion) and I need a "dynamic" implementation based on the dimensions that are currently filtered in the data view.
My fact table looks something like this:
Key AmountA IndicatorA AmountB Other Data
1 5 1 null 25
2 6 1 null 52
3 7 1 2 106
4 null 0 4 108
Now I can specify a simple average for "[Measures].[AmountA]" with "[Measures].[AmountA] / [Measures].[IndicatorA]" which works great - "[IndicatorA]" sums up to the number of non-null values of "[AmountA]". And this also works great no matter what dimensions are selected in the view - it always divides by the count of rows that have been filtered in.
But what about [AmountB]? I don't have a null indicator column. I want to get an average value of [AmountB] for whatever rows have been filtered in for my current view. If I try to use the count of rows as a simple formula (psuedo-code "[Measures].[AmountB] / Count([Measures].[Key])") I get the wrong result, because it is counting all the null rows in the average.
So, I need a way to use the AVG function to specify the average of [AmountB] over the set of "whatever rows I'm currently filtering in, based on whatever dimensions I'm currently using". How do I specify this dynamic set?
I've tried several different uses of the AVG function and they have either returned null or summed up to huge numbers, clearly not the average I'm looking for.
Thanks-
Matt
Sorry, my first suggestion was wrong. If you don't have access to OLAP cube you can't write any mdx-query for this purpose (IMHO). Because, you don't have any detailed data (from your fact table) in this access level and you can use only aggregated data and dimensions from your cube.
Otherwise (if you have access to olap db), you can create this metric (count of not NULL rows) in your measure group and after that use it for AVG calculation (as calculated member in your cube or in section "WITH" in your mdx-query).