I'm working on olap cube (with analysis services 2008). In my database I have a column (datatype =NUMBERS(40,30) ). In this column there's values with 10 numbers after the coma example : 125.256987452122 or 159.2365894123658
In my cube olap, that column is match to a measure. When I look the values in the cube,
I don't have the same value with the database. example : 125.256987452122 ==> in cube 125.2569 or 159.2365894123658 ==> in cube 159.2365
Even when I set the measure property FORMATSTRING = ''### ### ### ### ##0.0000000000;-### ### ### ### ##0.0000000000'' I get this result 25.256987452122 ==> in cube 125.2569000000
or 159.2365894123658 ==> in cube 159.2365000000.
The mesure datatype is Double . I change it to currency but I have the same problem.
Did someone know how to get the same result as in the data base , in my cube olap :
159.2365894123658 ==> in cube 159.2365894123658 ???
Thanks for your answers
Not an optimal solution perhaps, but consider storing the data in the underlying database as float, or use a view to load your fact table that converts it to float. SSAS cube will only show 4 digits unless you use Float.
You can verify this by exploring the data with your current datatype in the DSV, it should show all decimals, yet SSAS will only show 4 digits.
Related
I encounter a problem while I am trying to aggregate (sum) a calculated column which was created in another Aggregation node from another Calculation view.
Calculation View : TEST2
Projection 1 (Plain projection of another query)
Projection1
Aggregation 1 Sum Amount_LC by HKONT and Unique_document_identifier. In the aggregation, a calculated column Clearing_sum is created with the following formular:
Aggregation1
[Question 1] The result of this calculation in raw data preview makes sense for me but the result in Analysis tab seems incorrect. What is the cause of this different output between Analysis and Raw Data?
Result Raw Data
Result Analysis
I thought that it might be the case that, instead of summing up, the analysis uses the formular of Clearing_sum since it is in the same node.
So I tried creating a new Calculation (TEST3) with a projection on this TEST2 (all columns included) and ran to see the output. I still get the same output (correct raw data but incorrect analysis).
Test3
Result Analysis Test3
[QUESTION 2] How can I get my desired result? (e.g. the sum of Clearing_sum for the highlighted row should be 2 according to Raw data tab). I also tried enabling the Client-side aggregation in the Calculated column, but it did not help.
Without the actual models (and not just screenshots) it is hard to tell what the cause of the problem here is.
One possible cause could be that removing the HKONT changed the grouping level of the underlying view that computed SUM(Amount_LC). In turn, this affects the calculation of Clearing_sum.
A way to avoid this is to instruct HANA to not strip those unreferenced columns and to not change the grouping level. To do that, the KEEP FLAG needs to be set for the columns that should stay part of the grouping.
For a more detailed explanation of this flag, check the documentation and/or blog posts like Usage of “Keep Flag”.
I am trying to build the cube and getting below error:
What should I do to resolve it?
Internal error: The operation terminated unsuccessfully. Errors in the
OLAP storage engine: The attribute key cannot be found when
processing: Table: 'dbo.FACT1', Column: 'LoanAge', Value: '-93'. The
attribute is 'LoanAge'. Errors in the OLAP storage engine: The record
was skipped because the attribute key was not found. Attribute:
LoanAge of Dimension: LoanAge from Database: Cube_Data, Cube: Bond
Analytics OLAP, Measure Group: FACT1, Partition: Fact Combined
SUBPRIME 20180401 HPI Median, Record: 185597. Errors in the OLAP
storage engine: The process operation ended because the number of
errors encountered during processing reached the defined limit of
allowable errors for the operation. Errors in the OLAP storage engine:
An error occurred while processing the 'Fact Combined SUBPRIME
20180401 HPI Median' partition of the 'FACT1' measure group for the
'Bond Analytics OLAP' cube from the cube_Data database. Server: The
current operation was cancelled because another operation in the
transaction failed. Internal error: The operation terminated
unsuccessfully. Errors in the OLAP storage engine: An error occurred
while processing the 'Fact Combined ALTA_20180401 HPI Median'
partition of the 'FACT1' measure group for the 'Bond Analytics OLAP'
cube from the Cube_Data database.
Greg actually replied in comment under your question.
Let me enlarge his explanation a bit.
Table dbo.FACT1 has a row with column LoanAge = -93
It's record #185597 when cube is doing T-SQL query to grab partition Fact Combined SUBPRIME 20180401 HPI Median data.
However this value ( -93 ) is not present in LoanAge dimension LoanAge attribute.
To fix it you need to:
add this value into LoanAge dimension table
"Process Update" LoanAge dimension
Process Fact Combined SUBPRIME 20180401 HPI Median partition again.
And figure out why dimension has no -93 value.
You probably need to implement late-arrival dimension scenario is your facts are coming earlier than dimension values.
E.g. one unknown value is comming from facts, add it, mark with some default name (e.g. 'Unknown -93'). And update them later once dimension reference table has this code.
This is the common case, not exactly it's applied to such simple attribute like age (numeric value with no additional description).
I know this is mainly a design problem. I 've read that there is a workaround for this issue by customising errors at processing time but I am not glad to have to ignore errors, also the cube process is scheduled so ignore errors is not a choice at least a good one.
This is part of my cube where the error is thrown.
DimTime
PK (int)
MyMonth (int, Example = 201501, 201502, 201503, etc.)
Another Columns
FactBudget
PK (int)
Month (int, Example = 201501, 201502, 201503, etc.)
Another columns...
The relation in DSV is set as follows.
DimTiempo = DimTime, FactPresupuesto=FactBudget, periodo = MyMonth, PeriodoPresupFK = Month
Just translated for understanding.
The Relationship in cube is as follows:
The cube was built without problem, when processing the errror: The attribute key cannot be found when processing was thrown.
It was thrown due to FactBudget has some Month values (201510, 201511, 201512 in example) which DimTime don't, so the integrity is broken.
As mentioned in the answer here this can be solved at ETL process. I think I can do nothing to get the relationship if one fact table has foreign keys that has not been inserted in dimensions.
Note MyMonth can be values 201501, 201502, 201503 etc. This is set for year and month concatenated, DimTime is incremental inserted and every day is calculated that column so in this moment DimTime don't have values for 201507 onwards.
Is there a workaround or pattern to handle this kind of relationships?
Thanks for considering my question.
I believe that the process you are following is incorrect: you should setup any time related dimensions via a degenerate/fact dimension methodology. That is, the time dimension would not really be a true dimension - rather, it is populated through the fact table itself which would contain the time. If you look up degenerate dimensions you'll see what I mean.
Is there a reason why you're incrementally populating DimTime? It certainly isn't the standard way to do it. You need the values you're using in your fact to already exist in the dimensions. I would simply script up a full set of data for DimTime and stop the incremental updates of it.
I ran into this issue while trying to process a cube in Halo BI. It seems that some "datetime" styles "are" supported by SQL Server but not Halo BI. This statement does not cause an error:
CAST(CONVERT(char(11),table.[col name],113) AS datetime) AS [col name]
however this does not process without an error:
CAST(CONVERT(char(16),table.[col name],120) AS datetime) AS [col name]
However both of these work in SQL Server Management Studio 2012.
Another cause of this error is due to the cube measures being improperly mapped to the fact table.
I am using Reporting Services 2012 and have a chart that uses a dataset that changes it's data based on parameters.
This data is just a bunch of periods formatted as YYYYMM an int, a machine number int, and numbers decimal(12,2). We select based on machine number and period and pull back all those numbers of decimal(12,2) and show them in the chart.
It works for most machines, but a few machines we pick we get the following error
An error occurred during local report processing. An error occurred during report processing. The processing of Parent for the chart 'chart1' cannot be performed. Cannot compare data of types System.Int32 and System.String. Please check the data type returned by the Parent.
A machine number that works is 516. One that doesn't is 517. Nothing is different in the returned SQL results from 516 and 517 besides different numbers, 5.23 instead of 5.17 as an example. There are no nulls in the data and no zeros, and definitely no strings.
Any help as to where to look next would be appreciated.
I don't know if this will be helpful or not, but the fix to eliminate the error was to change the SQL query to
cast(machno as varchar)
everywhere machno was in the query. This doesn't explain why the chart needed a string instead of an int.
Is there such a thing as a constant in SSAS?
Example (this is really happening where I'm at) everyone agrees that gigs to mb be converted by 1000 (not 1024) and tb to mb by 1000000.
Where would you store a number like that used across the board?
If it's inside the cube, could you create a calculated member that stores it? Define it in the cube's calculation script, it's fine with constants in there.
In cube calculation script:
CREATE MEMBER CURRENTCUBE.Measures.MBtoGigs AS 1000
Query against the cube:
SELECT Measures.MBtoGigs ON COLUMNS FROM [Cube]
One possible pitfall I would point out, is that using constants like this can alter the way you'd expect the NON EMPTY behaviour to work in your queries - As a constant is never 'empty'.
Having said that, you can define you own non-empty-behaviour for calculated measures, so remember to try that with any calculated measures that involve constants if you experience any issues.
where\how do you need to use it?
you can always create a fact table with a column with that value (1000), whitch will become a measure group, and set the aggregation type on the measure with to "lastNonempty".
Since this value is on its own MG it can be easly used on the expression property of another measure on a different MG