Date dimension and cumulative Months - sql

The Date dimension is in the following state:
In SSAS looks like this:
Based on this how can I build an Hierarchy in SSAS with the following structrue:
2016
Jan-Feb
Jan-Mrz
Jan-Apr
Jan-Mai
Jan-Jun
...
where Jan-Apr will be the cumulative Amount from January until April in 2016.

Downvotes are probably because this is a bad dimension design. You can get a cumulative aggregation by dropping all this Jan-Feb etc stuff, using a simple Date dimension and using the PeriodsToDate function.

Your hierarchy should go as Year -> Month -> day and so on. To support cummilative figures combine your hierarchy with a range dimension that includes month ranges like Jan - Apr for example. Hierarchy is not used as ranges directly. you have to combine both together to get the requirement.
Range dimension will have minimum value which will be your starting month and max value which will be your ending month of one particular range. Example dimension as follows
+------------+---------------------+
| DIM_Range |
+------------+---------------------+
|Sur_Key | Min Value | Max Value |
+------------+---------------------+
| 01 | Jan | Apr |
| 02 | May | August |
+------------+---------------------+

Related

Python Pandas: How can I group different days of a month that are part of my dataset into one month?

I am currently working on my master thesis and I have a problem with regards to data organization in pandas. I downloaded multiple economic indicators that are published once a month and consolidated them in one dataframe.
However, the economic indicators are released on different days each month. Therefore my dataframe has for example five different rows for January 2020 (e.g. January 1st, January 5th, January 13th, January 28th, January 31st) and many „NaN“ values in each row.
I want to organize my data so that I have one row for each month, so for example one row for January 2020. However, I cannot figure out how to solve this problem in pandas.
Another challenge represents the fact that sometimes the data is released on March 1st and on March 31st. Therefore consolidating everything in one month could also lead to new problems if the values are summed up.
The table below visualizes my problem. My index column in the dataframe are the dates.
| Dates | Indicator 1 | Indicator 2
| ———————— | ——————————— | ———————————
| 01.01.20 | 1 | 1
| 08.01.20 | 2 | NaN
| 02.02.20 | 5 | 5
| 01.03.20 | 8 | 6
| 31.03.20 | 7 | 7
I already tried pd.to_period or pd.groupby, but I could not solve the problem.

How to get MAX date for currency conversion across multiple periods using MDX

Summary
I'm new to MDX but have been asked to enhance currency conversion
functionality in a cube. The problem itself seems simple.
For any given selected period, get the last date, and use that as the Reporting date for FX currency conversion.
For example, user selects periods Jan 2018 through to Oct 2018. The last date would be 31-Oct-2018.
The currency conversion is many to one reporting currency so we would take rates from the 31-Oct-2018.
Methods Tried
I have tried various methods to get the last date using:
MAX (with and without existing)
ClosingPeriod
Separate FX Rate Measure Group and Date Dimension
These work fine if the user selects one period and the calculation is only for that period.
However the problem is that across multiple periods (think YTD or LTD calculations) then end rate is taken from each month.
For example, each line item here in the corresponding month would be converted using the rate at period value for that month (Jan 7.285) rather than using the Oct 2018 value of 7.027.
Period | Rate at Period | LastDate | LastDateExisting
Jan 2018 | 7.285 | 20191231 | 20180131
Feb 2018 | 7.273 | 20191231 | 20180228
Mar 2018 | 7.275 | 20191231 | 20180331
Apr 2018 | 7.201 | 20191231 | 20180430
May 2018 | 7.146 | 20191231 | 20180531
Jun 2018 | 7.118 | 20191231 | 20180630
Jul 2018 | 7.116 | 20191231 | 20180731
Sep 2018 | 7.102 | 20191231 | 20180930
Oct 2018 | 7.027 | 20191231 | 20181031
Example Code
Here is an example of the MAX method I have found in various MDX cookbooks to get the last Date. I am assuming that due to the CurrentMember I am getting results per period. I realise this is normal MDX behaviour but due to the requirement would need to break this.
The MAX example without existing is the closest as it returns the same date but I am not sure if this can be filtered for the actual closing period easily?
WITH
MEMBER Measures.[LastDate] AS
MAX( [Accounting Date].[Year – Quarter – Month – Date].[Date].MEMBERS,
IIF( [Measures].[Rate at Period] = 0,
NULL,
[Accounting Date].[Year – Quarter – Month – Date].CurrentMember.Member_Key
)
)
MEMBER Measures.[LastDateExisting] AS
MAX( EXISTING [Accounting Date].[Year – Quarter – Month – Date].[Date].MEMBERS,
IIF( [Measures].[Rate at Period] = 0,
NULL,
[Accounting Date].[Year – Quarter – Month – Date].CurrentMember.Member_Key
)
)
Expected Results
I would like to see the last date column as all ‘20181031’ as below.
Period | Rate at Period (overridden) | LastDate
Jan 2018 | 7.027 | 20181031
Feb 2018 | 7.027 | 20181031
Mar 2018 | 7.027 | 20181031
Apr 2018 | 7.027 | 20181031
May 2018 | 7.027 | 20181031
Jun 2018 | 7.027 | 20181031
Jul 2018 | 7.027 | 20181031
Sep 2018 | 7.027 | 20181031
Oct 2018 | 7.027 | 20181031
Any help appreciated and happy to provide more information should you need it.
Looks like you already figured this bit:
WITH
MEMBER Measures.[LastDate] AS
MAX(
[Accounting Date].[Year – Quarter – Month – Date].[Date].MEMBERS,
IIF(
[Measures].[Rate at Period] = 0,
NULL,
[Accounting Date].[Year – Quarter – Month – Date].CURRENTMEMBER.MEMBER_KEY
)
)
So to get the Rate at Period (overridden) you can just use the date you've found and make a measure using a tuple - only slight change is that you will need to add the LastDate member to a hierarchy other than Measures - you can take your pick: in our cubes I usually pick the Langauges hierarchy as it is hardly ever used:
WITH
MEMBER <OtherHierarchy>.[LastDate] AS
MAX(
[Accounting Date].[Year – Quarter – Month – Date].[Date].MEMBERS,
IIF(
[Measures].[Rate at Period] = 0,
NULL,
[Accounting Date].[Year – Quarter – Month – Date].CURRENTMEMBER.MEMBER_KEY
)
)
MEMBER Measures.[Rate at Period (overridden)] AS
(
<OtherHierarchy>.[LastDate],
[Measures].[Rate at Period]
)

MDX - Get total from selected period

I want to calculate a percentage value of a selected period. I don't know how to handle it.
| Quantity | CalcMember |
January | 5 | |
2015-01-01 | 1 | 20% |
2015-01-02 | 2 | 40% |
2015-01-03 | 2 | 40% |
I need only the total of my selected period from day X to X and not the result of the whole month for my calculation.
The issue is summarizing the filtered members within the calculated member.
edit: I found a solution!
I have to create a dynamic set
CREATE DYNAMIC SET CurrentCube.[SelectedDates] AS [Date].[YearMonth].[Date].Members;
CREATE MEMBER CURRENTCUBE.[Measures].[Percentage] AS
[Measures].[Qty] / SUM([SelectedDates], [Measures].[Qty]),
format_string = "Percent"
but this works only when the dates are in the rows...
Use EXISTING function
CREATE HIDDEN DYNAMIC SET [SelectedDates] AS
EXISTING [Date].[YearMonth].[----smallest level of your hierarchy---].Members;
CREATE MEMBER CURRENTCUBE.[Measures].[Percentage] AS
[Measures].[Qty] / // or somethinng like [Original Value] from numeric calculations if you want to do that for multiple measures at once
Sum
([SelectedDates], [Measures].[Qty]
);

DAX SUMMARIZE() with filter - Powerpivot

Rephrasing a previous question after further research. I have a denormalised hierarchy of cases, each with an ID, a reference to their parent (or themselves) and a closure date.
Cases
ID | Client | ParentMatterName | MatterName | ClaimAmount | OpenDate | CloseDate
1 | Mr. Smith | ABC Ltd | ABC Ltd | $40,000 | 1 Jan 15 | 4 Aug 15
2 | Mr. Smith | ABC Ltd | John | $0 |20 Jan 15 | 7 Oct 15
3 | Mr. Smith | ABC Ltd | Jenny | $0 | 1 Jan 15 | 20 Jan 15
4 | Mrs Bow | JQ Public | JQ Public | $7,000 | 1 Jan 15 | 4 Aug 15
After the help of greggyb I also have another column, Cases[LastClosed], which will be true if the current row is closed, and is the last closed of the parent group.
There is also a second table of payments, related to Cases[ID]. These payments could be received in parent or child matters. I sum payments received as follows:
Recovery All Time:=CALCULATE([Recovery This Period], ALL(Date_Table[dateDate]))
I am looking for a new measure which will calculate the total recovered for a unique ParentMatterName, if the last closed matter in this group was closed in the Financial Year we are looking at - 30 June end date.
I am now looking at the SUMMARIZE() function to do the first part of this, but I don't know how to filter it. The layers of calculate are confusing. I've looked at This MSDN blog but it appears that this will filter to only show the total payments for that matter that was last closed (not adding the related children).
My current formula is:
Recovery on Closed This FY :=
CALCULATE (
SUMX (
SUMMARIZE (
MatterListView,
MatterListView[UniqueParentName],
"RecoveryAllTime", [Recovery All Time]
),
[RecoveryAllTime]
)
)
All help appreciated.
Again, your solution is much more easily solved with a model addition. Remember, storage is cheap, your end users are impatient.
Just store in your Cases table a column with the LastClosedDate of every parent matter, which indicates the date associated with the last closed child matter. Then it's a simple filter to return only those payments/matters that have LastClosedDate in the current fiscal year. Alternately, if you know for certain that you are only concerned with the year, you could store just LastClosedFiscalYear, to make your filter predicate a bit simpler.
If you need help with specific measures or how you might implement the additional field, let us know (I'd recommend adding these fields at the source, or deriving them in the source query rather than using calculated columns).

How to calculate the broadcast year and month out of the given date?

Is there a way to calculate the the broadcast year and month for a given gregorian date?
The advertising broadcast calendar differs from the regular calendar, in the way that every month needs to start on a Monday and end on a Sunday and have exactly 4 or 5 weeks. You can read about it here: http://en.wikipedia.org/wiki/Broadcast_calendar
This is a pretty common thing in TV advertising, so I guess there is a standard mathematical formula for it, that uses a combination of date functions (week(), month(), etc...).
Here is an example mapping between gregorian and broadcast dates:
| gregorian_date | broadcast_month | broadcast_year |
+----------------+-----------------+----------------+
| 2014-12-27 | 12 | 2014 |
| 2014-12-28 | 12 | 2014 |
| 2014-12-29 | 1 | 2015 |
| 2014-12-30 | 1 | 2015 |
| 2014-12-31 | 1 | 2015 |
| 2015-01-01 | 1 | 2015 |
| 2015-01-02 | 1 | 2015 |
Here is example how the broadcast calendar looks for 2015:
http://www.rab.com/public/reports/BroadcastCalendar_2015.pdf
As far as I can see, the pattern is that the first of the Gregorian month always falls within the first week of the Broadcast calendar, and any days from the previous month are pulled forward into that month to create full weeks. In Excel, you can use the following formula in cell B2 (first of your broadcast months above) to calculate the broadcast month:
=MONTH(A2+(7-WEEKDAY(A2,2)))
Similarly, in cell C2:
=IF(AND(MONTH(A2)=12,B2=1),YEAR(A2)+1,YEAR(A2))
This will return the broadcast month and year for any dates you put into your data set.
Hope that helps!
month,first,last
2018_1,2018-01-01,2018-01-28
2018_2,2018-01-29,2018-02-25
2018_3,2018-02-26,2018-03-25
2018_4,2018-03-26,2018-04-29
2018_5,2018-04-30,2018-05-27
2018_6,2018-05-28,2018-06-24
2018_7,2018-06-25,2018-07-29
2018_8,2018-07-30,2018-08-26
2018_9,2018-08-27,2018-09-30
2018_10,2018-10-01,2018-10-28
2018_11,2018-10-29,2018-11-25
2018_12,2018-11-26,2018-12-30
2019_1,2018-12-31,2019-01-27
2019_2,2019-01-28,2019-02-24
2019_3,2019-02-25,2019-03-31
2019_4,2019-04-01,2019-04-28
2019_5,2019-04-29,2019-05-26
2019_6,2019-05-27,2019-06-30
2019_7,2019-07-01,2019-07-28
2019_8,2019-07-29,2019-08-25
2019_9,2019-08-26,2019-09-29
2019_10,2019-09-30,2019-10-27
2019_11,2019-10-28,2019-11-24
2019_12,2019-11-25,2019-12-29
2020_1,2019-12-30,2020-01-26
2020_2,2020-01-27,2020-02-23
2020_3,2020-02-24,2020-03-29
2020_4,2020-03-30,2020-04-26
2020_5,2020-04-27,2020-05-31
2020_6,2020-06-01,2020-06-28
2020_7,2020-06-29,2020-07-26
2020_8,2020-07-27,2020-08-30
2020_9,2020-08-31,2020-09-27
2020_10,2020-09-28,2020-10-25
2020_11,2020-10-26,2020-11-29
2020_12,2020-11-30,2020-12-27
2021_1,2020-12-28,2021-01-31
2021_2,2021-02-01,2021-02-28
2021_3,2021-03-01,2021-03-28
2021_4,2021-03-29,2021-04-25
2021_5,2021-04-26,2021-05-30
2021_6,2021-05-31,2021-06-27
2021_7,2021-06-28,2021-07-25
2021_8,2021-07-26,2021-08-29
2021_9,2021-08-30,2021-09-26
2021_10,2021-09-27,2021-10-31
2021_11,2021-11-01,2021-11-28
2021_12,2021-11-29,2021-12-26
2022_1,2021-12-27,2022-01-30
2022_2,2022-01-31,2022-02-27
2022_3,2022-02-28,2022-03-27
2022_4,2022-03-28,2022-04-24
2022_5,2022-04-25,2022-05-29
2022_6,2022-05-30,2022-06-26
2022_7,2022-06-27,2022-07-31
2022_8,2022-08-01,2022-08-28
2022_9,2022-08-29,2022-09-25
2022_10,2022-09-26,2022-10-30
2022_11,2022-10-31,2022-11-27
2022_12,2022-11-28,2022-12-25
2023_1,2022-12-26,2023-01-29
2023_2,2023-01-30,2023-02-26
2023_3,2023-02-27,2023-03-26
2023_4,2023-03-27,2023-04-30
2023_5,2023-05-01,2023-05-28
2023_6,2023-05-29,2023-06-25
2023_7,2023-06-26,2023-07-30
2023_8,2023-07-31,2023-08-27
2023_9,2023-08-28,2023-09-24
2023_10,2023-09-25,2023-10-29
2023_11,2023-10-30,2023-11-26
2023_12,2023-11-27,2023-12-31
2024_1,2024-01-01,2024-01-28
2024_2,2024-01-29,2024-02-25
2024_3,2024-02-26,2024-03-31
2024_4,2024-04-01,2024-04-28
2024_5,2024-04-29,2024-05-26
2024_6,2024-05-27,2024-06-30
2024_7,2024-07-01,2024-07-28
2024_8,2024-07-29,2024-08-25
2024_9,2024-08-26,2024-09-29
2024_10,2024-09-30,2024-10-27
2024_11,2024-10-28,2024-11-24
2024_12,2024-11-25,2024-12-29
2025_1,2024-12-30,2025-01-26
2025_2,2025-01-27,2025-02-23
2025_3,2025-02-24,2025-03-30
2025_4,2025-03-31,2025-04-27
2025_5,2025-04-28,2025-05-25
2025_6,2025-05-26,2025-06-29
2025_7,2025-06-30,2025-07-27
2025_8,2025-07-28,2025-08-31
2025_9,2025-09-01,2025-09-28
2025_10,2025-09-29,2025-10-26
2025_11,2025-10-27,2025-11-30
2025_12,2025-12-01,2025-12-28