I want to calculate a percentage value of a selected period. I don't know how to handle it.
| Quantity | CalcMember |
January | 5 | |
2015-01-01 | 1 | 20% |
2015-01-02 | 2 | 40% |
2015-01-03 | 2 | 40% |
I need only the total of my selected period from day X to X and not the result of the whole month for my calculation.
The issue is summarizing the filtered members within the calculated member.
edit: I found a solution!
I have to create a dynamic set
CREATE DYNAMIC SET CurrentCube.[SelectedDates] AS [Date].[YearMonth].[Date].Members;
CREATE MEMBER CURRENTCUBE.[Measures].[Percentage] AS
[Measures].[Qty] / SUM([SelectedDates], [Measures].[Qty]),
format_string = "Percent"
but this works only when the dates are in the rows...
Use EXISTING function
CREATE HIDDEN DYNAMIC SET [SelectedDates] AS
EXISTING [Date].[YearMonth].[----smallest level of your hierarchy---].Members;
CREATE MEMBER CURRENTCUBE.[Measures].[Percentage] AS
[Measures].[Qty] / // or somethinng like [Original Value] from numeric calculations if you want to do that for multiple measures at once
Sum
([SelectedDates], [Measures].[Qty]
);
Related
I'm currently running timescaleDB. I have a table that looks similar to the following
one_day | name | metric_value
------------------------+---------------------
2022-05-30 00:00:00+00 | foo | 400
2022-05-30 00:00:00+00 | bar | 200
2022-06-01 00:00:00+00 | foo | 800
2022-06-01 00:00:00+00 | bar | 1000
I'd like a query that returns the % growth and raw growth of metric, so something like the following.
name | % growth | growth
-------------------------
foo | 200% | 400
bar | 500% | 800
I'm fairly new to timescaleDB and not sure what the most efficient way to do this is. I've tried using LAG, but the main problem I'm facing with that is OVER (GROUP BY time, url) doesn't respect that I ONLY want to consider the same name in the group by and can't seem to get around it. The query works fine for a single name.
Thanks!
Use LAG to get the previous value for the same name using the PARTITION option:
lag(metric_value,1,0) over (partition by name order by one_day)
This says, when ordered by 'one_day', within each 'name', give me the previous (the second parameter to LAG says 1 row) value of 'metric_value'; if there is no previous row, give me '0'.
How to get previous date member only among selected/visible members of date dimension?
I've tried PREVMEMBER and LAG functions but they return previous calendar date (yesterday).
Data in OLAP cube:
DATE | SUM
-----------------
2018-09-01 | 500
2018-09-02 | 150
2018-09-03 | 300
2018-09-04 | 777
2018-09-05 | 900
2018-09-06 | 1200
2018-09-07 | 1500
In my query I'm selecting different dates in a filter and I need to get SUM of previous visible date:
DATE | SUM | PREV_SUM
-------------------------------
2018-09-02 | 150 | NULL
2018-09-04 | 777 | 150 (from 2018-09-02)
2018-09-07 | 1500 | 777 (from 2018-09-04)
My MDX query:
WITH
MEMBER PREV_MEMBER AS
MEMBERTOSTR([dim_date].[Day Id].CURRENTMEMBER.PREVMEMBER)
MEMBER PREV_MEMBER_LAG AS
MEMBERTOSTR([dim_date].[Day Id].CURRENTMEMBER.lag(1))
MEMBER PREV_SUM AS
SUM(
STRTOMEMBER(PREV_MEMBER),
[Measures].[SUM]
)
SELECT
NON EMPTY {
[Measures].[SUM],
PREV_SUM,
PREV_MEMBER,
PREV_MEMBER_LAG
} ON COLUMNS,
NON EMPTY {(
[dim_date].[Day Id].ALLMEMBERS
)} ON ROWS
FROM (
SELECT ({
[dim_date].[Day Id].&[20180902],
[dim_date].[Day Id].&[20180904],
[dim_date].[Day Id].&[20180907]
}) ON COLUMNS
FROM [cub_main]
)
My result (returns yesterday):
DATE | SUM | PREV_SUM | PREV_MEMBER | PREV_MEMBER_LAG
--------------------------------------------------------------------------------------------
20180902 | 150 | 500 | [dim_date].[Day Id].&[20180901] | [dim_date].[Day Id].&[20180901]
20180904 | 777 | 300 | [dim_date].[Day Id].&[20180903] | [dim_date].[Day Id].&[20180903]
20180907 | 1500 | 1200 | [dim_date].[Day Id].&[20180906] | [dim_date].[Day Id].&[20180906]
How can I get PREV_SUM only among selected/displayed members?
Try creating a custom set in your WITH clause. The set will be made up of the dates.
Then use the GENERATE function to iterate over the set. I think lag should then use only the dates in the set.
(Apologies I am away from a PC so unable to test)
WITH
// Create custom set
SET SSS AS
{
[dim_date].[Day Id].&[20180902],
[dim_date].[Day Id].&[20180904],
[dim_date].[Day Id].&[20180907]
}
// Find current date member rank in custom set
// Decrement index of current member by 2
// Use ITEM function
MEMBER PREV_MEMBER AS
SETTOSTR(SSS.ITEM(RANK([dim_date].[Day Id].CURRENTMEMBER, SSS)-2))
MEMBER PREV_SUM AS
SUM(
STRTOSET(PREV_MEMBER),
[Measures].[SUM]
)
SELECT
NON EMPTY {
[Measures].[SUM],
PREV_MEMBER,
PREV_SUM
} ON COLUMNS,
NON EMPTY {(
// Use custom set in rows as a date filter
SSS
)} ON ROWS
FROM [cub_main]
My data is like -
+-----------+------------------+-----------------+-------------+
| Issue Num | Created On | Closed at | Issue Owner |
+-----------+------------------+-----------------+-------------+
| 1 | 12/21/2016 15:26 | 1/13/2017 9:48 | Name 1 |
| 2 | 1/10/2017 7:38 | 1/13/2017 9:08 | Name 2 |
| 3 | 1/13/2017 8:57 | 1/13/2017 8:58 | Name 2 |
| 4 | 12/20/2016 20:30 | 1/13/2017 5:46 | Name 2 |
| 5 | 12/21/2016 19:30 | 1/13/2017 1:14 | Name 1 |
| 6 | 12/20/2016 20:30 | 1/12/2017 9:11 | Name 1 |
| 7 | 1/9/2017 17:44 | 1/12/2017 1:52 | Name 1 |
| 8 | 12/21/2016 19:36 | 1/11/2017 16:59 | Name 1 |
| 9 | 12/20/2016 19:54 | 1/11/2017 15:45 | Name 1 |
+-----------+------------------+-----------------+-------------+
What I am trying to achieve is
Number of issues created per week
Number of issues closed per week
Net number of issues remaining per week
I am able to resolve the top two points but unable to approach the last.
My attempt -
This gives me number of issues created every week.
Similarly I have done for Closed per week.
For Net number of issues (Created-Closed) -
I tried adding Closed At column along with Created On but I can't see second bar in the chart along with Created On either.
Something like this
I tried doing the same in excel -
I want something of this sort but with another column as the difference of
number of issues created that week - number of issues closed that week.
In this case, 8-6=2.
You could use a calculated field(Analysis->Create Calculated Field). Something like this:
{FIXED [Create Date]:Count(if DATEPART('year',[Create Date]) = 2016 then [Number of Records] end)} - {FIXED [Closed Date]:Count(if DATEPART('year',[Closed Date]) = 2016 then [Number of Records] end)}
This function is using LOD expressions to pull back both sets of values. It will filter on all 2016 results for both date sets and then minus them from each other.
For more on LOD's see here:
https://www.tableau.com/about/blog/LOD-expressions
Use this as your measure and pull in one of your date fields as the dimension.
The normal way to solve this problem is to reshape the data so you have one row per status change instead of one row per issue, with a column named [Date] and a column named [Action]. The action can be submit and close (or in a more complex world include approve, reject, whatever - tracking the history.
You can do the reshaping without modifying your source data by using a UNION to get two copies of each row with appropriate calculated fields to make the visible columns make sense (e.g., create calculated a field called Date that returns the submission date or closing date depending on whether the row is from the first or second union, with a similar one called Action whose value depends on that as well. Filter out Close actions that have a null date)
Or you can preprocess the data to reshape it.
Or you can use data blending to make two sources that point to the same data source but customizing the linking fields to line up the submit and close dates (e.g., duplicate the data connection and rename both date fields to have the same name). But in this case, you probably want to create scaffolding source that has every date, but no other data, to use as the primary data source to avoid filtering out data from the secondary for dates that don't appear in the primary. The blending approach can be brittle.
Assuming you used the UNION approach instead of Data Blending, then you can count the number of submissions and closures within a certain date range, or compute a running total of the difference to see the backlog size over time.
I'm going to do my best to explain this so I apologize in advance if my explanation is a little awkward. If I am foggy somewhere, please tell me what would help you out.
I have a table filled with circuits and dates. Each circuit gets trimmed on a time cycle of about 36 months or 48 months. I have a column that gives me this info. I have one record for every time the a circuit's trim cycle has been completed. I am attempting to link a known circuit outage list, to a table with their outage data, to a table with the circuit's trim history. The twist is the following:
I only want to get back circuits that have exceeded their trim cycles by 6 months. So I would need to take all records for a circuit, look at each individual record, find the most recent previous record relative to the record currently being examined (I will need every record examined invididually), calculate the difference between the two records in months, then return only the records that exceeded 6 months of difference between any two entries for a given feeder.
Here is an example of the data:
+----+--------+----------+-------+
| ID | feeder | comp | cycle |
| 1 | 123456 | 1/1/2001 | 36 |
| 2 | 123456 | 1/1/2004 | 36 |
| 3 | 123456 | 7/1/2007 | 36 |
| 4 | 123456 | 3/1/2011 | 36 |
| 5 | 123456 | 1/1/2014 | 36 |
+----+--------+----------+-------+
Here is an example of the result set I would want (please note: cycle can vary by circuit, so the value in the cycle column needs to be in the calculation to determine if I exceeded the cycle by 6 months between trimmings):
+----+--------+----------+-------+
| ID | feeder | comp | cycle |
| 3 | 123456 | 7/1/2007 | 36 |
| 4 | 123456 | 3/1/2011 | 36 |
+----+--------+----------+-------+
This is the query I started but I'm failing really hard at determining how to make the date calculations correctly:
SELECT temp_feederList.Feeder, Temp_outagesInfo.causeType, Temp_outagesInfo.StormNameThunder, Temp_outagesInfo.deviceGroup, Temp_outagesInfo.beginTime, tbl_Trim_History.COMP, tbl_Trim_History.CYCLE
FROM (temp_feederList
LEFT JOIN Temp_outagesInfo ON temp_feederList.Feeder = Temp_outagesInfo.Feeder)
LEFT JOIN tbl_Trim_History ON Temp_outagesInfo.Feeder = tbl_Trim_History.CIRCUIT_ID;
I wasn't really able to figure out where I need to go from here to get that most recent entry and perform the mathematical comparison. I've never been asked to do SQL this complex before, so I want to thank all of you for your patience and any assistance you're willing to lend.
I'm making some assumptions, but this uses a subquery to give you rows in the feeder list where the previous completed date was greater than the number of months ago indicated by the cycle:
SELECT tbl_Trim_History.ID, tbl_Trim_History.feeder,
tbl_Trim_History.comp, tbl_Trim_History.cycle
FROM tbl_Trim_History
WHERE tbl_Trim_History.comp>
(SELECT Max(DateAdd("m", tbl_Trim_History.cycle, comp))
FROM tbl_Trim_History T2
WHERE T2.feeder = tbl_Trim_History.feeder AND
T2.comp < tbl_Trim_History.comp)
If you needed to check for longer than 36 months you could add an arbitrary value to the months calculated by the DateAdd function.
Also I don't know if the value of cycle specified the number of month from the prior cycle or the number of months to the next one. If the latter I would change tbl_Trim_History.cycle in the DateAdd function to just cycle.
SELECT tbl_trim_history.ID, tbl_trim_history.Feeder,
tbl_trim_history.Comp, tbl_trim_history.Cycle,
(select max(comp) from tbl_trim_history T
where T.feeder=tbl_trim_history.feeder and
t.comp<tbl_trim_history.comp) AS PriorComp,
IIf(DateDiff("m",[priorcomp],[comp])>36,"x") AS [Select]
FROM tbl_trim_history;
This query identifies (with an X in the last column) the records from tbl_trim_history that exceed the cycle time - but as noted in the comments I'm not entirely sure if this is what you need or not, or how to incorporate the other 2 tables. Once you see what it is doing you can modify it to only keep the records you need.
Rephrasing a previous question after further research. I have a denormalised hierarchy of cases, each with an ID, a reference to their parent (or themselves) and a closure date.
Cases
ID | Client | ParentMatterName | MatterName | ClaimAmount | OpenDate | CloseDate
1 | Mr. Smith | ABC Ltd | ABC Ltd | $40,000 | 1 Jan 15 | 4 Aug 15
2 | Mr. Smith | ABC Ltd | John | $0 |20 Jan 15 | 7 Oct 15
3 | Mr. Smith | ABC Ltd | Jenny | $0 | 1 Jan 15 | 20 Jan 15
4 | Mrs Bow | JQ Public | JQ Public | $7,000 | 1 Jan 15 | 4 Aug 15
After the help of greggyb I also have another column, Cases[LastClosed], which will be true if the current row is closed, and is the last closed of the parent group.
There is also a second table of payments, related to Cases[ID]. These payments could be received in parent or child matters. I sum payments received as follows:
Recovery All Time:=CALCULATE([Recovery This Period], ALL(Date_Table[dateDate]))
I am looking for a new measure which will calculate the total recovered for a unique ParentMatterName, if the last closed matter in this group was closed in the Financial Year we are looking at - 30 June end date.
I am now looking at the SUMMARIZE() function to do the first part of this, but I don't know how to filter it. The layers of calculate are confusing. I've looked at This MSDN blog but it appears that this will filter to only show the total payments for that matter that was last closed (not adding the related children).
My current formula is:
Recovery on Closed This FY :=
CALCULATE (
SUMX (
SUMMARIZE (
MatterListView,
MatterListView[UniqueParentName],
"RecoveryAllTime", [Recovery All Time]
),
[RecoveryAllTime]
)
)
All help appreciated.
Again, your solution is much more easily solved with a model addition. Remember, storage is cheap, your end users are impatient.
Just store in your Cases table a column with the LastClosedDate of every parent matter, which indicates the date associated with the last closed child matter. Then it's a simple filter to return only those payments/matters that have LastClosedDate in the current fiscal year. Alternately, if you know for certain that you are only concerned with the year, you could store just LastClosedFiscalYear, to make your filter predicate a bit simpler.
If you need help with specific measures or how you might implement the additional field, let us know (I'd recommend adding these fields at the source, or deriving them in the source query rather than using calculated columns).