PowerPivot return value of last date for each Uniquw - powerpivot

Good day everyone,
So I've been struggling with this calculation in PowerPivot, where I want the formula to return the last project status for each project, from the last date. I have used a few formulas to at least get the last date for each project number, but have not managed to get that right either. I've tried variants of LASTNONBLANK, MAX, LASTDATE etc. but these all miss the crucial bit where it does it for each project number.
=CALCULATE(MAX(GetTimeSheetData[Date]),ALL(GetTimeSheetData))
=LASTNONBLANK(GetTimeSheetData[Date],GetTimeSheetData[Code])
.....and many others.
I have the file on the link below that shows what I need to do, any help will be most welcome......I'm busy going crazy!!
https://drive.google.com/file/d/0B29yA0i2Te9ycFVSclBuZUVzSUU/view?usp=sharing
Thanks!!
Llewellyn

Here is the code for the max date:
=CALCULATE( MAXA( GetTimeSheetData[Date]), ALLEXCEPT(GetTimeSheetData,GetTimeSheetData[Code]))
This code would get you the last status in a calculated column:
=CALCULATE(
VALUES ( GetTimeSheetData[Project Status]),
FILTER( ALLEXCEPT(GetTimeSheetData, GetTimeSheetData[Code], GetTimeSheetData[CalculatedColumn1]),
GetTimeSheetData[Code] = EARLIER( GetTimeSheetData[Code]) &&
GetTimeSheetData[Date] = CALCULATE( MAX( GetTimeSheetData[Date]),
ALLEXCEPT(GetTimeSheetData, GetTimeSheetData[Code], GetTimeSheetData[CalculatedColumn1]),
GetTimeSheetData[Code] = EARLIER( GetTimeSheetData[Code]))))
This code would get you a calculated field rather than utilizing a column:
Current Status:=CALCULATE(
VALUES ( GetTimeSheetData[Project Status]),
FILTER( ALLEXCEPT(GetTimeSheetData, GetTimeSheetData[Code]),
GetTimeSheetData[Date] = CALCULATE( MAX( GetTimeSheetData[Date]),
ALLEXCEPT(GetTimeSheetData, GetTimeSheetData[Code]))))
Either way, for the last example of taking the last status, you still need a manner of identifying which one was the true last. In this case, I would suggest either a) a timestamp rather than a date or b) another identifying field.

Related

Summing the value of the aggregate function Last()

enter image description hereI am trying to design a report in ssrs that returns the last opening stock for each product in the dataset. To achieve this I used
=Last(Fields!CustProdAdj_new_openingstockValue.Value).
This works fine. But where I encountered a problem is in getting the sum of all the opening stock for each product. I tried using
=sum(Last(Fields!CustProdAdj_new_openingstockValue.Value))
but I got the error message
[Error on Preview]
Please is there another way to go about this
I have tried using aggregate(), runningValue(), to no avail
This is the dataset
This is the report layout
On previewing having used max()
This is probably easier to do directly in the dataset query but assuming you cannot change that, then this should work...
This assumes your data is ordered by the CustProdAdj_createdon column and that this is a date, datetime or some other ordered value, change this bit if required.
=SUM(
IIF(Fields!CustProdAdj_createdon.Value = Max(Fields!CustProdAdj_createdon.Value, "MyRowGroupNameHere"),
CustProdAdj_new_openingstockValue,
0)
)
Change the MyRowGroupNameHere to be the name of the rowgroup spelled exactly as it is in the rowgroup panel below the main design panel. Case sensitive and include quotes.
What this does is, for each row within the rowgroup "MyRowGroupNameHere", compare the CustProdAdj_createdon date to the max CustProdAdj_createdon across the rowgroup. . If it is the same then return the CustProdAdj_new_openingstockValue else return 0.
This will return the value required on only 1 record within the group.
For example, if you had 1 row per day then only on the last day of the month would a value be returned other than 0 because only the date of the last record would match the maximum date within the group.
Then it simply sums the results of this up.

Set Analysis - Current year and previous financial periods

Trying to get the below code to run correctly, I want to get the sum of the field movement where the field GlYear is the same as what has been selected and the field GlPeriod is less than or equal to the period selected.
sum({$ <GlYear = {"{'1'} =$(=max(GlYear))"}, GlPeriod = {"{'1'} <=$(=max(GlPeriod))"}>}Movement)
I can't convert the two fields to dates as there are 16 financial periods within a year to be reviewed.
sum({$ <GlYear = {"$(=max(GlYear))"}, GlPeriod = {"<=$(=max(GlPeriod))"}>}Movement)
I'm not sure what you are trying to achieve with the {'1'}
My expression gives me this based on some quick test data
Tip it usually helps me figure out the set analysis by leaving the expression without a caption so that I can see the results of the $() expansion

Using IF THEN in Access 2010 Query

I'm not very knowledgeable in coding of Access queries, so I hope someone can help with this issue.
I have a query (using the query builder) that has a field named RetrainInterval from table tblProcedures (this will return a number like 1, 3, 6, 12, etc.; the rotational months the particular document have to be retrained on) and another field named Training/Qualification Date from table tblTrainingRecords.
I want the query to look at the RetrainInterval for a given record (record field is ClassID in tblProcedures) and then look at the Training/Qualification Date and calculate if that record should be in the query.
In a module I would do this:
IF RetrainInterval = 1 Then
DateAdd("m",1,[Training/Qualification Date]) <add to query if <=today()+30>
ElseIf RetrainInterval = 3 Then
DateAdd("m",3,[Training/Qualification Date]) <add to query if <=today()+30>
ElseIF......
How can I translate this into something that would work in a query? My end goal is to generate a report that will show me what document class numbers are due within a specified time interval (say I enter 30 in the form textbox to represent any upcoming required training within 30 days of the query), but all of the calculations to determine this is based off of when the last training date was (stored in the training records table). I also want to make sure that I do not get multiple returns for the same class number since there will be multiple training entries for each class, just grab the minimum last training date. I hope I explained it well enough. It's hard to put this into words on what I am trying to do without positing up the entire database.
UPDATE
I think I have simplified this a bit after getting some rest. Here are two images, one is the current query, and one is what comes up in the report. I have been able to refine this a bit, but now my problem is I only want the particular Class to show once on the report, not twice, even though I have multiple retrain due dates (because everything is looking at the table that holds the employee training data and will have multiple training's for each Class number). I would like to only show one date, the oldest. Hope that makes sense.
Query - http://postimg.org/image/cpcn998zx/
Report - http://postimg.org/image/krl5945l9/
When RetrainInterval = 1, you add 1 month to [Training/Qualification Date].
When RetrainInterval = 3, you add 3 months to [Training/Qualification Date].
And so on.
The pattern appears to be that RetrainInterval is the number of months to add. If that is true, use RetrainInterval directly in your DateAdd() expression and don't bother about IF THEN.
DateAdd("m", RetrainInterval, [Training/Qualification Date])
You can not do that in a query. Been there, cursed that!
You can use the IFF( 2>x ; 1 ;0)
Giving that if the first statement is true, 1 is returned, and 0 if false.
You can not return a criteria like IFF(2>x ; Cell>2 ; Cell>0) (Not possible) It will just return 0 if you try, i think. it will not give an error all the time.
You have to use criterias!
I would to something like this picture:
I hope you follow, else let me know.

Last Available value MDX

I have a requirement where in i am to extract data from a cube, within the SSRS dataset using the query builder ,with the time dimension in the result set, across a range of dates. The conditions are
The measures are to be displayed for each day of the date range.
The sub total row should have the last available measures value for that time range.
There is a time filter (currently a single date filter with a multi select option).
my MDX is as below.
The measure has a 'Sum' as the aggregation type.
I have a calculated measure with the scope defined as below.
SCOPE([MEASURES].[Measure1]);
SCOPE([Date].[Date].MEMBERS);
THIS = TAIL(EXISTING ([Date].[Date].MEMBERS),1).ITEM(0) ;
END SCOPE;
END SCOPE;
This above scope statement works perfectly. however, when i select in more that one date member this query slows WAYYYYYYY down. Performance numbers are
Across 1 date - 4 seconds
Across 2 dates - 22 minutes
Across 3 dates - unknown (in Hours)
This drastic degradation in performance goes away if i remove the scope statement, which makes me thing that there should be a better way to do the same. the final report query is as below.
SELECT
NON EMPTY
{[Measures].[Measure1]} ON COLUMNS
,NON EMPTY
{ [Dimension1].[Dimension1].[Dimension1].ALLMEMBERS*
[Dimension2].[Dimension2].[Dimension2].ALLMEMBERS*
[Dimension3].[Dimension3].[Dimension3].ALLMEMBERS*
[Date].[Date].[Date].ALLMEMBERS
} ON ROWS
FROM (
SELECT {[Date].[Date].&[2014-06-13T00:00:00]
,[Date].[Date].&[2014-06-16T00:00:00] } ON COLUMNS
FROM [Cube]
)
So the question again is, Is there a way to do the last available value part of the scope statement so as to have a better performance? Also, if there is another way to write the final mdx that would help the performance?.
Please let me know if there are anythings unclear regarding the question.
Thanks
Srikanth
The first optimization step would be to change your query to
SELECT
NON EMPTY
{[Measures].[Measure1]} ON COLUMNS
,NON EMPTY
{ [Dimension1].[Dimension1].[Dimension1].ALLMEMBERS*
[Dimension2].[Dimension2].[Dimension2].ALLMEMBERS*
[Dimension3].[Dimension3].[Dimension3].ALLMEMBERS*
{[Date].[Date].&[2014-06-13T00:00:00], [Date].[Date].&[2014-06-16T00:00:00] }
} ON ROWS
FROM [Cube]
Furthermore, I am not sure why you added the SCOPE([Date].[Date].MEMEBER); (probably Date].[Date].MEMBERS, actually). Maybe it helps to omit it and the corresponding END SCOPE.

Grouping, totaling in Rails and Active Record

I'm trying to group a series of records in Active Record so I can do some calculations to normalize that quantity attribute of each record for example:
A user enters a date and a quantity. Dates are not unique, so I may have 10 - 20 quantities for each date. I need to work with only the totals for each day, not every individual record. Because then, after determining the highest and lowest value, I convert each one by basically dividing by n which is usually 10.
This is what I'm doing right now:
def heat_map(project, word_count, n_div)
return "freezing" if word_count == 0
words = project.words
counts = words.map(&:quantity)
max = counts.max
min = counts.min
return "max" if word_count == max
return "min" if word_count == min
break_point = (max - min).to_f/n_div.to_f
heat_index = (((word_count - min).to_f)/break_point).to_i
end
This works great if I display a table of all the word counts, but I'm trying to apply the heat map to a calendar that displays running totals for each day. This obviously doesn't total the days, so I end up with numbers that are out of the normal scale.
I can't figure out a way to group the word counts and total them by day before I do the normalization. I tried doing a group_by and then adding the map call, but I got an error an undefined method error. Any ideas? I'm also open to better / cleaner ways of normalizing the word counts, too.
Hard to answer without knowing a bit more about your models. So I'm going to assume that the date you're interested in is just the created_at date in the words table. I'm assuming that you have a field in your words table called word where you store the actual word.
I'm also assuming that you might have multiple entries for the same word (possibly with different quantities) in the one day.
So, this will give you an ordered hash of counts of words per day:
project.words.group('DATE(created_at)').group('word').sum('quantity')
If those guesses make no sense, then perhaps you can give a bit more detail about the structure of your models.