SSRS Group Sort Using Group Variable - vba

I'm looking to sort group Part using variable Coverage. The variable's expression:
=( IIF(ISNOTHING( SUM(Fields!OnhandQty.Value) ), 0, SUM(Fields!OnhandQty.Value)) +
IIF(ISNOTHING( Fields!WIP.Value ), 0, Fields!WIP.Value)
) / IIF(ISNOTHING( SUM(Fields!RequiredQuantity.Value) ), 0, SUM(Fields!RequiredQuantity.Value) )
I'm able to save report (using Report Builder) with no errors, but I get the error:
The processing of SortExpression for the table ‘TablixCustomer’ cannot be performed. The comparison failed. Please check the data type returned by the SortExpression. (rsProcessingError)
Why?

Your expression SUM(...Value) will fail if .Value is ever null and becomes a String value of #error. So try changing your expressions to this: Note that it checks the value before aggregating the value instead of aggregating before checking for null.
=( SUM(IIF(ISNOTHING(Fields!OnhandQty.Value), 0, Fields!OnhandQty.Value)) +
IIF(ISNOTHING( Fields!WIP.Value ), 0, Fields!WIP.Value)
) / SUM(IIF(ISNOTHING(Fields!RequiredQuantity.Value), 1, Fields!RequiredQuantity.Value))

Related

How to create a data frame from another data frame, filter rows and select specific columns all at the same time? [duplicate]

I know this error message has been discussed on here before but I still cannot figure out how to make this work. I'm trying to do a np.where statement with more than one condition. Below is my code. I am getting the error message "Keyword can't be an expression" and it is highlighting the space after "aggregate['Counter'] > 1.
aggregate['1'] = np.where(np.logical_and(aggregate['Counter'] > 1, aggregate['2'].shift(1) = aggregate['3']), 0, aggregate['2'])
The comparison operator is ==, not =:
...aggregate['2'].shift(1) == aggregate['3']),...
^^ here
You need a double equals sign:
aggregate['1'] = np.where(np.logical_and(aggregate['Counter'] > 1, aggregate['2'].shift(1) == aggregate['3']), 0, aggregate['2']

The expression specified in the EVALUATE statement is not a valid table expression

I'm working on a Tabular cube in Visual Studio.
I have a DAX formula in the cube that works fine:
SUMX(FILTER(factFHA, factFHA[EventCd]="D"), [LoanCount])
When I run it in SSMS as:
evaluate(
SUMX(FILTER(factFHA, factFHA[EventCd]="D"), [LoanCount])
)
it fails with following error:
Query (1, 1) The expression specified in the EVALUATE statement is not a valid table expression.
I have 2 other formulas that both work fine:
evaluate(factFHA)
evaluate(filter('factFHA', [EventCd] = "D"))
I can't figure out what is wrong with the SUMX with FILTER
Please advise. Thank you.
EVALUATE function only works if you pass a table or an expression that returns a table, you are passing a SUMX function which return a scalar value (Decimal).
The syntax to write queries using DAX, is as follows:
[DEFINE { MEASURE <tableName>[<name>] = <expression> } -> Define a session (optional) measure
EVALUATE <table> --> Generate a table using your measures or creating calculated columns
[ORDER BY {<expression> [{ASC | DESC}]}[, …] --> Order the returned table by a passed column or expression
[START AT {<value>|<parameter>} [, …]]] --> This is an ORDER BY Sub-clause to define from which the query results will start.
Define your measure then use then use it inside the EVALUATE clause using a expression that evaluates to a table.
DEFINE
MEASURE factFHA[MyMeasure] =
SUMX ( FILTER ( factFHA, factFHA[EventCd] = "D" ), [LoanCount] )
EVALUATE
( SUMMARIZE ( FILTER ( factFHA, factFHA[EventCd] = "D" ), factFHA[AnyColumnToGroup]
, "Sum of MyMeasure", SUM ( factFHA[MyMeasure] ) ) )
Let me know if this helps.

Using condition in Calculatetable ()

I've a problem on Table filtering while using CALCULATETABLE()
I tried to use the script with condition for CALCULATETABLE():
XeroInvoices[AmountPaid] < XeroInvoices[AmountDue]
EVALUATE
SUMMARIZE(
CALCULATETABLE(XeroInvoices,
XeroInvoices[Status] = "AUTHORISED",
XeroInvoices[DueDate] <= TODAY(),
XeroInvoices[AmountPaid] < XeroInvoices[AmountDue]
),
XeroInvoices[Number],
XeroInvoices[Reference],
XeroInvoices[Status],
XeroInvoices[Date],
XeroInvoices[DueDate],
XeroInvoices[AmountPaid],
XeroInvoices[AmountDue]
)
but the error that i get in DAX Studio is as following:
Query (6, 30) The expression contains multiple columns, but only a single column can be used in a True/False expression that is used as a table filter expression.
I managed only to kinda achieve that I wanted only like this -- crating new column within SUMMARIZE() syntax and later filtering it in Excel:
EVALUATE
SUMMARIZE(
CALCULATETABLE(XeroInvoices,
XeroInvoices[Status] = "AUTHORISED",
XeroInvoices[DueDate] <= TODAY()
),
XeroInvoices[Number],
XeroInvoices[Reference],
XeroInvoices[Status],
XeroInvoices[Date],
XeroInvoices[DueDate],
XeroInvoices[AmountPaid],
XeroInvoices[AmountDue],
"AmPaid<AmDue",XeroInvoices[AmountPaid]< XeroInvoices[AmountDue]
)
Does anyone know what might be the reason for this Err in CALCULATETABLE() and what might be a proposed solution?
Thanks!!
Check this
To filter by multiple columns you have to explicitly specify the "FILTER"
CALCULATETABLE (
Product,
FILTER (
Product,
OR ( Product[Color] = "Red", Product[Weight] > 1000 )
)
)

Convert "YYYYMMDD" format string to date in MDX?

I have some issue with applying date related functions on the "YYYYMMDD" format string in MDX. For example, if I have this query below:
with
member foo as WEEKDay("2013-03-21")
select
foo on 0
from
[Some Cube]
It will correctly output "5" for foo in SSMS. But if I change the second line to:
member foo as WEEKDay("20130321")
Unfortunately, it will throw "type mismatch" error.
So what I want to do is that converting the string to some recognizable date format and then applying the functions on it. Any ideas for the easiest way, e.g. using existing functions?
Please note that the string is actually inputted from members in any cube where the MDX is running on. So the string format could have been recognizable, e.g. "YYYY-MM-DD". So hard coded string converting algorithm may not be ok.
Topic is too old, but maybe this may help someone. Technique is quite brute, but scalable.
with
member foo_false as WeekDay("20130321")
member foo_true as WeekDay("2013-03-21")
member foo_brute as
case when IsError(WeekDay("20130321"))=False then WeekDay("20130321") else
case
/* YYYYMMDD */
when
IsError(WeekDay("20130321"))=True AND IsNumeric("20130321")=True
and IsError(WeekDay(Left("20130321",4)+'-'+Right(Left("20130321",6),2)+'-'+Right("20130321",2)))=False
then WeekDay(Left("20130321",4)+'-'+Right(Left("20130321",6),2)+'-'+Right("20130321",2))
/* DDMMYYYY */
when
IsError(WeekDay("20130321"))=True AND IsNumeric("20130321")=True
and IsError(WeekDay(Right("20130321",4)+'-'+Right(Left("20130321",4),2)+'-'+Left("20130321",2)))=False
then WeekDay(Right("20130321",4)+'-'+Right(Left("20130321",4),2)+'-'+Left("20130321",2))
/* MMDDYYYY */
when
IsError(WeekDay("20130321"))=True AND IsNumeric("20130321")=True
and IsError(WeekDay(Right("20130321",4)+'-'+Left("20130321",2)+'-'+Right(Left("20130321",4),2)))=False
then WeekDay(Right("20130321",4)+'-'+Left("20130321",2)+'-'+Right(Left("20130321",4),2))
/* Unsupported Message */
else "Unsupported Format" end
end
select
{foo_false,foo_true,foo_brute} on 0
from
[DATA Cube]
Adding supportable formats to the end before "Unsupported", use any input string instead of "20130321".
But the easiest way is to use another layer (e.g. SQL function CONVERT) before inserting to MDX if possible, sure thing.
The vba function isDate can be used to check if the passed date is well formatted. If not then format it first using dateserial and mid and use them.
with
member foo as "20130321"
member bar as
iif(vba!isdate(foo) = TRUE,
WEEKDay(foo), //--if the date is already well formatted, it can be used
WEEKday(vba!dateserial(vba!mid(foo, 0, 4), vba!mid(foo, 5, 2), vba!right(foo, 2))))
select
bar on 0
from
[some cube]
EDIT
The above code can be modified to accommodate other checks such as MMDDYYYY or DDMMYYYY, but the thing is it is impossible in lot of cases for the engine to intuitively know if the passed value is in YYYYMMDDDD or DDMMYYYY or MMDDYYYY. Take for example the string 1111111
This could easily be in any date format as it is a valid date howsoever you break it.
I would suggest that you have another member which can store the date format as well. That way looking at this member, string can be parsed.
e.g.
with
member foo as
// "20130321" //YYYYMMDD
// "03212013"//MMDDYYYY
"21032013"//DDMMYYYY
MEMBER dateformat as "ddmmyyyy"
member bar as
iif(vba!isdate(foo) = TRUE,
WEEKDay(foo),
IIF(dateformat = "yyyymmdd", //YYYYMMDD
WEEKday(vba!dateserial(vba!mid(foo, 0, 4), vba!mid(foo, 5, 2), vba!right(foo, 2))),
IIF(dateformat = "mmddyyyy", //MMDDYYYY
WEEKday(vba!dateserial(right(foo, 4), vba!mid(foo, 0, 2), vba!mid(foo, 3, 2))),
IIF(dateformat = "ddmmyyyy", //DDMMYYYY
WEEKday(vba!dateserial(right(foo, 4), vba!mid(foo, 3, 2), vba!mid(foo, 0, 2))),
null
)
)
)
)
select
bar on 0
from
[aw cube]
With FooMember being an int here, representing a yyyyMMdd date I could convert it to Date using the following code
=Format(CDate(Left(CSTR(Fields!FooMember.Value),4)
+ "-" + Mid(CSTR(Fields!FooMember.Value), 5, 2)
+ "-" + Right(CSTR(Fields!FooMember.Value), 2)),
"dd/MM/yyyy")
use it in a cube using the following code
Format(CDate(Left([Measures].[FooMember],4)
+ "-" + Mid([Measures].[FooMember], 5, 2)
+ "-" + Right([Measures].[FooMember], 2)),"yyyy/MM/dd")

AVG is not taking null values into consideration

I have loaded the following test data:
name, age,gender
"John", 33,m
"Sam", 33,m
"Julie",33,f
"Jimbo",, m
with schema: name:STRING,age:INTEGER,gender:STRING and I have confirmed that the Jimbo row shows a null for column "age" in the BigQuery Browser Tool > mydataset > Details > Preview section.
When I run this query :
SELECT AVG(age) FROM [peterprivatedata.testpeople]
I get 24.75 which is incorrect. I expected 33 because the documentation for AVG says "Rows with a NULL value are not included in the calculation."
Am I doing something wrong or is this a known bug? (I don't know if there's a public issues list to check). What's the simplest workaround to this?
This is a known bug where we coerce null numeric values to to 0 on import. We're currently working on a fix. These values do however, show up as not not defined (which for various reasons is different from null), so you can check for IS_EXPLICITLY_DEFINED. For example:
SELECT sum(if(is_explicitly_defined(numeric_field), numeric_field, 0)) /
sum(if(is_explicitly_defined(numeric_field), 1, 0))
AS my_avg FROM your_table
Alternately, you could use another column to represent is_null. Then the query would look like:
SELECT sum(if(numeric_field_is_null, 0, numeric_field)) /
sum(if(numeric_field_is_null, 0, 1))
AS my_avg FROM your_table