Datatable Compute Method filter on row number - vb.net

I use a query which fetches say 50 records and passes it to a datatable. This record is then displayed in a tabular format. The display has pagination used displaying 10 records at a time. There is a facility to move to next or previous set of record or move forward or backwards by 1 record.
I have to find Min and Max of a column for the set of record currently visible. I am planning to use Compute method but I am not sure if it allows filtering on anything other than the columns in datatable.
Do I have to include row number in my query or is there a better solution (something along the line mentioned below)?
CType(dtLineup.Compute("Min(ArrivalDate)", dt.row(2) to dt.row(12)), Date)

There is nothing like your pseudo code in MSDN on DataColumn.Expression. You could include a row number in your query, as you said, but an alternative is to add a row number column to your data table and use that in the filter expression.
DataColumn col = new DataColumn("rownumber", typeof(int));
col.AutoIncrement = true;
col.AutoIncrementSeed = 1;
datatable.Columns.Add(col);
Another alternative could be to do paging by linq (Skip-Take) and compute the aggregate function over the returned rows. But that may be a major departure of your current application structure.

Related

Summing the value of the aggregate function Last()

enter image description hereI am trying to design a report in ssrs that returns the last opening stock for each product in the dataset. To achieve this I used
=Last(Fields!CustProdAdj_new_openingstockValue.Value).
This works fine. But where I encountered a problem is in getting the sum of all the opening stock for each product. I tried using
=sum(Last(Fields!CustProdAdj_new_openingstockValue.Value))
but I got the error message
[Error on Preview]
Please is there another way to go about this
I have tried using aggregate(), runningValue(), to no avail
This is the dataset
This is the report layout
On previewing having used max()
This is probably easier to do directly in the dataset query but assuming you cannot change that, then this should work...
This assumes your data is ordered by the CustProdAdj_createdon column and that this is a date, datetime or some other ordered value, change this bit if required.
=SUM(
IIF(Fields!CustProdAdj_createdon.Value = Max(Fields!CustProdAdj_createdon.Value, "MyRowGroupNameHere"),
CustProdAdj_new_openingstockValue,
0)
)
Change the MyRowGroupNameHere to be the name of the rowgroup spelled exactly as it is in the rowgroup panel below the main design panel. Case sensitive and include quotes.
What this does is, for each row within the rowgroup "MyRowGroupNameHere", compare the CustProdAdj_createdon date to the max CustProdAdj_createdon across the rowgroup. . If it is the same then return the CustProdAdj_new_openingstockValue else return 0.
This will return the value required on only 1 record within the group.
For example, if you had 1 row per day then only on the last day of the month would a value be returned other than 0 because only the date of the last record would match the maximum date within the group.
Then it simply sums the results of this up.

Tableau: Get the ids that contain only the selected values from another column

I have the following question!
I have a table like this:
Data Source
I want to create a field(i suppose it's a field) that i can take the apl_ids,
that have as service_offered some that i want.
Example from the above table. If i want the apl_ids that have ONLY the service_offered
Pending 1, Pending 2 and Pending 7.
In that case, I want to get the apl_id = "13" since apl_id = "12" got one more service that i don't need.
Which is the best way to get that?
Thank you in advance!
Add a calculated field which gives 1 for desired values and 0 for other values. Add another calc field with fixed LOD to apl_id to sum of calcF1. Filter all ids with values=3 only. I think that should work.
Else tell me I will post screenshots
You can create a set based on the field api_id defined by the condition
max([service_offering]=“Pending 1”) and
max([service_offering]=“Pending 2”) and
max([service_offering]=“Pending 7”) and
min([service_offering]=“Pending 1” or [service_offering]=“Pending 2” or [service_offering]=“Pending 7”)
This set will contain those api_ids that have at least one record where service_offering is “Pending 1” and at least one record with Pending 2 ... and where every record has a service offering of 1, 2 or 7 (I.e. no others)
The key is to realize that Tableau treats True as greater than False, so min() and max() for boolean expressions correspond to every() and any().
Once you have a set of api_ids() you can use it on shelves and in calculated fields in many different ways.

Pentaho Adding summary rows

Any idea how to summarize data in a Pentaho transformation and then insert the summary row directly under the group being summarized.
I can use a Group By step and get a summarised result stream having one row per key field, but what I want is each sorted group written to the output and the summary row inserted underneath, thus preserving the input.
In the Group By, you can do 'Include all Rows', but this just appends the summary fields to the end of each existing row. It does not create new summary rows.
Thanks in advance
To get the summary rows to appear under the group by blocks you have to use some tricks, such as introducing a numeric "order" field, setting the value of the original data to 1 and the sub totals rows to 2.
Also in the group-by/ sub-totals stream, I am generating a sum field, say "subtotal". You have to make sure to also include this as a blank in your regular stream or else the metadata will be divergent and the final merge will not work.
Here is the best explanation I have found for this pattern:
https://www.packtpub.com/books/content/pentaho-data-integration-4-working-complex-data-flows
You will need to copy the rows too a different stream, and then merge or join them again, to make it a separate row.

Tag new records with number 1 to 6, evenly allocated

I am bringing over a record set that needs to be divided into 6 lists. I am using the field WrkList to hold the list number that will range from 1-6. I don't want to manually add the numbers to each of the new records with a repeating squence of (1, 2, 3, 4, 5, 6) as they are brought in. The WrkList field allows the records to be worked by 6 employees using queries that use the field as the criteria for that query. In any given day, over 1200 records may be appended to the table throughout the day and would need to have the WrkList field updated. I want these divided out as evenly as possible among the 6 groups as each new set of records are appended. Any help on getting started would be greatly appreciated.
Basically, you will open a recordset in DAO that includes all the records for which WrkList is Null. You will sort this by the order they came in, or some other logical criteria - whatever helps your workers have a coherent work queue (perhaps no order at all).
You will go through the recordset from beginning to end and update the WrkList field with a variable, byteWrkList.
This variable will have a value that changes with each edit. It will increment up by one, or if it was 6 for the last edit, it will return to 1.
NOTE: This code does not specify that you have filtered for Null! OpenRecordset must be based on a query that does filter for Null! (Or it must be based on a SQL string that does the same thing.)
Option Compare Database
Option Explicit
Private Sub AllocateTasks()
Public byteMax As Byte, byteWrkList as Byte
byteMax = 6
byteWrkList = 1
Dim rstTask As Recordset
Set rstTask = CurrentDb.OpenRecordset("tableOfTasks")
Do Until rstTask.EOF
if byteWrkList > byteMax then
byteWrkList = 1
else
byteWrkList = byteWrkList + 1
end if
rstTask.Edit
' Make sure you are not over-writing an existing value!
' Make sure it is NULL, or that your recordset excluded NULLs.
rstTask!WrkList = byteWrkList
rstTask.Update
rstTask.MoveNext
Loop
rstTask.Close
Set rstTask = Nothing
End Sub
Then you just need a way to invoke (trigger) the above code ... but your post doesn't really have enough information to suggest what that is.
There are alternate (and elegant) methods to obtain byteWrkList, such as using the mod() function applied to an autonumber index. (This is not important. I just had to get it off my chest because mod() is fun.) Indeed, there are alternate methods to handle this entirely; but this is what I would start with.

SSIS 2008 Row Count Transformation - Row count return 0

This should be rather simple but I don't know why I get Row Count as Zero when I use ROW COUNT transformation in Data Flow Task. I have created a variable(NoOfRecords) with Package scope.
Variable name set to variable NoOfRecords in Row Count Transformation.
Used a Derived column to assign the row count.
The package runs successfully and shows record count 265
But the Derived column shows record count as 0 instead of 265 rows.
After the Row Count, add an Aggregate Taks and select count option in the Operation tab in the Aggregate task properties.
Then you can use the row count variable for further operation where it holds the total row count of the input file.
Row Count is processed after rows has passed.
You're adding the variable to each row as they pass through the Derived Column step, but at this time, the variable has not been updated (as it happens after all rows has passed) - so the value 0 is correct.
You -might- be able to achieve this by using an asynchronous task before your derived (but i'm not sure this'll work, it just popped to my mind). Add a Sort or Aggregate step before your Derived and try again.
I used this in the query as an efficient way of getting the row count:
count(all SnapshotDate) over () as nRowCount
Here's the successful technique for recording rows that worked in my situation.
The scenario is I want to log the rows migrated between tables. The RowCount doesn't get populated until you exit the DataFlow.
[Control Flow]
1. Data Flow Task
a. read origin data - Source control
b. Add RowCount transformation. Link a to b.
Right-click RowCount and map to UserVariable (int64)
c. Add Destination control for loading table.
d. Link b to c.
2. Add Execute SQL Task to ControlFlow. right click, edit
INSERT SQL statement: Insert Into LogTable(rowcount) Values(?)
Parameter Mapping
Variable Direction DataType ParameterName ParameterSize
User::RowCount INPUT LONG 0 -1