Related
I'm trying to factor in multiple conditions to a dataset I'm working with. Row_number seems like the way to go with lag function in a second query but I can't quite get it 100%.
Here is how my data is structured:
CREATE TABLE emailhell(
mainID INTEGER NOT NULL PRIMARY KEY
,acctID VARCHAR(4) NOT NULL
,emailID VARCHAR(2) NOT NULL
,type INTEGER NOT NULL
,created DATETIME NOT NULL
);
INSERT INTO emailhell(mainID,acctID,emailID,type,created) VALUES (1,'1234','1',6,'1/1/2018');
INSERT INTO emailhell(mainID,acctID,emailID,type,created) VALUES (2,'1234','1',11,'1/1/2018');
INSERT INTO emailhell(mainID,acctID,emailID,type,created) VALUES (3,'1234','2',6,'1/2/2018');
INSERT INTO emailhell(mainID,acctID,emailID,type,created) VALUES (4,'1234','3',6,'1/3/2018');
INSERT INTO emailhell(mainID,acctID,emailID,type,created) VALUES (5,'1234','4',6,'1/4/2018');
INSERT INTO emailhell(mainID,acctID,emailID,type,created) VALUES (6,'ABC','89',6,'1/5/2018');
INSERT INTO emailhell(mainID,acctID,emailID,type,created) VALUES (7,'ABC','90',6,'1/6/2018');
INSERT INTO emailhell(mainID,acctID,emailID,type,created) VALUES (8,'ABC','90',11,'1/7/2018');
INSERT INTO emailhell(mainID,acctID,emailID,type,created) VALUES (9,'258','22',6,'1/7/2018');
INSERT INTO emailhell(mainID,acctID,emailID,type,created) VALUES (10,'258','1',6,'1/10/2018');
INSERT INTO emailhell(mainID,acctID,emailID,type,created) VALUES (11,'258','2',6,'1/30/2018');
INSERT INTO emailhell(mainID,acctID,emailID,type,created) VALUES (12,'258','3',6,'1/31/2018');
INSERT INTO emailhell(mainID,acctID,emailID,type,created) VALUES (13,'258','29',6,'2/15/2018');
INSERT INTO emailhell(mainID,acctID,emailID,type,created) VALUES (14,'258','29',11,'2/16/2018');
INSERT INTO emailhell(mainID,acctID,emailID,type,created) VALUES (15,'258','31',6,'3/1/2018');
and my desired output
+--------+--------+---------+------+-----------+-------+------------+
| mainID | acctID | emailID | type | created | index | touchcount |
+--------+--------+---------+------+-----------+-------+------------+
| 1 | 1234 | 1 | 6 | 1/1/2018 | 1 | |
| 2 | 1234 | 1 | 11 | 1/1/2018 | 2 | 1 |
| 3 | 1234 | 2 | 6 | 1/2/2018 | 1 | |
| 4 | 1234 | 3 | 6 | 1/3/2018 | 2 | |
| 5 | 1234 | 4 | 6 | 1/4/2018 | 3 | |
| 6 | ABC | 89 | 6 | 1/5/2018 | 1 | |
| 7 | ABC | 90 | 6 | 1/6/2018 | 2 | |
| 8 | ABC | 90 | 11 | 1/7/2018 | 3 | 2 |
| 9 | 258 | 22 | 6 | 1/7/2018 | 1 | |
| 10 | 258 | 1 | 6 | 1/10/2018 | 2 | |
| 11 | 258 | 2 | 6 | 1/30/2018 | 3 | |
| 12 | 258 | 3 | 6 | 1/31/2018 | 4 | |
| 13 | 258 | 29 | 6 | 2/15/2018 | 5 | |
| 14 | 258 | 29 | 11 | 2/16/2018 | 6 | 5 |
| 15 | 258 | 31 | 6 | 3/1/2018 | 1 | |
+--------+--------+---------+------+-----------+-------+------------+
Here's what I was working with but It's having issues for some reason when the activity looks like, Type 6 followed by an 11 followed by a 6, 11, etc. Here's my start of the query and I'm sure there's a better way to do this. I am then doing a similar query with the LAG function to grab the times where type 11 appeared.
SELECT dm.TABLE.*,
row_number() over(partition by dm.acctId, dm.type order by dm.acctId, dm.created_date) as index into dm.table2
from dm.TABLE with (NOLOCK)
You are defining groups by acctId and 11. Then for the 11s, you want one less than the size of the group. So, cumulative sum and some other stuff:
select t.*,
row_number() over (partition by acctId, grp order by mainId) as index,
(case when type = 11
then count(*) over (partition by acctId, grp ) - 1
end) as touchcount
from (select t.*,
sum(case when type = 11 then 1 else 0 end) over (partition by acctId order by mainId desc) as grp
from t
) t;
I should note that the definition of group requires counting backwards, rather than forwards. That is because 11 is included in the "previous" group rather than the first record in the "next" group.
I want to convert columns to rows in SQL Server:
Id Value Jan1 Jan2
----------------------
1 2 25 35
2 5 45 45
result should be
Id Value Month 1 2
----------------------
1 2 Jan 25 35
2 5 Jan 45 45
How can I get this result? Anyone please help
What you are asking seems a little strange. If I extend your example to include columns for Feb1 and Feb2, then I see two options for transposing your columns from this:
+----+-------+------+------+------+------+
| Id | Value | Jan1 | Jan2 | Feb1 | feb2 |
+----+-------+------+------+------+------+
| 1 | 2 | 25 | 35 | 15 | 28 |
| 2 | 5 | 45 | 45 | 60 | 60 |
+----+-------+------+------+------+------+
Transpose just the month part:
select Id, Value, MonthName, MonthValue1, MonthValue2
from t
cross apply (values ('Jan',Jan1,Jan2),('Feb',Feb1,Feb2)
) v (MonthName,MonthValue1,MonthValue2)
returns:
+----+-------+-----------+-------------+-------------+
| Id | Value | MonthName | MonthValue1 | MonthValue2 |
+----+-------+-----------+-------------+-------------+
| 1 | 2 | Jan | 25 | 35 |
| 1 | 2 | Feb | 15 | 28 |
| 2 | 5 | Jan | 45 | 45 |
| 2 | 5 | Feb | 60 | 60 |
+----+-------+-----------+-------------+-------------+
Or completely transpose the month columns like so:
select Id, Value, MonthName, MonthValue
from t
cross apply (values ('Jan1',Jan1),('Jan2',Jan2),('Feb1',Feb1),('Feb2',Feb2)
) v (MonthName,MonthValue)
returns:
+----+-------+-----------+------------+
| Id | Value | MonthName | MonthValue |
+----+-------+-----------+------------+
| 1 | 2 | Jan1 | 25 |
| 1 | 2 | Jan2 | 35 |
| 1 | 2 | Feb1 | 15 |
| 1 | 2 | Feb2 | 28 |
| 2 | 5 | Jan1 | 45 |
| 2 | 5 | Jan2 | 45 |
| 2 | 5 | Feb1 | 60 |
| 2 | 5 | Feb2 | 60 |
+----+-------+-----------+------------+
rextester demo: http://rextester.com/KZV45690
This would appear to be:
select Id, Value, 'Jan' as [month], Jan1 as [1], Jan2 as [2]
from t;
You are basically just adding another column to the output.
I don't recommend using numbers as column names, nor SQL Server keywords such as month.
Here is an option that you won't have to specify up to 365 fields
Declare #YourTable table (Id int,Value int,Jan1 int,Jan2 int,Feb1 int, Feb2 int)
Insert Into #YourTable values
(1, 2, 25, 35, 100, 101),
(2, 5, 45, 45, 200, 201)
Select [Id],[Value],[Month],[1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31]
From (
Select A.Id
,A.Value
,[Month] = Left(C.Item,3)
,[Col] = substring(C.Item,4,5)
,[Measure] = C.Value
From #YourTable A
Cross Apply (Select XMLData = cast((Select A.* for XML Raw) as xml)) B
Cross Apply (
Select Item = attr.value('local-name(.)','varchar(100)')
,Value = attr.value('.','int')
From B.XMLData.nodes('/row') as A(r)
Cross Apply A.r.nodes('./#*') AS B(attr)
Where attr.value('local-name(.)','varchar(100)') not in ('ID','Value')
) C
) A
Pivot (sum(Measure) For [Col] in ([1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31]) ) p
Returns
I need some help identifying a sequence of events in SQL Server 08 R2.
This is the sample data:
ID | SampleTime | SampleValue | CycleNum
1 | 07:00:00 | 10 |
2 | 07:02:00 | 10 |
3 | 07:05:00 | 10 |
4 | 07:12:00 | 20 |
5 | 07:15:00 | 10 |
6 | 07:22:00 | 10 |
7 | 07:23:00 | 20 |
8 | 07:30:00 | 20 |
9 | 07:31:00 | 10 |
I have used the following as a guide, link
, but it doesn't give the required output
The rules are:
A cycle starts at 10 and finishes at 20
There can be multiple 10s before a 20, and multiple 20s before the next 10
A cycle will always start at the first 10, and finish on the last 20 before the next 10.
Example Output
ID | SampleTime | SampleValue | CycleNum
1 | 07:00:00 | 10 | 1
2 | 07:02:00 | 10 | 1
3 | 07:05:00 | 10 | 1
4 | 07:12:00 | 20 | 1
5 | 07:15:00 | 10 | 2
6 | 07:22:00 | 10 | 2
7 | 07:23:00 | 20 | 2
8 | 07:30:00 | 20 | 2
9 | 07:31:00 | 10 | 3
Test Table
CREATE TABLE myTable (ID INT IDENTITY, SampleTime DATETIME, SampleValue INT, CycleNum INT)
INSERT INTO myTable (SampleTime, SampleValue)
VALUES ('07:00:00',10),
('07:02:00',10),
('07:05:00',10),
('07:12:00',20),
('07:15:00',10),
('07:22:00',10),
('07:23:00',20),
('07:30:00',20),
('07:31:00',10)
Try this... this will give the mapping of ID and CYCLENUM
WITH EVE_DATA AS (
SELECT ID
, SAMPLETIME
, SAMPLEVALUE
, CASE
WHEN (SAMPLEVALUE - lag(SAMPLEVALUE, 1, 0) over (order by SAMPLETIME ASC)) = -10
THEN 1
ELSE 0
END AS START_IND
FROM
MY_TABLE
)
SELECT T1.id
, SUM(T2.START_IND) + 1 AS CycleNum
FROM EVE_DATA T1
JOIN EVE_DATA T2
ON T1.ID >= T2.ID
GROUP BY T1.ID
ORDER BY T1.ID;
I have a DB where certain records are tagged with an ID and I want create a view that contains the Average of all those records with the same ID, EXCLUDING the current record. For example, if my data looks like this:
ROW - ID - Value
1 1 20
2 1 30
3 1 40
4 2 60
5 2 80
6 2 40
7 3 50
8 3 20
9 3 40
My view needs to calculate the average of every row with the same ID, EXCLUDING the row it's on, so my output would look something like this:
ROW - ID - Value AVG
1 1 20 35
2 1 30 30
3 1 40 25
4 2 60 60
5 2 80 50
6 2 40 70
7 3 50 30
8 3 20 55
9 3 40 35
So, in the case of row 3, it's extracted rows 1 and 2, as they have the same ID and given me the avg of their values - 25.
I'm gone round the houses on this for a while now, but can't seem to nail it. Any help would be appreciated.
One Option if you have window functions
Declare #YourTable table (ROW int,ID int,Value int)
Insert Into #YourTable values
(1, 1, 20),
(2, 1, 30),
(3, 1, 40),
(4, 2, 60),
(5, 2, 80),
(6, 2, 40),
(7, 3, 50),
(8, 3, 20),
(9, 3, 40)
Select *
,Avg = (sum(value) over (Partition By ID)-Value)/NullIf((sum(1) over (Partition By ID)-1),0)
From #YourTable
Another Option is a OUTER APPLY
Select A.*
,B.*
From #YourTable A
Outer Apply (Select Avg=avg(value)
From #YourTable
where ID=A.ID and Row<>A.Row
) B
Both Return
SELECT t1.gid, AVG(t2.value)
FROM table1 as t1 INNER JOIN
table1 as t2 ON (t1.gid != t2.gid)
GROUP BY t1.gid;
Basically, join the table to itself on your condition and then group the results based on the first table's key.
This solution should work regardless of what database system you are usimg; there may be minor syntax details to change.
A table like this:
ID | Value
1 | 4
2 | 6
3 | 5
Becomes (when joined):
t1.ID | t2.ID | t1.Value | t2.Value
1 | 2 | 4 | 6
1 | 3 | 4 | 5
2 | 1 | 6 | 4
2 | 3 | 6 | 5
3 | 1 | 5 | 4
3 | 2 | 5 | 6
And, then the aggregate of the grouped rows yields the wanted values.
This query works for me:
select t1.row, t1.id, t1.value, (select avg(value) from test_table as t2 where t1.id = t2.id and t1.row != t2.row) as avg from test_table as t1;
Data in table created by me (i assume is simmilar to Yours):
mysql> select * from test_table;
+-----+------+-------+
| row | id | value |
+-----+------+-------+
| 1 | 1 | 20 |
| 2 | 1 | 30 |
| 3 | 1 | 40 |
| 4 | 2 | 60 |
| 5 | 2 | 80 |
| 6 | 2 | 40 |
| 7 | 3 | 50 |
| 8 | 3 | 20 |
| 9 | 3 | 40 |
+-----+------+-------+
Result of query:
+-----+------+-------+---------+
| row | id | value | avg |
+-----+------+-------+---------+
| 1 | 1 | 20 | 35.0000 |
| 2 | 1 | 30 | 30.0000 |
| 3 | 1 | 40 | 25.0000 |
| 4 | 2 | 60 | 60.0000 |
| 5 | 2 | 80 | 50.0000 |
| 6 | 2 | 40 | 70.0000 |
| 7 | 3 | 50 | 30.0000 |
| 8 | 3 | 20 | 45.0000 |
| 9 | 3 | 40 | 35.0000 |
+-----+------+-------+---------+
I have a table in SQL Server database containing :
int value (column's name : Value)
datetime value (column's name : Date)
bit value (column's name : LastLineOfPage)
I would like to make a pagination query over this table. The logic of the pagination is the following :
The query must return lines corresponding to a given page (parameter #PageNumber), after sorting lines by the Date column
Also, the query must give the SUM of all the previous pages lines
The line number per page is not fixed : by default it's 14 lines per page, but if the bit LastLineOfPage is true, then the page contain only lines until the one with the true value
Here is a synthetic view of the process :
Here is the data in text :
ID DATE VALUE LASTLINEOFPAGE
1 07/10/2006 10 0
2 14/10/2006 12 0
3 21/10/2006 4 1
4 28/10/2006 6 0
5 04/11/2006 8 1
6 25/11/2006 125 0
7 02/12/2006 1 0
8 09/12/2006 5 0
9 16/12/2006 45 0
10 30/12/2006 1 1
So, the query receiving #PageNumber, and also #DefaultLineNumberPerPage (which will be equal to 14 but maybe one day that will change).
Could you help me in the design of this query or SQL function ?
Thanks !
Sample data
I added few rows to illustrate how it works when there are more rows per page than #DefaultLineNumberPerPage. In this example I'll use #DefaultLineNumberPerPage=5 and you'll see how extra pages were generated.
DECLARE #T TABLE (ID int, dt date, VALUE int, LASTLINEOFPAGE bit);
INSERT INTO #T(ID, dt, VALUE, LASTLINEOFPAGE) VALUES
(1 , '2006-10-07', 10 , 0),
(2 , '2006-10-14', 12 , 0),
(3 , '2006-10-21', 4 , 1),
(4 , '2006-10-28', 6 , 0),
(5 , '2006-11-04', 8 , 1),
(6 , '2006-11-25', 125, 0),
(7 , '2006-12-02', 1 , 0),
(8 , '2006-12-09', 5 , 0),
(9 , '2006-12-16', 45 , 0),
(10, '2006-12-30', 1 , 1),
(16, '2007-01-25', 125, 0),
(17, '2007-02-02', 1 , 0),
(18, '2007-02-09', 5 , 0),
(19, '2007-02-16', 45 , 0),
(20, '2007-02-20', 1 , 0),
(26, '2007-02-25', 125, 0),
(27, '2007-03-02', 1 , 0),
(28, '2007-03-09', 5 , 0),
(29, '2007-03-10', 5 , 0),
(30, '2007-03-11', 5 , 0),
(31, '2007-03-12', 5 , 0),
(32, '2007-03-13', 5 , 1),
(41, '2007-10-07', 10 , 0),
(42, '2007-10-14', 12 , 0),
(43, '2007-10-21', 4 , 1);
Query
Run it step-by-step, CTE-by-CTE and examine intermediate results to understand what it does.
CTE_FirstLines sets the FirstLineOfPage flag to 1 for the first line of the page instead of the last.
CTE_SimplePages uses a cumulative SUM to calculate the simple page numbers based on FirstLineOfPage page breaks.
CTE_ExtraPages uses ROW_NUMBER divided by #DefaultLineNumberPerPage to calculate extra page numbers if there is a page that has more than #DefaultLineNumberPerPage rows.
CTE_CompositePages combines simple page numbers with extra page numbers to make a single composite page "Number". It assumes that there will be less than 1000 rows between original LASTLINEOFPAGE flags. If it is possible to have such long sequence of rows, increase the 1000 constant and consider using bigint type for CompositePageNumber column.
CTE_FinalPages uses DENSE_RANK to assign sequential numbers without gaps for each final page.
DECLARE #DefaultLineNumberPerPage int = 5;
DECLARE #PageNumber int = 3;
WITH
CTE_FirstLines
AS
(
SELECT
ID,dt, VALUE, LASTLINEOFPAGE
,CAST(ISNULL(LAG(LASTLINEOFPAGE)
OVER (ORDER BY dt), 1) AS int) AS FirstLineOfPage
FROM #T
)
,CTE_SimplePages
AS
(
SELECT
ID,dt, VALUE, LASTLINEOFPAGE, FirstLineOfPage
,SUM(FirstLineOfPage) OVER(ORDER BY dt
ROWS BETWEEN UNBOUNDED PRECEDING AND CURRENT ROW) AS SimplePageNumber
FROM CTE_FirstLines
)
,CTE_ExtraPages
AS
(
SELECT
ID,dt, VALUE, LASTLINEOFPAGE, FirstLineOfPage, SimplePageNumber
,(ROW_NUMBER() OVER(PARTITION BY SimplePageNumber ORDER BY dt) - 1)
/ #DefaultLineNumberPerPage AS ExtraPageNumber
FROM CTE_SimplePages
)
,CTE_CompositePages
AS
(
SELECT
ID,dt, VALUE, LASTLINEOFPAGE, FirstLineOfPage, SimplePageNumber, ExtraPageNumber
,SimplePageNumber * 1000 + ExtraPageNumber AS CompositePageNumber
FROM CTE_ExtraPages
)
,CTE_FinalPages
AS
(
SELECT
ID,dt, VALUE, LASTLINEOFPAGE, FirstLineOfPage, SimplePageNumber, ExtraPageNumber
,CompositePageNumber
,DENSE_RANK() OVER(ORDER BY CompositePageNumber) AS FinalPageNumber
FROM CTE_CompositePages
)
,CTE_Sum
AS
(
SELECT
ID,dt, VALUE, LASTLINEOFPAGE, FirstLineOfPage, SimplePageNumber, ExtraPageNumber
,CompositePageNumber
,FinalPageNumber
,SUM(Value) OVER(ORDER BY FinalPageNumber, dt
ROWS BETWEEN UNBOUNDED PRECEDING AND CURRENT ROW) AS SumCumulative
FROM CTE_FinalPages
)
SELECT
ID,dt, VALUE, LASTLINEOFPAGE, FirstLineOfPage, SimplePageNumber, ExtraPageNumber
,CompositePageNumber
,FinalPageNumber
,SumCumulative
FROM CTE_Sum
-- WHERE FinalPageNumber = #PageNumber
ORDER BY dt
;
Result with the final WHERE filter commented out
Here is the full result with all intermediate columns to illustrate how the query works.
+----+------------+-------+-----+-----+--------+-------+-----------+-------+------------+
| ID | dt | VALUE | Lst | Fst | Simple | Extra | Composite | Final | TotalValue |
+----+------------+-------+-----+-----+--------+-------+-----------+-------+------------+
| 1 | 2006-10-07 | 10 | 0 | 1 | 1 | 0 | 1000 | 1 | 10 |
| 2 | 2006-10-14 | 12 | 0 | 0 | 1 | 0 | 1000 | 1 | 22 |
| 3 | 2006-10-21 | 4 | 1 | 0 | 1 | 0 | 1000 | 1 | 26 |
| 4 | 2006-10-28 | 6 | 0 | 1 | 2 | 0 | 2000 | 2 | 32 |
| 5 | 2006-11-04 | 8 | 1 | 0 | 2 | 0 | 2000 | 2 | 40 |
| 6 | 2006-11-25 | 125 | 0 | 1 | 3 | 0 | 3000 | 3 | 165 |
| 7 | 2006-12-02 | 1 | 0 | 0 | 3 | 0 | 3000 | 3 | 166 |
| 8 | 2006-12-09 | 5 | 0 | 0 | 3 | 0 | 3000 | 3 | 171 |
| 9 | 2006-12-16 | 45 | 0 | 0 | 3 | 0 | 3000 | 3 | 216 |
| 10 | 2006-12-30 | 1 | 1 | 0 | 3 | 0 | 3000 | 3 | 217 |
| 16 | 2007-01-25 | 125 | 0 | 1 | 4 | 0 | 4000 | 4 | 342 |
| 17 | 2007-02-02 | 1 | 0 | 0 | 4 | 0 | 4000 | 4 | 343 |
| 18 | 2007-02-09 | 5 | 0 | 0 | 4 | 0 | 4000 | 4 | 348 |
| 19 | 2007-02-16 | 45 | 0 | 0 | 4 | 0 | 4000 | 4 | 393 |
| 20 | 2007-02-20 | 1 | 0 | 0 | 4 | 0 | 4000 | 4 | 394 |
| 26 | 2007-02-25 | 125 | 0 | 0 | 4 | 1 | 4001 | 5 | 519 |
| 27 | 2007-03-02 | 1 | 0 | 0 | 4 | 1 | 4001 | 5 | 520 |
| 28 | 2007-03-09 | 5 | 0 | 0 | 4 | 1 | 4001 | 5 | 525 |
| 29 | 2007-03-10 | 5 | 0 | 0 | 4 | 1 | 4001 | 5 | 530 |
| 30 | 2007-03-11 | 5 | 0 | 0 | 4 | 1 | 4001 | 5 | 535 |
| 31 | 2007-03-12 | 5 | 0 | 0 | 4 | 2 | 4002 | 6 | 540 |
| 32 | 2007-03-13 | 5 | 1 | 0 | 4 | 2 | 4002 | 6 | 545 |
| 41 | 2007-10-07 | 10 | 0 | 1 | 5 | 0 | 5000 | 7 | 555 |
| 42 | 2007-10-14 | 12 | 0 | 0 | 5 | 0 | 5000 | 7 | 567 |
| 43 | 2007-10-21 | 4 | 1 | 0 | 5 | 0 | 5000 | 7 | 571 |
+----+------------+-------+-----+-----+--------+-------+-----------+-------+------------+
To get only one given page uncomment the WHERE filter in the final SELECT.
Result with the final WHERE filter
+----+------------+-------+-----+-----+--------+-------+-----------+-------+------------+
| ID | dt | VALUE | Lst | Fst | Simple | Extra | Composite | Final | TotalValue |
+----+------------+-------+-----+-----+--------+-------+-----------+-------+------------+
| 6 | 2006-11-25 | 125 | 0 | 1 | 3 | 0 | 3000 | 3 | 165 |
| 7 | 2006-12-02 | 1 | 0 | 0 | 3 | 0 | 3000 | 3 | 166 |
| 8 | 2006-12-09 | 5 | 0 | 0 | 3 | 0 | 3000 | 3 | 171 |
| 9 | 2006-12-16 | 45 | 0 | 0 | 3 | 0 | 3000 | 3 | 216 |
| 10 | 2006-12-30 | 1 | 1 | 0 | 3 | 0 | 3000 | 3 | 217 |
+----+------------+-------+-----+-----+--------+-------+-----------+-------+------------+
The TotalValue in the last row gives you the total page value that you want to show at the bottom of the page. If you sum all values on this page (125+1+5+45+1 = 177) and subtract it from the last TotalValue (217-177 = 40) you'll get the total of previous pages that you want to show at the top of the page. You'd better do these calculations on the client.
I have a partial solution. Still doesnt count default page size, but can give you an idea. So let me know what you think. Hope you are familiar with CTE's. Test each step so you see what are the partial results.
SQL Demo
WITH cte as (
SELECT [ID], [DATE], [VALUE], [LASTLINEOFPAGE],
SUM([VALUE]) OVER (ORDER BY [ID]) as Total,
SUM([LASTLINEOFPAGE]) OVER (ORDER BY [ID]) as page_group
FROM Table1
),
pages as (
SELECT c1.[ID], c1.[Total],
CASE WHEN c1.[ID] = 1 THEN 0
WHEN c1.[ID] = m.[minID] THEN c1.[page_group] -1
ELSE c1.[page_group]
END as [page_group]
FROM cte as c1
JOIN (SELECT [page_group], MIN([ID]) as minID
FROM cte
GROUP BY [page_group]) m
ON c1.[page_group] = m.[page_group]
)
SELECT c.[ID], c.[DATE], c.[VALUE], c.[LASTLINEOFPAGE],
(SELECT MAX([Total])
FROM pages p2
WHERE p2.[page_group] = p.[page_group]) as [Total],
p.[page_group]
FROM cte c
JOIN pages p
ON c.[ID] = p.[id]
As you can see the total and the page are in the aditional column and you shouldnt display those on your app