How can I get a count of all previous registers of a user in a table for each time this user shows up? - sql

Ok, so let's say I have a column like that:
client_calls
+------+-------+---------------------+
| id | userId| last_call_to_client |
+------+-------+---------------------+
| 3004 | 664 | 2013-04-01 |
| 3005 | 664 | 2014-05-09 |
| 3006 | 664 | 2015-12-11 |
| 3007 | 664 | 2021-11-24 |
| 3008 | 664 | 2022-03-05 |
+------+-------+---------------------+
And I need this result, a table that counts how many calls a client got before the date in a specific row:
client_calls_so_far
+------+-------+---------------------+-----------------+
| id | userId| last_call_to_client | calls_so_far |
+------+-------+---------------------+-----------------+
| 3004 | 664 | 2013-04-01 | 0 |
| 3005 | 664 | 2014-05-09 | 1 |
| 3006 | 664 | 2015-12-11 | 2 |
| 3007 | 664 | 2021-11-24 | 3 |
| 3008 | 664 | 2022-03-05 | 4 |
+------+-------+---------------------+-----------------+
How can I do that?

Example for you:
select *,count(last_call_to_client) over (partition by userId rows between unbounded preceding and current row) -1 as count_call
from client_calls;
demo : https://dbfiddle.uk/ceEDVUg-

Related

Solar-Heating: Data analytics for Grafana, advanced query

I would need some help with a very specific use case I have for my homelab.
I do have some solar panels on my roof, and I do extract a lot of data points to my server. I am using a specific app for that, making it easy to consume and automate stuff for that data (iobroker). The data I do save into a progres database. (No questions please why not Influx or TimescaleDB, postgres is what I need to live with...)
I use everything on docker right now, works perfectly. While I was able to create numerous dashboard on Grafana, display everything I like there, there is one specific "thing" I was unable to do, and after month of trying to get it done I finally ask for help. I do have a device supporting my heating from generated power to warm up the water. The device is using energy that we would normally feed back to the grid. The device is updating the power it pushes to the heating device pretty much every second. I am pulling the data from the device also every second. However I do have the logging configured in the way, that is only logs data when there is a difference to the previous datapoint.
One example:
Time
consumption in W
2018-02-21 12:00:00
3500
2018-02-21 12:00:01
1470
2018-02-21 12:00:02
1470
2018-02-21 12:00:03
1470
2018-02-21 12:00:00
1600
The second and third entry with the value of "1470" would not exist!
So first issue I have is a missing data point(s). What I would like to achieve is to have a calculation showing the consumption by individual day, month, year and all-time.
This does not need to happen inside Grafana, and I don't think Grafana can do this at all. There are options to do similar things in Grafana, but they do not provide an accurate result ($__unixEpochGroupAlias(ts,1s,previous)). I do have every option that is needed to create the data, so there should not be any obstacle in your ideas, and store it again inside the DB.
The data is polled/stored every 1000ms, so every second. Idea is to use Ws (Watt-seconds) to easily calculate with accurate numbers, as well as to display them better in Wh or kWh.
The DB can be only queried with SQL - but as mentioned if calculations needs to be done in a different language or so, then this is also fine.
Tried everything I could think of. SQL queries, searching numerous posts, all avaialble SQL based Grafana options. Guess I need custom code, but that above my skillset.
Anything more you'd need to know? Let me know. Thanks in advance!
The data structure looks the following:
id=entry for the application to identify the datapoint ts=timestamp
val=value in Ws
The other values are not important, but I wanted to show them for completeness.
id | ts | val | ack | _from | q
----+---------------+------+-----+-------+---
23 | 1661439981910 | 1826 | t | 3 | 0
23 | 1661439982967 | 1830 | t | 3 | 0
23 | 1661439984027 | 1830 | t | 3 | 0
23 | 1661439988263 | 1828 | t | 3 | 0
23 | 1661439985088 | 1829 | t | 3 | 0
23 | 1661439987203 | 1829 | t | 3 | 0
23 | 1661439989322 | 1831 | t | 3 | 0
23 | 1661439990380 | 1830 | t | 3 | 0
23 | 1661439991439 | 1827 | t | 3 | 0
23 | 1661439992498 | 1829 | t | 3 | 0
23 | 1661440021097 | 1911 | t | 3 | 0
23 | 1661439993558 | 1830 | t | 3 | 0
23 | 1661440022156 | 1924 | t | 3 | 0
23 | 1661439994624 | 1830 | t | 3 | 0
23 | 1661440023214 | 1925 | t | 3 | 0
23 | 1661439995683 | 1828 | t | 3 | 0
23 | 1661440024273 | 1924 | t | 3 | 0
23 | 1661439996739 | 1830 | t | 3 | 0
23 | 1661440025332 | 1925 | t | 3 | 0
23 | 1661440052900 | 1694 | t | 3 | 0
23 | 1661439997797 | 1831 | t | 3 | 0
23 | 1661440026391 | 1927 | t | 3 | 0
23 | 1661439998855 | 1831 | t | 3 | 0
23 | 1661440027450 | 1925 | t | 3 | 0
23 | 1661439999913 | 1828 | t | 3 | 0
23 | 1661440028509 | 1925 | t | 3 | 0
23 | 1661440029569 | 1927 | t | 3 | 0
23 | 1661440000971 | 1830 | t | 3 | 0
23 | 1661440030634 | 1926 | t | 3 | 0
23 | 1661440002030 | 1838 | t | 3 | 0
23 | 1661440031694 | 1925 | t | 3 | 0
23 | 1661440053955 | 1692 | t | 3 | 0
23 | 1659399542399 | 0 | t | 3 | 0
23 | 1659399543455 | 1 | t | 3 | 0
23 | 1659399544511 | 0 | t | 3 | 0
23 | 1663581880895 | 2813 | t | 3 | 0
23 | 1663581883017 | 2286 | t | 3 | 0
23 | 1663581881952 | 2646 | t | 3 | 0
23 | 1663581884074 | 1905 | t | 3 | 0
23 | 1661440004144 | 1838 | t | 3 | 0
23 | 1661440032752 | 1926 | t | 3 | 0
23 | 1661440005202 | 1839 | t | 3 | 0
23 | 1661440034870 | 1924 | t | 3 | 0
23 | 1661440006260 | 1840 | t | 3 | 0
23 | 1661440035929 | 1922 | t | 3 | 0
23 | 1661440007318 | 1840 | t | 3 | 0
23 | 1661440036987 | 1918 | t | 3 | 0
23 | 1661440008377 | 1838 | t | 3 | 0
23 | 1661440038045 | 1919 | t | 3 | 0
23 | 1661440009437 | 1839 | t | 3 | 0
23 | 1661440039104 | 1900 | t | 3 | 0
23 | 1661440010495 | 1839 | t | 3 | 0
23 | 1661440040162 | 1877 | t | 3 | 0
23 | 1661440011556 | 1838 | t | 3 | 0
23 | 1661440041220 | 1862 | t | 3 | 0
23 | 1661440012629 | 1840 | t | 3 | 0
23 | 1661440042279 | 1847 | t | 3 | 0
23 | 1661440013687 | 1840 | t | 3 | 0
23 | 1661440043340 | 1829 | t | 3 | 0
23 | 1661440014746 | 1833 | t | 3 | 0
23 | 1661440044435 | 1817 | t | 3 | 0
23 | 1661440015804 | 1833 | t | 3 | 0
23 | 1661440045493 | 1789 | t | 3 | 0
23 | 1661440046551 | 1766 | t | 3 | 0
23 | 1661440016862 | 1846 | t | 3 | 0
23 | 1661440047610 | 1736 | t | 3 | 0
23 | 1661440048670 | 1705 | t | 3 | 0
23 | 1661440017920 | 1863 | t | 3 | 0
23 | 1661440049726 | 1694 | t | 3 | 0
23 | 1661440050783 | 1694 | t | 3 | 0
23 | 1661440018981 | 1876 | t | 3 | 0
23 | 1661440051840 | 1696 | t | 3 | 0
23 | 1661440055015 | 1692 | t | 3 | 0
23 | 1661440056071 | 1693 | t | 3 | 0
23 | 1661440322966 | 1916 | t | 3 | 0
23 | 1661440325082 | 1916 | t | 3 | 0
23 | 1661440326142 | 1926 | t | 3 | 0
23 | 1661440057131 | 1693 | t | 3 | 0
23 | 1661440327199 | 1913 | t | 3 | 0
23 | 1661440058189 | 1692 | t | 3 | 0
23 | 1661440328256 | 1915 | t | 3 | 0
23 | 1661440059247 | 1691 | t | 3 | 0
23 | 1661440329315 | 1923 | t | 3 | 0
23 | 1661440060306 | 1692 | t | 3 | 0
23 | 1661440330376 | 1912 | t | 3 | 0
23 | 1661440061363 | 1676 | t | 3 | 0
23 | 1661440331470 | 1913 | t | 3 | 0
23 | 1661440062437 | 1664 | t | 3 | 0
23 | 1663581885133 | 1678 | t | 3 | 0
23 | 1661440332530 | 1923 | t | 3 | 0
23 | 1661440064552 | 1667 | t | 3 | 0
23 | 1661440334647 | 1915 | t | 3 | 0
23 | 1661440335708 | 1913 | t | 3 | 0
23 | 1661440065608 | 1665 | t | 3 | 0
23 | 1661440066665 | 1668 | t | 3 | 0
23 | 1661440336763 | 1912 | t | 3 | 0
23 | 1661440337822 | 1913 | t | 3 | 0
23 | 1661440338879 | 1911 | t | 3 | 0
23 | 1661440068780 | 1664 | t | 3 | 0
23 | 1661440339939 | 1912 | t | 3 | 0
(100 rows)```
iobroker=# \d ts_number
Table "public.ts_number"
Column | Type | Collation | Nullable | Default
--------+---------+-----------+----------+---------
id | integer | | not null |
ts | bigint | | not null |
val | real | | |
ack | boolean | | |
_from | integer | | |
q | integer | | |
Indexes:
"ts_number_pkey" PRIMARY KEY, btree (id, ts)
You can do this with a mix of generate_series() and some window functions.
First we use generate_series() to get all the second timestamps in a desired range. Then we join to our readings to find what consumption values we have. Group nulls with their most recent non-null reading. Then set the consumption the same for the whole group.
So: if we have readings like this:
richardh=> SELECT * FROM readings;
id | ts | consumption
----+------------------------+-------------
1 | 2023-02-16 20:29:13+00 | 900
2 | 2023-02-16 20:29:16+00 | 1000
3 | 2023-02-16 20:29:20+00 | 925
(3 rows)
We can get all of the seconds we might want like this:
richardh=> SELECT generate_series(timestamptz '2023-02-16 20:29:13+00', timestamptz '2023-02-16 20:29:30+00', interval '1 second');
generate_series
------------------------
2023-02-16 20:29:13+00
2023-02-16 20:29:14+00
...etc...
2023-02-16 20:29:29+00
2023-02-16 20:29:30+00
(18 rows)
Then we join our complete set of timestamps to our readings:
WITH wanted_timestamps (ts) AS (
SELECT generate_series(timestamptz '2023-02-16 20:29:13+00', timestamptz '2023-02-16 20:29:30+00', interval '1 second')
)
SELECT
wt.ts
, r.consumption
, sum(CASE WHEN r.consumption IS NOT NULL THEN 1 ELSE 0 END)
OVER (ORDER BY ts) AS group_num
FROM
wanted_timestamps wt
LEFT JOIN readings r USING (ts)
ORDER BY wt.ts;
ts | consumption | group_num
------------------------+-------------+-----------
2023-02-16 20:29:13+00 | 900 | 1
2023-02-16 20:29:14+00 | | 1
2023-02-16 20:29:15+00 | | 1
2023-02-16 20:29:16+00 | 1000 | 2
2023-02-16 20:29:17+00 | | 2
2023-02-16 20:29:18+00 | | 2
2023-02-16 20:29:19+00 | | 2
2023-02-16 20:29:20+00 | 925 | 3
2023-02-16 20:29:21+00 | | 3
2023-02-16 20:29:22+00 | | 3
2023-02-16 20:29:23+00 | | 3
2023-02-16 20:29:24+00 | | 3
2023-02-16 20:29:25+00 | | 3
2023-02-16 20:29:26+00 | | 3
2023-02-16 20:29:27+00 | | 3
2023-02-16 20:29:28+00 | | 3
2023-02-16 20:29:29+00 | | 3
2023-02-16 20:29:30+00 | | 3
(18 rows)
Finally, fill in the missing consumption values:
WITH wanted_timestamps (ts) AS (
SELECT generate_series(timestamptz '2023-02-16 20:29:13+00', timestamptz '2023-02-16 20:29:30+00', interval '1 second')
), grouped_values AS (
SELECT
wt.ts
, r.consumption
, sum(CASE WHEN r.consumption IS NOT NULL THEN 1 ELSE 0 END)
OVER (ORDER BY ts) AS group_num
FROM wanted_timestamps wt
LEFT JOIN readings r USING (ts)
)
SELECT
gv.ts
, first_value(gv.consumption) OVER (PARTITION BY group_num)
AS consumption
FROM
grouped_values gv
ORDER BY ts;
ts | consumption
------------------------+-------------
2023-02-16 20:29:13+00 | 900
2023-02-16 20:29:14+00 | 900
2023-02-16 20:29:15+00 | 900
2023-02-16 20:29:16+00 | 1000
2023-02-16 20:29:17+00 | 1000
2023-02-16 20:29:18+00 | 1000
2023-02-16 20:29:19+00 | 1000
2023-02-16 20:29:20+00 | 925
2023-02-16 20:29:21+00 | 925
2023-02-16 20:29:22+00 | 925
2023-02-16 20:29:23+00 | 925
2023-02-16 20:29:24+00 | 925
2023-02-16 20:29:25+00 | 925
2023-02-16 20:29:26+00 | 925
2023-02-16 20:29:27+00 | 925
2023-02-16 20:29:28+00 | 925
2023-02-16 20:29:29+00 | 925
2023-02-16 20:29:30+00 | 925
(18 rows)

How to calculate various sum columns based on value of another in SQL?

Question: Write a query, which will output the user count today, as well as from 7 (uc7), 14 (uc14), 30 (uc30) days ago
Table: num_users
+------------+------------+
| dateid | user_count |
+------------+------------+
| 2014-12-31 | 1010 |
| 2014-12-30 | 1000 |
| 2014-12-29 | 990 |
| 2014-12-28 | 980 |
| 2014-12-27 | 970 |
| 2014-12-26 | 960 |
| 2014-12-25 | 950 |
| 2014-12-24 | 940 |
| 2014-12-23 | 930 |
| 2014-12-22 | 920 |
| 2014-12-21 | 910 |
| 2014-12-20 | 900 |
| 2014-12-19 | 890 |
| 2014-12-18 | 880 |
| 2014-12-17 | 870 |
| 2014-12-16 | 860 |
| 2014-12-15 | 850 |
| 2014-12-14 | 840 |
| 2014-12-13 | 830 |
| 2014-12-12 | 820 |
| 2014-12-11 | 810 |
| 2014-12-10 | 800 |
| 2014-12-09 | 790 |
| 2014-12-08 | 780 |
| 2014-12-07 | 770 |
| 2014-12-06 | 760 |
| 2014-12-05 | 750 |
| 2014-12-04 | 740 |
| 2014-12-03 | 730 |
| 2014-12-02 | 720 |
| 2014-12-01 | 710 |
+------------+------------+
Desired Output:
+------------+------+------+------+------+
| dateid | uc | uc7 | uc14 | uc30 |
+------------+------+------+------+------+
| 2014-12-31 | 1010 | 940 | 870 | 710 |
| 2014-12-30 | 1000 | 930 | 860 | 0 |
| 2014-12-29 | 990 | 920 | 850 | 0 |
| 2014-12-28 | 980 | 910 | 840 | 0 |
| 2014-12-27 | 970 | 900 | 830 | 0 |
| 2014-12-26 | 960 | 890 | 820 | 0 |
| 2014-12-25 | 950 | 880 | 810 | 0 |
| 2014-12-24 | 940 | 870 | 800 | 0 |
| 2014-12-23 | 930 | 860 | 790 | 0 |
| 2014-12-22 | 920 | 850 | 780 | 0 |
| 2014-12-21 | 910 | 840 | 770 | 0 |
| 2014-12-20 | 900 | 830 | 760 | 0 |
| 2014-12-19 | 890 | 820 | 750 | 0 |
| 2014-12-18 | 880 | 810 | 740 | 0 |
| 2014-12-17 | 870 | 800 | 730 | 0 |
| 2014-12-16 | 860 | 790 | 720 | 0 |
| 2014-12-15 | 850 | 780 | 710 | 0 |
| 2014-12-14 | 840 | 770 | 0 | 0 |
| 2014-12-13 | 830 | 760 | 0 | 0 |
| 2014-12-12 | 820 | 750 | 0 | 0 |
| 2014-12-11 | 810 | 740 | 0 | 0 |
| 2014-12-10 | 800 | 730 | 0 | 0 |
| 2014-12-09 | 790 | 720 | 0 | 0 |
| 2014-12-08 | 780 | 710 | 0 | 0 |
| 2014-12-07 | 770 | 0 | 0 | 0 |
| 2014-12-06 | 760 | 0 | 0 | 0 |
| 2014-12-05 | 750 | 0 | 0 | 0 |
| 2014-12-04 | 740 | 0 | 0 | 0 |
| 2014-12-03 | 730 | 0 | 0 | 0 |
| 2014-12-02 | 720 | 0 | 0 | 0 |
| 2014-12-01 | 710 | 0 | 0 | 0 |
+------------+------+------+------+------+
How do I properly do this?
I tried my solution as below but it does not result in the right solution
SELECT dateid AS today,
(SELECT SUM(user_count) FROM num_users WHERE dateid = dateid) AS uc,
(SELECT SUM(user_count) FROM num_users WHERE dateid - 7) AS uc7,
(SELECT SUM(user_count) FROM num_users WHERE dateid - 14) AS uc14,
(SELECT SUM(user_count) FROM num_users WHERE dateid - 14) AS uc30
FROM num_users
This produces the presented output:
SELECT num_users.dateid, num_users.user_count AS uc,
(SELECT user_count FROM num_users AS A WHERE A.dateid=num_users.dateid-7) AS uc7,
(SELECT user_count FROM num_users AS A WHERE A.dateid=num_users.dateid-14) AS uc14,
(SELECT user_count FROM num_users AS A WHERE A.dateid=num_users.dateid-30) AS uc30
FROM num_users
ORDER BY num_users.dateid DESC;
But maybe you really want:
SELECT Sum(num_users.user_count) AS uc,
Sum(IIf([dateid]<=#12/31/2014#-7,[user_count],0)) AS uc7,
Sum(IIf([dateid]<=#12/31/2014#-14,[user_count],0)) AS uc14,
Sum(IIf([dateid]<=#12/31/2014#-30,[user_count],0)) AS uc30
FROM num_users;
Above tested with Access. If data actually continues through current date, replace #12/31/2014# with Date(). Formatting literal date and function will most likely be different in another database platform.

SQL: FIter rows with specyfic pattern

I'm a bit new in sql.
I have the following table:
+-----+---------+------------------------+
| ID | ID_TEST | FILE_PATH |
+-----+---------+------------------------+
| 575 | 3 | Landscapes_001_h_A.jpg |
| 576 | 3 | Landscapes_001_h_B.jpg |
| 577 | 3 | Landscapes_001_h_C.jpg |
| 578 | 3 | Landscapes_001_h_D.jpg |
| 579 | 3 | Landscapes_001_h_E.jpg |
| 580 | 3 | Landscapes_002_h_A.jpg |
| 581 | 3 | Landscapes_002_h_B.jpg |
| 582 | 3 | Landscapes_002_h_C.jpg |
| 583 | 3 | Landscapes_002_h_D.jpg |
| 584 | 3 | Landscapes_002_h_E.jpg |
+-----+---------+------------------------+
The pattern for picture is Landscapes_XXX_h_Y.jpg
where
XXX is number from 1 to 185 and Y is quality version from A to E
I wanna select each image name with different quality.
The output should be
+-----+---------+------------------------+
| ID | ID_TEST | FILE_PATH |
+-----+---------+------------------------+
| 575 | 3 | Landscapes_001_h_A.jpg |
| 576 | 3 | Landscapes_002_h_E.jpg |
| 577 | 3 | Landscapes_003_h_C.jpg |
| 578 | 3 | Landscapes_004_h_B.jpg |
| 579 | 3 | Landscapes_005_h_D.jpg |
| 580 | 3 | Landscapes_006_h_A.jpg |
| 581 | 3 | Landscapes_007_h_E.jpg |
| 582 | 3 | Landscapes_008_h_C.jpg |
| 583 | 3 | Landscapes_009_h_B.jpg |
| 584 | 3 | Landscapes_010_h_E.jpg |
+-----+---------+------------------------+
but of course for 185 elements.
I'm using 5.5.60-MariaDB.
How to write SELECT statement? Using REGEXP?

SQL Server: I need to create copies of records from 2 tables and ensure FK reflects these copies

In SQL Server 2016, I need to create nearly exact copies of records from 2 tables. The only difference will be their primary keys, one other column that I'm resetting to zero, and a foreign key in my 2nd table (which is the PK of my 1st table). I can create copies of both tables fine, however I don't know how to assign new FK values to the 2nd table, to correctly reflect the new primary keys from the 1st table.
Here are the current records in both tables:
Table 1: Batches
+-----------------------+-----------+----------------+------------+
| BatchID (pk identity) | StartDate | ProcessingStep | BatchCount |
+-----------------------+-----------+----------------+------------+
| 1 | 5/10/2019 | 2 | 8203 |
| 2 | 5/11/2019 | 2 | 345 |
| 3 | 5/12/2019 | 2 | 5014 |
+-----------------------+-----------+----------------+------------+
Table 2: ItemList
+--------------------------+---------+--------+-----------+-------------+
| ItemListID (pk identity) | BatchID | ItemID | Processed | ProcessDate |
+--------------------------+---------+--------+-----------+-------------+
| 1000 | 1 | 201 | 1 | 5/10/2019 |
| 1001 | 1 | 689 | 1 | 5/10/2019 |
| 1002 | 2 | 548 | 1 | 5/11/2019 |
| 1003 | 2 | 693 | 1 | 5/11/2019 |
| 1004 | 3 | 123 | 1 | 5/12/2019 |
| 1005 | 3 | 999 | 1 | 5/12/2019 |
+--------------------------+---------+--------+-----------+-------------+
I now want to create copies of these records with the following exceptions:
Batches.ProcessingStep for all records will now be set to zero
ItemList's Processed & ProcessDate are reset to zero & null respectively
Update ItemList.BatchID to reflect the new PK of the copied Batches records (this is where I'm having trouble)
Currently, my script for updating my tables is as follows:
INSERT INTO Batches(StartDate, ProcessingStep, BatchCount)
SELECT StartDate, 0, BatchCount
FROM Batches
WHERE BatchID IN (1,2,3)
INSERT INTO ItemList(BatchID, ItemID, Processed, ProcessDate)
SELECT <<?? not sure ??>>, ItemID, 0, NULL
WHERE ItemListID BETWEEN 1000 AND 1005
And here would be my final results:
Table 1: Batches
+---------+-----------+----------------+------------+
| BatchID | StartDate | ProcessingStep | BatchCount |
+---------+-----------+----------------+------------+
| 1 | 5/10/2019 | 2 | 8203 |
| 2 | 5/11/2019 | 2 | 345 |
| 3 | 5/12/2019 | 2 | 5014 |
| 4 | 5/10/2019 | 0 | 8203 |
| 5 | 5/11/2019 | 0 | 345 |
| 6 | 5/12/2019 | 0 | 5014 |
+---------+-----------+----------------+------------+
Table 2: ItemList
+------------+---------+--------+-----------+-------------+
| ItemListID | BatchID | ItemID | Processed | ProcessDate |
+------------+---------+--------+-----------+-------------+
| 1000 | 1 | 201 | 1 | 5/10/2019 |
| 1001 | 1 | 689 | 1 | 5/10/2019 |
| 1002 | 2 | 548 | 1 | 5/11/2019 |
| 1003 | 2 | 693 | 1 | 5/11/2019 |
| 1004 | 3 | 123 | 1 | 5/12/2019 |
| 1005 | 3 | 999 | 1 | 5/12/2019 |
| 1006 | 4 | 201 | 0 | NULL |
| 1007 | 4 | 689 | 0 | NULL |
| 1008 | 5 | 548 | 0 | NULL |
| 1009 | 5 | 693 | 0 | NULL |
| 1010 | 6 | 123 | 0 | NULL |
| 1011 | 6 | 999 | 0 | NULL |
+------------+---------+--------+-----------+-------------+
How would I go about populating that ItemList.BatchID foreign key correctly?
Thanks.

How to calculate running total in SQL

I have my dataset in the given format
It's a month level data along with salary for each month.
I need to calculate cumulative salary for each month end. How can I do this
+----------+-------+--------+---------------+
| Account | Month | Salary | Running Total |
+----------+-------+--------+---------------+
| a | 1 | 586 | 586 |
| a | 2 | 928 | 1514 |
| a | 3 | 726 | 2240 |
| a | 4 | 538 | 538 |
| b | 1 | 956 | 1494 |
| b | 3 | 667 | 2161 |
| b | 4 | 841 | 3002 |
| c | 1 | 826 | 826 |
| c | 2 | 558 | 1384 |
| c | 3 | 558 | 1972 |
| c | 4 | 735 | 2707 |
| c | 5 | 691 | 3398 |
| d | 1 | 670 | 670 |
| d | 4 | 838 | 1508 |
| d | 5 | 1000 | 2508 |
+----------+-------+--------+---------------+
I need to calculate running total column which is cumulative column. How can I do efficiently in SQL?
You can use SUM with ORDER BY clause inside the OVER clause:
SELECT Account, Month, Salary,
SUM(Salary) OVER (PARTITION BY Account ORDER BY Month) AS RunningTotal
FROM mytable