Access/SQL - split multiple columns to multiple rows based on column ID - sql

I've tried a few things but can't get this to work efficiently. Help!
Customer dataset example (basically a dump of over 100 sensor board readings with 42 sensors per board)
| BoardName | ReadingTime | Sensor1 | Sensor2 | Sensor3 | Sensor4 ... Sensor42 |
-------------------------------------------------------------------------------------
| BoardA | 1224201301 | 18 | 24 | 7 | etc etc for each column |
| BoardB | 1224201301 | 18 | 23 | 8 | etc etc for each column |
| BoardC | 1224201301 | 17 | 24 | 7 | etc etc for each column |
| BoardD | 1224201301 | 16 | 23 | 6 | etc etc for each column |
| BoardA | 1224201302 | 18 | 22 | 5 | etc etc for each column |
| BoardB | 1224201302 | 18 | 23 | 5 | etc etc for each column |
-------------------------------------------------------------------------------------
This seems like a pretty inefficient table design. I'd like to get it into SQL more like the following example, which makes the data a little more accessible.
| SensorID | ReadingTime | SensorValue |
----------------------------------------
| BrdASen1 | 1224201301 | 18 |
| BrdASen2 | 1224201301 | 24 |
| BrdASen3 | 1224201301 | 7 |
| BrdBSen1 | 1224201301 | 18 |
| BrdBSen1 | 1224201301 | 23 |
| BrdBSen1 | 1224201301 | 8 |
| etc etc |
----------------------------------------
So basically, I want to iterate through each row of imported data, split off the 42 columns into individual rows with a 3-column SensorID/Date/Value format.

I would use this table structure:
|BoardName|SensorId| ReadingTime | SensorValue |
------------------------------------------------
|BoardA | 1 | 1224201301 | 18 |
|BoardA | 2 | 1224201301 | 24 |
SensorId values would be from 1 - 42.
To populate the table, you'll need to use the UNPIVOT operator.

Related

How to update table 2 from the inserted data in table 1?

Can you help me on what query I to to update one table with data from another.
I have 2 tables for example:
tbl_med_take
| id | name | med | qty |
---------------------------------
| 1 | jayson | med2 | 3 |
| 2 | may | med2 | 4 |
| 3 | jenny. | med3 | 6 |
| 4 | joel. | med3 | 4 |
tbl_med
| id | med | stocks |
-----------------------------
| 1 | med1 | 20 |
| 2 | med2 |. 17 |
| 3 | med3 | 24 |
The output that I want in tbl_med:
tbl_med
| id | med | stocks |
-----------------------------
| 1 | med1 | 20 |
| 2 | med2 |. 10 |
| 3 | med3 | 14 |
First get the total consumed from med_tbl_take using
select med,sum(quantity) as total from tbl_med_take group by med
Then you can left join with your med_tbl and subtract.
select m.id,m.med,(m.stocks-ISNULL(n.total,0)) from tbl_med m
left join
(select med,sum(quantity) as total from tbl_med_take group by med) n
on m.med=n.med
CHECK DEMO HERE

how to remove 'Total' from excel while check ssas cube data

the rank is also using Total value of the column. how to remove that.
it's a calculation.
+---------------------+------------+------+
| Row Labels | Change | Rank |
+---------------------+------------+------+
| ADDERALL XR | 20,236.00 | 7 |
| ATOMOXETINE | 11,448.00 | 9 |
| BIPHENTIN | 87,007.00 | 4 |
| CONCERTA | 151,397.00 | 3 |
| CONCERTA | | 11 |
| DEXEDRINE | 2,065.00 | 10 |
| GENERIC ATOMOXETINE | 17,778.00 | 8 |
| INTUNIV XR | | 12 |
| METHYLPHENIDATE ER | 21,969.00 | 6 |
| METHYPHENIDATE ER | | 13 |
| RITALIN IR | 40,826.00 | 5 |
| RITALIN SR | -19,238.00 | 14 |
| STRATTERA | -19,555.00 | 15 |
| VYVANSE | 220,762.00 | 2 |
| Grand Total | 534,695.00 | 1 |
+---------------------+------------+------+
In the excel sheet, right click on the area where cube data is populated, Click on the Pivot table options, goto Totals and Filters tab, and uncheck grand totals for columns options, and your totals will go off. It will remove totals for all columns, and you will have to sum those columns where you need totals by explicitly writing formulas for those columns.

Inefficient SQL Search Query - Oracle DB

My logic in this query is right (well im 80% sure it is). but its been running for 2h 23min and still going, was wondering if some one could maybe help me make this run a bit more efficiently as i don't think its that intense of a query
SELECT b.bridge_no, COUNT(*) AS comment_cnt
FROM iacd_asset b INNER JOIN iacd_note c
ON REGEXP_LIKE(c.comments, '(^|\W)BN' || b.bridge_no || '(\W|$)', 'i')
inner join ncr_note e on c.note_id=e.note_id
inner join ncr f on e.ncr_id=f.ncr_id
inner join ncr_iac g on f.ncr_id=g.ncr_id
WHERE c.create_dt >= date'2015-01-01'
AND c.create_dt < date'2015-03-12'
AND length(b.bridge_no) > 1
AND g.scheme in (1, 3, 5, 6, 7, 8, 9, 9, and about 10 more values)
GROUP BY b.bridge_no
ORDER BY comment_cnt;
in short the query should be making a bunch of joins, and then filtering the joined table by schemes (g.scheme in....) , and then parsing the notes field for anything with BN in it.
PLAN TABLE, ok i have never used one before, but i believe this is the plan table
+------------------+----------------+--------------+------------------+--------------+-----------------+----------------+-----------+----+-----------+-------+----------+---------+-------------+-------------+-----------------+---------+------------+-----------------------------+---------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------+------+--------------+
| OPERATION | OPTIONS | OBJECT_OWNER | OBJECT_NAME | OBJECT_ALIAS | OBJECT_INSTANCE | OBJECT_TYPE | OPTIMIZER | ID | PARENT_ID | DEPTH | POSITION | COST | CARDINALITY | BYTES | CPU_COST | IO_COST | TEMP_SPACE | ACCESS_PREDICATES | FILTER_PREDICATES | PROJECTION | TIME | QBLOCK_NAME |
+------------------+----------------+--------------+------------------+--------------+-----------------+----------------+-----------+----+-----------+-------+----------+---------+-------------+-------------+-----------------+---------+------------+-----------------------------+---------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------+------+--------------+
| SELECT STATEMENT | | | | | | | ALL_ROWS | 0 | | 0 | 281,503 | 281,503 | 40 | 4,480 | 148,378,917,975 | 215,677 | | | | | 458 | |
+------------------+----------------+--------------+------------------+--------------+-----------------+----------------+-----------+----+-----------+-------+----------+---------+-------------+-------------+-----------------+---------+------------+-----------------------------+---------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------+------+--------------+
| SORT | ORDER BY | | | | | | | 1 | 0 | 1 | 1 | 281,503 | 40 | 4,480 | 148,378,917,975 | 215,677 | | | | (#keys=1) COUNT(*)[22], "B"."BRIDGE_NO"[NUMBER,22] | 458 | SEL$81719215 |
+------------------+----------------+--------------+------------------+--------------+-----------------+----------------+-----------+----+-----------+-------+----------+---------+-------------+-------------+-----------------+---------+------------+-----------------------------+---------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------+------+--------------+
| HASH | GROUP BY | | | | | | | 2 | 1 | 2 | 1 | 281,503 | 40 | 4,480 | 148,378,917,975 | 215,677 | | | | (#keys=1) "B"."BRIDGE_NO"[NUMBER,22], COUNT(*)[22] | 458 | |
+------------------+----------------+--------------+------------------+--------------+-----------------+----------------+-----------+----+-----------+-------+----------+---------+-------------+-------------+-----------------+---------+------------+-----------------------------+---------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------+------+--------------+
| HASH JOIN | | | | | | | | 3 | 2 | 3 | 1 | 281,497 | 16,084 | 1,801,408 | 148,366,537,976 | 215,677 | 24,126,000 | "G"."NCR_ID"="F"."NCR_ID" | | (#keys=1) "B"."BRIDGE_NO"[NUMBER,22] | 458 | |
+------------------+----------------+--------------+------------------+--------------+-----------------+----------------+-----------+----+-----------+-------+----------+---------+-------------+-------------+-----------------+---------+------------+-----------------------------+---------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------+------+--------------+
| HASH JOIN | | | | | | | | 4 | 3 | 4 | 1 | 96,996 | 209,778 | 21,607,134 | 13,549,630,814 | 90,985 | 22,725,000 | "E"."NCR_ID"="F"."NCR_ID" | | (#keys=1) "F"."NCR_ID"[NUMBER,22], "B"."BRIDGE_NO"[NUMBER,22] | 158 | |
+------------------+----------------+--------------+------------------+--------------+-----------------+----------------+-----------+----+-----------+-------+----------+---------+-------------+-------------+-----------------+---------+------------+-----------------------------+---------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------+------+--------------+
| HASH JOIN | | | | | | | | 5 | 4 | 5 | 1 | 42,595 | 208,419 | 20,216,643 | 5,484,063,163 | 40,162 | 9,839,000 | "C"."NOTE_ID"="E"."NOTE_ID" | REGEXP_LIKE ("C"."COMMENTS",'(^|\W)BN'||TO_CHAR("B"."BRIDGE_NO")||'(\W|$)','i') | (#keys=1) "B"."BRIDGE_NO"[NUMBER,22], "E"."NCR_ID"[NUMBER,22] | 70 | |
+------------------+----------------+--------------+------------------+--------------+-----------------+----------------+-----------+----+-----------+-------+----------+---------+-------------+-------------+-----------------+---------+------------+-----------------------------+---------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------+------+--------------+
| PARTITION RANGE | SINGLE | | | | | | | 6 | 5 | 6 | 1 | 1,039 | 104,603 | 8,577,446 | 62,280,224 | 1,011 | | | | "C"."NOTE_ID"[NUMBER,22], "C"."COMMENTS"[VARCHAR2,4000] | 2 | |
+------------------+----------------+--------------+------------------+--------------+-----------------+----------------+-----------+----+-----------+-------+----------+---------+-------------+-------------+-----------------+---------+------------+-----------------------------+---------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------+------+--------------+
| TABLE ACCESS | FULL | IACDB | IACD_NOTE | C#SEL$1 | 2 | TABLE | ANALYZED | 7 | 6 | 7 | 1 | 1,039 | 104,603 | 8,577,446 | 62,280,224 | 1,011 | | | "C"."CREATE_DATE"<TO_DATE(' 2014-12-31 00:00:00', 'syyyy-mm-dd hh24:mi:ss') | "C"."NOTE_ID"[NUMBER,22], "C"."COMMENTS"[VARCHAR2,4000] | 2 | SEL$81719215 |
+------------------+----------------+--------------+------------------+--------------+-----------------+----------------+-----------+----+-----------+-------+----------+---------+-------------+-------------+-----------------+---------+------------+-----------------------------+---------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------+------+--------------+
| MERGE JOIN | CARTESIAN | | | | | | | 8 | 5 | 6 | 2 | 24,267 | 12,268,270 | 184,024,050 | 2,780,501,758 | 23,033 | | | | (#keys=0) "B"."BRIDGE_NO"[NUMBER,22], "E"."NCR_ID"[NUMBER,22], "E"."NOTE_ID"[NUMBER,22] | 40 | |
+------------------+----------------+--------------+------------------+--------------+-----------------+----------------+-----------+----+-----------+-------+----------+---------+-------------+-------------+-----------------+---------+------------+-----------------------------+---------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------+------+--------------+
| TABLE ACCESS | FULL | IACDB | IACD_ASSET | B#SEL$1 | 1 | TABLE | ANALYZED | 9 | 8 | 7 | 1 | 7 | 40 | 160 | 560,542 | 7 | | | LENGTH(TO_CHAR("B"."BRIDGE_NO"))>1 | "B"."BRIDGE_NO"[NUMBER,22] | 1 | SEL$81719215 |
+------------------+----------------+--------------+------------------+--------------+-----------------+----------------+-----------+----+-----------+-------+----------+---------+-------------+-------------+-----------------+---------+------------+-----------------------------+---------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------+------+--------------+
| BUFFER | SORT | | | | | | | 10 | 8 | 7 | 2 | 24,259 | 308,248 | 3,390,728 | 2,779,941,216 | 23,026 | | | | (#keys=0) "E"."NCR_ID"[NUMBER,22], "E"."NOTE_ID"[NUMBER,22] | 40 | |
+------------------+----------------+--------------+------------------+--------------+-----------------+----------------+-----------+----+-----------+-------+----------+---------+-------------+-------------+-----------------+---------+------------+-----------------------------+---------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------+------+--------------+
| TABLE ACCESS | FULL | IACDB | IACD_NCR_NOTE | E#SEL$2 | 4 | TABLE | ANALYZED | 11 | 10 | 8 | 1 | 606 | 308,248 | 3,390,728 | 69,498,530 | 576 | | | | "E"."NCR_ID"[NUMBER,22], "E"."NOTE_ID"[NUMBER,22] | 1 | SEL$81719215 |
+------------------+----------------+--------------+------------------+--------------+-----------------+----------------+-----------+----+-----------+-------+----------+---------+-------------+-------------+-----------------+---------+------------+-----------------------------+---------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------+------+--------------+
| INDEX | FAST FULL SCAN | IACDB | PK_IACDNCR_NCRID | F#SEL$3 | | INDEX (UNIQUE) | ANALYZED | 12 | 4 | 5 | 2 | 31,763 | 22,838,996 | 137,033,976 | 3,248,120,913 | 30,322 | | | | "F"."NCR_ID"[NUMBER,22] | 52 | SEL$81719215 |
+------------------+----------------+--------------+------------------+--------------+-----------------+----------------+-----------+----+-----------+-------+----------+---------+-------------+-------------+-----------------+---------+------------+-----------------------------+---------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------+------+--------------+
| TABLE ACCESS | FULL | IACDB | IACD_NCR_IAC | G#SEL$4 | 8 | TABLE | ANALYZED | 13 | 3 | 4 | 2 | 181,461 | 1,731,062 | 15,579,558 | 134,407,812,606 | 121,833 | | | ALL THE SCHEMES CHCECKS | "G"."NCR_ID"[NUMBER,22] | 295 | SEL$81719215 |
+------------------+----------------+--------------+------------------+--------------+-----------------+----------------+-----------+----+-----------+-------+----------+---------+-------------+-------------+-----------------+---------+------------+-----------------------------+---------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------+------+--------------+
Hopefully thats legible enough
interms of indexes i assume only the fields that are being sorted is importent
crate_dt is not indexed
scheme id is indexed
Maybe my order in query is wrong...
The plan shows you're doing FULL TABLE SCAN of IACD_NOTE and IACD_ASSET, and then doing a CARETESIAN join of them, because you have provided no criteria for linking one record in IACD_ASSET to a set of records in IACD_NOTE.
That's not my definition of a non-intense query, and the eye-popping values for CPU cost bear that out.
You need to replace this ..,
FROM iacd_asset b INNER JOIN iacd_note c
ON REGEXP_LIKE(c.comments, '(^|\W)BN' || b.bridge_no || '(\W|$)', 'i')
... with an actual join on indexed columns. It would be helpful if Notes were linked to Assets by a foreign key of BRIDGE_NO or similar. I don't know your data model. Then you can use that regex as an additional filter in the WHERE clause.
Also you join to three further tables, to get to something which allows an additional filter on SCHEME. Again, I don't know your data model but this seems pretty inefficient.
Unfortunately this is the sort of tuning which relies on domain knowledge. Fixing this query requires understanding of the data - its volume, distribution and skew, the data model itself and the business logic your query implements. This is way beyond the scope of the advice we can offer in StackOverflow.
One thing to consider, but it is a big decision would be to index the comments with a free text index. However, that has lots of ramifications (especially space and database admin). Find out more.

Selecting all rows in a master table and summing columns in multiple detail tables

I have a master table (Project List) along with several sub tables that are joined on one common field (RecNum). I need to get totals for all of the sub tables, by column and am not sure how to do it. This is a sample of the table design. There are more columns in each table (I need to pull * from "Project List") but I'm showing a sampling of the column names and values to get an idea of what to do.
Project List
| RecNum | Project Description |
| 6 | Sample description |
| 7 | Another sample |
WeekA
| RecNum | UserName | Day1Reg | Day1OT | Day2Reg | Day2OT | Day3Reg | Day3OT |
| 6 | JustMe | 1 | 2 | 3 | 4 | 5 | 6 |
| 6 | NotMe | 1 | 2 | 3 | 4 | 5 | 6 |
| 7 | JustMe | | | | | | |
| 7 | NotMe | | | | | | |
WeekB
| RecNum | UserName | Day1Reg | Day1OT | Day2Reg | Day2OT | Day3Reg | Day3OT |
| 6 | JustMe | 7 | 8 | 1 | 2 | 3 | 4 |
| 6 | NotMe | 7 | 8 | 1 | 2 | 3 | 4 |
| 7 | JustMe | | | | | | |
| 7 | NotMe | | | | | | |
So the first query should return the complete totals for both users, like this:
| RecNum | Project Description | sumReg | sumOT |
| 6 | Sample description | 40 | 52 |
| 7 | Another sample | 0 | 0 |
The second query should return the totals for just a specified user, (WHERE UserName = 'JustMe') like this:
| RecNum | Project Description | sumReg | sumOT |
| 6 | Sample description | 20 | 26 |
| 7 | Another sample | 0 | 0 |
Multiple parallel tables with the same structure is usually a sign of poor database design. The data should really be all in one table, with additional columns specifying the week.
You can, however, use union all to bring the data together. The following is an example of a query:
select pl.recNum, pl.ProjectDescription,
sum(Day1Reg + Day2Reg + Day3Reg) as reg,
sum(Day1OT + Day2OT + Day3OT) as ot
from ProjectList pl join
(select * from weekA union all
select * from weekB
) w
on pl.recNum = w.recNum
group by l.recNum, pl.ProjectDescription,;
In practice, you should use select * with union all. You should list the columns out explicitly. You can add appropraite where clauses or conditional aggregation to get the results you want in any particular case.

MySQL Sum one field based on a second 'common' field

Not a SQL guy, so i'm in a bind, so I'll keep this simple
+--------------+------------------------+
| RepaymentDay | MonthlyRepaymentAmount |
+--------------+------------------------+
| 3 | 0.214847 |
| 26 | 0.219357 |
| 24 | 0.224337 |
| 5 | 0.224337 |
| 18 | 0.224337 |
| 28 | 0.214847 |
| 1 | 0.224337 |
| 28 | 0.327079 |
+--------------+------------------------+
I am looking for a query that would then give something like;
+--------------+------------------------+
| RepaymentDay | MonthlyRepaymentAmount |
+--------------+------------------------+
| 1 | 0.224337 |
| 3 | 0.214847 |
| 5 | 0.224337 |
| 18 | 0.224337 |
| 24 | 0.224337 |
| 26 | 0.219357 |
| 28 | 0.541926 |
+--------------+------------------------+
I.e Where there are multiple records with the same value of 'RepaymentDay' , to sum those values of 'MonthlyRepaymentAmount'.
I could do it externally with perl or python, no issue, but I want to try and become more familiar with SQL.
Any ideas?
SELECT RepaymentDay, SUM(MonthlyRepaymentAmount) AS MonthlyRepaymentAmount
FROM YourTable
GROUP BY RepaymentDay
ORDER BY RepaymentDay
select RepaymentDay, sum(MonthlyRepaymentAmount) from table_name group by RepaymentDay;