I am using a database with two tables:
HEADING:(ID, CAPTION, CUSTOMER_ID,DATE, TDOC, ID_PAR) -- ID is PK
-- ID_PAR is FK Related
to HEADING.ID
LINES : (ID, NDOC,QTE) -- ID is PK NDOC is FK RELATED TO
HEADING.ID
HEADING has a recursive query to get all childs of a HEADING ID
for example an invoice with many delivery notes.
In that system the lines of an invoice are not typed we type only delivery notes
How to get all the lines of an invoice in case of the datasets are related as master-detail?
Heading dataset
ID PARENT_ID DOC_DATE DOC_TYPE
01 NULL A
02 01 B
03 02 C
04 02 C
Lines Dataset
ID_PK NDOC_FK CODE_PROD QTE
Heading
01 03 P1 5
02 03 P10 20
03 03 P67 65
04 04 P61 34
SQL LINES RESULT for heading.ID=01
PRODUCT QTE
P1 5
P10 20
P67 65
P61 61
I am working on a hana table and i am trying to delete a table if it contains value from a list.
A B
22 01
22 01
22 02
22 06
23 01
23 01
23 06
I will like to drop some values from this table and have this.
A B
22 01
22 01
22 06
23 01
23 01
23 06
Basically i will like to most likely do a count and check if column B consists of 01 AND 02, if it does drop 02 and if it consists of only 01 leave as it is.
This seems virtually impossible with almost every sql script i have tried
SELECT BP, COUNT(*) AS SO FROM "EH"."BP_CUST" GROUP BY BP;
This scripts gets the count of each row and put it in SO column.
after that maybe do an if statement on the SO column and delete if the B field contains 01 and 02?
I tried doing and IF statement then select and i could not get it to work either.
A B
22 01
22 01
22 02
22 06
23 01
23 01
23 06
24 02
Becomes
A B
22 01
22 01
22 06
23 01
23 01
23 06
24 02
If I understand correctly, you want:
select c.*
from "EH"."BP_CUST" c
where c.b <> '02' or
not exists (select 1
from "EH"."BP_CUST" c2
where c2.a = c.a and c2.b = '01'
);
Your question says "delete". But I think the intention is to select "02" rows only when there is no "01" row for the same a (and all other rows).
If I understood correctly, this might be the solution:
DELETE BP_CUST
WHERE A IN
(
SELECT
BP_CUST.A
FROM
(
SELECT
A
, COUNT(CASE WHEN B != '02' THEN 1 ELSE NULL END) AS NOT_02
, COUNT(CASE WHEN B = '02' THEN 1 ELSE NULL END) AS IS_02
FROM BP_CUST
GROUP BY A
) AS t_delete
JOIN BP_CUST ON BP_CUST.A = t_delete.A
WHERE B = '02' AND NOT_02 > 0 AND IS_02 > 0
)
AND B = '02'
I have a table with the following ddl.
CREATE TABLE "LEDGER"
("FY" NUMBER,
"FP" VARCHAR2(20 BYTE),
"FUND" VARCHAR2(20 BYTE),
"TYPE" VARCHAR2(2 BYTE),
"AMT" NUMBER
)
The table contains the following data.
REM INSERTING into LEDGER
SET DEFINE OFF;
Insert into LEDGER (FY,FP,FUND,TYPE,AMT) values (15,'03','A','03',1);
Insert into LEDGER (FY,FP,FUND,TYPE,AMT) values (15,'04','A','03',2);
Insert into LEDGER (FY,FP,FUND,TYPE,AMT) values (16,'04','A','03',3);
Insert into LEDGER (FY,FP,FUND,TYPE,AMT) values (12,'05','A','04',6);
based on the partition of fy,fp,fund and type I would like to write a query to keep a running count from the beginning of fp(fp though it is a varchar it represents a number in the month. i.E 2 equals february and 3 equals march etc.) to a hard number of 14. So taking a closer look at the data you will notice that in FY 15 the max period is 04 so i must add another 10 periods to my report to get my report to have the full 14 periods. here is the expected output.
here is what i tried, but I'm just simply stumbling all together on this.
WITH fy_range AS
(
SELECT MIN (fy) AS min_fy
, MAX (fy) AS max_fy
FROM ledger
),all_fys AS
(
SELECT min_fy + LEVEL - 1 AS fy
FROM fy_range
CONNECT BY LEVEL <= max_fy + 1 - min_fy
)
,all_fps AS
(
SELECT TO_CHAR (LEVEL, 'FM00') AS fp
FROM dual
CONNECT BY LEVEL <= 14
)
SELECT
FUND
,G.TYPE
,G.FY
,G.FP
,LAST_VALUE(G.AMT ignore nulls) OVER (PARTITION BY G.FUND ORDER BY Y.FY P.FP ) AS AMT
FROM all_fys y
CROSS JOIN all_fps p
LEFT OUTER JOIN LEDGER G PARTITION BY(FUND)
ON g.fy = y.fy
AND g.fp = p.fp;
but I end up with a bunch of nulls and some strange results.
This may not be the most efficient solution, but it is easy to understand and maintain. First (in the most deeply nested subquery) we find the min FP for each combination of FY, FUND and TYPE. Then we use a CONNECT BY query to fill all the FP for all FY, FUND, TYPE combinations (up to the hard upper limit of 14). Then we left-outer-join to the original data in the LEDGER table. So far we densified the data. In the final query (the join) we also add the column for the cumulative sum - that part is easy after we densified the data.
TYPE is an Oracle keyword, so it is probably best not to use it as a column name. It is also best not to use double-quoted table and column names (I had to use upper case everywhere because of that). I also made sure to convert from varchar2 to number and back to varchar2 - we shouldn't rely on implicit conversions.
select S.FY, to_char(S.FP, 'FM09') as FP, S.FUND, S.TYPE,
sum(L.AMT) over (partition by S.FY, S.FUND, S.TYPE order by S.FP) as CUMULATIVE_AMT
from (
select FY, MIN_FP + level - 1 as FP, FUND, TYPE
from (
select FY, min(to_number(FP)) as MIN_FP, FUND, TYPE
from LEDGER
group by FY, FUND, TYPE
)
connect by level <= 15 - MIN_FP
and prior FY = FY
and prior FUND = FUND
and prior TYPE = TYPE
and prior sys_guid() is not null
) S left outer join LEDGER L
on S.FY = L.FY and S.FP = L.FP and S.FUND = L.FUND and S.TYPE = L.TYPE
;
Output:
FY FP FUND TYPE CUMULATIVE_AMT
--- --- ---- ---- --------------
12 05 A 04 6
12 06 A 04 6
12 07 A 04 6
12 08 A 04 6
12 09 A 04 6
12 10 A 04 6
12 11 A 04 6
12 12 A 04 6
12 13 A 04 6
12 14 A 04 6
15 03 A 03 1
15 04 A 03 3
15 05 A 03 3
15 06 A 03 3
15 07 A 03 3
15 08 A 03 3
15 09 A 03 3
15 10 A 03 3
15 11 A 03 3
15 12 A 03 3
15 13 A 03 3
15 14 A 03 3
16 04 A 03 3
16 05 A 03 3
16 06 A 03 3
16 07 A 03 3
16 08 A 03 3
16 09 A 03 3
16 10 A 03 3
16 11 A 03 3
16 12 A 03 3
16 13 A 03 3
16 14 A 03 3
Select drl.id, drl.ap, drl.sqn, drl.date
from srs_drl drl
this will out output something like this:
14000001 01 01 05/11/2015
14000001 01 01 06/11/2015
14000001 01 01 01/12/2015
14000001 01 01 04/01/2016
15000234 01 02 05/11/2015
15000234 01 03 06/11/2015
15000234 01 03 01/12/2015
15000234 01 04 04/01/2016
For every unique first 3 columns I need to retrieve the earliest date. So for the above table I wish to return:
14000001 01 01 05/11/2015
15000234 01 02 05/11/2015
15000234 01 03 06/11/2015
15000234 01 04 04/01/2016
Any help with this query would be much appreciated. I've tried using TOP but that only returns the first record for the entire table rather than the first record grouped by the first 3 columns.
Thanks in advance
Do a GROUP BY combined with MIN:
Select drl.id, drl.ap, drl.sqn, MIN(drl.date)
from srs_drl drl
group by drl.id, drl.ap, drl.sqn
Alternatively, a NOT EXISTS to return a row if no one with same drl.id, drl.ap, drl.sqn is even earlier:
Select drl.id, drl.ap, drl.sqn, drl.date
from srs_drl drl
where not exist (select 1 from from srs_drl d2
where d2.id = drl.id
and d2.ap = drl.ap
and d2.sqn = drl.sqn
and d2.date < drl.date)
Note that date is a reserved word in ANSI SQL, so you may need to write "date".
I want to list BR, BRANCHNAME and the number of people employed in it. There are 5 branches it total and only 4 of them have people employed in it; Branch 05 has no employees in it. After using the following code, the branch 05 will not be shown as the row of branch 05 will not be included after the where statement. I want to show a row of "05 Br05 0".
SELECT EMPLOYEE.BR, BRANCHNAME, Count(*) AS Number
FROM EMPLOYEE, BRANCH
WHERE (EMPLOYEE.BR = BRANCH.BR)
GROUP BY EMPLOYEE.BR, BRANCHNAME;
The result is:
BR BRANCHNAME Number
01 Br01 6
02 Br02 4
03 Br03 5
04 Br04 6
I want to have the following result:
BR BRANCHNAME Number
01 Br01 6
02 Br02 4
03 Br03 5
04 Br04 6
05 Br05 0
It would seem you want a LEFT JOIN which gives a countable row with a null result even if there is no matching employee.
Since you've not added your table structure, I assume branchname is a field in the branch table.
SELECT branch.br, branch.branchname, COUNT(employee.br) AS Number
FROM branch
LEFT JOIN employee
ON branch.br = employee.br
GROUP BY branch.br, branch.branchname
An SQLfiddle to test with (based on SQL Server since Access is not available)