How to get table name when using TABLE_DATE_RANGE - google-bigquery

I would like to get daily statistics using TABLE_DATE_RANGE like this:
Select count(*), tableName
FROM
(TABLE_DATE_RANGE(appengine_logs.appengine_googleapis_com_request_log_,
DATE_ADD(CURRENT_TIMESTAMP(), -7, 'DAY'), CURRENT_TIMESTAMP()))
group by tableName
Is there any way to get a table name when using TABLE_DATE_RANGE?

You need to query your dataset with a metadata query.
SELECT * FROM publicdata:samples.__TABLES__
WHERE MSEC_TO_TIMESTAMP(creation_time) < DATE_ADD(CURRENT_TIMESTAMP(), -7, 'DAY')
this returns
+-----+------------+------------+-----------------+---------------+--------------------+-----------+--------------+------+---+
| Row | project_id | dataset_id | table_id | creation_time | last_modified_time | row_count | size_bytes | type | |
+-----+------------+------------+-----------------+---------------+--------------------+-----------+--------------+------+---+
| 1 | publicdata | samples | github_nested | 1348782587310 | 1348782587310 | 2541639 | 1694950811 | 1 | |
| 2 | publicdata | samples | github_timeline | 1335915950690 | 1335915950690 | 6219749 | 3801936185 | 1 | |
| 3 | publicdata | samples | gsod | 1335916040125 | 1413937987846 | 114420316 | 17290009238 | 1 | |
| 4 | publicdata | samples | natality | 1335916045005 | 1413925598038 | 137826763 | 23562717384 | 1 | |
| 5 | publicdata | samples | shakespeare | 1335916045099 | 1413926827257 | 164656 | 6432064 | 1 | |
| 6 | publicdata | samples | trigrams | 1335916127449 | 1335916127449 | 68051509 | 277168458677 | 1 | |
| 7 | publicdata | samples | wikipedia | 1335916132870 | 1423520879902 | 313797035 | 38324173849 | 1 | |
+-----+------------+------------+-----------------+---------------+--------------------+-----------+--------------+------+---+
You can add in the WHERE clauses to restrict to tables similar to
WHERE table_id contains "wiki"
or regexp like WHERE REGEXP_MATCH(table_id, r"^foo[\d]{3,5}")

Related

Sql server : select members having 2 or more records on different date of same class

I am trying to find out all the member ids who have more than 2 records on a different dates in the same class.
+----------+------------+-------+-----------+
| MemberId | Date | Class | |
+----------+------------+-------+-----------+
| 118111 | 2/18/2020 | A | Valid |
| 118111 | 10/15/2020 | A | Valid |
| 118216 | 1/31/2020 | B | Valid |
| 118216 | 5/16/1981 | B | Valid |
| 118291 | 6/9/2020 | A | Valid |
| 118291 | 12/5/2020 | A | Valid |
| 118533 | 4/9/2020 | A | Not valid |
| 118533 | 11/11/2020 | B | Not valid |
| 118533 | 7/22/2020 | C | Valid |
| 118533 | 10/25/2020 | C | Valid |
| 118293 | 3/30/2020 | A | Not valid |
| 118293 | 3/30/2020 | A | Not valid |
| 118499 | 4/16/2020 | B | Valid |
| 118499 | 7/26/2020 | B | Valid |
| 118499 | 3/25/2020 | A | Not valid |
+----------+------------+-------+-----------+
I have made a query which checks only 2 records but unable to find a solution for checking more than 2 records.
select mc.*
FROM table1 AS mc
JOIN table1 AS ma ON ma.memberid = mc.memberid
AND ma.date != mc.date
AND ma.class = mc.class
Assuming you just want the memberid you can us a HAVING:
SELECT memberid
FROM dbo.YourTable
GROUP BY memberid
HAVING COUNT(DISTINCT [Date]) > 2;

Counting based on group of 1st column

I am using following query to count how many Bill_date each BAN have
select replace(c.usertoken, '-', '') as BAN
, to_char(to_date(bi.name,'YYYY-MM-DD'),'dd-mm-yy') as Billdate_dmy
, (replace(c.usertoken, '-', '') ||':'|| to_char(to_date(bi.name,'YYYY-MM-DD'),'dd-mm-yy')) as BAN_Billdate_dmy
, count(c.usertoken) as Number_Of_Bills
from customer c
, service s
, document d
, bill bi
, batch ba
, billrun br
where c.ID = s.CUSTOMER_SERVICE_ID
and s.ID = d.SERVICE_DOCUMENT_ID
and bi.ID = d.BILL_DOCUMENT_ID
and d.BATCH = ba.ID
and ba.BILLRUN = br.ID
and br.STATUS = 'APPROVED'
and c.brand='rogers'
and d.VERSIONEDCONTENTFOLDER='cbu'
group by c.usertoken, bi.name
order by c.usertoken
Output of the above query
+-----------+----------+--------------------+--------------+--+-------+
| BAN | Bill_date | BAN_Billdate | Count |
+-----------+----------+--------------------+--------------+--+-------+
| 100001247 | 25-09-19 | 100001247:25-09-19 | 1 | | |
| 100001247 | 25-10-19 | 100001247:25-10-19 | 1 | | |
| 100002583 | 15-10-19 | 100002583:15-10-19 | 1 | | |
| 100004753 | 25-09-19 | 100004753:25-09-19 | 1 | | |
| 100004753 | 25-10-19 | 100004753:25-10-19 | 1 | | |
| 100005719 | 25-09-19 | 100005719:25-09-19 | 1 | | |
| 100005719 | 25-10-19 | 100005719:25-10-19 | 1 | | |
| 100006311 | 06-09-19 | 100006311:06-09-19 | 1 | | |
| 100009596 | 25-09-19 | 100009596:25-09-19 | 1 | | |
| 100009596 | 25-10-19 | 100009596:25-10-19 | 1 | | |
+-----------+----------+--------------------+--------------+--+-------+
However I was expecting the following output
+-----------+----------+--------------------+--------------+--+-------+
| BAN | Billdate | BAN_Billdate | | Count |
+-----------+----------+--------------------+--------------+--+-------+
| 100001247 | 25-09-19 | 100001247:25-09-19 | 2 | | |
| 100001247 | 25-10-19 | 100001247:25-10-19 | 2 | | |
| 100002583 | 15-10-19 | 100002583:15-10-19 | 3 | | |
| 100004753 | 25-09-19 | 100004753:25-09-19 | 3 | | |
| 100004753 | 25-10-19 | 100004753:25-10-19 | 3 | | |
| 100005719 | 25-09-19 | 100005719:25-09-19 | 2 | | |
| 100005719 | 25-10-19 | 100005719:25-10-19 | 2 | | |
| 100006311 | 06-09-19 | 100006311:06-09-19 | 1 | | |
| 100009596 | 25-09-19 | 100009596:25-09-19 | 2 | | |
| 100009596 | 25-10-19 | 100009596:25-10-19 | 2 | | |
+-----------+----------+--------------------+--------------+--+-------+
Please advise what changes should I do in the query to have the count column reflecting the expected values.
I don't want to touch your query and the archaic join syntax. Please learn proper SQL grammar with JOIN and ON clauses for joins.
That said, you seem to want a window function to sum the counts:
select sum(count(*)) over (partition by ban, to_date(bi.name, 'YYYY-MM-DD'))
I'm not sure that aggregation is really useful, if you are only getting one row per group. In that case, you might want to remove the group by and use:
select count(*) over (partition by ban, to_date(bi.name, 'YYYY-MM-DD'))

Query to group 5 records

I have table for eg "employee" with just one column "id". Say you have records from 1 through 1000.
Employee
------------
ID
------------
1
2
3
..
..
999
1000
Now I would like to write a query which gives the following results i.e. sort by ascending order and concatenate first 5 to 1 record, second 5 to 2 second, and so on. Any ideas how I can do this?
Here is the output I am looking to have.
1,2,3,4,5
6,7,8,9,10
11,12,13,14,15
...........
...........
996,997,998,999,1000
Use row_number and listagg functions, in this way:
SELECT listagg( id, ',' ) within group( order by group_no, id )
FROM (
select id,
trunc((row_number() over( order by id ) -1) / 5) as group_no
from employee
)
GROUP BY group_no
Working demo: http://sqlfiddle.com/#!4/ef526/10
| LISTAGG(ID,',')WITHINGROUP(ORDERBYGROUP_NO,ID) |
|------------------------------------------------|
| 1,2,3,4,5 |
| 6,7,8,9,10 |
| 11,12,13,14,15 |
| 16,17,18,19,20 |
| 21,22,23,24,25 |
| 26,27,28,29,30 |
| 31,32,33,34,35 |
| 36,37,38,39,40 |
| 41,42,43,44,45 |
| 46,47,48,49,50 |
| 51,52,53,54,55 |
| 56,57,58,59,60 |
| 61,62,63,64,65 |
| 66,67,68,69,70 |
| 71,72,73,74,75 |
| 76,77,78,79,80 |
| 81,82,83,84,85 |
| 86,87,88,89,90 |
| 91,92,93,94,95 |
| 96,97,98,99,100 |
| 101,102,103,104,105 |
| 106,107,108,109,110 |
| 111,112,113,114,115 |
| 116,117,118,119,120 |
| 121,122,123,124,125 |
| 126,127,128,129,130 |
| 131,132,133,134,135 |
| 136,137,138,139,140 |
| 141,142,143,144,145 |
| 146,147,148,149,150 |
| 151,152,153,154,155 |
| 156,157,158,159,160 |
| 161,162,163,164,165 |
| 166,167,168,169,170 |
| 171,172,173,174,175 |
| 176,177,178,179,180 |
| 181,182,183,184,185 |
| 186,187,188,189,190 |
| 191,192,193,194,195 |
| 196,197,198,199,200 |

Considering values from one table as column header in another

I have a base table where I need to calculate the difference between two dates based on the type of the entry.
tblA
+----------+------------+---------------+--------------+
| TypeCode | Log_Date | Complete_Date | Pending_Date |
+----------+------------+---------------+--------------+
| 1 | 18/04/2016 | 19/04/2016 | |
| 2 | 10/04/2016 | 18/04/2016 | 15/04/2016 |
| 3 | 12/04/2016 | 19/04/2016 | |
| 4 | 15/04/2016 | 17/04/2016 | 16/04/2016 |
| 5 | 16/04/2016 | 21/04/2016 | |
| 1 | 19/04/2016 | 20/04/2016 | |
| 2 | 20/03/2016 | 31/03/2015 | |
| 3 | 25/03/2016 | 28/03/2016 | |
| 4 | 26/03/2016 | 27/03/2016 | |
| 5 | 27/03/2016 | 30/03/2016 | |
+----------+------------+---------------+--------------+
I have another look up table which has the column names to be considered based on the TypeCode.
tblB
+----------+----------+---------------+
| TypeCode | DateCol1 | DateCol2 |
+----------+----------+---------------+
| 1 | Log_Date | Complete_Date |
| 2 | Log_Date | Pending_Date |
| 3 | Log_Date | Complete_Date |
| 4 | Log_Date | Pending_Date |
| 5 | Log_Date | Complete_Date |
+----------+----------+---------------+
I am doing a simple DATEDIFF between two dates for my calculation. However I want to lookup which columns to consider for this calculation from tblB and apply it on tblA based on the TypeCode.
Resulting table:
For example: When the TypeCode is 2 or 4 then the calculation should be DATEDIFF(d, Log_Date, Pending_Date), otherwise DATEDIFF(d, Log_Date, Complete_Date)
+----------+------------+---------------+--------------+----------+
| TypeCode | Log_Date | Complete_Date | Pending_Date | Cal_Days |
+----------+------------+---------------+--------------+----------+
| 1 | 18/04/2016 | 19/04/2016 | | 1 |
| 2 | 10/04/2016 | 18/04/2016 | 15/04/2016 | 5 |
| 3 | 12/04/2016 | 19/04/2016 | | 7 |
| 4 | 15/04/2016 | 17/04/2016 | 16/04/2016 | 1 |
| 5 | 16/04/2016 | 21/04/2016 | | 5 |
| 1 | 19/04/2016 | 20/04/2016 | | 1 |
| 2 | 20/03/2016 | 31/03/2015 | | |
| 3 | 25/03/2016 | 28/03/2016 | | 3 |
| 4 | 26/03/2016 | 27/03/2016 | | |
| 5 | 27/03/2016 | 30/03/2016 | | 3 |
+----------+------------+---------------+--------------+----------+
Any help would be appreciated. Thanks.
Use JOIN with CASE expression:
SELECT
a.*,
Cal_Days =
DATEDIFF(
DAY,
CASE
WHEN b.DateCol1 = 'Log_Date' THEN a.Log_Date
WHEN b.DateCol1 = 'Complete_Date' THEN a.Complete_Date
ELSE a.Pending_Date
END,
CASE
WHEN b.DateCol2 = 'Log_Date' THEN a.Log_Date
WHEN b.DateCol2 = 'Complete_Date' THEN a.Complete_Date
ELSE a.Pending_Date
END
)
FROM TblA a
INNER JOIN TblB b
ON b.TypeCode = a.TypeCode

Inefficient SQL Search Query - Oracle DB

My logic in this query is right (well im 80% sure it is). but its been running for 2h 23min and still going, was wondering if some one could maybe help me make this run a bit more efficiently as i don't think its that intense of a query
SELECT b.bridge_no, COUNT(*) AS comment_cnt
FROM iacd_asset b INNER JOIN iacd_note c
ON REGEXP_LIKE(c.comments, '(^|\W)BN' || b.bridge_no || '(\W|$)', 'i')
inner join ncr_note e on c.note_id=e.note_id
inner join ncr f on e.ncr_id=f.ncr_id
inner join ncr_iac g on f.ncr_id=g.ncr_id
WHERE c.create_dt >= date'2015-01-01'
AND c.create_dt < date'2015-03-12'
AND length(b.bridge_no) > 1
AND g.scheme in (1, 3, 5, 6, 7, 8, 9, 9, and about 10 more values)
GROUP BY b.bridge_no
ORDER BY comment_cnt;
in short the query should be making a bunch of joins, and then filtering the joined table by schemes (g.scheme in....) , and then parsing the notes field for anything with BN in it.
PLAN TABLE, ok i have never used one before, but i believe this is the plan table
+------------------+----------------+--------------+------------------+--------------+-----------------+----------------+-----------+----+-----------+-------+----------+---------+-------------+-------------+-----------------+---------+------------+-----------------------------+---------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------+------+--------------+
| OPERATION | OPTIONS | OBJECT_OWNER | OBJECT_NAME | OBJECT_ALIAS | OBJECT_INSTANCE | OBJECT_TYPE | OPTIMIZER | ID | PARENT_ID | DEPTH | POSITION | COST | CARDINALITY | BYTES | CPU_COST | IO_COST | TEMP_SPACE | ACCESS_PREDICATES | FILTER_PREDICATES | PROJECTION | TIME | QBLOCK_NAME |
+------------------+----------------+--------------+------------------+--------------+-----------------+----------------+-----------+----+-----------+-------+----------+---------+-------------+-------------+-----------------+---------+------------+-----------------------------+---------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------+------+--------------+
| SELECT STATEMENT | | | | | | | ALL_ROWS | 0 | | 0 | 281,503 | 281,503 | 40 | 4,480 | 148,378,917,975 | 215,677 | | | | | 458 | |
+------------------+----------------+--------------+------------------+--------------+-----------------+----------------+-----------+----+-----------+-------+----------+---------+-------------+-------------+-----------------+---------+------------+-----------------------------+---------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------+------+--------------+
| SORT | ORDER BY | | | | | | | 1 | 0 | 1 | 1 | 281,503 | 40 | 4,480 | 148,378,917,975 | 215,677 | | | | (#keys=1) COUNT(*)[22], "B"."BRIDGE_NO"[NUMBER,22] | 458 | SEL$81719215 |
+------------------+----------------+--------------+------------------+--------------+-----------------+----------------+-----------+----+-----------+-------+----------+---------+-------------+-------------+-----------------+---------+------------+-----------------------------+---------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------+------+--------------+
| HASH | GROUP BY | | | | | | | 2 | 1 | 2 | 1 | 281,503 | 40 | 4,480 | 148,378,917,975 | 215,677 | | | | (#keys=1) "B"."BRIDGE_NO"[NUMBER,22], COUNT(*)[22] | 458 | |
+------------------+----------------+--------------+------------------+--------------+-----------------+----------------+-----------+----+-----------+-------+----------+---------+-------------+-------------+-----------------+---------+------------+-----------------------------+---------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------+------+--------------+
| HASH JOIN | | | | | | | | 3 | 2 | 3 | 1 | 281,497 | 16,084 | 1,801,408 | 148,366,537,976 | 215,677 | 24,126,000 | "G"."NCR_ID"="F"."NCR_ID" | | (#keys=1) "B"."BRIDGE_NO"[NUMBER,22] | 458 | |
+------------------+----------------+--------------+------------------+--------------+-----------------+----------------+-----------+----+-----------+-------+----------+---------+-------------+-------------+-----------------+---------+------------+-----------------------------+---------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------+------+--------------+
| HASH JOIN | | | | | | | | 4 | 3 | 4 | 1 | 96,996 | 209,778 | 21,607,134 | 13,549,630,814 | 90,985 | 22,725,000 | "E"."NCR_ID"="F"."NCR_ID" | | (#keys=1) "F"."NCR_ID"[NUMBER,22], "B"."BRIDGE_NO"[NUMBER,22] | 158 | |
+------------------+----------------+--------------+------------------+--------------+-----------------+----------------+-----------+----+-----------+-------+----------+---------+-------------+-------------+-----------------+---------+------------+-----------------------------+---------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------+------+--------------+
| HASH JOIN | | | | | | | | 5 | 4 | 5 | 1 | 42,595 | 208,419 | 20,216,643 | 5,484,063,163 | 40,162 | 9,839,000 | "C"."NOTE_ID"="E"."NOTE_ID" | REGEXP_LIKE ("C"."COMMENTS",'(^|\W)BN'||TO_CHAR("B"."BRIDGE_NO")||'(\W|$)','i') | (#keys=1) "B"."BRIDGE_NO"[NUMBER,22], "E"."NCR_ID"[NUMBER,22] | 70 | |
+------------------+----------------+--------------+------------------+--------------+-----------------+----------------+-----------+----+-----------+-------+----------+---------+-------------+-------------+-----------------+---------+------------+-----------------------------+---------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------+------+--------------+
| PARTITION RANGE | SINGLE | | | | | | | 6 | 5 | 6 | 1 | 1,039 | 104,603 | 8,577,446 | 62,280,224 | 1,011 | | | | "C"."NOTE_ID"[NUMBER,22], "C"."COMMENTS"[VARCHAR2,4000] | 2 | |
+------------------+----------------+--------------+------------------+--------------+-----------------+----------------+-----------+----+-----------+-------+----------+---------+-------------+-------------+-----------------+---------+------------+-----------------------------+---------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------+------+--------------+
| TABLE ACCESS | FULL | IACDB | IACD_NOTE | C#SEL$1 | 2 | TABLE | ANALYZED | 7 | 6 | 7 | 1 | 1,039 | 104,603 | 8,577,446 | 62,280,224 | 1,011 | | | "C"."CREATE_DATE"<TO_DATE(' 2014-12-31 00:00:00', 'syyyy-mm-dd hh24:mi:ss') | "C"."NOTE_ID"[NUMBER,22], "C"."COMMENTS"[VARCHAR2,4000] | 2 | SEL$81719215 |
+------------------+----------------+--------------+------------------+--------------+-----------------+----------------+-----------+----+-----------+-------+----------+---------+-------------+-------------+-----------------+---------+------------+-----------------------------+---------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------+------+--------------+
| MERGE JOIN | CARTESIAN | | | | | | | 8 | 5 | 6 | 2 | 24,267 | 12,268,270 | 184,024,050 | 2,780,501,758 | 23,033 | | | | (#keys=0) "B"."BRIDGE_NO"[NUMBER,22], "E"."NCR_ID"[NUMBER,22], "E"."NOTE_ID"[NUMBER,22] | 40 | |
+------------------+----------------+--------------+------------------+--------------+-----------------+----------------+-----------+----+-----------+-------+----------+---------+-------------+-------------+-----------------+---------+------------+-----------------------------+---------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------+------+--------------+
| TABLE ACCESS | FULL | IACDB | IACD_ASSET | B#SEL$1 | 1 | TABLE | ANALYZED | 9 | 8 | 7 | 1 | 7 | 40 | 160 | 560,542 | 7 | | | LENGTH(TO_CHAR("B"."BRIDGE_NO"))>1 | "B"."BRIDGE_NO"[NUMBER,22] | 1 | SEL$81719215 |
+------------------+----------------+--------------+------------------+--------------+-----------------+----------------+-----------+----+-----------+-------+----------+---------+-------------+-------------+-----------------+---------+------------+-----------------------------+---------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------+------+--------------+
| BUFFER | SORT | | | | | | | 10 | 8 | 7 | 2 | 24,259 | 308,248 | 3,390,728 | 2,779,941,216 | 23,026 | | | | (#keys=0) "E"."NCR_ID"[NUMBER,22], "E"."NOTE_ID"[NUMBER,22] | 40 | |
+------------------+----------------+--------------+------------------+--------------+-----------------+----------------+-----------+----+-----------+-------+----------+---------+-------------+-------------+-----------------+---------+------------+-----------------------------+---------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------+------+--------------+
| TABLE ACCESS | FULL | IACDB | IACD_NCR_NOTE | E#SEL$2 | 4 | TABLE | ANALYZED | 11 | 10 | 8 | 1 | 606 | 308,248 | 3,390,728 | 69,498,530 | 576 | | | | "E"."NCR_ID"[NUMBER,22], "E"."NOTE_ID"[NUMBER,22] | 1 | SEL$81719215 |
+------------------+----------------+--------------+------------------+--------------+-----------------+----------------+-----------+----+-----------+-------+----------+---------+-------------+-------------+-----------------+---------+------------+-----------------------------+---------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------+------+--------------+
| INDEX | FAST FULL SCAN | IACDB | PK_IACDNCR_NCRID | F#SEL$3 | | INDEX (UNIQUE) | ANALYZED | 12 | 4 | 5 | 2 | 31,763 | 22,838,996 | 137,033,976 | 3,248,120,913 | 30,322 | | | | "F"."NCR_ID"[NUMBER,22] | 52 | SEL$81719215 |
+------------------+----------------+--------------+------------------+--------------+-----------------+----------------+-----------+----+-----------+-------+----------+---------+-------------+-------------+-----------------+---------+------------+-----------------------------+---------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------+------+--------------+
| TABLE ACCESS | FULL | IACDB | IACD_NCR_IAC | G#SEL$4 | 8 | TABLE | ANALYZED | 13 | 3 | 4 | 2 | 181,461 | 1,731,062 | 15,579,558 | 134,407,812,606 | 121,833 | | | ALL THE SCHEMES CHCECKS | "G"."NCR_ID"[NUMBER,22] | 295 | SEL$81719215 |
+------------------+----------------+--------------+------------------+--------------+-----------------+----------------+-----------+----+-----------+-------+----------+---------+-------------+-------------+-----------------+---------+------------+-----------------------------+---------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------+------+--------------+
Hopefully thats legible enough
interms of indexes i assume only the fields that are being sorted is importent
crate_dt is not indexed
scheme id is indexed
Maybe my order in query is wrong...
The plan shows you're doing FULL TABLE SCAN of IACD_NOTE and IACD_ASSET, and then doing a CARETESIAN join of them, because you have provided no criteria for linking one record in IACD_ASSET to a set of records in IACD_NOTE.
That's not my definition of a non-intense query, and the eye-popping values for CPU cost bear that out.
You need to replace this ..,
FROM iacd_asset b INNER JOIN iacd_note c
ON REGEXP_LIKE(c.comments, '(^|\W)BN' || b.bridge_no || '(\W|$)', 'i')
... with an actual join on indexed columns. It would be helpful if Notes were linked to Assets by a foreign key of BRIDGE_NO or similar. I don't know your data model. Then you can use that regex as an additional filter in the WHERE clause.
Also you join to three further tables, to get to something which allows an additional filter on SCHEME. Again, I don't know your data model but this seems pretty inefficient.
Unfortunately this is the sort of tuning which relies on domain knowledge. Fixing this query requires understanding of the data - its volume, distribution and skew, the data model itself and the business logic your query implements. This is way beyond the scope of the advice we can offer in StackOverflow.
One thing to consider, but it is a big decision would be to index the comments with a free text index. However, that has lots of ramifications (especially space and database admin). Find out more.