i have this table
id
status
outgoing
1
paid
{"a945248027_14454878":"processing","a945248027_14454878":"cancelled","a945248027_14454878":"completed"}
I am trying to extract the value after underscore and the processes i.e 14454878, "processing", "cancelled" and "completed"
I tried extracting the keys using this query on metabase
SELECT *,
CAST(substring(key from '_([^_]+)$') AS INTEGER) as Volume,
substring(outgoing::varchar from ':"([a-z]*)' ) as Status
FROM table
CROSS JOIN LATERAL json_object_keys(outgoing) AS j(key);
However, upon splitting i got this
Key
Volume
Status
a945248027
14454878
processing
a945248027
14454878
processing
a945248027
14454878
processing
Whereas this is what i am trying to achieve
Key
Volume
Status
a945248027
14454878
processing
a945248027
14454878
cancelled
a945248027
14454878
completed
Please help
Related
I want to schedule one Google Cloud query for every day, and I also want to receive email alerts whenever any table's size exceeds 1 TB. Is it possible that?
With INFORMATION_SCHEMA.TABLE_STORAGE the size of all tables in a project and a region can be obtained. The error raises an alert and for schedulded queries an email notification can be set.
For each region the project is using you need to set up a schedule query.
SELECT
STRING_AGG(summary),if(count(1)>0,error(concat(count(1)," tables too large, total: ",sum(total_logical_bytes)," list: " ,STRING_AGG(summary) )),"")
FROM
(
SELECT
project_id,
table_name,
SUM(total_logical_bytes) AS total_logical_bytes,
CONCAT(project_id,'.',project_id,'=',SUM(total_logical_bytes) ) AS summary
FROM
`region-eu`.INFORMATION_SCHEMA.TABLE_STORAGE
GROUP BY
1,
2
HAVING
total_logical_bytes> 1024*1024 # 1MB Limit
ORDER BY
total_logical_bytes DESC
)
The inner query obtaines all tables in eu-region and filters these above 1 MB. The outer query checks for more than one item in the if statement and raises an alert with error.
I am trying to sort the event hub input data using Stream Analytics and populating the output in ADLS 2 in CSV format but the data is not sorted.
I used azure function (Timer trigger) which fetches the data from SQL Server sequentially (order by Id field) and sending the batch event data to event hub.
In Stream Analytics, I am using the event hub stream as Input and the ADLS 2 as output.
I tried both, CollectTop() and TopOne() to sort the data.
Also, I am using "TIMESTAMP BY SeenTime"
Below is the ASA query:
eventdata: Input
adls2: Output
With
Step1 as (
SELECT eventdata.Id as Id, eventdata.ClientMacAddress, eventdata.SeenEpoch,
eventdata.SeenTime, System.Timestamp() t
FROM eventdata
TIMESTAMP BY SeenTime
),
Step2 as (
SELECT TopOne() OVER (ORDER BY Id asc) as topEvent
FROM Step1
Group by TumblingWindow(minute, 10)
)
SELECT udf.convertJsonToString(topEvent)
INTO
[adls2]
FROM Step2
Appreciate your help in advance.
I have a day partitioned table with approx 300k rows in the streaming buffer. When running an interactive, non-cached, standard SQL query using
SELECT .. FROM .. WHERE _PARTITIONTIME IS NULL
The query validator says:
Valid: This query will process 0 B when run.
And after executing, the job information tab says:
Bytes Processed 0 B
Bytes Billed 0 B
The query is certainly returning real-time results each time I run it. Is this actually a free operation?
I have high transactional service with oracle db as backend.Many number of clients will be calling our service to get the data.When we receive the request for data, we need to query the db, get the result set and send them in the paginated way.I dont want the query.But wanted to know what really happens.say if the result set has 20,000 rows, and if we need to send 100 data per page in repsonse,how can i mention that there is remaining set of data, in the response,so that the client needs to hit our service to get the next next pages?.Say the response is in json format.How should the resposne format look like?.I'm new to oracle.Thanks for your help.
To select the data the paginated way, try
select order_id, order_descr
from (select order_id, order_descr, row_number() over(order by order_date desc) r
from orders
where customer_id = 123)
where r between 1 and 101
to display order 1 upto 100 (first page) of customer_id 123.
If you receive at client side more than 100 rows of data then more data exists.
The inner select statement with order by clause is necessary.
I have a set of tables in Access 2007 that I need to get to display the total number of items received. We order items against a job using the job number as a common reference (like an ID).
Each job has multiple items required. Most items have multiple shipments we receive. Each shipment is given a unique receiving ticket number, so they need to be entered individually and totaled.
I have:
tblJobItems :JobNumber, Item, QtyNeeded
tblReceived :JobNumber, Item, QtyRecvd, RTNumber (RT = Receiving Ticket)
tblJobs : JobNumber, JobQty (and more, but not relevant to this issue)
(JobQty is not always the same as the item's QtyNeeded. The job is like a run of a certain model of computer, the job qty is how many of that model. Items are sometimes 1:1, like a case or power supply, but can be 2:1 or 3:1 like having multiple hard drives.)
I have a query that works fine to show the number of items placed on order, but we want to expand it (or combine with other queries) to show the total number of items received per the job number on the same line. Eventually I'll use this number to change the status and other functions.
SELECT tblJobItems.JobNumber, tblJobItems.Item, tblJobItems.QTYNeeded, tblJobItems.PartStatus, First(tblJobs.BDT) AS FirstOfBDT, First(DateAdd("ww",-2,[BDT])) AS DueBy
FROM tblJobItems INNER JOIN tblJobs ON tblJobItems.JobNumber = tblJobs.JobNumber
GROUP BY tblJobItems.JobNumber, tblJobItems.Item, tblJobItems.QTYNeeded, tblJobItems.PartStatus;
This shows me in a listbox the items ordered and how many, the JobNumber is stored as ([Tempvars]![JobNum]), and the listbox only shows the records that match the JubNumber. (the tempvar is global, so it can be used in a query if that helps anyone)
I'm not opposed to having this go through two or three queries to get to the answer.
It turns out that the key piece needed in my query's SQL was:
Sum(tblReceived.ReceivedQTY) AS SumOfReceivedQTY, IIf(IsNull([SumofReceivedQTY]),0,[SumofReceivedQTY]) AS RECQTY
This sums up the quantity and also creates a new column in the query with the totals.