How to write a Kusto query to find two consecutive rows that have the same value in a field - azure-log-analytics

I need to write a Kusto query for Azure Log Analysis that finds consecutive events that have the same value in a field (same error code). We basically need to find if the requests fail twice in a row.
The case where a request fails, one succeeds and one fails is not to be returned.

Assuming you have a table with Id, Datetime, and a ErrorCode, you can utilize prev() function to achieve this:
https://learn.microsoft.com/en-us/azure/kusto/query/prevfunction
datatable(Id:string, Datetime:datetime, ErrorCode:string)
[
'1', datetime(2018-10-16 00:00), 'Error 1',
'1', datetime(2018-10-16 00:01), 'Error 1',
'2', datetime(2018-10-16 00:02), 'Error 1',
'2', datetime(2018-10-16 00:03), 'Error 2',
]
| order by Id, Datetime asc
| extend prevErrorCode = prev(ErrorCode), prevId=prev(Id)
| where prevErrorCode==ErrorCode and prevId == Id

Related

How to store an array of date ranges in Postgres?

I am trying to build a schedule, I generate an array of objects on the client containing date ranges
[
{start: "2020-07-06 0:0", end: "2020-07-10 23:59"},
{start: "2020-07-13 0:0", end: "2020-07-17 23:59"}
]
I have a column of type daterange[] what is the proper way to format this data to insert it into my table?
This is what I have so far:
INSERT INTO schedules(owner, name, dates) VALUES (
1,
'work',
'{
{[2020-07-06 0:0,2020-07-10 23:59]},
{[2020-07-13 0:0,2020-07-17 23:59]}
}'
)
I think you want:
insert into schedules(owner, name, dates) values (
1,
'work',
array[
'[2020-07-06, 2020-07-11)'::daterange,
'[2020-07-13, 2020-07-18)'::daterange
]
);
Rationale:
you are using dateranges, so you cannot have time portions (for this, you would need tsrange instead); as your code stands, it seems like you want an inclusive lower bound and an exclusive upper bound (hence [ at the left side, and ) at the right side)
explicit casting is needed so Postgres can recognize the that array elements have the proper datatype (otherwise, they look like text)
then, you can surround the list of ranges with the array[] constructor
Demo on DB Fiddle:
owner | name | dates
----: | :--- | :----------------------------------------------------
1 | work | {"[2020-07-06,2020-07-11)","[2020-07-13,2020-07-18)"}

Possible to query for data in multiple keys in the same query?

I'm trying to lookup some data in my database using a string, but utilizing multiple json keys to check in.
I know you can lookup using a single key in this case 'name'
SELECT *
FROM products_product
WHERE data->>'name' IN ('Value 1', 'Value 2')
My issue is that i have two types of json values being saved and therefore need to check both 'name' and 'title'
As an example to simplify it:
{
"name": "Value 1"
}
while the second maybe looking like this:
{
"title": "Value 2"
}
How can i do a single query check both 'name' and 'title' for the values i got?
I was expecting maybe to be able to do something like
SELECT *
FROM products_product
WHERE data->>('name', 'title') IN ('Value 1', 'Value 2')
But its not possible!
Would want to avoid having to do multiple queries as i know the keys the data can be placed in and got them as a list or tuple (whatever is needed)
So my final question is:
How can i query the database to check multiple fields (if they exist) for the values i have defined?
I'm expecting both the examples to be returned in a single query and not just one of them
You can use an OR
SELECT *
FROM products_product
WHERE data->>'name' IN ('Value 1', 'Value 2')
OR data->>'title' IN ('Value 1', 'Value 2')
You can slightly simplify this, by only specifying the values once:
with input (val) as (
values ('Value1'), ('Value2') --<< note the parentheses!
)
SELECT *
FROM products_product
WHERE data->>'name' IN (select val from input)
OR data->>'title' IN (select val from input)
You don't say what version you are using. You can do this with json path in postgres 12. The syntax is a bit obtuse, but it's possible:
SELECT * FROM products_product
WHERE jsonb_path_exists(data,
'strict $.keyvalue() ? ((#.value == "Value 1" || #.value == "Value 2") && (#.key == "name" || #.key == "title"))');
This will search every key/value combo at the root of your object and look for keys equalling name or title and values equalling Value 1 or Value 2.

Get values filtered by jsonb key

I have a table called houses that has a jsonb column called details . The structure of the details jsonb column is:
{
owner_id: TEXT,
owner_name: TEXT,
offers: [{ offer_id: TEXT, offer_value_id: TEXT, offer_value_name: TEXT }]
}
Notice, that sometimes offers can be completely empty, such as offers: []
So, right now I have a query that lets me get all the distinct house owners ordered by owner_name. It looks like this:
SELECT distinct ("details"->>'owner_id') as "identifier", "details"->>'owner_name' as "name"
FROM houses
order by "details"->>'owner_name' asc
I want to do something similar, but now I want to get all the different offer values but just for a specific offer_id. Here is some sample data followed but what I would expect:
id, details
1, { owner_id: '1', owner_name: 'john', offers: [] }
2, { owner_id: '2', owner_name: 'charles', offers: [ { offer_id: '1', offer_value_id: '1', offer_value_name: 'offer1'}, { offer_id: '2', offer_value_id: '2', offer_value_name: 'offer2'}] }
3, { owner_id: '3', owner_name: 'melissa', offers: [ { offer_id: '2', offer_value_id: '5', offer_value_name: 'a offer 3'} ]
4, { owner_id: '3', owner_name: 'melissa', offers: [ { offer_id: '6', offer_value_id: '8', offer_value_name: 'offer10'} ]
So, say I want to get all the different offer value ids and value names when the offer_id is '2'. The result would be:
identifier (this would be offer_value_id), name (this would be offer_value_name)
'5', 'a offer 3'
'2', 'offer2'
null, null
Notice that there is null, null because there is at least two rows that don't have any offers where offer_id is 1, and I want to get that too. Also, notice that the values are ordered by offer_value_name NULLs being last.
I've tried the following but is not working:
SELECT distinct ("details"->>'offers'->>'offer_value_id') as "identifier", ("details"->>'offers'->>'offer_value_name') as "name"
FROM houses
WHERE "details"->>'offers'->>'offer_id' = '2'
order by "details"->>'offers'->>'offer_value_name' asc
And I don't think this approach would work, because if the details offers don't have an offer_id, I also want it to select NULL, this would just filter it out.
I think this would work:
SELECT DISTINCT "offers"->>'offer_value_id' as "identifier", "offers"->>'offer_value_name' as "name"
FROM houses
LEFT JOIN jsonb_array_elements("details"->'offers') "offers" ON "offers"->>'offer_id' = '1';
ORDER BY "offers"->>'offer_value_name' NULLS LAST
You know you want to get all the records regardless of whether that offer with id 1 exists or not, that's why you do a LEFT JOIN.
The other thing to notice here is jsonb_array_elements that is helpful because it expands that json to a set of json values. That way you can access offers as if it were a top level field.

SQL server query on json string for stats

I have this SQL Server database that holds contest participations. In the Participation table, I have various fields and a special one called ParticipationDetails. It's a varchar(MAX). This field is used to throw in all contest specific data in json format. Example rows:
Id,ParticipationDetails
1,"{'Phone evening': '6546546541', 'Store': 'StoreABC', 'Math': '2', 'Age': '01/01/1951'}"
2,"{'Phone evening': '6546546542', 'Store': 'StoreABC', 'Math': '2', 'Age': '01/01/1952'}"
3,"{'Phone evening': '6546546543', 'Store': 'StoreXYZ', 'Math': '2', 'Age': '01/01/1953'}"
4,"{'Phone evening': '6546546544', 'Store': 'StoreABC', 'Math': '3', 'Age': '01/01/1954'}"
I'm trying to get a a query runing, that will yield this result:
Store, Count
StoreABC, 3
StoreXYZ, 1
I used to run this query:
SELECT TOP (20) ParticipationDetails, COUNT(*) Count FROM Participation GROUP BY ParticipationDetails ORDER BY Count DESC
This works as long as I want unique ParticipationDetails. How can I change this to "sub-query" into my json strings. I've gotten to this query, but I'm kind of stuck here:
SELECT 'StoreABC' Store, Count(*) Count FROM Participation WHERE ParticipationDetails LIKE '%StoreABC%'
This query gets me the results I want for a specific store, but I want the store value to be "anything that was put in there".
Thanks for the help!
first of all, I suggest to avoid any json management with t-sql, since is not natively supported. If you have an application layer, let it to manage those kind of formatted data (i.e. .net framework and non MS frameworks have json serializers available).
However, you can convert your json strings using the function described in this link.
You can also write your own query which works with strings. Something like the following one:
SELECT
T.Store,
COUNT(*) AS [Count]
FROM
(
SELECT
STUFF(
STUFF(ParticipationDetails, 1, CHARINDEX('"Store"', ParticipationDetails) + 9, ''),
CHARINDEX('"Math"',
STUFF(ParticipationDetails, 1, CHARINDEX('"Store"', ParticipationDetails) + 9, '')) - 3, LEN(STUFF(ParticipationDetails, 1, CHARINDEX('"Store"', ParticipationDetails) + 9, '')), '')
AS Store
FROM
Participation
) AS T
GROUP BY
T.Store

Using oracle decode

I have read an article regarding rename data from oracle column. I follow one of the query posted, but when I try on my own. I just only get NULL values. Please help me, what's wrong in my DECODE QUERY.
The original data of status column is 'no answer' and 'answer'
Thanks.
Here's my query
select call_time, decode(status, 'no answer', 'hey', 'answer', 'yes'), channel
FROM APP_ACCOUNT.CC_CALL;
And the output of this is:
call_time decode(status, 'no answer', 'hey', 'answer', 'yes') CHANNEL
10/22/2013 NULL DAHDI/i1/
11/05/2013 NULL DAHDI/i2/
Instead of:
call_time decode(status, 'no answer', 'hey', 'answer', 'yes') CHANNEL
10/22/2013 yes DAHDI/i1/
11/05/2013 hey DAHDI/i2/
When use decode, you should always supply with a default value
decode(value from db, matching 1, alternative value1, matching2, alternative value,...matching n, alternative value n, DEFAULT VALUE)
so that you won't have null returned if all the matches are failed