In PostgresSQL I used array types for storing ids, uuids etc like:
CREATE TABLE IF NOT EXISTS eventing.notifications (event_type integer NOT NULL, transport_type integer, user_id uuid, user_ids uuid[]);
or
CREATE TABLE IF NOT EXISTS public.sources (name character varying(255), timestamp without time zone, read_only_organization_ids integer[], organization_id uuid);
Is there an equivalent using Microsoft SQL server in Azure?
This is blurring the lines between SQL Server and nosql, however you can do it by encoding your array into a json array and store that in a varchar(max) column.
Then to create the json array from some other table storing user ids you would use the for json clause.
To get the original array out of the varchar column you can cross apply with the openjson function:
declare #notifications table (user_ids varchar(max))
declare #user_ids varchar(max)
;with cte_jsonUser(jsonIds) as
(
select id
from (values(1), (2)) as tbluser(id)
for json auto
)
insert into #notifications(user_ids)
select replace(replace(jsonIds,'{"id":',''),'}','')
from cte_jsonUser
select user_ids from #notifications
-- user_ids
-- [1,2]
select i.user_ids
from #notifications as n
cross apply openjson(n.user_ids)
with (user_ids int '$') as i
-- user_ids
-- 1
-- 2
Related
Product type table contains product types. Some ids may missing :
create table artliik (liiginrlki char(3) primary key);
insert into artliik values('1');
insert into artliik values('3');
insert into artliik values('4');
...
insert into artliik values('999');
Property table contais comma separated list of types.
create table strings ( id char(100) primary key, kirjeldLku chr(200) );
insert into strings values ('item1', '1,4-5' );
insert into strings values ('item2', '1,2,3,6-9,23-44,45' );
Type can specified as single integer, e.q 1,2,3 or as range like 6-9 or 23-44
List can contain both of them.
How to all properties for given type.
Query
select id
from artliik
join strings on ','||trim(strings.kirjeldLku)||',' like '%,'||trim(artliik.liiginrlki)||',%'
returns date for single integer list only.
How to change join so that type ranges in list like 6-9 are also returned?
Eq. f list contains 6-9, Type 6,7,8 and 9 shoud included in report.
Postgres 13 is used.
I would suggest a helper function similar to unnest that honors ranges.
Corrected function
create or replace function unnest_ranges(s text)
returns setof text language sql immutable as
$$
with t(x) as (select unnest(string_to_array(s, ',')))
select generate_series
(
split_part(x, '-', 1)::int,
case when x ~ '-' then split_part(x, '-', 2)::int else x::int end,
1
)::text
from t;
$$;
Then you can 'normalize' table strings and join.
select *
from artliik a
join (select id, unnest_ranges(kirjeldLku) from strings) as t(id, v)
on a.liiginrlki = v;
The use of a function definition is of course optional. I prefer it because the function is generic and reusable.
dbfiddle.uk demo will only works on pg14, since only pg14 have multirange data type. But customizeable icu collation works in pg13.
Collation doc: https://www.postgresql.org/docs/current/collation.html
Idea: create a multirange text data type that will sort numeric value based on their numerical value. like 'A-21' < 'A-123'.
CREATE COLLATION testcoll_numeric (
provider = icu,
locale = '#colNumeric=yes'
);
CREATE TYPE textrange AS RANGE (
subtype = text,
multirange_type_name = mulitrange_of_text,
COLLATION = testcoll_numeric
);
So
SELECT
mulitrange_of_text (textrange ('1'::text, '11'::text)) #> '9'::text AS contain_9;
should return true.
artliik table structure remain the same, but strings table need to change a bit.
CREATE temp TABLE strings (
id text PRIMARY KEY,
kirjeldLku mulitrange_of_text
);
then query it:
SELECT DISTINCT
strings.id
FROM
artliik,
strings
WHERE
strings.kirjeldLku #> liiginrlki::text
ORDER BY
1;
I got a table with strings that look like that:
'9;1;test;A;11002'
How would I count how many semicolons are there before the 'A'?
Cheers!
Using string functions
select len(left(str,charindex(str,'A')) - len(replace(left(str,charindex(str,'A'), ';', '')) n
from tbl
Hint1: The whole issue has some smell... You should not store your data as CSV string. But sometimes we have to work with what we have...
Hint2: The following needs SQL-Server v2016. With an older version we'd need to do something similar based on XML.
Try this:
--A declared table to mockup your issue
DECLARE #tbl TABLE(ID INT IDENTITY, YourCSVstring VARCHAR(100));
INSERT INTO #tbl(YourCSVstring)
VALUES('9;1;test;A;11002');
--the query
SELECT t.ID
,A.*
FROM #tbl t
CROSS APPLY OPENJSON(CONCAT(N'["',REPLACE(t.YourCSVstring,';','","'),N'"]')) A;
The idea in short:
We use some replacements to translate your CSV-string to a JSON array.
Now we can use use OPENJSON() to read it.
The value is the array item, the key its zero-based index.
Proceed with this however you need it.
Just to give you some fun: You can easily read the CSV type-safe into columns by doubling the [[ and using WITH to specify your columns:
SELECT t.ID
,A.*
FROM #tbl t
CROSS APPLY OPENJSON(CONCAT(N'[["',REPLACE(t.YourCSVstring,';','","'),N'"]]'))
WITH(FirstNumber INT '$[0]'
,SecondNumber INT '$[1]'
,SomeText NVARCHAR(100) '$[2]'
,YourLetterA NVARCHAR(100) '$[3]'
,FinalNumber INT '$[4]')A
returns:
ID FirstNumber SecondNumber SomeText YourLetterA FinalNumber
1 9 1 test A 11002
Creating two types of table to store JSON values (BLOB and NCLOB) and my aim to search the required values from JSON table;
Code Snippet 1: Creating table with one column as NCLOB.
create table departments_json_nclob (
department_id integer not null primary key,
department_data NCLOB not null
);
Simple insert with multibyte character (that is value : məharaːʂʈrə):
insert into departments_json_nclob
values ( 200,'{"department_list":[{"Deptname":"DEPT-A", "value" : "məharaːʂʈrə"}]}');
Code Snippet 2: Now, I have created one other table with BLOB datatype:
create table departments_json (
department_id integer not null primary key,
department_data blob not null
);
Added constraint to allow only JSON
alter table departments_json
add constraint dept_data_json
check ( department_data is JSON FORMAT JSON STRICT );
Insert normal JSON
insert into departments_json
values ( 100, utl_raw.cast_to_raw ('{"department_list":[{"Deptname":"DEPT-A", "value" : "məharaːʂʈrə"}]}'));
Insertion Verified from below query:
SELECT json_value(department_data format json, '$.department_list.value' )
FROM departments_json JS
WHERE DEPARTMENT_ID=100;
output is: məharaːʂʈrə
Now, I will have one more insertion in same table i.e. 'departments_json' but this time I will take required insertion value from NCLOB table departments_json_nclob:
declare
i nclob;
begin
select department_data into i from departments_json_nclob where department_id =200;
--inserting same way as I inserted in departments_json for department_id 100 but value comes from NCLOB
insert into departments_json
values ( 101, utl_raw.cast_to_raw (i));
commit;
end;
Again, insertion verified with below query:
SELECT json_value(department_data format json, '$.department_list.value' )
FROM departments_json JS
WHERE DEPARTMENT_ID=101;
output is: məharaːʂʈrə
Now my question is:
When I search for multibyte character, query return result of one query only that is direct insertion in BLOB table. Which is DEPARTMENT_ID=100 - why not 101?
Below query:
SELECT *
FROM departments_json
WHERE JSON_value(department_data format json, '$.department_list.value') = ('məharaːʂʈrə');
SELECT *
FROM departments_json
WHERE JSON_TEXTCONTAINS(department_data, '$.department_list.value', 'məharaːʂʈrə')
below query shows which character are multibyte:
select c, length(c), lengthb(c)
from ( select substr(s, level, 1) c
from ( select 'məharaːʂʈrə' s
from dual)
connect by level <= length(s));
I have a JSON column in one of the tables, and the JSON column has no key or property, only the value.
I tried to parse the column with JSON_Query and JSON_Value, but both of these functions only work if the JSON string has a key, but in my situation, the JSON string has no key.
So how can I parse the column from the top table to the bottom table in SQL Server like the image below?
Please try this:
DECLARE #Table TABLE (ID INT, [JSONColumn] NVARCHAR(MAX));
INSERT INTO #Table(ID,[JSONColumn])VALUES
(151616,'["B0107C57WO","B066EYU4IY"]')
,(151617,'["B0088MD64S"]')
;
SELECT t.ID,j.[value]
FROM #Table t
CROSS APPLY OPENJSON(t.JSONColumn) j
;
I have a table column consist with the XML files. I want to read XML data and display it.
I come up with the following code. But it read only one row in the column
want to display other XML data also
declare #xml xml
select #xml = event_data_XML from #temp
SELECT * FROM (
SELECT
CAST(f.x.query('data(#name)') as varchar(150)) as data_name,
CAST(f.x.query('data(value)') as varchar(150)) as data_value
FROM #xml.nodes('/event') as t(n)
CROSS APPLY t.n.nodes('data') as f(x)) X
PIVOT (MAX(data_value) FOR data_name IN (NTDomainName, DatabaseName, ServerName)) as pvt
Output should be like this(NTDomainName, DatabaseName, ServerName are xml data)
There are a bunch of ways you could do this. I'll show you a way I think you'd find easiest.
To start, here's a table with a little test data:
CREATE TABLE dbo.stuff (
id int identity (1,1) primary key
, event_data_xml xml
, create_date datetime default(getdate())
, is_active bit default(1)
);
INSERT INTO dbo.stuff (event_data_xml)
VALUES ('<event name="thing" package="as">something</event>')
INSERT INTO dbo.stuff (event_data_xml)
VALUES ('<event name="otherthing" package="as">something else</event>')
---All records
SELECT * FROM dbo.[stuff];
Make sense so far? Here's the query I'd use if I wanted to mix XML data and column data:
---Parsed up
SELECT event_data_xml.value('/event[1]', 'nvarchar(max)') AS [parsed element #text]
, event_data_xml.value('/event[1]/#name', 'nvarchar(max)') AS [parsed attribute value]
, create_date --column from table
FROM dbo.stuff
WHERE is_active = 1;
Using the value() function on the XML column passing in an xpath to what I want to display and SQL Server data type for how I want it returned.
Just make sure you're selecting a single value with your xpath expression.