How to get Order Numbers(189,190) from following TrigParams field in SQL server 2014 Database.
TrigParams
{"OWLSObjKey":{"key":"OWLSObjKey","value":"189","type":null},"OWLSObjType":{"key":"OWLSObjType","value":"17","type":null},"ObjKey":{"key":"ObjKey","value":"189","type":null},"ObjType":{"key":"ObjType","value":"17","type":null}}
{"OWLSObjKey":{"key":"OWLSObjKey","value":"190","type":null},"OWLSObjType":{"key":"OWLSObjType","value":"17","type":null},"ObjKey":{"key":"ObjKey","value":"190","type":null},"ObjType":{"key":"ObjType","value":"17","type":null}}
You see from the other answers, that SQL-Server 2016+ with broad JSON support would be of great help. But without you are not lost. You can use string methods:
credits to Panagiotis Kanavos for the MCVE
declare #table table (trigparams nvarchar(2000))
insert into #table
values
('{"OWLSObjKey":{"key":"OWLSObjKey","value":"189","type":null},"OWLSObjType":{"key":"OWLSObjType","value":"17","type":null},"ObjKey":{"key":"ObjKey","value":"189","type":null},"ObjType":{"key":"ObjType","value":"17","type":null}}'),
('{"OWLSObjKey":{"key":"OWLSObjKey","value":"190","type":null},"OWLSObjType":{"key":"OWLSObjType","value":"17","type":null},"ObjKey":{"key":"ObjKey","value":"190","type":null},"ObjType":{"key":"ObjType","value":"17","type":null}}')
--This is the query
select LEFT(CutOff,CHARINDEX('"',CutOff)-1)
from #table t
CROSS APPLY(SELECT STUFF(t.trigparams,1,CHARINDEX('"value":"',t.trigparams)+8,'')) A(CutOff);
The idea in short:
Within the APPLY we will use STUFF() to write nothing over the first characters up to the number you are looking for (after the first occurance of "value":". This is returned as Column CutOff. We can now use LEFT() to pick the number only.
The question doesn't specify the server version, the table schema or whether the string represents a single value or the contents of two separate rows.
I'll just assume it's SQL Server 2016 which supports JSON and the text comes from two separate rows. I'll also assume the query only needs to return the data as single value.
In this case a simple call to JSON_VALUE('$.OWLSObjKey.value') will return the data. JSON_VALUE returns a single value from a well-formed JSON string :
declare #table table (trigparams nvarchar(2000))
insert into #table
values
('{"OWLSObjKey":{"key":"OWLSObjKey","value":"189","type":null},"OWLSObjType":{"key":"OWLSObjType","value":"17","type":null},"ObjKey":{"key":"ObjKey","value":"189","type":null},"ObjType":{"key":"ObjType","value":"17","type":null}}'),
('{"OWLSObjKey":{"key":"OWLSObjKey","value":"190","type":null},"OWLSObjType":{"key":"OWLSObjType","value":"17","type":null},"ObjKey":{"key":"ObjKey","value":"190","type":null},"ObjType":{"key":"ObjType","value":"17","type":null}}')
select JSON_VALUE(trigparams,'$.OWLSObjKey.value') As SomeKey
from #table
This returns :
SomeKey
189
190
Assuming you're using SQL Server 2016+, use OPENJSON:
SELECT O.OrderNumber
FROM (VALUES('{"OWLSObjKey":{"key":"OWLSObjKey","value":"189","type":null},"OWLSObjType":{"key":"OWLSObjType","value":"17","type":null},"ObjKey":{"key":"ObjKey","value":"189","type":null},"ObjType":{"key":"ObjType","value":"17","type":null}}'),
('{"OWLSObjKey":{"key":"OWLSObjKey","value":"190","type":null},"OWLSObjType":{"key":"OWLSObjType","value":"17","type":null},"ObjKey":{"key":"ObjKey","value":"190","type":null},"ObjType":{"key":"ObjType","value":"17","type":null}}')) V(TrigParams)
CROSS APPLY OPENJSON(V.TrigParams) WITH (OWLSObjKey nvarchar(MAX) AS JSON) OK
CROSS APPLY OPENJSON(OK.OWLSObjKey) WITH (OrderNumber int '$.value') O;
If not, then SQL Server is not your friend for this without using a CLR Function. If you can't use CLR, then I would suggest using a different tool to read the JSON value.
Related
How do I remove duplicates in the following case in T-SQL?
I have a table with a column Code of type varchar(max).
It contains column value like truck/rail/truck/rail. I need the cell value to be truck/rail.
Other possibility is truck/rail/ship/truck need to be truck/rail/ship.
By using table valued function.
Thanks.
You can use String_Split along with String_agg to remove the duplicates.
DECLARE #t table(id int, val varchar(max))
insert into #t values(1,'truck/rail/truck/rail'), (2,'truck/rail/ship/truck')
SELECT t.id,STRING_AGG(splitval,'/') as newval FROM #t as t
cross apply (
SELECT distinct value from string_split(t.val,'/')) as ca(splitval)
group by t.id
id
newval
1
rail/truck
2
rail/ship/truck
Note1: String_Split, does not guarantee order. So, your concatenated results might be in different order from the original list, after duplicates removal. If you want to preserve the order, then we have to go for different solution using xml nodes or json array.
Note2: String_Split was introduced in SQL Server 2016. String_agg was introduced in SQL Server 2017. So, if you are using versions before that, you have to go for recursive CTE and CHARINDEX based solution.
If you know that the error exists, then just do an UPDATE where you replace the truck/rail/truck/rail with truck/rail using the REPLACE(Code,'truck/rail/truck/rail',truck/rail).
The same goes for your truck/rail/ship/truck issue.
If you need automatic detection and correction to be done, that's a whole 'nuther story but could still be done using nested REPLACES. Detection of the issue is the hard part. Personally, I'd be having a talk with the people that are providing the data.
I'm having a query with IN clause with 2000 parameters of varchar2 type. I have applied required index as per execution plan.
It is working very slowly. I'm looking for an alternative solution. One way I found is creating a temporary table with these value and fetch using join. Is there any way other than this? I'm using spring data JPA/Criteria for queries in Java. Thanks in advance.
Bulk Load the values into a temprary table, either using SqlServerBulkCopy directly or using the useBulkCopyForBatchInsert.
Use a Table-Valued Parameter
Or (SQL 2016+) send the values as a JSON array. Just create a long string of the form
["Value1","Value2","Value3"]
and pass it as a parameter to a query like:
select *
from SomeTable
where SomeColumn in ( select value from openjson(#jsonValues) with (value varchar(200) '$') )
Or do the same thing with XML.
You can also use string_split, but that's also only available on SQL 2016+ and JSON is more robust.
I know strings need to be prefixed with N' in SQL Server (2012) INSERT statements to store them as UNICODE but do they have to be retrieved (SELECT statement) in a certain way as well so they are in UNICODE?
I am able to store international strings correctly with N notation but when I run SELECT query to fetch the records back, it comes as question marks. My query is very simple.
SELECT COLUMN1, COLUMN2 FROM TABLE1
I am looking at other possible reasons that may have caused this issue but at least I want to eliminate the SQL statement above. Should it read COLUMN1 and COLUMN2 columns correctly when they both store UNICODE strings using N notation? Do I have to do anything to the statement to tell it they are UNICODE?
Within management studio you should not need to do anything special to display the correct values. Make sure that the columns in your table is defined as Unicode strings NVARCHAR instead of ANSI strings VARCHAR.
The following example demonstrates the concept:
CREATE TABLE UnicodeExample
(
MyUnicodeColumn NVARCHAR(100)
,MYANSIColumn VARCHAR(100)
)
INSERT INTO UnicodeExample
(
MyUnicodeColumn
,MYANSIColumn
)
VALUES
(
N'איש'
,N'איש'
)
SELECT *
FROM UnicodeExample
DROP TABLE UnicodeExample
In the above example the column MyUnicodeColumn is defined as an NVARCHAR(100) and MYANSIColumn is defined as a VARCHAR(100). The query will correctly return the result for MyUnicodeColumn but will return ??? for MYANSIColum.
I'm looking for an efficient way of how to convert rows to columns in SQL Server. I tried in Toad for Oracle, but now I want it in SQL Server.
This is my example:
CID SENTENCE
1 Hello; Hi;
2 Why; What;
The result should be like
CID SENTENCE
1 Hello
1 Hi
2 Why
2 What
Would you please help me with it?
I would advise you to rethink your database design. It's almost never a good idea to store data in a delimited string in any relational database.
If it's impossible to change your database design, you need to use some UDF to split strings.
There are many different approaches to split strings in sql server, read this article on the differences between the common ways.
You can probably change your chosen split string function to take the cid as well as the sentence as a variable and have it return the data exactly as your desired output for each row in your table.
Then all you have to do is a select from your table with an inner join with the udf on the cid.
try
declare #var table (CID int, SENTENCE varchar(50))
insert into #var(CID,SENTENCE) values
(1,'Hello; Hi;'),
(2,'Why; What;')
select cid,t.c.value('.','varchar(50)') as val
from (select cid,x=cast('<t>'+ replace(stuff(sentence,len(sentence),1,''),';','</t><t>')+'</t>' as xml)
from #var) a cross apply x.nodes('/t') t(c)
I have a database column containing a string that might look something like this u/1u/3u/19/g1/g4 for a particular row.
Is there a performant way to get all rows that have at least one of the following elements ['u/3', 'g4'] in that column?
I know I can use AND clauses, but the number of elements to verify against varies and could become large..
I am using RoR/ActiveRecord in my project.
in sql server, you can use XML to convert your list of search params into a record set, then cross join that with the base table, and do charIndex() to see if the column contains the substring.
Since i don't know your table or column names, i used a table (persons) that i already had data in, which has a column 'phone_home'. To search for any phone number that contains '202' or '785' i would use this query:
select person_id,phone_home,Split.data.value('.', 'VARCHAR(10)')
from (select *, cast('<n>202</n><n>785</n>' as XML) as myXML
from persons) as data cross apply myXML.nodes('/n') as Split(data)
where charindex(Split.data.value('.', 'VARCHAR(10)'),data.phone_Home) > 0
you will get duplicate records if it matches more than one value, so throw a distinct in there and remove the Split from the select statement if that is not desired.
Using xml in sql is voodoo magic to me...i got the idea from this post http://www.sqljason.com/2010/05/converting-single-comma-separated-row.html
no idea what performance is like...but at least there aren't any cursors or dynamic sql.
EDIT: Casting the XML is pretty slow, so i made it a variable so it only gets cast once.
declare #xml XML
set #xml = cast('<n>202</n><n>785</n>' as XML)
select person_id,phone_home,Split.persons.value('.', 'VARCHAR(10)')
from persons cross apply #xml.nodes('/n') as Split(persons)
where charindex(Split.persons.value('.', 'VARCHAR(10)'),phone_Home) > 0