SQL Checking Substring in a String - sql

I have a table with column mapping which store record: "IV=>0J,IV=>0Q,IV=>2,V=>0H,V=>0K,VI=>0R,VI=>1,"
What is the sql to check whether or not a substring is in column mapping.
so, I would like this:
if I have "IV=>0J" would return true, because IV=>0J is exact in string "mapping"
if I have "IV=>01" would return false. And so on...
I try this:
SELECT * FROM table WHERE charindex('IV=>0J',mapping)
But when I have "IV=>0", it returns TRUE. But, it should return FALSE.
Thank You..

You can search with commas included. Just also add one at beginning and end of mapping:
SELECT * FROM table WHERE charindex(',IV=>0J,',',' + mapping + ',') <> 0
or
SELECT * FROM table WHERE ',' + mapping + ',' LIKE '%,IV=>OJ,%'

This should do the trick:
SELECT * FROM table
WHERE
mapping LIKE '%,IV=>0J,%'
OR mapping LIKE '%,IV=>0J'
OR mapping LIKE 'IV=>0J,%'
OR mapping = 'IV=>0J'
But you should really normalize the database - you are currently violating the principle of atomicity, and therefore the 1NF. Your current difficulties in querying and the future difficulties with performance that you are about to encounter all stem from this root problem...

While you can search by including a comma in the string, this is a bad design for several reasons.
You are unable to take advantage of indexing
You force a full scan of the table, which will lead to bad performance AND excessive blocking.
You have to make sure that there is always a leading or a trailing comma (depends on what you expect in your LIKE expression).
You are no longer able to edit a single entry, you'll have to replace the entire string each time you want to change even a single mapping.
You open yourself to a concurrency nightmare if more that one users try to update different mappings that just happen to be stored in the same column.
Your table isn't even in 1st normal form any more, which is why you have such difficulties
You should normalize your mapping column, by extracting the data to a different mapping table, with at least the From and To columns you require. You can then add these columns to an index an convert your query using only a single index seek.
You can also add the ID values of your source table to the Mappings table and the index. This will allow you to convert the lookup for a source row to a join between the two tables that takes advantage of indexing

charindex returns the position of the text, not Boolean.
to check if the text exists, compare to 0:
SELECT * FROM table WHERE charindex('IV=>0J',mapping) <> 0

I think you're missing something here, the Charindex function does not return TRUE or FALSE.
It returns the starting point of the substring inside master string, or if the substring is not present, then -1.
So you query should read,
SELECT * FROM table WHERE charindex('IV=>0J',mapping) > 0

Related

Oracle REGEXP_LIKE function to use index table scan?

I am using regexp_like function to search specific patterns on a column. But, I see this query is not taking the index created on this column instead going for full table scan. Is there any option to create function based index for regexp_like so that my query will use that index? Here, the pattern SV4889 is not constant expression but it will vary every time.
select * from test where regexp_like(id,'SV4889')
Yup. Regular expressions do not use indexes. What can you do?
Well, if you are just looking for equality, then use equality:
where id = 'SV4889'
This will use an index on (id).
If you are looking for a leading value, then use like:
where id like 'SV4889%'
This will use an index because the wildcard is at the end of the pattern.
If you are storing multiple values in the column, say 'SV4889,SV4890' then fix your data model. It is broken! You should have another table with one row per id.
Finally, if you really need more sophisticated full text capabilities, then look into Oracle's support for full text indexes. However, such capabilities are usually not needed on a column called id.
You can add a virtual column to your table to determine if the substring you're interested in exists in the field, then index the virtual column. For example:
ALTER TABLE TEST
ADD SV4889_FLAG CHAR(1)
GENERATED ALWAYS AS (CASE
WHEN REGEXP_LIKE(ID,'SV4889') THEN 'Y'
ELSE 'N'
END) VIRTUAL;
This adds a field named SV4889_FLAG to your table which will contain Y if the text SV4889 exists in the ID field, and N if it doesn't. Then you can create an index on the new field:
CREATE INDEX IDX_TEST_SV4889_FLAG
ON TEST (SV4889_FLAG);
So to determine if a row has 'SV4889' in it you can use a query such as:
SELECT *
FROM TEST
WHERE SV4889_FLAG = 'Y'
db<>fiddle here

SqlQuery and SqlFieldsQuery

it looks that SqlQuery only supports sql that starts with select *? Doesn't it support other sql that only select some columns like
select id, name from person and maps the columns to the corresponding POJO?
If I use SqlFieldQuery to run sql, the result is a QueryCursor of List(each List contains one record of the result). But if the sql starts with select *, the this list's contents would be different with field query like:
select id,name,age from person
For the select *, each List is constructed with 3 parts:
the first elment is the key of the cache
the second element is the pojo object that contains the data
the tailing element are the values for each column.
Why was it so designed? If I don't know what the sql that SqlFieldsQuery runs , then I need additional effort to figure out what the List contains.
SqlQuery returns key and value objects, while SqlFieldsQuery allows to select specific fields. Which one to use depends on your use case.
Currently select * indeed includes predefined _key and _val fields, and this will be improved in the future. However, generally it's a good practice to list fields you want to fetch when running SQL queries (this is true for any SQL database, not only Ignite). This way your code will be protected from unexpected behavior in case schema is changed, for example.

SQL - just view the description for explanation

I would like to ask if it is possible to do this:
For example the search string is '009' -> (consider the digits as string)
is it possible to have a query that will return any occurrences of this on the database not considering the order.
for this example it will return
'009'
'090'
'900'
given these exists on the database. thanks!!!!
Use the Like operator.
For Example :-
SELECT Marks FROM Report WHERE Marks LIKE '%009%' OR '%090%' OR '%900%'
Split the string into individual characters, select all rows containing the first character and put them in a temporary table, then select all rows from the temporary table that contain the second character and put these in a temporary table, then select all rows from that temporary table that contain the third character.
Of course, there are probably many ways to optimize this, but I see no reason why it would not be possible to make a query like that work.
It can not be achieved in a straight forward way as there is no sort() function for a particular value like there is lower(), upper() functions.
But there is some workarounds like -
Suppose you are running query for COL A, maintain another column SORTED_A where from application level you keep the sorted value of COL A
Then when you execute query - sort the searchToken and run select query with matching sorted searchToken with the SORTED_A column

Easy way to transfer/update a list of numbers to be used in the SQL 'in' command?

I'm always being given a large list of say id's which I need to search in our database have manually put them into a sql statement like the follow which can take a while putting single quotes around each number followed by a comma, I was hoping someone has a easy way of doing this for me? Or am I just being a bit lazy...
select * from blah where idblah in ('1234-A', '1235-A', '1236-A' ................)
You can use the worlds' simplest code generator.
Just paste in the list of values, setup the pattern and voila... you have a set of quoted values.
I have also used Excel in the past, using the CONCAT function with smart paste.
I would set aside a table to hold the values and have my queries JOIN against that table. Set up a simple import script (don't forget to clear out the table at the start) and something like this is a breeze. Run the import, run the query. You never have to touch the query again or regenerate any code.
As an example:
CREATE TABLE Search_ID_List (
id VARCHAR(20) NOT NULL,
CONSTRAINT PK_Search_ID_List PRIMARY KEY CLUSTERED (id)
)
and:
SELECT
<column list>
FROM
Search_ID_List SIL
INNER JOIN Blah B ON
B.id = SIL.id
If you want to be able to save past search criteria or have multiple searches available to you at the same time then you can just add an identifying column which gets filled in by your import. It can be the file from where the ids came, some descriptive code/name, or whatever. Then just add that to the WHERE clause of your query and you're all set.
You could do something like this.
select * from blah where ',' + '1234-A,1235-A,1236-A' + ',' LIKE ',%' + idblah + '%,'
This pattern is super useful when you're being passed a comma delimited list of values to filter by, but I think would be applicable here as well.

Use a LIKE statement on SQL Server XML Datatype

If you have a varchar field you can easily do SELECT * FROM TABLE WHERE ColumnA LIKE '%Test%' to see if that column contains a certain string.
How do you do that for XML Type?
I have the following which returns only rows that have a 'Text' node but I need to search within that node
select * from WebPageContent where data.exist('/PageContent/Text') = 1
Yet another option is to cast the XML as nvarchar, and then search for the given string as if the XML vas a nvarchar field.
SELECT *
FROM Table
WHERE CAST(Column as nvarchar(max)) LIKE '%TEST%'
I love this solution as it is clean, easy to remember, hard to mess up, and can be used as a part of a where clause.
This might not be the best performing solution, so think twice before deplying it to production. It is however very usefull for a quick debug session, which is where I mostly use it.
EDIT: As Cliff mentions it, you could use:
...nvarchar if there's characters that don't convert to varchar
You should be able to do this quite easily:
SELECT *
FROM WebPageContent
WHERE data.value('(/PageContent/Text)[1]', 'varchar(100)') LIKE 'XYZ%'
The .value method gives you the actual value, and you can define that to be returned as a VARCHAR(), which you can then check with a LIKE statement.
Mind you, this isn't going to be awfully fast. So if you have certain fields in your XML that you need to inspect a lot, you could:
create a stored function which gets the XML and returns the value you're looking for as a VARCHAR()
define a new computed field on your table which calls this function, and make it a PERSISTED column
With this, you'd basically "extract" a certain portion of the XML into a computed field, make it persisted, and then you can search very efficiently on it (heck: you can even INDEX that field!).
Marc
Another option is to search the XML as a string by converting it to a string and then using LIKE. However as a computed column can't be part of a WHERE clause you need to wrap it in another SELECT like this:
SELECT * FROM
(SELECT *, CONVERT(varchar(MAX), [COLUMNA]) as [XMLDataString] FROM TABLE) x
WHERE [XMLDataString] like '%Test%'
This is what I am going to use based on marc_s answer:
SELECT
SUBSTRING(DATA.value('(/PAGECONTENT/TEXT)[1]', 'VARCHAR(100)'),PATINDEX('%NORTH%',DATA.value('(/PAGECONTENT/TEXT)[1]', 'VARCHAR(100)')) - 20,999)
FROM WEBPAGECONTENT
WHERE COALESCE(PATINDEX('%NORTH%',DATA.value('(/PAGECONTENT/TEXT)[1]', 'VARCHAR(100)')),0) > 0
Return a substring on the search where the search criteria exists