What can I use instead of LIKE in the queries?
As LIKE has poor performance impact on the search condition.
My scenario:
SELECT
*
FROM
MetaDataTag
WHERE
Name LIKE 'SUN RCC%'
I cant use contains as a Full text index is required on the table, which I am not opting for.
Suggestions will be very helpful.
In your particular case, LIKE is not a bad option because you are only using a postfix wildcard. Your query can actually use an index on the Name column.
Have a look at this visual explanation why it works: http://use-the-index-luke.com/sql/where-clause/searching-for-ranges/like-performance-tuning
try this..
SELECT
*
FROM
MetaDataTag
WHERE PATINDEX('SUN RCC%', Name ) != 0
Related
I have a table called logs.lognotes, and I want to find a faster way to search for customers who do not have a specific word or phrase in the note. I know I can use "not like", but my question is, can you use DOES NOT CONTAINS to replace not like, in the same way you can use:
SELECT *
FROM table
WHERE CONTAINS (column, ‘searchword’)
Yes, you should be able to use NOT on any boolean expression, as mentioned in the SQL Server Docs here. And, it would look something like this:
SELECT *
FROM table
WHERE NOT CONTAINS (column, ‘searchword’)
To search for records that do not contain the 'searchword' in the column. And, according to
Performance of like '%Query%' vs full text search CONTAINS query
this method should be faster than using LIKE with wildcards.
You can also simply use this:
select * from tablename where not(columnname like '%value%')
I'm trying to improve the performance on this query. What are some things that I can do to optimize it?
SELECT * FROM bldng
WHERE bldng_type LIKE '%PTR%' OR bldng_type LIKE '%FACILITY-A%'
OR bldng_type LIKE '%FACILITY-B%' AND area_sqf > 500
Since you are using '%search string%' two percent signs on the ends breaks the index on those columns. It will be better to use a fulltext index instead. If you want real performance and accuracy you can use Lucene or Sphinx Full Text search engine.
It looks like you're trying to select out different categories of buildings. I would suggest putting a 'bldng_category' column on the bldng table. This way you can use simple queries like this:
SELECT * FROM bldng
WHERE bldng_category in ('PTR', 'FACILITY_B', 'FACILITY-B') AND area_sqf > 500
Due to the wildcard prefix, you're going to do a full scan of bldng. If you can remove the wildcard prefix, and add an index on bldng_type, your performance should increase greatly. having %text% is slow
If you are using SQL Server, you could use Table Hints (MSDN link). NOLOCK is common.
Indexing area_sqf and specifying only the columns you need rather than using * may help.
I am using a SQL Server database, and I am storing Malayalam names in my tables. I just want to write a query for filtering names using some Malayalam words.
SELECT * FROM table WHERE mal_Name LIKE '#word%'
#word contain Malayalam words.
How can I achieve this? Please share any ideas.
EDIT
This is my table, rm_Malayalam_name contains Malayalam name. And my Query is
SELECT *
FROM Purchase.tblRawMaterials
WHERE rm_malayalam_name LIKE '%കദളിപഴം%'
It doesn't work. The entry is there but while executing this query nothing is shown
Do you mean you want to do something like SELECT * FROM table WHERE mal_Name LIKE '%'+#word+'%'
This will work if #Word is a single word, however if you want to take in multiple words you are going to need something a bit more complex.
UPDATE:
Having seen your new edits I would suspect that the reason it is not selecting is due to the encoding
Try SELECT * FROM table WHERE mal_Name LIKE N'%'+#word+'%'
or
select * from Purchase.tblRawMaterials where rm_malayalam_name like N'%കദളിപഴം%'
I'd suggest reading up on SQL's full-text search It will almost certainly be more efficient than LIKE '%word%'
How many words will you be searching on? How big is the column? These factors might influence what the best option is.
I need to filter out records based on some text matching in nvarchar(1000) column.
Table has more than 400 thousands records and growing. For now, I am using Like condition:-
SELECT
*
FROM
table_01
WHERE
Text like '%A1%'
OR Text like '%B1%'
OR Text like '%C1%'
OR Text like '%D1%'
Is there any preferred work around?
SELECT
*
FROM
table_01
WHERE
Text like '%[A-Z]1%'
This will check if the texts contains A1, B1, C1, D1, ...
Reference to using the Like Condition in SQL Server
You can try the following if you know the exact position of your sub string:
SELECT
*
FROM
table_01
WHERE
SUBSTRING(Text,1,2) in ('B1','C1','D1')
Have a look at LIKE on msdn.
You could reduce the number filters by combining more details into a single LIKE clause.
SELECT
*
FROM
table_01
WHERE
Text like '%[ABCD]1%'
If you can create a FULLTEXT INDEX on that column of your table (that assumes a lot of research on performance and space), then you are probably going to see a big improvement on performance on text matching. You can go to this link to see what FULLTEXT SEARCH is
and this link to see how to create a FULLTEXT INDEX.
I needed to do this so that I could allow two different databases in a filter for the DatabaseName column in an SQL Server Profiler Trace Template.
All you can do is fill in the body of a Like clause.
Using the reference in John Hartscock's answer, I found out that the like clause uses a sort of limited regex pattern.
For the OP's scenario, MSMS has the solution.
Assuming I want databases ABCOne, ABCTwo, and ABCThree, I come up with what is essentially independent whitelists for each character:
Like ABC[OTT][NWH][EOR]%
Which is easily extensible to any set of strings. It won't be ironclad, that last pattern would also match ABCOwe, ABCTnr, or ABCOneHippotamus, but if you're filtering a limited set of possible values there's a good chance you can make it work.
You could alternatively use the [^] operator to present a blacklist of unacceptable characters.
I was curious since i read it in a doc. Does writing
select * from CONTACTS where id = ‘098’ and name like ‘Tom%’;
speed up the query as oppose to
select * from CONTACTS where name like ‘Tom%’ and id = ‘098’;
The first has an indexed column on the left side. Does it actually speed things up or is it superstition?
Using php and mysql
Check the query plans with explain. They should be exactly the same.
This is purely superstition. I see no reason that either query would differ in speed. If it was an OR query rather than an AND query however, then I could see that having it on the left may spped things up.
interesting question, i tried this once. query plans are the same (using EXPLAIN).
but considering short-circuit-evaluation i was wondering too why there is no difference (or does mysql fully evaluate boolean statements?)
You may be mis-remembering or mis-reading something else, regarding which side the wildcards are on a string literal in a Like predicate. Putting the wildcard on the right (as in yr example), allows the query engine to use any indices that might exist on the table column you are searching (in this case - name). But if you put the wildcard on the left,
select * from CONTACTS where name like ‘%Tom’ and id = ‘098’;
then the engine cannot use any existing index and must do a complete table scan.