Hello I want to display results from unrelated tables where a text string exists in a column which is common to all tables in the database.
I can get the desired result with this:
SELECT *
FROM Table1
WHERE Title LIKE '%Text%'
UNION
SELECT *
FROM Table2
WHERE Title LIKE '%Text%'`
However my question is is there a more efficient way to go about this as I need to search dozens of tbls. Thanks for any help you can give!
ps the system I am using supports most dialects but would prefer to keep it simple with SQL Server as that is what I am used to.
There is a SP script you can find online called SearchAllTables (http://vyaskn.tripod.com/search_all_columns_in_all_tables.htm).
When you call it pass in the string, it will return the tables and columns as well as the full string.
You can modify it to work with other datatypes quite easily. It's a fantastic resource for tasks exactly like yours.
Related
Thank you for checking my question out!
I'm trying to write a query for a very specific problem we're having at my workplace and I can't seem to get my head around it.
Short version: I need to be able to target columns by their name, and more specifically by a part of their name that will be consistent throughout all the columns I need to combine or compare.
More details:
We have (for example), 5 different surveys. They have many questions each, but SOME of the questions are part of the same metric, and we need to create a generic field that keeps it. There's more background to the "why" of that, but it's pretty important for us at this point.
We were able to kind of solve this with either COALESCE() or CASE statements but the challenge is that, as more surveys/survey versions continue to grow, our vendor inevitably generates new columns for each survey and its questions.
Take this example, which is what we do currently and works well enough:
CASE
WHEN SURVEY_NAME = 'Service1' THEN SERV1_REC
WHEN SURVEY_NAME = 'Notice1' THEN FNOL1_REC
WHEN SURVEY_NAME = 'Status1' THEN STAT1_REC
WHEN SURVEY_NAME = 'Sales1' THEN SALE1_REC
WHEN SURVEY_NAME = 'Transfer1' THEN Null
ELSE Null
END REC
And also this alternative which works well:
COALESCE(SERV1_REC, FNOL1_REC, STAT1_REC, SALE1_REC) as REC
But as I mentioned, eventually we will have a "SALE2_REC" for example, and we'll need them BOTH on this same statement. I want to create something where having to come into the SQL and make changes isn't needed. Given that the columns will ALWAYS be named "something#_REC" for this specific metric, is there any way to achieve something like:
COALESCE(all columns named LIKE '%_REC') as REC
Bonus! Related, might be another way around this same problem:
Would there also be a way to achieve this?
SELECT (columns named LIKE '%_REC') FROM ...
Thank you very much in advance for all your time and attention.
-Kendall
Table and column information in Db2 are managed in the system catalog. The relevant views are SYSCAT.TABLES and SYSCAT.COLUMNS. You could write:
select colname, tabname from syscat.tables
where colname like some_expression
and syscat.tabname='MYTABLE
Note that the LIKE predicate supports expressions based on a variable or the result of a scalar function. So you could match it against some dynamic input.
Have you considered storing the more complicated properties in JSON or XML values? Db2 supports both and you can query those values with regular SQL statements.
I am using a SQL Server database, and I am storing Malayalam names in my tables. I just want to write a query for filtering names using some Malayalam words.
SELECT * FROM table WHERE mal_Name LIKE '#word%'
#word contain Malayalam words.
How can I achieve this? Please share any ideas.
EDIT
This is my table, rm_Malayalam_name contains Malayalam name. And my Query is
SELECT *
FROM Purchase.tblRawMaterials
WHERE rm_malayalam_name LIKE '%കദളിപഴം%'
It doesn't work. The entry is there but while executing this query nothing is shown
Do you mean you want to do something like SELECT * FROM table WHERE mal_Name LIKE '%'+#word+'%'
This will work if #Word is a single word, however if you want to take in multiple words you are going to need something a bit more complex.
UPDATE:
Having seen your new edits I would suspect that the reason it is not selecting is due to the encoding
Try SELECT * FROM table WHERE mal_Name LIKE N'%'+#word+'%'
or
select * from Purchase.tblRawMaterials where rm_malayalam_name like N'%കദളിപഴം%'
I'd suggest reading up on SQL's full-text search It will almost certainly be more efficient than LIKE '%word%'
How many words will you be searching on? How big is the column? These factors might influence what the best option is.
I'm building an abstract gem. i need a sql query that looks like this
SELECT * FROM my_table WHERE * LIKE '%my_search%'
is that possible?
edit:
I don't care about querys performance because it's a feature function of a admin panel, which is used once a month. I also don't know what columns the table has because it's so abstract. Sure i could use some rails ActiveRecord functions to find all the columns but i hoped to avoid adding this logic and just using the *. It's going to be a gem, and i can't know what db is going to be used with it. Maybe there is a sexy rails function that helps me out here.
As I understand the question, basically you are trying to build a sql statement which should check for a condition across all columns in that table. A dirty hack, but this generates the required Sql.
condition_string = MyTable.column_names.join(' LIKE ? OR ')
MyTable.all(:conditions => [condition_string, '%my_search%'])
However, this is not tested. This might work.
* LIKE '...' isn't valid according to the SQL standards, and not supported by any RDBMS I'm aware of. You could try using a function like CONCAT to make the left argument of LIKE, though performance won't be good. As for SELECT *, it's generally something to be avoided.
No, SQL does not support that syntax.
To search all columns you need to use procedures or dynamic SQL. Here's another SO question which may help:
SQL: search for a string in every varchar column in a database
EDIT: Sorry, the question I linked to is looking for a field name, not the data, but it might help you write some dynamically SQL to build the query you need.
You didn't say which database you are using, as there might be a vendor specific solution.
Its only an Idea, but i think it worth testing!
It depends on your DB you can get all Columns of a table, in MSSQL for example you can use somethink like:
select name from syscolumns where id=object_id('Tablename')
Under Oracle guess its like:
select column_name from USER_TAB_COLUMNS where TABLE_NAME = 'Tablename'
and then you will have to go through these columns usign a procedure and maby a cursor so you can check for each Column if the data your searching for is in there:
if ((select count(*) from Tablename where Colname = 'searchingdata') > 0)
then keep the results in a separated table(ColnameWhereFound, RecNrWhereFound).
The matter of Datatye may be an Issue if you try to compare strings with numbers, but if you notice for instance under SQL-Server the syscolumns table contains a column called "usertype" which contains a number seems to refer to the Datatype stored in the Columne, like 2 means string and 7 means int, and 2 means smallint, guess Oracle would have something similar too.
Hope this helps.
I need to filter out records based on some text matching in nvarchar(1000) column.
Table has more than 400 thousands records and growing. For now, I am using Like condition:-
SELECT
*
FROM
table_01
WHERE
Text like '%A1%'
OR Text like '%B1%'
OR Text like '%C1%'
OR Text like '%D1%'
Is there any preferred work around?
SELECT
*
FROM
table_01
WHERE
Text like '%[A-Z]1%'
This will check if the texts contains A1, B1, C1, D1, ...
Reference to using the Like Condition in SQL Server
You can try the following if you know the exact position of your sub string:
SELECT
*
FROM
table_01
WHERE
SUBSTRING(Text,1,2) in ('B1','C1','D1')
Have a look at LIKE on msdn.
You could reduce the number filters by combining more details into a single LIKE clause.
SELECT
*
FROM
table_01
WHERE
Text like '%[ABCD]1%'
If you can create a FULLTEXT INDEX on that column of your table (that assumes a lot of research on performance and space), then you are probably going to see a big improvement on performance on text matching. You can go to this link to see what FULLTEXT SEARCH is
and this link to see how to create a FULLTEXT INDEX.
I needed to do this so that I could allow two different databases in a filter for the DatabaseName column in an SQL Server Profiler Trace Template.
All you can do is fill in the body of a Like clause.
Using the reference in John Hartscock's answer, I found out that the like clause uses a sort of limited regex pattern.
For the OP's scenario, MSMS has the solution.
Assuming I want databases ABCOne, ABCTwo, and ABCThree, I come up with what is essentially independent whitelists for each character:
Like ABC[OTT][NWH][EOR]%
Which is easily extensible to any set of strings. It won't be ironclad, that last pattern would also match ABCOwe, ABCTnr, or ABCOneHippotamus, but if you're filtering a limited set of possible values there's a good chance you can make it work.
You could alternatively use the [^] operator to present a blacklist of unacceptable characters.
I am wondering how others would handle a scenario like such:
Say I have multiple choices for a user to choose from.
Like, Color, Size, Make, Model, etc.
What is the best solution or practice for handling the build of your query for this scneario?
so if they select 6 of the 8 possible colors, 4 of the possible 7 makes, and 8 of the 12 possible brands?
You could do dynamic OR statements or dynamic IN Statements, but I am trying to figure out if there is a better solution for handling this "WHERE" criteria type logic?
EDIT:
I am getting some really good feedback (thanks everyone)...one other thing to note is that some of the selections could even be like (40 of the selections out of the possible 46) so kind of large. Thanks again!
Thanks,
S
What I would suggest doing is creating a function that takes in a delimited list of makeIds, colorIds, etc. This is probably going to be an int (or whatever your key is). And splits them into a table for you.
Your SP will take in a list of makes, colors, etc as you've said above.
YourSP '1,4,7,11', '1,6,7', '6'....
Inside your SP you'll call your splitting function, which will return a table-
SELECT * FROM
Cars C
JOIN YourFunction(#models) YF ON YF.Id = C.ModelId
JOIN YourFunction(#colors) YF2 ON YF2.Id = C.ColorId
Then, if they select nothing they get nothing. If they select everything, they'll get everything.
What is the best solution or practice for handling the build of your query for this scenario?
Dynamic SQL.
A single parameter represents two states - NULL/non-existent, or having a value. Two more means squaring the number of parameters to get the number of total possibilities: 2 yields 4, 3 yields 9, etc. A single, non-dynamic query can contain all the possibilities but will perform horribly between the use of:
ORs
overall non-sargability
and inability to reuse the query plan
...when compared to a dynamic SQL query that constructs the query out of only the absolutely necessary parts.
The query plan is cached in SQL Server 2005+, if you use the sp_executesql command - it is not if you only use EXEC.
I highly recommend reading The Curse and Blessing of Dynamic SQL.
For something this complex, you may want a session table that you update when the user selects their criteria. Then you can join the session table to your items table.
This solution may not scale well to thousands of users, so be careful.
If you want to create dynamic SQL it won't matter if you use the OR approach or the IN approach. SQL Server will process the statements the same way (maybe with little variation in some situations.)
You may also consider using temp tables for this scenario. You can insert the selections for each criteria into temp tables (e.g., #tmpColor, #tmpSize, #tmpMake, etc.). Then you can create a non-dynamic SELECT statement. Something like the following may work:
SELECT <column list>
FROM MyTable
WHERE MyTable.ColorID in (SELECT ColorID FROM #tmpColor)
OR MyTable.SizeID in (SELECT SizeID FROM #tmpSize)
OR MyTable.MakeID in (SELECT MakeID FROM #tmpMake)
The dynamic OR/IN and the temp table solutions work fine if each condition is independent of the other conditions. In other words, if you need to select rows where ((Color is Red and Size is Medium) or (Color is Green and Size is Large)) you'll need to try other solutions.