I am using PostgreSQL and I would like to get row/rows(depending on the query), by giving it a value and searching all of the available columns of my table.
How would I go about checking every column for a value ? I also am looking at checking for different types of values
If the columns are a and b and the text you are searching for is 'test' then something like this will do the trick:
select *
from table1
where 'test' in (a, b);
Note: this will only work if all your columns have the same datatype and the same datatype with the value you search for.
Related
Update:
This question has no value and can be deleted. The syntax shown in the question actually works well and is likely the best approach.
I'd like to insert NULL into one column as part of a "select into". In the example below, I try to copy columns a and b from table_1 to table_2, and in the same query, insert NULL into table_2.c
I've tried this:
INSERT INTO table_2 (a, b, c)
SELECT a, b, NULL
FROM table_1
But I get ERROR 42601 (syntax_error) syntax error at or near "NULL".
Appreciate any guidance on this.
Nonsense. Your statement is syntactically correct.
None of the suggestions here are necessary. The error you report should not occur.
Also, SELECT INTO (like you wrote in error) is an unrelated command - the use of which is discouraged in favor of CREATE TABLE AS.
Typically, you can just omit columns that shall be NULL from the target list - unless a different default value is set for the column. But it's typically bad style to omit the target column list altogether (exceptions apply).
INSERT INTO table_2 (a, b)
SELECT a, b
FROM table_1;
If column table_2.c has no different DEFAULT value in the table definition, it defaults to NULL. To be precise:
How to use default value of data type as column default?
Shorter; but there is nothing wrong with your original query. In fact, it's the best way.
Assume I have a query that returns a result set of columns A and B from table First_Table. I want to limit the result set to those columns if the value of column X in table Second_Table is 0, and I want to add column C from table First_Table if the value of column X is 1.
The problem is easily resolved using a Python for example whereby I just have a variable as an empty string if value in column X is 0 or it would be equal to the string 'First_Table.ColumnC AS [Dynamic Value],', and I just format the sql in the script accordingly.
If Else solution is not an elegant way because I have multiple columns to add dynamically depending on multiple values...
I am just looking for some ideas on directions.. I have been looking at this for a while, might be bogged up
Dynamic sql is the best way to resolve this as suggested in the comments.
I have a varchar field in my database table A let's call it store_name, this field gets its value from entity A, now entity B enters store_name into a different database table B now I want to get all records in table A where the store_name matches the values in table B.
How would you recommend me doing the query as I don't control the values of those 2 fields?
What do you think about PostgreSQL fuzzystrmatch?
The tables contain thousands of records.
Thanks
Assuming that both table A and table B are in the same database. And I guess since you don't control insertion of data, you are not sure if the values are of same case or there may be a spelling mismatch.
Case 1: If the problem is only of case-mismatch, you can use ilike:
Select a.store_name
from a, b
Where a.store_name ilike b.store_name
Case 2: If you also want to check for spelling mismatch, but words sound similar, then after installing postgresql-contrib package and creating extension fuzzystrmatch, you can use:
Select a.store_name
from a, b
Where a.store_name ilike b.store_name OR
soundex(a.store_name) = soundex(b.store_name)
If you are dealing with names, which may not always be in English, it may be more appropriate to use metaphone or dmetaphone function instead of soundex.
Documentation: Fuzzystrmatch
If you want matching you can use a straight up join.
Select a.store_name
from a
join b on a.store_name = b.store_name;
If you want to use fuzzy matching just use the various functions available in the join criteria. Documentation here
Note: there are some limitations to Fuzzy string matching so i would advise testing each out on values that you either know match or don't.
So I am comparing two Oracle databases by grabbing random rows in database A, and searching for these rows in database B based off their key columns. Then I compare the rows which are returned in java.
I am using the following query to find rows in database B using the key columns from database A:
select * from mytable
Where (Key_Column_A,Key_Column_B,Key_Column_C)
in (('1','A', 'cat'),('2','B', 'dog'),('3','C', ''));
This works just fine for the first two sets of keys, but the third key('3','C', '') does not work because there is a null value in the third column. Changing the statement to ('3','C', NULL) or changing the SQL to
select * from mytable
Where (Key_Column_A,Key_Column_B,Key_Column_C)
in ((('1','A', 'cat'),('2','B', 'dog'),('3','C', ''))
OR (Key_Column_A,Key_Column_B,Key_Column_C) IS NULL);
will not work either.
Is there a way to include a null column in an IN clause? And if not, is there a way to efficiently do the same thing? (My only solution currently is to create a check to make sure there are no nullable columns in my keys which would make this process rather unefficient and somewhat messy).
You can use it this way. I think it would work.
select * from mytable
Where (NVL(Key_Column_A,''),NVL(Key_Column_B,''),NVL(Key_Column_C,''))
in (('1','A', 'cat'),('2','B', 'dog'),('3','C', ''));
I am not sure about this (Key_Column_A,Key_Column_B,Key_Column_C) IS NULL. Wouldn't this imply that all of the columns (A,B,C) are NULL ?
Does SQLite offer a way to search every column of a table for a searchkey?
SELECT * FROM table WHERE id LIKE ...
Selects all rows where ... was found in the column id. But instead to only search in the column id, I want to search in every column if the searchstring was found. I believe this does not work:
SELECT * FROM table WHERE * LIKE ...
Is that possible? Or what would be the next easy way?
I use Python 3 to query the SQLite database. Should I go the route to search through the dictionary after the query was executed and data returned?
A simple trick you can do is:
SELECT *
FROM table
WHERE ((col1+col2+col3+col4) LIKE '%something%')
This will select the record if any of these 4 columns contain the word "something".
No; you would have to list or concatenate every column in the query, or reorganize your database so that you have fewer columns.
SQLite has full-text search tables where you can search all columns at once, but such tables do not work efficiently with any other queries.
I could not comment on #raging-bull answer. So I had to write a new one. My problem was, that I have columns with null values and got no results because the "search string" was null.
Using coalesce I could solve that problem. Here sqlite chooses the column content, or if it is null an empty string (""). So there is an actual search string available.
SELECT *
FROM table
WHERE (coalesce(col1,"") || coalesce(col2,"") || coalesce(col3,"") || coalesce(col4,"")) LIKE '%something%')
I'm not quite sure, if I understood your question.
If you want the whole row returned, when id=searchkey, then:
select * from table where id=searchkey;
If you want to have specific columns from the row with the correct searchkey:
select col1, col2, col3 from table where id=searchkey;
If you want to search multiple columns for the "id": First narrow down which columns this could be found in - you don't want to search the whole table! Then:
select * from table where col1=searchkey or col2=searchkey or col3=searchkey;