Postgres query cannot find rows based on column value - sql

I want to select rows based on a column value. I know for a fact the column value exists. The first query returns 100 rows from the listing table. The second query, which looks for listings.OriginatingSystemName = 'mfrmls` returns nothing. Why?
(Removing the quotes or using double quotes does not work).
I am using pgAdmin4 to run these queries.
first query:
select * from listing limit 100;
second query:
select * from listing where 'listing.OriginatingSystemName' = 'mfrmls'
This produces a 'column does not exist' error:
select * from listing where OriginatingSystemName = 'mfrmls'

The correct syntax is to just write the column name in your WHERE statement:
SELECT * FROM listings WHERE "OriginatingSystemName" = 'mfrmls';
To elaborate further:
What your original query is doing is selecting every row in the listings table where the text string 'listings.OriginatingSystemName' is equal to this other text string 'mfrmls'. It is not actually grabbing the value from the column you want. No row in the table satisfies your where statement because your where statement is always false. Therefore, no rows are returned but the query was a success.
We need to implement the double quotes when dealing with case-sensitive identifiers. Here is some helpful documentation.

Related

How can you filter Snowflake EXPLAIN AS TABULAR syntax when its embedded in the TABLE function? Can you filter it with anything?

I have a table named Posts I would like to count and profile in Snowflake using the current Snowsight UI.
When I return the results via EXPLAIN using TABLULAR I am able to return the set with the combination of TABLE, RESULT_SCAN, and LAST_QUERY_ID functions, but any predicate or filter or column reference seems to fail.
Is there a valid way to do this in Snowflake with the TABLE function or is there another way to query the output of the EXPLAIN using TABLULAR?
-- Works
EXPLAIN using TABULAR SELECT COUNT(*) from Posts;
-- Works
SELECT t.* FROM TABLE(RESULT_SCAN(LAST_QUERY_ID())) as t;
-- Does not work
SELECT t.* FROM TABLE(RESULT_SCAN(LAST_QUERY_ID())) as t where operation = 'GlobalStats';
-- invalid identifier 'OPERATION', the column does not seem recognized.
Tried the third example and expected the predicate to apply to the function output. I don't understand why the filter works on some TABLE() results and not others.
You need to double quote the column name
where "operation"=
From the Documentation
Note that because the output column names from the DESC USER command
were generated in lowercase, the commands use delimited identifier
notation (double quotes) around the column names in the query to
ensure that the column names in the query match the column names in
the output that was scanned

PostgreSQL query is only returning one row when there are two rows that meet the query

When I run the query:
select * from table
I get 300 results. 2 of which have col_name with the same value: 10348-0. It's of type text.
When I run
select * from table where col_name = '10348-0'
I expect to be returned two rows return because two rows meet that criteria.
Instead I only get one row who's ID is one less than the other row. I.e. returned row id: 4556 unreturned row id: 4557. The ID column is of type serial integer.
Why am I only getting one row return when I add a "Where" clause to the query?
Presumably there are hidden characters or look-alike characters in the name -- so what looks the same is not necessarily the same.
The most common issue are leading or trailing spaces. I would suggest that you try finding these using:
col_name like '%10348-0%'
Another common problem are em-dashes versus en-dashes. You might also try:
col_name like '10348_0'
The _ is the LIKE wildcard to match exactly one character.

Why The Query Against HashKey returns no records

I am working on a new sql table. The table has a column [varbinary(8000)], where we are storing hash of a certain text. Now, I am trying to retrieve the same record back by using a where clause against the hashkey, but that yields zero records.
I have added a similar query here: http://sqlfiddle.com/#!18/be996/11
Try without the single quotes, like this
SELECT id, description
FROM ForgeRock
where id = 0x94EE059335E587E501CC4BF90613E0814F00A7B08BC7C648FD865A2AF6A22CC2
and you will get the expected result.

PostgreSQL: return row where any column value like variable

I am trying to have the user search for a value in a SQL table, and the user is returned with any row that contains that value. At the moment, I can make it work such that the code is:
SELECT * FROM table WHERE lower('foo') in (lower('col1'),lower('col2'),etc)
However, I would like it to be able to search every column and return any row LIKE 'foo'. For instance,
SELECT * FROM table WHERE (lower('col1'), lower('col2'), etc) like lower('%foo%')
But that doesn't work.
Any suggestions?
I believe you need to use multiple WHERE clauses instead of grouping them all into one statement. Try this:
SELECT * FROM table
WHERE lower(col1) like lower('%foo%')
OR lower(col2) like lower('%foo%')
OR etc like lower('%foo%')
You can convert the whole row to a string and then use LIKE on the result of that:
select *
from the_table
where lower(the_table::text) like '%foo%';
the_table::text returns all columns of each row as a comma separated list enclosed with parentheses, e.g. (42,Arthur,Dent). So the above is not 100 identical to a LIKE condition applied on each column - but probably does what you want.

SQLite WHERE-Clause for every column?

Does SQLite offer a way to search every column of a table for a searchkey?
SELECT * FROM table WHERE id LIKE ...
Selects all rows where ... was found in the column id. But instead to only search in the column id, I want to search in every column if the searchstring was found. I believe this does not work:
SELECT * FROM table WHERE * LIKE ...
Is that possible? Or what would be the next easy way?
I use Python 3 to query the SQLite database. Should I go the route to search through the dictionary after the query was executed and data returned?
A simple trick you can do is:
SELECT *
FROM table
WHERE ((col1+col2+col3+col4) LIKE '%something%')
This will select the record if any of these 4 columns contain the word "something".
No; you would have to list or concatenate every column in the query, or reorganize your database so that you have fewer columns.
SQLite has full-text search tables where you can search all columns at once, but such tables do not work efficiently with any other queries.
I could not comment on #raging-bull answer. So I had to write a new one. My problem was, that I have columns with null values and got no results because the "search string" was null.
Using coalesce I could solve that problem. Here sqlite chooses the column content, or if it is null an empty string (""). So there is an actual search string available.
SELECT *
FROM table
WHERE (coalesce(col1,"") || coalesce(col2,"") || coalesce(col3,"") || coalesce(col4,"")) LIKE '%something%')
I'm not quite sure, if I understood your question.
If you want the whole row returned, when id=searchkey, then:
select * from table where id=searchkey;
If you want to have specific columns from the row with the correct searchkey:
select col1, col2, col3 from table where id=searchkey;
If you want to search multiple columns for the "id": First narrow down which columns this could be found in - you don't want to search the whole table! Then:
select * from table where col1=searchkey or col2=searchkey or col3=searchkey;