Using NOT, MIN, AND in SQL having clause creating error - sql

I am working on Ingres9.2 version. When I execute a query like
select col1 from table1 group by col1 having not((5=min(col1)) and (0 = 1))
it raising an error as:
bad select or subselect target list has been found.
But the error not occurring if I change the query with any one of following conditions:
0=0 or 1=1 instead of 0=1.
Removing not.
Using or instead of and.
Removing min.
I am not able to find the reason for this behaviour. And the same error not occurring in other database which is also in Ingres. If anyone knows the reason, please explain it.

Ingres is an old database. Maybe it has a problem with using an aggregation function on a group by key. This is speculation, but this should work:
select col1
from table1
where not (5 = col1) and (0 = 1))
group by col1 ;
Of course, the condition would be more likely written as col1 <> 5, but the two forms are equivalent.

Related

Why doesn't this to_date work, when the results have been filtered to match my date format (Oracle SQL)

I have a table 'A' with one column (VARCHAR2). The table contains a row containing the text '01/01/2021' and another row with the text 'A'.
When I try to filter out 'A' and then to_date the remaining value, I get 'ORA-01858: a non-numeric character was found where a numeric was expected'. I've tried this in 2 ways.
select *
from tbl
where col <> 'A'
and to_Date(col,'DD/MM/YYYY') = to_date('01/01/2020','DD/MM/YYYY');
select *
from ( select *
from tbl
where col <> 'A')
where to_Date(col,'DD/MM/YYYY') = to_date('01/01/2020','DD/MM/YYYY');
I can understand why the first might not work, but in the second example, the to_date should ONLY ever see filtered data (i.e. '01/01/2020').
When I delete the value of 'A', the statement runs and I get my result back so it seems conclusive that the reason it isn't running is because it's trying to to_date the value of 'A', even though that should have been filtered out by then.
I have been able to replicate this using actual Oracle tables but unfortunately when I try and reproduce the tables using WITH AS, the query works and no error is encountered - another mystery!
Why doesn't this query work? The order of operation seems to be satisfied (and it works if I use WITH AS).
Oracle (and other databases) are under no obligation to evaluate the predicate applied to an inline view before evaluating the outer predicate. Frequently, in fact, from a performance optimization standpoint, you want the optimizer to push a selective predicate from an outer query into a view, inline view, or subquery. In this case, whether the query throws an error will depend on the query plan the optimizer chooses and which predicate it actually evaluates first.
As a quick hack, you can change the inline view to prevent predicates from being pushed. In this case, the presence of a rownum stops the optimizer from pushing the predicate. You could also use hints like no_push_pred to try to force the optimizer to use the plan you want
select *
from ( select t.*, rownum rn
from tbl t
where col <> 'A')
where to_Date(col,'DD/MM/YYYY') = to_date('01/01/2020','DD/MM/YYYY');
The issue with either of these quick hacks, though, is that some future version of the optimizer might have more options than you are aware of today so you may have problems in the future.
A better option is to rewrite the query such that you don't care what order the predicates are evaluated. In this case (depending on Oracle version), that's pretty easy since to_date allows you to specify a value when there is a conversion error
select *
from tbl
where col <> 'A'
and to_Date(col default null on conversion error,'DD/MM/YYYY') =
to_date('01/01/2020','DD/MM/YYYY');
If you're on an earlier version of Oracle or to_date is just an example of the actual problem, you can create a custom function that does the same thing.
create function safe_to_date( p_str in varchar2, p_fmt in varchar2 )
return date
is
begin
return to_date( p_str, p_fmt );
exception
when value_error
then
return null;
end safe_to_date;
select *
from tbl
where col <> 'A'
and safe_to_date(col,'DD/MM/YYYY') = to_date('01/01/2020','DD/MM/YYYY');

Why Does One SQL Query Work and the Other Does Not?

Please disregard the obvious problems with the manipulation of data in the where clause. I know! I'm working on it. While working on it, though, I discovered that this query runs:
SELECT *
FROM PatientDistribution
WHERE InvoiceNumber LIKE'PEX%'
AND ISNUMERIC(CheckNumber) = 1
AND CONVERT(BIGINT,CheckNumber) <> TransactionId
And this one does not:
SELECT *
FROM PatientDistribution
WHERE InvoiceNumber LIKE'PEX%'
AND CONVERT(BIGINT,CheckNumber) <> TransactionId
AND ISNUMERIC(CheckNumber) = 1
The only difference between the two queries is the order of items in the WHERE clause. I was under the impression that the SQL Server query optimizer would take the worry out of me having to worry about that.
The error returned is: Error converting data type varchar to bigint.
You are right, the order of the conditions shouldn't matter.
If AND ISNUMERIC(CheckNumber) = 1 is checked first and non-matching rows thus dismissed, then AND CONVERT(BIGINT,CheckNumber) <> TransactionId will work (for exceptions see scsimon's answer).
If AND CONVERT(BIGINT,CheckNumber) <> TransactionId is processed before AND ISNUMERIC(CheckNumber) = 1 then you may get an error.
That your first query worked and the second not was a matter of luck. It could just as well have been vice versa.
You can force one condition to be executed before the other:
SELECT *
FROM
(
SELECT *
FROM PatientDistribution
WHERE InvoiceNumber LIKE 'PEX%'
AND ISNUMERIC(CheckNumber) = 1
) num_only
WHERE CONVERT(BIGINT,CheckNumber) <> TransactionId;
You just got lucky that the first one worked, since you are correct that the order of what you list in the where clause does not matter. SQL is a declarative language meaning that you are telling the engine what should happen, not how. So your queries weren't executed with the same query plan I would suspect. Granted, you can affect what the optimizer does to a certain extent. You'll also notice this type of issue when using a CTE. For example:
declare #table table(columnName varchar(64))
insert into #table
values
('1')
,('1e4')
;with cte as(
select columnName
from #table
where isnumeric(columnName) = 1)
select
cast(columnName as decimal(32,16))
from cte
The above snippet you would assume that the second statement is ran on the results / subset from the CTE statement. However, you can't ensure this will happen and you could still get a type/conversion error on the second statement.
More importantly, you should know that ISNUMERIC() is largely misused. People often think that if it returns 1 then it could be converted to a decimal or int. But this isn't the case. It just checks that it's a valid numeric type. For example:
select
isnumeric('1e4')
,isnumeric('$')
,isnumeric('1,123,456')
As you can see, these evaluate to true, but would fail the conversion you put in your post.
Side note, your indexes are likely the reason why the first actually didn't error our.

sql server rewrites my query incorrectly?

There is a dirty data in input.
We are trying to cleanup dataset and then make some calculations on cleared data.
declare #t table (str varchar(10))
insert into #t select '12345' union all select 'ABCDE' union all select '111aa'
;with prep as
(
select *, cast(substring(str, 1, 3) as int) as str_int
from #t
where isnumeric(substring(str, 1, 3)) = 1
)
select *
from prep
where 1=1
and case when str_int > 0 then 'Y' else 'N' end = 'Y'
--and str_int > 0
Last 2 lines are doing the same thing. First one works, but if you uncomment second one it will crash with Conversion failed when converting the varchar value 'ABC' to data type int.
Obviously, SQL Server is rewriting query mixing all the conditions together.
My guess it that it considers 'case' as a havy operation and performs it as a last step. That's why workaround with case works.
Is this behavior documented in any way? or is it a bug?
This is a known issue with SQL Server, and Microsoft does not consider it a bug although users do. The difference between the two queries is the execution path. One is doing the conversion before the filtering, the other after.
SQL Server reserves the right to re-order the processing. The documentation does specify the logical processing of clauses as:
FROM
ON
JOIN
WHERE
GROUP BY
WITH CUBE or WITH ROLLUP
HAVING
SELECT
DISTINCT
ORDER BY
TOP
With (presumably but not explicitly documented here) CTEs being logically processed first. What does logically processed mean? Well, it doesn't mean that run-time errors are caught. It really determines the scope of identifiers during the compile phase.
When SQL Server reads from a data source, it can add new variables in. This is a convenient time to do this, because everything is in memory. However, this might occur before the filtering, which is what is causing the error when it occurs.
The fix to this problem is to use a case statement. So, the following CTE will usually work:
with prep as (
select *, (case when isnumeric(substring(str, 1, 3)) = 1 and str not like '%.%'
then cast(substring(str, 1, 3) as int)
end) as str_int
from #t
where isnumeric(substring(str, 1, 3)) = 1
)
Looks weird. And I think Redmond thinks so too. SQL Server 2012 introduced try_convert() (see here) which returns NULL if the conversion fails.
It would also help if you could instruct SQL Server to materialize CTEs. That would also solve the problem in this case. You can vote on adding such an option to SQL Server here.

SQL query help!! I'm trying to select the row that DOESN'T start with a number

I have 10,001 rows in my table, and all of the rows except one start with a number. I need to find this one row that doesn't start with a number, or even that doesn't contain a number.
So this is what I have:
Select col1 from table1 where col1 not like '?%'
Is this even close? I need to find the row that doesn't have a number...
Thanks!!
UPDATE: I am using a sqlite database
Use:
SELECT col1
FROM table1
WHERE SUBSTR(col1, 1, 1) NOT BETWEEN 0 AND 9
Reference:
core functions (incl SUBSTR)
LIKE
On Sql Server,
Select * From table
Where col1 Like '[^0-9]%'
EDIT: Don't know if there is an equivilent on SQLLIte,
but this will work...
Select * From table
Where col1 Not Like '0%'
And col1 Not Like '1%'
...
And col1 Not Like '9%'
There is a post in code project that allow you to use Regex with Ms SQL.
Hope this help.
The easiest way might be to just note that you're not using numbers; you're using strings that happen to have numbers in them. In this case, you can do
select * from table1 where col1 not between '0' and '9:';
(where the colon is the ASCII character after '9'; this way '9999999' won't be found).
It should be a lot less expensive than some of the other suggestions (e.g., checking the value of the first character).
#Dueber got me thinking, couldn't you just do this?
SELECT * FROM table1 WHERE col1 > '9'
Grab the first character, see if it's numeric.
SELECT *
FROM table1
WHERE ISNUMERIC(SUBSTRING(col1,1,1)) = 0
SUBSTRING
ISNUMERIC

How to select an empty result set?

Want to improve this post? Provide detailed answers to this question, including citations and an explanation of why your answer is correct. Answers without enough detail may be edited or deleted.
I'm using a stored procedure in MySQL, with a CASE statement.
In the ELSE clause of the CASE ( equivalent to default: ) I want to select and return an empty result set, thus avoiding to throw an SQL error by not handling the ELSE case, and instead return an empty result set as if a regular query would have returned no rows.
So far I've managed to do so using something like:
Select NULL From users Where False
But I have to name an existing table, like 'users' in this example.
It works, but I would prefer a way that doesn't break if eventually the table name used is renamed or dropped.
I've tried Select NULL Where False but it doesn't work.
Using Select NULL does not return an empty set, but one row with a column named NULL and with a NULL value.
There's a dummy-table in MySQL called 'dual', which you should be able to use.
select
1
from
dual
where
false
This will always give you an empty result.
This should work on most DBs, tested on Postgres and Netezza:
SELECT NULL LIMIT 0;
T-SQL (MSSQL):
SELECT Top 0 1;
How about
SELECT * FROM (SELECT 1) AS TBL WHERE 2=3
Checked in myphp, and it also works in sqlite and probably in any other db engine.
This will probably work across all databases.
SELECT * FROM (SELECT NULL AS col0) AS inner0 WHERE col0 IS NOT NULL;
SELECT TOP 0 * FROM [dbo].[TableName]
This is a reasonable approach to constant scan operator.
SELECT NULL WHERE FALSE;
it works in postgresql ,mysql, subquery in mysql.
How about this?
SELECT 'MyName' AS EmptyColumn
FROM dual
WHERE 'Me' = 'Funny'
SELECT * FROM (SELECT NULL) WHERE 0
In PostgreSQL a simple
SELECT;
works. You won't even get any columns labeled 'unknown'.
Note however, it still says 1 row retrieved.