Check if field is numeric, then execute comparison on only those field in one statement? - sql

This may be simple, but I am no SQL whiz so I am getting lost. I understand that sql takes your query and executes it in a certain order, which I believe is why this query does not work:
select * from purchaseorders
where IsNumeric(purchase_order_number) = 1
and cast(purchase_order_number as int) >= 7
MOST of the purchar_order_number fields are numeric, but we introduce alphanumeric ones recently. The data I am trying to get is to see if '7' is greater than the highest numeric purchase_order_number.
The Numeric() function filters out the alphanumeric fields fine, but doing the subsequent cast comparison throws this error:
Conversion failed when converting the nvarchar value '124-4356AB' to data type int.
I am not asking what the error means, that is obvious. I am asking if there is a way to accomplish what I want in a single query, preferably in the where clause due to ORM constraints.

does this work for you?
select * from purchaseorders
where (case when IsNumeric(purchase_order_number) = 1
then cast(purchase_order_number as int)
else 0 end) >= 7

You can do a select with a subselect
select * from (
select * from purchaseorders
where IsNumeric(purchase_order_number) = 1) as correct_orders
where cast(purchase_order_number as int) >= 7

try this:
select * from purchaseorders
where try_cast(purchase_order_number as int) >= 7

have to check which column has numeric values only.
Currently, in a table every field is setted with nvarchar(max) Like tableName (field1 nvarchar(max),field2 nvarchar(max),field3 nvarchar(3)) and tableName has 25lac Rows.
But on manually Check Field2 Contain the numeric Values Only... How to Check With t-sql that in the Complete Column (Field2) has numeric Value or not/null value with Longest Length in the Column!

Related

Error converting nvarchar to bigint in WHERE clause fails but works in SELECT

I am trying to achieve the following:
Get all Field values where the FieldValue is greater than 100 when the
value is stored as a number.
The indicator of whether or not if a value was stored as a number is given by the field type, which is another where clause.
The issue I am facing is that when I try to do the data field conversion in my WHERE statement, it failed.
I run:
SELECT FieldValue FROM CARData A
JOIN Fields B ON A.FieldId = B.FieldId
WHERE FieldTypeId = 3 AND FieldValue IS NOT NULL
And this returns the expected result of:
But if I added the WHERE clause to filter by the value:
SELECT FieldValue FROM CARData A
JOIN Fields B ON A.FieldId = B.FieldId
WHERE FieldTypeId = 3 AND FieldValue IS NOT NULL
AND CAST(FieldValue AS BIGINT) > 100
It throws the error:
Error converting data type nvarchar to bigint.
I somewhat understand what the issue is- it is trying to convert ALL values in the table to a bigint and is failing when it hits a non-numeric value.
I attempted to solve this by nesting the first query in a second like so:
SELECT RESULT.FieldValue FROM (
SELECT FieldValue FROM CARData A
JOIn Fields B ON A.FieldId = B.FieldId
WHERE
FieldTypeId = 3
AND FieldValue IS NOT NULL
AND ISNUMERIC(A.FieldValue) = 1) RESULT
WHERE CAST(FieldValue AS BIGINT) > 100
But even that does not return anything other than the aforementioned error.
While it's true that changing the structure of your query can cause SQL to pick a different plan, you should limit that technique to trying to help the optimizer pick a performant plan. The reason for this is that if successful execution of the logic depends on a particular choice of plan, then your query might work today but fail tomorrow (or in production) when SQL decides to pick a different plan.
Fortunately you don't have to rely on that here! Use try_cast
SELECT FieldValue FROM CARData A
JOIN Fields B ON A.FieldId = B.FieldId
WHERE FieldTypeId = 3
AND TRY_CAST(FieldValue AS BIGINT) > 100
I'm also curious... is this part of a SQL course? If so, tell your instructor to come visit StackOverflow so we can tell them to stop teaching students to use EAV's. Unless the whole point is to show you how horrible they are! :)

Finding max value for a column containing hierarchical decimals

I have a table where the column values are like '1.2.4.5', '3.11.0.6',
'3.9.3.14','1.4.5.6.7', N/A, etc.. I want to find the max of that particular column. However when i use this query i am not getting the max value.
(SELECT max (CASE WHEN mycolumn = 'N/A'
THEN '-1000'
ELSE mycolumn
END )
FROM mytable
WHERE column like 'abc')
I am getting 3.9.3.14 as max value instead of 3.11....
Can someone help me?
Those aren't really decimals - they're strings containing multiple dots, so it's unhelpful to think of them as being "decimals".
We can accomplish your query with a bit of manipulation. There is a type build into SQL Server that more naturally represents this type of structure - hierarchyid. If we convert your values to this type then we can find the MAX fairly easily:
declare #t table (val varchar(93) not null)
insert into #t(val) values
('1.2.4.5'),
('3.11.0.6'),
('3.9.3.14'),
('1.4.5.6.7')
select MAX(CONVERT(hierarchyid,'/' + REPLACE(val,'.','/') + '/')).ToString()
from #t
Result:
/3/11/0/6/
I leave the exercise of fully converting this string representation back into the original form as an exercise for the reader. Alternatively, I'd suggest that you may want to start storing your data using this datatype anyway.
MAX() on values stored as text performs an alphabetic sort.
Use FIRST_VALUE and HIERARCHYID:
SELECT DISTINCT FIRST_VALUE(t.mycolumn) OVER(
ORDER BY CONVERT(HIERARCHYID, '/' + REPLACE(NULLIF(t.mycolumn,'N/A'), '.', '/') + '/') DESC) AS [Max]
FROM #mytable t

Getting a varchar value even after removing it from subquery

I encountered a strange behavior this morning and am not sure why is it happening?
My table and values :
create table tblA (acc varchar(10),accname varchar(10))
insert into tblA values('1','A')
insert into tblA values('2','B')
insert into tblA values('3','C')
insert into tblA values('Z','D')
insert into tblA values('4','E')
Query 1 to fetch acc with 1
select * from tblA where acc = 1
error : Conversion failed when converting the varchar value 'Z' to data type int
I understand the error very well.
Query 2 to fetch all numeric records
select * from tblA where ISNUMERIC(acc)<>0
gives me all records with proper numeric accounts. Excludes acc 'Z'
Query 3 to fetch acc 1 from all numeric accs
select * from (select * from tblA where ISNUMERIC(acc)<>0) A
where A.acc=1
error : Conversion failed when converting the varchar value 'Z' to data type int.
This is what I didn't understand, how come is it showing an error for value 'Z' that is not even present in table from subquery aliased as A? I know there are work arounds like converting or casting to varchar and all, I just want to know the reason behind this behavior
Your conversion in query 3 is a filtering predicate, meaning it is performing the check to return all rows where acc CAN be converted to a numeric. It is not, however, returning a result set with those values converted, as it would be here:
select *
from (select CONVERT(INT, acc) AS acc, accname
from tblA
where ISNUMERIC(acc)<>0) A
where A.acc=1
In other words, your query is still returning varchars, not integers. As a result, it may be that behind-the-scenes, the query engine is just doing the conversion on every value in the table because it will not simply follow the order of the query if it estimates it will be faster. Therefore, even though you logically should have ruled out the need to perform an integer conversion on non-integer values, the query optimizer is still doing it in the background because it thinks it will save time. This is just my speculation.
you are having datatype of Id acc as varchar and you are checking acc with integer type . just pass acc as varchar.
select * from (select * from tblA where ISNUMERIC(acc)<>0) A
where A.acc='1'
this will work fine.
here is the screenshot for the reference
Try the following nested case query
select *
from #tblA
where 1=
case when ISNUMERIC(acc)=1 then case when acc=1 then 1 end end

How do you query an int column for any value?

How can you query a column for any value in that column? (ie. How do I build a dynamic where clause that can either filter the value, or not.)
I want to be able to query for either a specific value, or not. For instance, I might want the value to be 1, but I might want it to be any number.
Is there a way to use a wild card (like "*"), to match any value, so that it can be dynamically inserted where I want no filter?
For instance:
select int_col from table where int_col = 1 // Query for a specific value
select int_col from table where int_col = * // Query for any value
The reason why I do not want to use 2 separate SQL statements is because I am using this as a SQL Data Source, which can only have 1 select statement.
Sometimes I would query for actual value (like 1, 2...) so I can't not have a condition either.
I take it you want some dynamic behavior on your WHERE clause, without having to dynamically build your WHERE clause.
With a single parameter, you can use ISNULL (or COALESCE) like this:
SELECT * FROM Table WHERE ID = ISNULL(#id, ID)
which allows a NULL parameter to match all. Some prefer the longer but more explicit:
SELECT * FROM Table WHERE (#id IS NULL) OR (ID = #id)
A simple answer would be use: IS NOT NULL. But if you are asking for say 123* for numbers like 123456 or 1234 or 1237 then the you could convert it to a varchar and then test against using standard wild cards.
In your where clause: cast(myIntColumn as varchar(15)) like '123%'.
Assuming the value you're filtering on is a parameter in a stored procedure, or contained in a variable called #Value, you can do it like this:
select * from table where #Value is null or intCol = #Value
If #Value is null then the or part of the clause is ignored, so the query won't filter on intCol.
The equivalent of wildcards for numbers are the comparators.
So, if you wanted to find all positive integers:
select int_col from table where int_col > 0
any numbers between a hundred and a thousand:
select int_col from table where int_col BETWEEN 100 AND 1000
and so on.
I don't quite understand what you're asking. I think you should use two different queries for the different situations you have.
When you're not looking for a specific value:
SELECT * FROM table
When you are looking for a specific value:
SELECT * FROM table WHERE intcol = 1
You can use the parameter as a wildcard by assigning special meaning to NULL:
DECLARE #q INT = 1
SELECT * FROM table WHERE IntegerColumn = #q OR #q IS NULL
This way, when you pass in NULL; you get all rows.
If NULL is a valid value to query for, then you need to use two parameters.
If you really want the value of your column for all rows on the table you can simply use
select int_col
from table
If you want to know all the distinct values, but don't care how many times they're repeated you can use
select distinct int_col
from table
And if you want to know all the distinct values and how many times they each appear, use
select int_col, count(*)
from table
group by int_col
To have the values sorted properly you can add
order by int_col
to all the queries above.
Share and enjoy.

How to assign a value to a casted column in Oracle

I am wondering whether is possible to assign a value to a casted column in SQL depending on real table values.
For Example:
select *, cast(null as number) as value from table1
where if(table1.id > 10 then value = 1) else value = 0
NOTE: I understand the above example is not completely Oracle, but, it is just a demonstration on what I want to accomplish in Oracle. Also, the above example can be done multiple ways due to its simplicity. My goal here is to verify if it is possible to accomplish the example using casted columns (columns not part of table1) and some sort of if/else.
Thanks,
Y_Y
select table1.*, (case when table1.id > 10 then 1 else 0 end) as value
from table1