Why does appending a NULL into a Date column result in the value 1900-01-01 being displayed? - sql

I'm using an SSIS package to import a basic text file, it has 3 date fields, and sometimes some of the date fields are empty.
The imported table shows empty fields, I suppose because they are varchar(50) datatypes. But then I need to insert those records from that table into another table, where those columns are defined as Date datatypes.
When I run the insert statement, the resulting values in the destination table all show 1900-01-01 for the date, rather than NULL or blank. I tried forcing the value to be null, but it didn't work:
CASE WHEN refresh_date IS NULL THEN NULL ELSE refresh_date END AS RefreshDate
How can I make a Date column just accept a blank or null value?

The varchar field should not be casted or converted to a date if 'empty'. When a blank, or empty string, is casted to a date it equals '1900-01-01'. You can test this by using the following algorithm:
SELECT CAST('' as date)
Using SSIS you are better of checking if the varchar(50) field equals '', and if so setting it to NULL. Here is an example SQL query:
SELECT CASE WHEN importedfield = '' THEN NULL
ELSE CAST(importedfield as date)
END AS [NewFieldname]

Try adding a derived column transformation in the Data flow task with the following expression, the issue may be caused by empty string (not NULL values)
If the input is of type Date:
ISNULL([DateColumn]) ? NULL(DT_DATE) : (DT_DATE)[DateColumn]
If the input column is of type string
(ISNULL([DateColumn]) || [DateColumn] == "") ? NULL(DT_DATE) : (DT_DATE)[DateColumn]
And map this column to the destination column

Perhaps it's not a null value you're entering, but rather an empty string. You mentioned they could be
a VARCHAR(50) data type. So you might need to add some more logic. Try this:
NULLIF(LTRIM(RTRIM(refresh_date)),'')

You may want to check the data type of this column and see if it accepts null value.

Related

Cast string as "Column Reference" in HIVE SQL

I have a query that is meant to use bind values to retrieve information from a table and test if any field is NULL. The user enters a column name for the bind value and that column is then tested for any NULL values. Here is a simplified version of the query:
SELECT
CASE
WHEN ISNULL(bind.value)
THEN 'PASS'
ELSE 'FAIL'
END AS Solution
This keeps returning 'FAIL' I think because ISNULL() is testing the column entered as a string. Instead, I need it to test the fields in the column, rather than the string holding the column's name. Is there anyway to cast this string as a reference or pointer (I know SQL doesn't have pointer but a pointer-like object) to the a column?
NOTE: When I replace bind.value with the column name it returns 'PASS.' I'm really trying keep this as dynamic as possible so can utilize it with other tables without having to write a new query for each table I use this on.
Most probably you are passing empty strings instead of NULL.
empty string '' is not the same as NULL in Hive, add test for empty string:
SELECT
CASE
WHEN ISNULL(bind.value) OR cast(bind.value as string)='' THEN 'PASS'
ELSE 'FAIL'
END AS Solution

Compare SQL Varchar field to SQL

I have two databases for which I want to compare a varchar50AllowNulls column in one to a TextNotNull column in the other.
What I find is that when I have a value in the varchar column and an empty string in the text column, my Access query to these linked tables works.
However when I have an empty string in the varchar and a value in the text column, it does not work.
I cannot find any topics that seems to address this. I could ask the developer to change the text to varchar, but thought best if I can resolve using SQL.
My guess is that I need to cast or trim, but my efforts have not been successful.
SELECT
A.EQNUM, A.[Varchar50AllowNulls], B.EQNUM, B.[TextNotAllowNulls]
FROM
A
INNER JOIN
B ON A.EQNUM = B.EQNUM
WHERE
((A.[Varchar50AllowNulls] <> B.[TextNotAllowNulls]));
The logic for your WHERE clause below is that you want to remove a record when the nullable VARCHAR column is not equal to the non nullable TEXT column. If Varchar50AllowNulls is not NULL, then your original condition using <> was already checking for this. However, when Varchar50AllowNulls is NULL then <> won't work, because comparing any value against NULL is unknown. Instead, since TextNotAllowNulls cannot be NULL, then whenever Varchar50AllowNulls is NULL you want to remove this record, regardless of the value of the TEXT column.
SELECT A.EQNUM,
A.[Varchar50AllowNulls],
B.EQNUM,
B.[TextNotAllowNulls]
FROM A
INNER JOIN B
ON A.EQNUM = B.EQNUM
WHERE A.[Varchar50AllowNulls] <> B.[TextNotAllowNulls] OR
A.[Varchar50AllowNulls] IS NULL

Determine if a Varchar field has a Numeric entry

I've got a field in a table that has a DataType of varchar(10). This field contains numeric values that are formatted as a varchar, for the sole purpose of being used to join two tables together. Some sample data would be:
AcctNum AcctNumChar
2223333 2223333
3324444 3324444
For some records, the table sometimes thinks this field (AcctNumChar) is numeric and the join doesn't work properly. I then have to use an Update statement to re-enter the value as a varchar.
Is there any way to determine whether or not the field has a varchar or numeric value in it, using a query? I'm trying to narrow down which records are faulty without having to wait for one of the users to tell me that their query isn't returning any hits.
You can use isnumeric() for a generic comparison, for instance:
select (case when isnumeric(acctnum) = 1 then cast(acctnum as decimal(10, 0))
end)
In your case, though, you only seem to want integers:
(case when acctnum not like '%[^0-9]%' then cast(acctnum as decimal(10, 0))
end)
However, I would strongly suggest that you update the table to change the data type to a number, which appears to be the correct type for the value. You can also add a computed column as:
alter table t add AcctNum_Number as
(case when acctnum not like '%[^0-9]%' then cast(acctnum as decimal(10, 0))
end)
Then you can use the computed column rather than the character column.
There are several ways to check if varchar column contains numeric value but I recommend to you to us TRY_CONVERT function.
It will give you NULL if the value cannot be converted to number. For example, to get all records that have numeric values, you can do this:
SELECT *
FROM [table]
WHERE TRY_CONVERT(INT, [value]) IS NOT NULL
You can use CAST and CONVERT (Transact-SQL) functions here to solve your purpose.
reference here - https://msdn.microsoft.com/en-IN/library/ms187928.aspx.
IsNumeric worked, TRY_CONVERT didn't (SQL wouldn't recognize it as a built-in function for some reason). Anyway, for the record I ran the following query and got all of my suspect records:
SELECT *
FROM ACCT_LIST
where IsNumeric([ACCT_NUM_CHAR]) = 0
Use PATINDEX function:
DECLARE #s VARCHAR(20) = '123123'
SELECT PATINDEX('%[^0-9]%', #s)
If #s variable will have something different from range 0-9 this function will return the index of first occurence of non digit symbol. If all symbols are digits it will return 0.

Convert column value to int in SQL Server

I am working on this query and it is returning exception cannot convert varchar to int:
SELECT BG_PRIORITY, count(*) as cnt
FROM Bug
WHERE BG_TARGET_REL= case when ISNUMERIC('XXXX 13.5') = 1 then
cast('XXXX 13.5' as int) else 'XXXX 13.5' end
GROUP BY BG_PRIORITY
This query is generated from my C# code. Where clause filter can be a numeric or string one as the user chooses the type of filter he/she wants and according gives its value.
Is there a approach so that I can add any type of filter in my query?
IF your only column types are int and varchar, then you can simply use this:
SELECT BG_PRIORITY, count(*) as cnt
FROM Bug
WHERE <ChosenColumn> = 'InputValue'
GROUP BY BG_PRIORITY;
This works by using SQL Server's implicit data type precedence conversions. If is an int, the input value, something like '123' is converted to the number 123 (exactly what we want!). If the column is varchar, the quoted value remains as a varchar.
Your original CAST error stems from the fact that a CASE statement results in one datatype and one only, using the same data type precedence rules linked above. Consider your branches:
case when ISNUMERIC('XXXX 13.5') = 1
then cast('XXXX 13.5' as int) --<< this branch returns int
else 'XXXX 13.5' --<< this branch returns varchar
end
>> data type precedence ==> resultant type of expression is "int"
When you give it a value 'XXXX 13.5', the case statement results in 'XXXX 13.5', which it then needs to cast to the CASE expression resultant type, i.e. int => fail.

How to return a NULL value for Date type column in SQL in Derby?

In my Sql, I want to group by some columns having some conditions , so for other column, they are date type. I just want to manually return null for them, But i don't know how.
Can you give me some advice how to return null for date type column? To simply use NULL doesn't work in Apache Derby which is odd.
I am also interested in the answer for other database. If you can also answer it, it will be great.
EDIT
Reply to comment, I want
select NULL from TABLE, and NULL stands for an empty value of DATE type column. I just want to return a NULL value manually.But it throws exception tells me NULL is invalid. Is that possible?
EDIT
I found an workaround to return a constant value for date type in Derby. That is using Date(789) which is date function in derby. But I am still insterested in how to return an empty constant value for Date?
EDIT
NULL in query in Derby must be casted to correct type. which is a bit different.
It should use cast(null as DATE) AS START_DATE expression. As in the link.
I just add here for others to find it out easily. One thing interesting is even the column START_DATE is NOT NULL, I can still fake a NULL return value for this column. Actually it should be , not a surprise at all here,right?
Since your column is NULL-able you should declare the property also as DateTime nullable: DateTime? MyDate;