How do I convert this oracle date - sql

I am trying to convert from one date format to another. I am not sure how to write the functions.
My source date looks like 01/15/2009 01:23:15
The format I need is 01152009.
Thanks

Try this.
TO_CHAR(TO_DATE('01/15/2009 01:23:15','MM/DD/YYYY HH:MI:SS'),'MMDDYYY')
More info here,
http://psoug.org/reference/date_func.html

Does this work for you? It assumes the date is in date format but will work with timestamp
select to_char(YourDateField,'DDMMYYYY') from dual;
You can always convert it back to a date using the TO_DATE function if you need that format.

select TO_CHAR(TO_DATE('01/15/2009 01:23:15','MM/DD/YYYY MI:HH:SS'),'MMDDYYYY') from dual
if your field is already of data type date then you should only do:
select TO_CHAR(<fieldname>,'ddmmyyyy') ...

Related

Change date format in oracle query

When running
select processing_date from table;
i got this result "04-30-2020 20.12.49.978711"
what i want to change the format of the result to "30-APR-20"
is there a way i can do that ?
i tried select to_date(processing_date,'mm-dd-yyyy') from table; but it gives me errors
any help ?
You want to_char():
select to_char(processing_date, 'MM-DD-YYYY')
Dates are stored as an internal format, which you cannot change. If you want the date formatted in a particular way, then one solution is to convert to a string with the format you want.
EDIT:
The date appears to be a string. You can convert it to a date using:
select to_date(substr(processing_date, 1, 10), 'MM-DD-YYYY')
You can then either use as-is or use to_date() to get the format you really want.

SQlite compare dates without timestamp

In SQLite, how to compare date without passing timestamp?
The date format is 2018-03-18 08:24:46.101655+00 and I want to compare against only date part as 2018-03-18.
I have tried as where mydate='2018-03-18' but that didn't return any records.
Similarly, tried Date(mydate)='2018-03-18' but that didn't help either.
How can I compare dates ignoring the timestamp part?
select * from mytable
where strftime('%Y-%m-%d', mydate) = '2018-03-18'
This is not one of the supported date formats.
To extract the date part from the string, use substr():
... WHERE substr(mydate, 1, 10) = '2018-03-18'
It might be a better idea to store dates in a correct format in the database to begin with.
It is looking that there is problem with date format.
Sqlite doesn't understand data like '+00' in date.
So date() and strftime() will not work here if data type is 'timestamp with time zone'.
Try by using like clause.
Try using strftime
SELECT strftime('%Y %m %d', 'columnName');
you can find it here strftime.php

Date type conversion in hive

I have a date column in String type in 'MM/dd/yyyy' format.
I have to convert this into format 'dd/MM/yyyy' format.
How to achieve this in Hive/Impala ?
you can use like this,
select from_unixtime(unix_timestamp(date ,'MM/dd/yyyy'), 'dd/MM/yyyy') from date_test;
Let me know if this works.
Since you only need to switch existing substrings you can use substring:
concat_ws('/',substr(date,1,2),substr(date,4,2),substr(date,7,4))

How to change date format in hive?

My table in hive has a filed of date in the format of '2016/06/01'. but i find that it is not in harmory with the format of '2016-06-01'.
They can not compare for instance.
Both of them are string .
So I want to know how to make them in harmory and can compare them. Or on the other hand, how to change the '2016/06/01' to '2016-06-01' so that them can compare.
Many thanks.
To convert date string from one format to another you have to use two date function of hive
unix_timestamp(string date, string pattern) convert time string
with given pattern to unix time stamp (in seconds), return 0 if
fail.
from_unixtime(bigint unixtime[, string format]) converts the
number of seconds from unix epoch (1970-01-01 00:00:00 UTC) to a
string representing the timestamp of that moment in the current
system time zone.
Using above two function you can achieve your desired result.
The sample input and output can be seen from below image:
The final query is
select from_unixtime(unix_timestamp('2016/06/01','yyyy/MM/dd'),'yyyy-MM-dd') from table1;
where table1 is the table name present in my hive database.
I hope this help you!!!
Let's say you have a column 'birth_day' in your table which is in your format,
you should use the following query to convert birth_day into the required format.
date_Format(birth_day, 'yyyy-MM-dd')
You can use it in a query in the following way
select * from yourtable
where
date_Format(birth_day, 'yyyy-MM-dd') = '2019-04-16';
Use :
unix_timestamp(DATE_COLUMN, string pattern)
The above command would help convert the date to unix timestamp format which you may format as you want using the Simple Date Function.
Date Function
cast(to_date(from_unixtime(unix_timestamp(yourdate , 'MM-dd-yyyy'))) as date)
here is my solution (for string to real Date type):
select to_date(replace('2000/01/01', '/', '-')) as dt ;
ps:to_date() returns Date type, this feature needs Hive 2.1+; before 2.1, it returns String.
ps2: hive to_date() function or date_format() function , or even cast() function, cannot regonise the 'yyyy/MM/dd' or 'yyyymmdd' format, which I think is so sad, and make me a little crazy.

MS Access - Select Char as Date and doing a date diff

I have two columns. ColA and ColB contains char(10) with data "20090520" and "20090521".
I want to select and get the date difference in days. I have tried using Format() and CDate()
but MS Access always display as #ERROR.
Access prefers its dates in this format:
#2009-12-01#
You can convert your date to something Access understands with:
CDate(Format([ColA], "0000-00-00"))
Or alternatively:
DateSerial(Left([ColA],4),Mid([ColA],5,2),Right([ColA],2))
And to display the result in your preferred format:
Format(<date here>, "dd-mm-yyyy")
Try using DateSerial() to convert the dates:
DateSerial(Left([FieldName],4),Mid([FieldName],5,2),Right([FieldName],2))
If at all possible, change the datatype to a date datatype. You should not store dates as character data.
I am connecting to another database which I have no control on. That is why this problem occurred. Thanks for the feedback.