Alter Column type in Teradata - sql

I have an export job from Datameer going through to HIVE. The issue is that we were told that HIVE converts Date columns to string. I am feeding the data from HIVE to Tableau and the issue is that the Date column being converted to string is completely throwing off my data.
I am looking to convert/alter my existing column "Posting_Date" from String to Date... HIVE is based off a Teradata interface so I am trying to find a command which will let me convert this column back to Date format..
I tried the following:
ALTER table Database.Table1
ADD posting_date date(4)

Related

Trouble converting CSV date from non standard format SQL

I have dates in the format 01jan2020 (without a space or any separator) and need to convert this to a date type in SQL Server 2016 Management Studio.
The data was loaded from a .CSV file into a table (call it TestData, column is Fill_Date).
To join on a separate table to pull back data for another process, I need the TestData column Fill_Date to be in the correct format (MM-DD-YYYY) for my query to run correctly.
Fill_Date is currently in table TestData as datatype varchar(50).
I want to either see if it is possible to convert it with TestData table or directly insert the result into a 2nd table that is formatted.
Thanks (NEWB)
I ended up solving by converting the data while dropping into a temp table, deleting old value, and then inserting from that table back into the TestData table.
CONVERT(VARCHAR,CONVERT(date,[fill_date]),101) AS fill_date

SQL convert varchar into Date

A long time ago I created a database, and completely forgot to set the column to Date and now looking at the data, I want to extract, it looks like 2006-05-06.
How would I run a SQL statement to convert it into the correct format (dd/MM/yyyy) 06/05/2006, I'm running with the British format "103".
What I was planning on doing, I've already added a second column (s_batch_convert2) to the database, hoping to convert into that and then delete the original column (s_batch_convert), renaming the new column to the old one.
UPDATE s_service_repairs
SET s_batch_convert2 = TRY_CONVERT(Date, s_batch_convert, 103)
Am I along the right lines?
You should convert your existing column to a bona fide date. It seems to have the right format:
alter table s_service_repairs alter column s_batch_convert date;
Then you can add a computed column for the format you want:
alter table s_service_repairs s_batch_convert_mmddyyyy as ( try_convert(varchar(10), s_batch_convert, 103) );

SparkSql get float type field value null from hive table

I create and import hive table with sqoop and use pyspark to get data. The table is composed by one string field, one int field and several float field. I can get the whole data by hue hive sql query. But while I program with pyspark sql the non-float field can be displayed and the float fields always show null value.
HUE hive sql results:
zeppelin pyspark output:
The details of hive table:
I finally found the cause. since I import these tables from mysql via sqoop. the original table columns are uppercase and in hive they were converted to all lowercase automatically. it caused all converted fields value can not be retrieved by sparksql. (but HUE hive queries these data normally, It might be a bug of spark.) I have to convert uppercase field names to lower case by specify the option --query in sqoop command. i.e. --query 'select MMM as mmm from table...'

kettle etl how to convert to a time data type

I have a table input and gets some data from a SQL Server table. One field has values of type time, e.g. 02:22:57.0000000, the destination table (table output ) is a PostgreSQL table and has data type of time for that field. But PDI seems think the time from the source table is of type string and generates an error.
ERROR: column "contact_time" is of type time without time zone but expression is of type character varying
I tried using select value step, but there is no time type, only date and timestamp. How should I do?
You can use Select Values step and in the meta-data tab, select Type as Timestamp and Format as HH:mm:ss
This will format your string input to timestamp.
Hope this helps :)

Convert datatype of existing data from Char to DateTime with SQL

I am relatively new to SQL Server and I am trying to update the datatype of about 3000 records from a Char to Datetime so I can use the DATEDIFF function. I created a view that achieves the conversion but what I think I need to do is alter the data in the origin table.
SELECT
CONVERT(datetime, CONVERT(char(8), TRANS_ACCOUNTINGDATE_ALLCAMPAIGNS_2010_ALLPROCESSINGACCOUNTS_ALL))
FROM Accounting;
What I think I need to do is an alter table and iterate over each row performing the conversion. Trying to change the data type using the GUI is not working for me.
Any help would be appreciated.
Thanks
The datatype is an attribute of the COLUMN, not just of the data inside the column. You can't put datetime data into a char field - that's the purpose of data types!
You need to add a new field and run an UPDATE statement to populate it with your converted data. Then you can drop the original field and rename your new one to the original name.