Currently, I try to union several tables in hive. After I achieved this, I found that some column type is not proper. I thought some should be float but some kind it becomes string. Then I ran alter command:alter table table_name change column_name column_name float; It returned error message:
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. Unable to alter table. The following columns have types incompatible with the existing columns in their respective positions
I thought this means that hive don't support convert string to float as showed in table. But I found that I can do cast on the same column and get wanted result. This makes me confused why there will be different result between alter and cast. What is the logic behind it. THX.
It seems you have missed CHANGE keyword.
alter table table_name CHANGE column_name column_name_new float;
See here: https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL#LanguageManualDDL-ChangeColumnName/Type/Position/Comment
Demo:
hive> create table t(a string);
OK
Time taken: 0.069 seconds
hive> alter table t change a a_new float;
OK
Time taken: 0.158 seconds
hive> describe formatted t;
OK
# col_name data_type comment
a_new float
Related
One of my column is defined as ntext datatype which is no longer supported by many streams. I'm trying my best to convert that column to numeric or int and every attempt is reaching anywhere.
reg_no #my ntext field | name | country | etc |
I tried to alter the col using the command we all use, but failed
alter table tabl1
alter column [reg_no] numeric(38,0)
Error:
Error converting data type nvarchar to numeric
Any suggestions on fixing this or has anyone come across this in the past, if yes how did you get over it
You should be able to do this in two steps:
alter table tabl1 alter column [reg_no] nvarchar(max);
alter table tabl1 alter column [reg_no] numeric(38,0);
ntext is deprecated and conversion to numeric is not supported, but converting to nvarchar() is supported.
This assumes that the values are compatible with numeric. Otherwise you will get a type conversion error. If this happens, you can get the offending values by using:
select *
from t
where try_convert(numeric(38, 0), try_convert(nvarchar(max), x)) is null
Try,
select convert(int,convcert(varchar(40), reg_no)) as newfieldname from tabl1
I have problem inserting values in my SQL server database on Azure, I am getting the following error:
Failed to execute query. Error: String or binary data would be truncated in table 'dummy_app.dbo.user_info', column 'uid'. Truncated value: 'u'.
The statement has been terminated.
I don't understand where I am wrong, I just created the server, and I am trying to experiment but cant fix this.
if not exists (select * from sysobjects where name='user_info' and xtype='U')
create table user_info (
uid varchar unique,
name varchar,
email varchar
)
go;
INSERT INTO dbo.user_info(uid, name, email) VALUES('uids', 'name', 'email') go;
Creating the table works fine, the only thing that doesn't work is the second command INSERT
I suspect that the reason is that you haven't defined a lenght for varchar and it defaults to 1 as length. Therefore your value gets truncated.
Set a varchar length to something like varchar(200) and you should be good to go.
This looks like the fact that the CREATE portion of your procedure for the table doesn't include a length of varchar, so you'd have to specify a length such as varchar(50) since the default is 1. Refer to the official MS docs in the link, in the remarks.
docs.miscrosoft.com
Also, here is the syntax for the CREATE TABLE in Azure which might be helpful as well.
Syntax of Azure CREATE TABLE
I have a database table of that I have used to store the data returned from a web spider. I have a column that contains ticket prices for different events all in the varchar type (as the scrapy spider has to scrape the data in unicode). I'm trying to return the min price of the column and since the min() function only works for data of type INT, I tried to convert the column to integers using a solution from this SO post:
ALTER TABLE vs_tickets ALTER COLUMN ticketprice TYPE integer USING (ticketprice::integer);
but I got the error: ERROR: invalid input syntax for integer:
I also tried: change_column :vs_tickets, :ticketprice, 'integer USING CAST(ticketprice AS integer)' but that didn't work either.
What is the proper way to convert the column to type INT?
Edit:
You have decimal places in the string, so a simple cast is not going to work. You can do a double conversion:
cast(cast(ticketprice as decimal(10, 2)) as int)
or:
(ticketprice::decimal(10, 2))::int
(The parens are not strictly necessary.)
EDIT:
Or, as Erwin points out, just use numeric:
(ticketprice::numeric)::int
Postgres is much smarter about numeric than most other databases . . . after all, it supports numbers that are egregiously large ;)
The final query is:
ALTER TABLE vs_tickets
ALTER COLUMN ticketprice TYPE integer USING (ticketprice::numeric::integer);
I'm going to bet on your column have wrong characters.
Also you may want use float or numeric because you will lose decimals if convert to integers.
You need create a function to check if a text is numeric like this isnumeric-with-postgresql
Then check each row like this
select ticketprice
from vs_tickets
where ISNUMERIC(ticketprice) = false;
As your comment you also should try
SELECT ticketprice::float
You will be best off adding an INT column, moving your data with a cast and then removing the old varchar column.
ALTER TABLE vs_tickets ADD COLUMN ticketprice_int TYPE int;
GO
update vs_tickets SET ticketprice_int = cast(ticketprice as int);
// if you fail to cast the varchar to int you can use Gordon's method
// update vs_tickets SET ticketprice_int = cast(cast(ticketprice as decimal(10, 2)) as int);
GO
ALTER TABLE vs_tickets DROP COLUMN ticketprice;
GO
ALTER TABLE vs_tickets RENAME COLUMN ticketprice_int to ticketprice;
GO
With this at minimum you will be able to tell if and where a cast/convert fails and be able to check and recheck at each step before you can't turn back.
I have a database where there is a column of varchar that I wish to convert to a timestamp. I'd like to do something like this, but keep getting a syntax error, please can someone advise?
ALTER TABLE MY_TABLE
ALTER COLUMN MY_COLUMN TYPE timestamp
USING to_timestamp(MY_COLUMN::double precision);
MY_COLUMN is of type VARCHAR(255) NOT NULL
Error reads:
Syntax error in SQL statement "ALTER TABLE MY_TABLE
ALTER COLUMN MY_COLUMN TYPE TIMESTAMP
USING[*] TO_TIMESTAMP(MY_COLUMN::DOUBLE PRECISION) "; SQL statement:
ALTER TABLE MY_TABLE
ALTER COLUMN MY_COLUMN TYPE timestamp
USING to_timestamp(MY_COLUMN::double precision) [42000-176] 42000/42000
It would appear that my Postgresql running within Grails doesn't support to_timestamp. As pointed out by #a_horse_with_no_name in the comments of the question, the code works. Thanks for pointing out sqlfiddle.com, I hadn't realised such a resource existed.
I am using Apache derby database v 10.9.1.0. There is one existing table Country-having column LawID of type bigint. It contains records having integer data only. Due to some business reason, I need to alter its data type from 'bigint' to 'varchar' . I tried following two ways to alter existing table. But both ways did not work.
a. first way
ALTER TABLE Country ADD COLUMN LawID_NEW VARCHAR(50);
UPDATE Country SET LawID_NEW = LawID;
ALTER TABLE Country DROP COLUMN LawID;
RENAME COLUMN Country.LawID_NEW TO LawID;
It shows message like :Columns of type 'VARCHAR' cannot hold values of type 'BIGINT'.
b. second way
ALTER TABLE Country ALTER LawID SET DATA TYPE VARCHAR(50);
It shows error message like : Invalid type specified for column 'LawID'. The type of a column may not be changed.
Any help related to correct alter query is highly appreciated, Thanks
I think the first method would work with this change:
UPDATE Country SET LawID_NEW = TRIM(CHAR(LawID));
ALTER TABLE tablename MODIFY columnname VARCHAR(20);
This works in mysql. Give it a try.
Or
ALTER TABLE table CHANGE columnname columnname VARCHAR(20);