Oracle will corrupt other column's data after DROP COLUMN [duplicate] - sql

This question already has answers here:
Why does this Oracle DROP COLUMN alter the default value of another column?
(1 answer)
Oracle bug when adding not nullable columns with default
(2 answers)
Closed 1 year ago.
I have a table with some other columns and preexisting data in my development database. I need to add four more columns to store data for a new feature.
I've added four new columns to this table with the following commands:
alter table my_table add (pin_validacao_cadastro varchar2(6 char) default '000000' not null);
alter table my_table add (tentativas_validacao_pin number default 0 not null);
alter table my_table add (codigo_bloqueio number default 1 not null check (codigo_bloqueio in (0, 1, 2)));
alter table my_table add data_validacao_cadastro date;
Then, I've discovered that the first definition needed to change because the default value should be another value. Then, I've dropped the first column (pin_validacao_cadastro).
alter table my_table drop column pin_validacao_cadastro;
Suprisingly enough, before I try to recreate the first column with the correct default value, I've noticed the second column (tentativas_validacao_pin) now is altered and all the values are NULL, when it should to be 0.
Then, I've dropped the second column (tentativas_validacao_pin) to recreate it and fix the corruption.
alter table my_table drop column tentativas_validacao_pin;
But wait! Before I've had the chance to recreate it, I've noticed that all values of the third column (codigo_bloqueio) are equal to 0. Before the DROP command, all values of this column were equal to 1 (the default value for this column).
What am I missing here? Is this supposed to happen? It seems that the default value of the dropped column is being applyed to the next existing column.
Since the problem ocurrs using diferent database tools (sqldeveloper, sqlplus, PlSqldeveloper) I think that it is something related to oracle database.
Can anyone explain what is happening?
I'm using Oracle 11G.

Related

Alter column of type timestamp

I want to update an empty table , which has a column of type timestamp to varbinary(8)
I used the following command
ALTER TABLE Notification ALTER COLUMN RowRevisionID varbinary(8)
and I get and an error
Cannot alter column 'RowRevisionID' because it is 'timestamp'.
How can I change a timestamp column type?
I do not wish to drop the column an add a new one , because that will create a column at the end , and I wish to preserve column order to use this table in an INSERT INTO
You unfortunately cannot make a change to a timestamp column, as the error implies; you are stuck with what you have. Also, each table can only have one timestamp column, so you cannot duplicate the column in any solution.
Your best bet (depending on the size of the table) might be to copy the data into a staging table (using SELECT * INTO MyTempTable FROM OriginalTable syntax to preserve the timestamp values), then drop and recreate the original table with the desired columns in the desired order and reinsert the data, or you could add a new VARBINARY(8) column to the existing table and drop the timestamp column, preserving the original table. There are probably other solutions along the same lines as these, but all will require a new column, rather than an ALTER COLUMN script.
You are looking for:
ALTER TABLE Notification DROP RowRevisionID;
and
ALTER TABLE Notification ADD RowRevisionID varbinary(8) AFTER myOtherColumn;

Alter the data type of a column in MonetDB

How can I alter the type of a column in an existing table in MonetDB? According to the documentation the code should be something like
ALTER TABLE <tablename> ALTER COLUMN <columnname> SET ...
but then I am basically lost because I do not know which standard the SQL used by MonetDB follows here and I get a syntax error. If this statement is not possible I would be grateful for a workaround that is not too slow for large (order of 10^9 records) tables.
Note: I ran into this problem while doing some bulk data imports from csv files into a table in my database. One of the columns is of type INT but the values in the file at some point exceed the INT limit of 2^31-1 (yes, the table is big) and so the transaction aborts. After I found out the reason for this failure, I wanted to change it to BIGINT but all versions of SQL code I tried failed.
This is currently not supported. However, there is a workaround:
Example table for this example, say we want to change the type of column b from integer to double.
create table a(b integer);
insert into a values(42);
Create a temporary column alter table a add column b2 double;
Set data in temporary column to original data update a set b2=b;
Remove the original column alter table a drop column b;
Re-create the original column with the new type alter table a add column b double;
Move data from temporary column to new column update a set b=b2;
Drop the temporary column alter table a drop column b2;
Profit
Note that this will change the ordering of columns if there are more than one. However, this is only a cosmetic issue.

How best to sum 2 columns and update 3rd column with sum?

I am looking for the best way to add 2 or more columns in a SQL Server table and update another column with there sum.
Yes, I know this is a dumb thing to do and calculations should be done at time of transaction but I am modifying an existing table where the data in a column now needs to be more detailed but numerous processes will still use the column value.
For example, a column name is TotalDailyMiles and numerous processes access and use that field. Now more detail is needed. 2 columns need to be added to the table TotalAMMiles and TotalPMMiles. These 2 columns will sum to the existing column. Changing all the processes that access the TotalDailyMiles column to use the 2 new columns instead is not an option. The data for the new columns in old records does not exist so the value for columns holding the sum of the 2 new columns cannot be based on the 2 new columns in old records because in old records the new column values will be 0, or maybe null but I'm leaning toward 0 so I can set the new columns as Not Null.
I'm thinking of using a trigger to update the column holding the sum based on the new columns changing but I'm hoping one of you smart people have a better option.
How about treating the existing column as its own value (which will be 0 in future rows), adding the two new columns, and then creating a calculated column with the same name as the old Total? Something like this (I'm assuming a data type of decimal(7, 2) but of course use what you have, though I hope it's not float):
EXEC sp_rename 'dbo.Miles.TotalDailyMiles', 'DailyMiles';
ALTER TABLE dbo.Miles ADD COLUMN AMMiles decimal(7, 2) NOT NULL
CONSTRAINT DF_Miles_AMMiles DEFAULT (0);
ALTER TABLE dbo.Miles ADD COLUMN PMMiles decimal(7, 2) NOT NULL
CONSTRAINT DF_Miles_PMMiles DEFAULT (0);
ALTER TABLE dbo.Miles ADD COLUMN TotalDailyMiles
AS (DailyMiles + AMMiles + PMMiles) PERSISTED;
Some possible housekeeping that might be needed on the DailyMiles column, too:
-- if not already NOT NULL
ALTER TABLE dbo.Miles ALTER COLUMN AMMiles decimal(7, 2) NOT NULL;
-- if not already defaulting to 0
ALTER TABLE dbo.Miles ADD
CONSTRAINT DF_Miles_DailyMiles DEFAULT (0) FOR DailyMiles;
You could additionally add a constraint that either DailyMiles must be 0, or AMMiles and PMMiles must both be 0:
ALTER TABLE dbo.Miles ADD CONSTRAINT CK_Miles_DailyOrAMPM
CHECK (DailyMiles = 0 OR (AMMiles = 0 AND PMMiles = 0));
As long as consumers of the data don't try to update the TotalDailyMiles column, you've solved your problem handily.

populate a column with a value based on value of other column alter table sql

I have a table 'Data' with a column 'Date'
I need to add another column called flag and populate it with 0 if the date is less than 2 years from current date and populate it with 1 if the date is more than 2 years from current date.
I did it by adding column using alter table and using update set statement as below
alter table data add flag INTEGER constraint flag_value check (flag in(0,1));
Is there a way to do this using just one alter table statement without using update set?
Is there a way to do this using just one alter table statement without using update set?
Not in sqlite, which does not support computed columns and there is no way to run a default UPDATE via an ALTER TABLE statement.
You could use the view approach from the linked question, or do as you've done and issue separate ALTER TABLE and UPDATE statements.

How to insert columns in between in table in sql server 2008

I want to add or update columns using alter table if i am adding a new column i want show error. I am using the code below
alter table Personal_Details alter columns DOB datetime
if i uncheck the NULL to not NULL then it will shows column does not allow nulls; update fails;
i want to insert the fields in between columns not at end.
Plese fix my bug,
Thanks in advance.
The position of the column in the table declaration has nothing to do with its being NULL or NOT NULL.
If you are adding a column (of any type) which you want to be NOT NULL, i.e. you want to prohibit NULL values in that column, and the table already contains some rows, you must also provide some default value. For example:
ALTER TABLE Personal_Details
ADD COLUMN DOB datetime NOT NULL DEFAULT (GETDATE())
Otherwise the engine will attempt to add that column with NULLs as its values, which will violate the NOT NULL property, and the change, therefore, will be reverted.
Basically, the same applies when you want to set an existing column's NOT NULL property on while the column already contains NULLs. But in this case you must explicitly eliminate the NULLs before the change by either replacing them with values or removing the respective rows.
Source:
ALTER TABLE (Transact-SQL). (The particular section related to your problem is just above this code snippet.)
1)For ur adding column with not null problem
Use
ALTER TABLE Personal_Details ADD COLUMN DOB datetime NULL
Update the DOB column with the required dates and make sure there is no null in the column
then alter the column using
ALTER TABLE Personal_Details ALTER COLUMN DOB datetime not NULL
2)For your column going to the end problem...
you should not be worried...the order in which the columns are arranged doesnt matter...unless u are using a pathetic way of accessing data by column order..in which case again..u should stop accessing it by column order...
If the column order really matters you can change it using design option in the sql management table(rightclick on table >design and drag the column to its required place.)