Alter every double fields of a table - sql

I need to change every double precision field of a table to numeric(15,3) type,
how can i do this job quickly with a stored procedure that iterate through the field of a given table and if the type is double precision alter column to numeric ?

following query should return query to update these tables... you can add table filter if you want. Copy the result and run it.
select 'ALTER TABLE ' + table_name + ' ALTER COLUMN ' + column_name + 'TYPE numeric(15,3)'
from information_schema.columns
where data_type = 'double precision'
and table_name = 'YOUR_TABLE_NAME'
TEST IT BEFORE RUN IT. I DID NOT TEST THIS YET.

While another question will do it for all columns; a simple "alter table [tablename] alter column [columnToAlter] type numeric(15,3). You shouldn't need to run them through a cursor; any value that's not going to be affected by this should remain unaffected.
If you can't do it by changing the datatype itself, a simple update [tablename] set [columnname] = cast(columnname as numeric(15,3) should also work.
Hope that helps!

Related

BigQuery Drop Table Column - DDL Bug

After removing a column from a table by:
ALTER TABLE MyTable
DROP COLUMN IF EXISTS MyColumn
In BigQuery UI I Can see that the column was deleted successfully & I can't query the specific column but when I query DDL I can see that the column still exists in the scheme:
SELECT DDL FROM MyDataSet.INFORMATION_SCHEMA.TABLES
WHERE DDL LIKE '%MyTable%'
What am I doing wrong?
This is a nasty, undocumented side effect of Bigquery's Time Travel. Time Travel makes it unsafe to use ALTER TABLE statements in bigquery.
Demonstration of problem:
create table apu.time_travel_problem
( id int64
, name string
);
select column_name, data_type
FROM apu.INFORMATION_SCHEMA.COLUMNS
WHERE table_name = 'time_travel_problem';
column_name
data_type
id
INT64
name
STRING
This is all normal so far, but after an ALTER TABLE everything goes odd:
alter table apu.time_travel_problem drop column name;
select column_name, data_type
FROM apu.INFORMATION_SCHEMA.COLUMNS
WHERE table_name = 'time_travel_problem';
column_name
data_type
id
INT64
name
STRING
The column we just dropped is still there!
Now try this:
alter table apu.time_travel_problem add column name string;
Column `name` was recently deleted in the table `time_travel_problem`. Deleted column name is reserved for up to the time travel duration, use a different column name instead.
Solution:
Do not use ALTER TABLE in bigquery. Instead DROP and reCREATE using a temporary table.
This is a jinja template which I use:
/* {{TABLE}} */
CREATE TABLE IF NOT EXISTS {{DATASET}}.{{TABLE}}_migration
OPTIONS (expiration_timestamp = timestamp_add(CURRENT_TIMESTAMP(), INTERVAL 8 HOUR))
AS SELECT * FROM {{DATASET}}.{{TABLE}};
DROP TABLE {{DATASET}}.{{TABLE}};
CREATE TABLE {{DATASET}}.{{TABLE}}
(
{{COLUMN_DDL}}
);
INSERT INTO {{DATASET}}.{{TABLE}}
(
{{COLUMN_LIST}}
)
SELECT
{{COLUMN_LIST}}
FROM {{DATASET}}.{{TABLE}}_migration;

Set default value for multiple table sql server

I have a database with column ( empid ) in multiple tables.
I want to make a trigger or function or stored procedures to set a default value for this column in all tables after insert
I Hope this will resolve your Answer or give you some insight what we need to put in function or stored procedure to resolve the issue.
SELECT
'alter ' + TABLE_Schema + '.' + Table_name + 'ADD CONSTRAINT constraint_Name
DEFAULT (22) FOR [Column]'
FROM
db_name.INFORMATION_SCHEMA.COLUMNS
WHERE
COLUMN_NAME = 'ID'

How to assign a column to a dynamic variable

One column in one table of my database say (Table A and Column A) can be either of Numeric type or VARCHAR type. Datatype is decided dynamically and then table gets created.
I need to create a dynamic variable (#Dynamic) which should check the datatype of this column and assign a different column (column B or column C) to it accordingly i.e.
If column A is NVARCHAR, assign column B to #Dynamic
If column A is NUMERIC, assign column C to #Dynamic
I've to do this in both SQL Server and Oracle.
Any help to write a function for this would be greatly appreciated.
In sql server you can check column data type
SELECT DATA_TYPE
FROM INFORMATION_SCHEMA.COLUMNS
WHERE
TABLE_NAME = 'Table A'
AND COLUMN_NAME = 'Column A'
In oracle
SELECT Type
FROM user_tab_columns
WHERE table_name = 'Table A'
AND column_name = 'Column A';
Sql_variant is a dynamic type in SQL Server. The eqivalent in oracle would be anydata type.
But if you are using dynamic SQL why don't you store the data in nvarchar as it is big enough for the convertet numeric values as well and you can use it directly for your dynamic SQL statement?
Actually this was a generic question and did not need any sample data to answer.
The approach I am following is:
I wrote a function
CREATE FUNCTION [dbo].[GET_DATA_TYPE] (#input VARCHAR)
RETURNS VARCHAR(255) AS
BEGIN
DECLARE #l_data_type VARCHAR(255)
SELECT #l_data_type=DATA_TYPE
FROM INFORMATION_SCHEMA.COLUMNS
WHERE
TABLE_NAME = #input AND
COLUMN_NAME = 'AssetID'
RETURN #l_data_type
END
I would call this function in my stored procedures as
SELECT #dataType = dbo.GET_DATA_TYPE(#input);
I declared another variable i.e. #FinalType
IF #dataType == 'numeric'
THEN #FinalType = columnC
IF #dataType == 'nvarchar'
THEN #FinalType = columnD
Then I'll use this #FinalType variable in all my dynamic sqls.
Any other efficient way to do this.

How to change datatype of the column in derby database?

Iam trying to change the datatype of the column from integer(9) to Numeric(14,3).
but not able to change
i tried below query
alter table TableName alter ColumnName NUMERIC (14,3);
hope below one will help.
ALTER TABLE Table_Name ALTER COLUMN Column_Name SET DATA TYPE NUMERIC(14,3);

How to select all the columns of a table except one column?

How to select all the columns of a table except one column?
I have nearly 259 columns I cant mention 258 columns in SELECT statement.
Is there any other way to do it?
You can use this approach to get the data from all the columns except one:-
Insert all the data into a temporary table
Then drop the column which you dont want from the temporary table
Fetch the data from the temporary table(This will not contain the data of the removed column)
Drop the temporary table
Something like this:
SELECT * INTO #TemporaryTable FROM YourTableName
ALTER TABLE #TemporaryTable DROP COLUMN Columnwhichyouwanttoremove
SELECT * FROM #TemporaryTable
DROP TABLE #TemporaryTable
Create a view. Yes, in the view creation statement, you will have to list each...and...every...field...by...name.
Once.
Then just select * from viewname after that.
This is not a generic solution, but some databases allow you to use regular expressions to specify the columns.
For instance, in the case of Hive, the following query selects all columns except ds and hr:
SELECT `(ds|hr)?+.+` FROM sales
You can get the column name details from sys.columns table
Try the following query:
SELECT * FROM SYS.COLUMNS
WHERE object_id = OBJECT_ID('dbo.TableName')
AND [Name] <> 'ColumnName'
DECLARE #sql as VARCHAR(8000)
SET #sql = 'SELECT '
SELECT #sql += [Name] + ', ' FROM SYS.COLUMNS
WHERE object_id = OBJECT_ID('dbo.TableName')
AND [Name] <> 'ColumnName'
SELECT #sql += ' FROM Dbo.TableName'
EXEC(#sql)
I just wanted to echo #Luann's comment as I use this approach always.
Just right click on the table > Script table as > Select to > New Query window.
You will see the select query. Just take out the column you want to exclude and you have your preferred select query.
There are lot of options available , one of them is :
CREATE TEMPORARY TABLE temp_tb SELECT * FROM orig_tb;
ALTER TABLE temp_tb DROP col_x;
SELECT * FROM temp_tb;
Here the col_x is the column which u dont want to include in select statement.
Take a look at this question : Select all columns except one in MySQL?
You can retrieve the list of column name by simple query and then remove those column by apply where query like this.
SELECT * FROM (
SELECT COLUMN_NAME
FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_NAME = N'TableName'
) AS allColumns
WHERE allColumns.COLUMN_NAME NOT IN ('unwantedCol1', 'unwantedCol2')
If you are using DataGrip you can do the following:
Enter your SELECT statement SELECT * FROM <your_table>;
Put your cursor over * and press Alt+Enter
You will get pop up menu with Expand column list option
Click on it and it will convert * with full list of columns
Now you can remove columns that you don't need
Here is a link for an example on how to do it.
Without creating new table you can do simply (e.g with mysqli):
get all columns
loop through all columns and remove wich you want
make your query
$r = mysqli_query('SELECT column_name FROM information_schema.columns WHERE table_name = table_to_query');
$c = count($r); while($c--) if($r[$c]['column_name'] != 'column_to_remove_from_query') $a[] = $r[$c]['column_name']; else unset($r[$c]);
$r = mysqli_query('SELECT ' . implode(',', $a) . ' FROM table_to_query');
Try the following query:
DECLARE #Temp NVARCHAR(MAX);
DECLARE #SQL NVARCHAR(MAX);
SET #Temp = '';
SELECT #Temp = #Temp + COLUMN_NAME + ', ' FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME ='Person' AND COLUMN_NAME NOT IN ('Id')
SET #SQL = 'SELECT ' + SUBSTRING(#Temp, 0, LEN(#Temp)) +' FROM [Person]';
EXECUTE SP_EXECUTESQL #SQL;
In your case, expand columns of that database in the object explorer. Drag the columns in to the query area.
And then just delete one or two columns which you don't want and then run it. I'm open to any suggestions easier than this.
Only one way to achieve this giving column name. There is no other method found. You must have to list all column name