Set default value for multiple table sql server - sql

I have a database with column ( empid ) in multiple tables.
I want to make a trigger or function or stored procedures to set a default value for this column in all tables after insert

I Hope this will resolve your Answer or give you some insight what we need to put in function or stored procedure to resolve the issue.
SELECT
'alter ' + TABLE_Schema + '.' + Table_name + 'ADD CONSTRAINT constraint_Name
DEFAULT (22) FOR [Column]'
FROM
db_name.INFORMATION_SCHEMA.COLUMNS
WHERE
COLUMN_NAME = 'ID'

Related

How to assign a column to a dynamic variable

One column in one table of my database say (Table A and Column A) can be either of Numeric type or VARCHAR type. Datatype is decided dynamically and then table gets created.
I need to create a dynamic variable (#Dynamic) which should check the datatype of this column and assign a different column (column B or column C) to it accordingly i.e.
If column A is NVARCHAR, assign column B to #Dynamic
If column A is NUMERIC, assign column C to #Dynamic
I've to do this in both SQL Server and Oracle.
Any help to write a function for this would be greatly appreciated.
In sql server you can check column data type
SELECT DATA_TYPE
FROM INFORMATION_SCHEMA.COLUMNS
WHERE
TABLE_NAME = 'Table A'
AND COLUMN_NAME = 'Column A'
In oracle
SELECT Type
FROM user_tab_columns
WHERE table_name = 'Table A'
AND column_name = 'Column A';
Sql_variant is a dynamic type in SQL Server. The eqivalent in oracle would be anydata type.
But if you are using dynamic SQL why don't you store the data in nvarchar as it is big enough for the convertet numeric values as well and you can use it directly for your dynamic SQL statement?
Actually this was a generic question and did not need any sample data to answer.
The approach I am following is:
I wrote a function
CREATE FUNCTION [dbo].[GET_DATA_TYPE] (#input VARCHAR)
RETURNS VARCHAR(255) AS
BEGIN
DECLARE #l_data_type VARCHAR(255)
SELECT #l_data_type=DATA_TYPE
FROM INFORMATION_SCHEMA.COLUMNS
WHERE
TABLE_NAME = #input AND
COLUMN_NAME = 'AssetID'
RETURN #l_data_type
END
I would call this function in my stored procedures as
SELECT #dataType = dbo.GET_DATA_TYPE(#input);
I declared another variable i.e. #FinalType
IF #dataType == 'numeric'
THEN #FinalType = columnC
IF #dataType == 'nvarchar'
THEN #FinalType = columnD
Then I'll use this #FinalType variable in all my dynamic sqls.
Any other efficient way to do this.

I want to create a stored procedure using T-SQL that dynamically creates a table based on outside data that isn't vulnerable to SQL injection

I have a table which has a column that represents the name of a table we'd like to create. There's a foreign key relationship to another table which has a column representing the name of the columns for the desired table (all data types assumed to be nvarchar). I'm using a stored procedure to create this table. Essentially what I'm doing is getting all of the relevant data from my tables, then building a SQL string up to generate the table, and finally using EXEC sp_executesql #CreateTableSQL.
#CreateTableSQL is generated through string concatenation like this:
SET #CreateTableSQL = 'CREATE TABLE ' + #TableName + ' (' + #ColumnString + ')';
This leaves me vulnerable to SQL injection. If someone were to use a #TableName value of:
C (t int); DROP TABLE MyTable;--
then this would drop MyTable (undesirable).
Can someone help me build this SQL and leave it invulnerable to injection? Help is greatly appreciated. Thanks!
You can make use of QUOTENAME() function which will enforce square brackets [] around the variables(Table and column names) and any value passed to these variables will only be treated as an Object name.
Something like ......
SET #CreateTableSQL = 'CREATE TABLE ' + QUOTENAME(#TableName)
+ ' (' + QUOTENAME(#ColumnString) + ')';
Now even if someone passes a value of C (t int); DROP TABLE MyTable;-- to any of these variables, the whole value C (t int); DROP TABLE MyTable;-- will still be treated as an object name.

Update single column found in multiple tables

I have the same column in multiple tables in my database. I need to update every table that contains that column where the value is equal to 'xxxx'. There's a very similar stack question here which is close to what I'm looking for - I just need to add another condition in my WHERE statement. I'm not sure how to include it in the query as I keep getting syntax errors.
SELECT 'UPDATE ' + TABLE_NAME + ' SET customer= ''NewCustomerValue'' '
FROM INFORMATION_SCHEMA.COLUMNS
WHERE COLUMN_NAME = 'customer'
The part I'm having problems with is how to include the below line in the 'WHERE' statement.
AND customer='xxxx'
Try like this
SELECT 'UPDATE ' + TABLE_NAME + ' SET customer= ''NewCustomerValue'' where customer=''xxxx'''
FROM INFORMATION_SCHEMA.COLUMNS
WHERE COLUMN_NAME = 'customer'
try this:
' AND customer=''xxxx'' ' --(two ' inside a string = ')

How to describe table in SQL Server 2008?

I want to describe a table in SQL Server 2008 like what we can do with the DESC command in Oracle.
I have table [EX].[dbo].[EMP_MAST] which I want to describe, but it does not work.
Error shown:
The object 'EMP_MAST' does not exist in database 'master' or is
invalid for this operation.
You can use sp_columns, a system stored procedure for describing a table.
exec sp_columns TableName
You can also use sp_help.
According to this documentation:
DESC MY_TABLE
is equivalent to
SELECT column_name "Name", nullable "Null?",
concat(concat(concat(data_type,'('),data_length),')') "Type" FROM
user_tab_columns WHERE table_name='TABLE_NAME_TO_DESCRIBE';
I've roughly translated that to the SQL Server equivalent for you - just make sure you're running it on the EX database.
SELECT column_name AS [name],
IS_NULLABLE AS [null?],
DATA_TYPE + COALESCE('(' + CASE WHEN CHARACTER_MAXIMUM_LENGTH = -1
THEN 'Max'
ELSE CAST(CHARACTER_MAXIMUM_LENGTH AS VARCHAR(5))
END + ')', '') AS [type]
FROM INFORMATION_SCHEMA.Columns
WHERE table_name = 'EMP_MAST'
The sp_help built-in procedure is the SQL Server's closest thing to Oracle's DESC function IMHO
sp_help MyTable
Use
sp_help "[SchemaName].[TableName]"
or
sp_help "[InstanceName].[SchemaName].[TableName]"
in case you need to qualify the table name further
You can use keyboard short-cut for Description/ detailed information of Table in SQL Server 2008.
Follow steps:
Write Table Name,
Select it, and press Alt + F1
It will show detailed information/ description of mentioned table as,
1) Table created date,
2) Columns Description,
3) Identity,
4) Indexes,
5) Constraints,
6) References etc. As shown Below [example]:
May be this can help:
Use MyTest
Go
select * from information_schema.COLUMNS where TABLE_NAME='employee'
{ where: MyTest= DatabaseName
Employee= TableName } --Optional conditions
I like the answer that attempts to do the translate, however, while using the code it doesn't like columns that are not VARCHAR type such as BIGINT or DATETIME. I needed something similar today so I took the time to modify it more to my liking. It is also now encapsulated in a function which is the closest thing I could find to just typing describe as oracle handles it. I may still be missing a few data types in my case statement but this works for everything I tried it on. It also orders by ordinal position. this could be expanded on to include primary key columns easily as well.
CREATE FUNCTION dbo.describe (#TABLENAME varchar(50))
returns table
as
RETURN
(
SELECT TOP 1000 column_name AS [ColumnName],
IS_NULLABLE AS [IsNullable],
DATA_TYPE + '(' + CASE
WHEN DATA_TYPE = 'varchar' or DATA_TYPE = 'char' THEN
CASE
WHEN Cast(CHARACTER_MAXIMUM_LENGTH AS VARCHAR(5)) = -1 THEN 'Max'
ELSE Cast(CHARACTER_MAXIMUM_LENGTH AS VARCHAR(5))
END
WHEN DATA_TYPE = 'decimal' or DATA_TYPE = 'numeric' THEN
Cast(NUMERIC_PRECISION AS VARCHAR(5))+', '+Cast(NUMERIC_SCALE AS VARCHAR(5))
WHEN DATA_TYPE = 'bigint' or DATA_TYPE = 'int' THEN
Cast(NUMERIC_PRECISION AS VARCHAR(5))
ELSE ''
END + ')' AS [DataType]
FROM INFORMATION_SCHEMA.Columns
WHERE table_name = #TABLENAME
order by ordinal_Position
);
GO
once you create the function here is a sample table that I used
create table dbo.yourtable
(columna bigint,
columnb int,
columnc datetime,
columnd varchar(100),
columne char(10),
columnf bit,
columng numeric(10,2),
columnh decimal(10,2)
)
Then you can execute it as follows
select * from describe ('yourtable')
It returns the following
ColumnName IsNullable DataType
columna NO bigint(19)
columnb NO int(10)
columnc NO datetime()
columnd NO varchar(100)
columne NO char(10)
columnf NO bit()
columng NO numeric(10, 2)
columnh NO decimal(10, 2)
hope this helps someone.
As a variation of Bridge's answer (I don't yet have enough rep to comment, and didn't feel right about editing that answer), here is a version that works better for me.
SELECT column_name AS [Name],
IS_NULLABLE AS [Null?],
DATA_TYPE + CASE
WHEN CHARACTER_MAXIMUM_LENGTH IS NULL THEN ''
WHEN CHARACTER_MAXIMUM_LENGTH > 99999 THEN ''
ELSE '(' + Cast(CHARACTER_MAXIMUM_LENGTH AS VARCHAR(5)) + ')'
END AS [Type]
FROM INFORMATION_SCHEMA.Columns
WHERE table_name = 'table_name'
Notable changes:
Works for types without length. For an int column, I was seeing NULL for the type because the length was null and it wiped out the whole Type column. So don't print any length component (or parens).
Change the check for CAST length of -1 to check actual length. I was getting a syntax error because the case resulted in '*' rather than -1. Seems to make more sense to perform an arithmetic check rather than an overflow from the CAST.
Don't print length when very long (arbitrarily > 5 digits).
Just enter the below line.
exec sp_help [table_name]

Alter every double fields of a table

I need to change every double precision field of a table to numeric(15,3) type,
how can i do this job quickly with a stored procedure that iterate through the field of a given table and if the type is double precision alter column to numeric ?
following query should return query to update these tables... you can add table filter if you want. Copy the result and run it.
select 'ALTER TABLE ' + table_name + ' ALTER COLUMN ' + column_name + 'TYPE numeric(15,3)'
from information_schema.columns
where data_type = 'double precision'
and table_name = 'YOUR_TABLE_NAME'
TEST IT BEFORE RUN IT. I DID NOT TEST THIS YET.
While another question will do it for all columns; a simple "alter table [tablename] alter column [columnToAlter] type numeric(15,3). You shouldn't need to run them through a cursor; any value that's not going to be affected by this should remain unaffected.
If you can't do it by changing the datatype itself, a simple update [tablename] set [columnname] = cast(columnname as numeric(15,3) should also work.
Hope that helps!