my problem is: I need to select all my db-tables which contain a column NrPad out of my database and for exactly this tables I need to update the column NrPad
I have already a working select and update statement:
select
t.name as table_name
from sys.tables t
inner join sys.columns c
on t.object_id = c.object_id
where c.name like 'NrPad'
Update Anlage Set NrPad = CASE WHEN Len(Nr) < 10 THEN '0' + Convert(Nvarchar,Len(Nr)) ELSE Convert(Nvarchar,Len(Nr)) END + Nr
My problem is: How can I merge this two statements together?
I'm open to suggestions and your help is greatly appreciated.
Use the INFORMATION_SCHEMA rather than sys.tables, and create a dynamic SQL statement like so:
DECLARE #sql varchar(max) = '';
SELECT
#sql = #sql + '; UPDATE ' + c.TABLE_NAME + ' SET NrPAd = CASE WHEN LEN(Nr)<10 THEN ''0'' + CONVERT(NVARCHAR,LEN(NR)) ELSE CONVERT(NVARCHAR,LEN(NR)) END + Nr'
FROM INFORMATION_SCHEMA.COLUMNS c
where c.COLUMN_NAME = 'NrPad'
print #sql -- for debugging purposes
exec (#sql)
This assumes that all tables that have the NrPad column also have a Nr column. If you need to check for those, or if you just need to use the Nr column from a particular table, it's a bit different (either join against INFORMATION_SCHEMA.COLUMNS again or against Anglage to get the value of Nr or check that Nr is a column on that table).
Not testet on your case but you could do an update - set - from - where.
Have a look at this question with multiple answers: How do I UPDATE from a SELECT in SQL Server?
maybe someone will judge me,but all i can do for this case is cursor
DECLARE #table_name varchar(100)
DECLARE #sql varchar(1000)
DECLARE table_cursor CURSOR FOR
select
t.name as table_name
from sys.tables t
inner join sys.columns c
on t.object_id = c.object_id
where c.name like 'NrPad'
OPEN table_cursor
Fetch next From table_cursor Into #table_name
While ##fetch_status=0
Begin
set #sql = 'Update' + #table_name + 'Set NrPad = CASE WHEN Len(Nr) < 10 THEN '0' + Convert(Nvarchar,Len(Nr))
ELSE Convert(Nvarchar,Len(Nr)) END + Nr'
EXEC (#sql)
Fetch Next From table_cursor Into #table_name
End
Close table_cursor
Deallocate table_cursor
this is how you write the cursor in SQLSERVER, i really don't want code another one for Oracle. so please tag the dbms you are using next time
You can modify the select statement to generate the update statements then execute them all.
Below uses string literal of Oracle.
select 'Update ' || t.name || q'[ Set NrPad = CASE WHEN Len(Nr) < 10 THEN '0' + Convert(Nvarchar,Len(Nr)) ELSE Convert(Nvarchar,Len(Nr)) END + Nr;]'
from sys.tables t
inner join sys.columns c
on t.object_id = c.object_id
where c.name like 'NrPad'
Related
Is there an easy way to count nulls in all fields in a table without writing 40+ very similar, but slightly different, queries? I would think there is some kind of statistics maintained for all tables, and this may be the easiest way to go with it, but I don't know for sure. Thoughts, anyone? Thanks!!
BTW, I am using SQL Server 2008.
Not sure if you consider this simple or not, but this will total the NULLs by column in a table.
DECLARE #table sysname;
SET #table = 'MyTable'; --replace this with your table name
DECLARE #colname sysname;
DECLARE #sql NVARCHAR(MAX);
DECLARE COLS CURSOR FOR
SELECT c.name
FROM sys.tables t
INNER JOIN sys.columns c ON t.object_id = c.object_id
WHERE t.name = #table;
SET #sql = 'SELECT ';
OPEN COLS;
FETCH NEXT FROM COLS INTO #colname;
WHILE ##FETCH_STATUS = 0
BEGIN
SET #sql = #sql + 'COUNT(CASE WHEN ' + #colname + ' IS NULL THEN 1 END) AS ' + #colname + '_NULLS,'
FETCH NEXT FROM COLS INTO #colname;
END;
CLOSE COLS;
DEALLOCATE COLS;
SET #sql = LEFT(#sql,LEN(#sql) - 1) --trim tailing ,
SET #sql = #sql + ' FROM ' + #table;
EXEC sp_executesql #sql;
SELECT COUNT( CASE WHEN field01 IS NULL THEN 1 END) +
COUNT( CASE WHEN field02 IS NULL THEN 1 END) +
...
COUNT( CASE WHEN field40 IS NULL THEN 1 END) as total_nulls
This answer will return a table containing the name of each column of a specified table. (#tab is the name of the table you're trying to count NULLs in.)
You can loop through the column names, count NULLs in each column, and add the result to a total running count.
many of tables in my DB have a Boolean column 'IsDeleted'.
I need to alter the column in all tables, that the default value will be zero, and then update all old records with value null, to be with value zero.
There is a way to do it beside writing a script for every table?
Thanks,
This would be a good starting point to generate the CReate, Update and Rename scripts required. Advisory: TEST ON BACKUP OF DATABASE FIRST.
select
'ALTER TABLE dbo.' + O.Name + ' ADD IsDeletedNew bit default 0;
UPDATE dbo.' + O.Name + ' SET IsDeletedNew = 1 WHERE IsDeleted = 1;
UPDATE dbo.' + O.Name + ' SET IsDeletedNew = 0 WHERE IsDeleted = 0 OR IsDeleted IS NULL;
ALTER TABLE dbo.' + O.Name + ' DROP COLUMN IsDeleted;
EXECUTE sp_rename N''dbo.' + O.Name + '.IsDeletedNew'', N''Tmp_IsDeleted_1'', ''COLUMN''
EXECUTE sp_rename N''dbo.' + O.Name + '.Tmp_IsDeleted_1'', N''IsDeleted'', ''COLUMN'' '
from syscolumns C
Inner join sysobjects o on C.ID = O.ID
where c.name = 'IsDeleted'
First, I can set a default value for a boolean filed. It worked for me.
ALTER TABLE [dbo].<TableName> ADD DEFAULT 0 FOR IsDeleted
This is my script that sets default value for every 'IsDeleted' field that doesn't have a defualt value. It worked for me.
DECLARE #NAME VARCHAR(100)
DECLARE #SQL NVARCHAR(300)
DECLARE CUR CURSOR
FOR
SELECT t.name AS 'TableName'
FROM sys.columns c
JOIN sys.tables t ON c.object_id = t.object_id
WHERE c.name = 'IsDeleted'
AND (SELECT object_definition(default_object_id) AS definition
FROM sys.columns
WHERE name ='IsDeleted'
AND object_id = object_id(t.name)) is null
OPEN CUR
FETCH NEXT FROM CUR INTO #NAME
WHILE ##FETCH_STATUS = 0
BEGIN
SET #SQL = 'ALTER TABLE [dbo].'+#NAME+' ADD DEFAULT 0 FOR IsDeleted'
--PRINT #SQL -- will print all the update scripts
EXEC Sp_executesql #SQL
FETCH NEXT FROM CUR INTO #NAME
END
CLOSE CUR
DEALLOCATE CUR
With so many tables, do the alter using dynamic SQL
declare #tab_name varchar(120)
declare #the_sql varchar(1000)
declare MyCursor cursor
for
select distinct table_name
from INFORMATION_SCHEMA.COLUMNS
where column_name = 'IsNumeric'
open MyCursor
fetch next from MyCursor into #tab_name
while ##fetchstatus = 0
begin
set #the_sql = 'alter table ' + #tab_name + ' add NewNumeric bit default 0'
execute (#the_sql)
fetch next from MyCursor into #tab_name
end
close MyCursor
deallocate MyCursor
rinse and repeat to updatethe values, delete the old column and then update the new column
Using SQL Server 2008, say I have a table called testing with 80 columns and I want to find a value called foo.
I can do:
SELECT *
FROM testing
WHERE COLNAME = 'foo'
Is it possible I can query all 80 columns and return all the results where foo is contained in any of the 80 columns?
You can use in:
SELECT *
FROM testing
WHERE 'foo' in (col1, col2, col3, . . . );
First Method(Tested)
First get list of columns in string variable separated by commas and then you can search 'foo' using that variable by use of IN
Check stored procedure below which first gets columns and then searches for string:
DECLARE #TABLE_NAME VARCHAR(128)
DECLARE #SCHEMA_NAME VARCHAR(128)
-----------------------------------------------------------------------
-- Set up the name of the table here :
SET #TABLE_NAME = 'testing'
-- Set up the name of the schema here, or just leave set to 'dbo' :
SET #SCHEMA_NAME = 'dbo'
-----------------------------------------------------------------------
DECLARE #vvc_ColumnName VARCHAR(128)
DECLARE #vvc_ColumnList VARCHAR(MAX)
IF #SCHEMA_NAME =''
BEGIN
PRINT 'Error : No schema defined!'
RETURN
END
IF NOT EXISTS (SELECT * FROM sys.tables T JOIN sys.schemas S
ON T.schema_id=S.schema_id
WHERE T.Name=#TABLE_NAME AND S.name=#SCHEMA_NAME)
BEGIN
PRINT 'Error : The table '''+#TABLE_NAME+''' in schema '''+
#SCHEMA_NAME+''' does not exist in this database!'
RETURN
END
DECLARE TableCursor CURSOR FAST_FORWARD FOR
SELECT CASE WHEN PATINDEX('% %',C.name) > 0
THEN '['+ C.name +']'
ELSE C.name
END
FROM sys.columns C
JOIN sys.tables T
ON C.object_id = T.object_id
JOIN sys.schemas S
ON S.schema_id = T.schema_id
WHERE T.name = #TABLE_NAME
AND S.name = #SCHEMA_NAME
ORDER BY column_id
SET #vvc_ColumnList=''
OPEN TableCursor
FETCH NEXT FROM TableCursor INTO #vvc_ColumnName
WHILE ##FETCH_STATUS=0
BEGIN
SET #vvc_ColumnList = #vvc_ColumnList + #vvc_ColumnName
-- get the details of the next column
FETCH NEXT FROM TableCursor INTO #vvc_ColumnName
-- add a comma if we are not at the end of the row
IF ##FETCH_STATUS=0
SET #vvc_ColumnList = #vvc_ColumnList + ','
END
CLOSE TableCursor
DEALLOCATE TableCursor
-- Now search for `foo`
SELECT *
FROM testing
WHERE 'foo' in (#vvc_ColumnList );
2nd Method
In sql server you can get object id of table then using that object id you can fetch columns. In that case it will be as below:
Step 1: First get Object Id of table
select * from sys.tables order by name
Step 2: Now get columns of your table and search in it:
select * from testing where 'foo' in (select name from sys.columns where object_id =1977058079)
Note: object_id is what you get fetch in first step for you relevant table
You can use in and you can get the column names dynamically and pass them to IN clause by making sql string and executing it using execute sp_executesql.
declare #sql nvarchar(2100)
declare #cols nvarchar(2000)
declare #toSearch nvarchar(200)
declare #tableName nvarchar(200)
set #tableName = 'tbltemp'
set #toSearch = '5'
set #cols =(
SELECT LEFT(column_name, LEN(column_name) - 1)
FROM (
SELECT column_name + ', '
FROM INFORMATION_SCHEMA.COLUMNS where table_name = #tableName
FOR XML PATH ('')
) c (column_name )
)
set #sql = 'select * from tbltemp where '''+ #toSearch + ''' in (' + #cols + ')';
execute sp_executesql #sql
I think this is one of the best ways of doing it
SELECT * FROM sys.columns a
inner join
(
SELECT object_id
FROM sys.tables
where
type='U'--user table
and name like 'testing'
) b on a.object_id=b.object_id
WHERE a.name like '%foo%'
I took the idea from ubaid ashraf's answer, but made it actually work. Just change MyTableName here:
SELECT STUFF((
SELECT ', ' + c.name
FROM sys.columns c
JOIN sys.types AS t ON c.user_type_id=t.user_type_id
WHERE t.name != 'int' AND t.name != 'bit' AND t.name !='date' AND t.name !='datetime'
AND object_id =(SELECT object_id FROM sys.tables WHERE name='MyTableName')
FOR XML PATH('')),1,2,'')
You could tweak it to your needs and add or remove conditions from the where column (the 't.name != 'int' AND t.name != 'bit' etc. part), e.g. add 't.name != 'uniqueidentifier'' to avoid getting Conversion failed when converting the varchar value 'myvalue' to data type int type of errors..
Then copy paste the result into this query (otherwise it didn't work):
SELECT * from MyTableName where 'foo' in (COPY PASTE PREVIOUS QUERY RESULT INTO HERE)
--Obtain object_id
SELECT object_id FROM sys.tables WHERE name = <your_table>
--look for desired value in specified columns using below syntax
SELECT * FROM <your_table> WHERE <VALUE_YOU_SEARCH_FOR> in
(SELECT name FROM sys.tables WHERE object_id = <your_table_object_id>
and name like '<if_you_have_multiple_columns_with_same_name_pattern>')
I've worked with BornToCode's answer and this script generates the queries to find a value in all columns of type varchar for any view (can be table) of the database:
DECLARE #id INT
declare #name nvarchar(30)
DECLARE #getid CURSOR
declare #value nvarchar(30)
set #value = 'x'
SET #getid = CURSOR FOR
SELECT object_id,name
FROM sys.views
OPEN #getid
FETCH NEXT
FROM #getid INTO #id, #name
WHILE ##FETCH_STATUS = 0
BEGIN
---------
SELECT 'SELECT * from ' + #name + ' where ''' + #value + ''' in (' +
STUFF((
SELECT ', ' + c.name
FROM sys.columns c
JOIN sys.types AS t ON c.user_type_id=t.user_type_id
WHERE t.name = 'varchar'-- AND t.name != 'bit' AND t.name !='date' AND t.name !='datetime'
AND object_id =(SELECT object_id FROM sys.views WHERE name=#name)
FOR XML PATH('')),1,2,'')
+ ')' as 'query'
------
FETCH NEXT
FROM #getid INTO #id, #name
END
CLOSE #getid
DEALLOCATE #getid
So I have a database with many tables that have a column that contains a GL Account value (for financial purposes). The column name varies by table (i.e. in one table the column is called "gldebitaccount" and in another table it's called "glcreditaccount"). I was able to find all combinations of table / column pairs using the following query:
SELECT c.name AS ColName, t.name AS TableName
FROM sys.columns c
JOIN sys.tables t ON c.object_id = t.object_id
WHERE c.name LIKE '%gl%acc%'
This query returns close to 100 pairs of tables/columns. I am trying to find any value in any of those table/column pairs that exceeds 25 chars in length. For an individual table/column, I'd typically use:
SELECT *
FROM tableName
WHERE LEN(columnName)>25
I want to avoid having to run that query 100 times with each pair. Is there any way I can do a "for each" (which I know is frowned upon in SQL since everything should be set-based). I've done sub-SELECT statements before, but not any that involved change the table in the FROM clause. Any ideas or help would be greatly appreciated!
Thanks in advance!
As the previous answer, the solution will need dynamic SQL. Here is a way that uses both dynamic SQL and cursors, and you can expect slow performance, so use at your own risk:
DECLARE #TableName NVARCHAR(128), #ColumnName NVARCHAR(128)
DECLARE #Query NVARCHAR(4000)
DECLARE CC CURSOR LOCAL FAST_FORWARD FOR
SELECT QUOTENAME(t.name), QUOTENAME(c.name)
FROM sys.columns c
INNER JOIN sys.tables t
ON c.object_id = t.object_id
WHERE c.collation_name IS NOT NULL
AND c.max_length > 25 AND c.name LIKE '%gl%acc%';
CREATE TABLE #Results(TableName NVARCHAR(128), ColumnName NVARCHAR(128));
OPEN CC
FETCH NEXT FROM CC INTO #TableName, #ColumnName
WHILE ##FETCH_STATUS = 0
BEGIN
SET #Query = 'IF EXISTS(SELECT 1 FROM '+#TableName+'
WHERE LEN('+#ColumnName+') > 25)
INSERT INTO #Results
VALUES(#TableName,#ColumnName)'
EXEC sp_executesql #Query,
N'#TableName NVARCHAR(128),#ColumnName NVARCHAR(128)',
#TableName,
#ColumnName;
FETCH NEXT FROM CC INTO #TableName, #ColumnName
END
CLOSE CC
DEALLOCATE CC
SELECT *
FROM #Results
Here's an option without cursors that also doesn't add XML overhead. Note that it also protects you from potential type conflicts (e.g. try the others in a database with hierarchyid columns, like AdventureWorks), from table or column names with apostrophes, and from table names that exist in more than one schema.
DECLARE #sql NVARCHAR(MAX) = N'';
CREATE TABLE #Results
(
SchemaName NVARCHAR(128), TableName NVARCHAR(128), ColumnName NVARCHAR(128)
);
SELECT #sql += N'INSERT #Results SELECT '''
+ REPLACE(s.name,'''','''''') + ''','''
+ REPLACE(t.name,'''','''''') + ''','''
+ REPLACE(c.name,'''','''''') + '''
WHERE EXISTS (SELECT 1 FROM ' + QUOTENAME(s.name)
+ '.' + QUOTENAME(t.name) + ' WHERE
LEN(' + QUOTENAME(c.name) + ') > 25);
'
FROM sys.columns AS c
INNER JOIN sys.tables AS t
ON c.[object_id] = t.[object_id]
INNER JOIN sys.schemas AS s
ON t.[schema_id] = s.[schema_id]
WHERE
(
c.system_type_id IN (35,99) -- text,ntext
OR (c.system_type_id IN (167,231) -- varchar,nvarchar, could be max
AND c.max_length > 25 OR c.max_length = -1)
OR (c.system_type_id IN (175,239) -- char, nchar
AND c.max_length > 25)
)
AND c.name LIKE N'%gl%acc%';
EXEC sp_executesql #sql;
SELECT SchemaName, TableName, ColumnName FROM #Results;
Yet another solution with dynamic SQL.
But now without cursors. It uses FOR XML statement and should be much faster.
DECLARE #sqlstatement VARCHAR(MAX);
SET #sqlstatement =
REPLACE (
STUFF ( (
SELECT 'UNION ALL SELECT ''' + t.name + ''' as TableName, '''
+ c.name + ''' AS ColumnName, '
+ c.name + ' AS Value FROM '
+ t.name + ' WHERE LEN (' + c.name + ') ' + CHAR(62) + ' 25'
FROM sys.columns c
INNER JOIN sys.tables t ON c.object_id = t.object_id
WHERE c.name LIKE '%gl%acc%'
FOR XML PATH('')
), 1, 10, '')
, '>', '>')
EXEC (#sqlstatement)
You may want to add extra filter for columns by their type and max_length:
INNER JOIN sys.types ty ON c.system_type_id = ty.system_type_id
AND (
ty.name IN ('text', 'ntext')
OR (
ty.name IN ('varchar', 'char', 'nvarchar', 'nchar')
AND (c.max_length > 25 OR c.max_length = -1)
)
You will need to create dynamic SQL because you cannot dynamically specify the source table. You could do this using a cursor, or write a select statement that makes a row for each statement you need to run. This shows how to do it with a cursor. You problem looks like an acceptable usage for a cursor:
DECLARE #ColName VARCHAR(MAX);
DECLARE #TableName VARCHAR(MAX);
DECLARE #SomeSQL VARCHAR(MAX);
DECLARE db_cursor CURSOR FOR
SELECT c.name AS ColName, t.name AS TableName
FROM sys.columns c
JOIN sys.tables t ON c.object_id = t.object_id
WHERE c.name LIKE '%gl%acc%'
OPEN db_cursor;
FETCH NEXT FROM db_cursor INTO #ColName, #TableName;
WHILE ##FETCH_STATUS = 0
BEGIN
-- you need to make dynamic SQL
SELECT #SomeSQL = 'SELECT * FROM ' + #TableName + ' WHERE LEN(' + #ColName + ') > 25;'
PRINT(#SomeSQL + CHAR(10));
-- you could execute it directly if you wish.
--EXEC (#SomeSQL);
FETCH NEXT FROM db_cursor INTO #ColName, #TableName;
END
CLOSE db_cursor;
DEALLOCATE db_cursor;
I wasn't sure if you needed to do anything with the results, but this will return the records that meet the criteria you posted in your question
Declare #TableName sysname
Declare #ColName sysname
Declare #dynamic_SQL varchar(MAX)
Declare some_cursor CURSOR FOR
SELECT c.name AS ColName, t.name AS TableName
FROM sys.columns c
JOIN sys.tables t ON c.object_id = t.object_id
WHERE c.name LIKE '%gl%acc%'
OPEN some_cursor
FETCH NEXT FROM some_cursor INTO #ColName, #TableName
WHILE ##FETCH_STATUS = 0
Begin
select #dynamic_SQL = '
Select *
From ' + #TableName + '
Where LEN('+ #ColName +') > 25
'
exec (#dynamic_SQL)
FETCH NEXT FROM some_cursor INTO #ColName, #TableName
End
CLOSE some_cursor
DEALLOCATE some_cursor
I have an SQL Server 2008 database with many tables. I've been using the now lame datetime datatype and want to use the new and better datetime2. In most places where I have a datetime field, the corresponding column name is Timestamp. Is there anywhere to do a bulk change from datatime to datetime2?
Run this in Management Studio, copy the result and paste into new Query Window:
select 'ALTER TABLE ' + OBJECT_NAME(o.object_id) +
' ALTER COLUMN ' + c.name + ' DATETIME2 ' +
CASE WHEN c.is_nullable = 0 THEN 'NOT NULL' ELSE 'NULL' END
from sys.objects o
inner join sys.columns c on o.object_id = c.object_id
inner join sys.types t on c.system_type_id = t.system_type_id
where o.type='U'
and c.name = 'Timestamp'
and t.name = 'datetime'
order by OBJECT_NAME(o.object_id)
Data type alteration generally requires ALTER TABLE statements:
ALTER TABLE myTable ALTER COLUMN timestamp datetime2 [NOT] NULL
To change all the datetime columns into datetime2 in a given database & schema:
DECLARE #SQL AS NVARCHAR(4000)
DECLARE #table_name AS NVARCHAR(255)
DECLARE #column_name AS NVARCHAR(255)
DECLARE #isnullable AS BIT
DECLARE CUR CURSOR FAST_FORWARD FOR
SELECT c.table_name,
c.column_name,
CASE WHEN c.is_nullable = 'YES' THEN 1 ELSE 0 END AS is_nullable
FROM INFORMATION_SCHEMA.COLUMNS c
WHERE c.data_type = 'datetime'
AND c.table_catalog = 'your_database'
AND c.table_schema = 'your_schema'
-- AND c.table_name = 'your_table'
OPEN CUR
FETCH NEXT FROM CUR INTO #table_name, #column_name, #isnullable
WHILE ##FETCH_STATUS = 0
BEGIN
SELECT #SQL = 'ALTER TABLE ' + #table_name + ' ALTER COLUMN ' + #column_name + ' datetime2' + (CASE WHEN #isnullable = 1 THEN '' ELSE ' NOT' END) + ' NULL;'
EXEC sp_executesql #SQL
FETCH NEXT FROM CUR INTO #table_name, #column_name, #isnullable
END
CLOSE CUR;
DEALLOCATE CUR;
This would be a bit of a brute-force method, but you could always look up all columns of datatype datetime using the sys.columns view, grab the table name and column name, iterate over that list with a cursor, and for each entry generate an ALTER TABLE statement like so:
ALTER TABLE #tablename ALTER COLUMN #columnname datetime2
Then run said statement with EXEC. Obviously, you'd need to have permissions both to query sys.columns and to ALTER all of those tables...
Apologies there isn't more code in this answer - don't have a copy of SSMS on this machine, and can't remember the syntax for all of that from memory. :)
I would use a query window, and output all of the ALTER TABLE statements you need to perform this. Once you have them all generated, you can run the result against the database.
if you select from SYSCOLUMNS the names of the tables and fields that you want, you can generate the statements you need to change all of the columns in the database to datetime2.
ALTER TABLE {tablename} ALTER COLUMN {fieldname} datetime2 [NULL | NOT NULL]
You can do something like this:
SELECT
'ALTER TABLE [' + Table_Schema+'].['+Table_Name
+'] Alter Column ['+Column_Name+'] datetime2;'
FROM
INFORMATION_SCHEMA.COLUMNS
WHERE DATA_TYPE='datetime';
Now you have all the scripts necessary to make your bulk type change.
Ref: https://www.sqlservercentral.com/forums/topic/how-to-change-column-type-on-all-tables-of-a-certain-database