how to execute query foreach cell in a database - sql

I need to replace whitespace with NULL in every cell in my database. SQL server 2008 r2. I'm looking for something efficient, but looks like cursor is only way?

First find all tables and columns that are Nullable and of type CHAR or VARCHAR, using INFORMATION_SCHEMA:
SELECT TABLE_NAME, COLUMN_NAME
FROM MyDatabase.INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_SCHEMA = 'dbo'
AND IS_NULLABLE = 'YES'

Cursors are NEVER the answer :)
ypercube did the hard part. You need to take that info and iterate through it, executing an update statement for each column in each table. You can do that using a WHILE statement. Here is an UNTESTED example of how you could do this:
--Set database to use
USE [MyDatabase];
GO
--Create table variable to hold table/column pairs
DECLARE #table TABLE (
[Key] BIGINT PRIMARY KEY IDENTITY (1, 1),
[TABLE_NAME] VARCHAR(100),
[COLUMN_NAME] VARCHAR(100)
);
--Populate table variable
INSERT INTO #table ([TABLE_NAME], [COLUMN_NAME])
SELECT [TABLE_NAME], [COLUMN_NAME]
FROM MyDatabase.INFORMATION_SCHEMA.COLUMNS
WHERE [TABLE_SCHEMA] = 'dbo'
AND [IS_NULLABLE] = 'YES';
--Initialize counting variables
DECLARE #counter BIGINT = 1;
DECLARE #max BIGINT = (SELECT COUNT(1) FROM #table);
--Iterate through each pair
WHILE #counter <= #max
BEGIN
--Assign the current pair values to variables
DECLARE #TableName VARCHAR(100), #ColumnName VARCHAR(100);
SELECT #TableName = [TABLE_NAME], #ColumnName = [COLUMN_NAME]
FROM #table
WHERE [Key] = #counter;
--Execute dynamic SQL
EXEC
(
'UPDATE [' + #TableName + ']' +
'SET [' + #ColumnName + '] = NULL' +
'WHERE RTRIM([' + #ColumnName + ']) = '''';'
);
--Increment the counter
SET #counter = #counter + 1;
END

I guess it depends on how you define whitespace. Will the following work?
update mytable set mycolumn = null where len(rtrim(ltrim(mycolumn))) = 0

Related

Verify all columns can convert from varchar to float

I have tried a bunch of different ways like using cursors and dynamic SQL, but is there a fast way to verify that all columns in a given table can convert from varchar to float (without altering the table)?
I want to get a print out of which columns fail and which columns pass.
I am trying this method now but it is slow and cannot get the list of columns that pass or error out.
drop table users;
select *
into users_1
from users
declare #cols table (i int identity, colname varchar(100))
insert into #cols
select column_name
from information_schema.COLUMNS
where TABLE_NAME = 'users'
and COLUMN_NAME not in ('ID')
declare #i int, #maxi int
select #i = 1, #maxi = MAX(i) from #cols
declare #sql nvarchar(max)
while(#i <= #maxi)
begin
select #sql = 'alter table users_1 alter column ' + colname + ' float NULL'
from #cols
where i = #i
exec sp_executesql #sql
select #i = #i + 1
end
I found this code on one of the SQL tutorials sites.
Why all the drop/create/alter nonsense? If you just want to know if a column could be altered, why leave your table in a wacky state, where the columns that can be altered are altered, and the ones that can't just raise errors?
Here's one way to accomplish this with dynamic SQL (and with some protections):
DECLARE #tablename nvarchar(513) = N'dbo.YourTableName';
IF OBJECT_ID(#tablename) IS NOT NULL
BEGIN
DECLARE #sql nvarchar(max) = N'SELECT ',
#tmpl nvarchar(max) = N'[Can $colP$ be converted?]
= CASE WHEN EXISTS
(
SELECT 1 FROM ' + #tablename + N'
WHERE TRY_CONVERT(float, COALESCE($colQ$,N''0'')) IS NULL
)
THEN ''No, $colP$ cannot be coverted''
ELSE ''Yes, $colP$ CAN be converted'' END';
SELECT #sql += STRING_AGG(
REPLACE(REPLACE(#tmpl, N'$colQ$',
QUOTENAME(name)), N'$colP$', name), N',')
FROM sys.columns
WHERE object_id = OBJECT_ID(#tablename)
AND name <> N'ID';
EXEC sys.sp_executesql #sql;
END
Working db<>fiddle
This is never going to be "fast" - there is no great shortcut to having to read and validate every value in the table.

How to Change datatype of all column to Unicode datatype for all base table in a database

Due to storing values from language specific diacritics (Spanish, French, German) I am trying to change the datatype of a column to Unicode datatype.
Varchar to nvarchar
char to nchar
So all column datatype to its respective Unicode datatype.
For all tables in a specific database.
Can it be possible to do in a single statement? Because doing with alter statement is time consuming.
ALTER TABLE dbo.Employee
ALTER COLUMN FirstName NVARCHAR(255) NOT NULL
Many thanks.
First we create a temporary table and define the variables we need for the changes.
Then we fill the table by fetching the columns that need to change.
I entered the fetch command below and everything is clear.
Then, for each row in the table, we make the desired changes in the database.
Also, a column in a table may have a value of NULL, so you must replace the NULL values with a value before changing the column.
I replaced the NULL values with a value of 0.
The code below works perfectly and correctly.
Only I entered dbo for schema name in search of columns. See what your database schema name is
--Container to Insert Id which are to be iterated
Declare #temp1 Table
(
tablename varchar(100),
columnname varchar(100),
columnlength varchar(100),
columntype varchar(100)
)
--Container to Insert records in the inner select for final output
Insert into #temp1
SELECT t.TABLE_NAME,c.COLUMN_NAME,c.CHARACTER_MAXIMUM_LENGTH,c.DATA_TYPE FROM information_schema.tables t
join INFORMATION_SCHEMA.COLUMNS c on t.TABLE_NAME = c.TABLE_NAME
WHERE t.table_schema='dbo'
and (DATA_TYPE = 'varchar' OR DATA_TYPE = 'char')
-- Keep track of #temp1 record processing
Declare #tablename varchar(100)
Declare #columnname varchar(100)
Declare #columnlength varchar(100)
Declare #columntype varchar(100)
Declare #SQL VarChar(1000)
Declare #vary varchar(100)
Declare #final varchar(1000)
While((Select Count(*) From #temp1)>0)
Begin
Set #tablename=(Select Top 1 tablename From #temp1)
Set #columnname=(Select Top 1 columnname From #temp1)
Set #columnlength=(Select Top 1 columnlength From #temp1)
Set #columntype=(Select Top 1 columntype From #temp1)
if(#columntype = 'varchar')
Set #columntype='nvarchar'
else
Set #columntype='nchar'
--set null value with 0 value
SELECT #SQL = 'UPDATE ' + #tablename + ' SET ' + #columnname + ' = 0 WHERE ' + #columnname + ' IS NULL'
Exec ( #SQL)
SELECT #SQL = 'ALTER TABLE '
SELECT #SQL = #SQL + #tablename
select #vary = ' ALTER COLUMN ' + #columnname + ' ' + #columntype + '(' + #columnlength + ') NOT NULL'
select #final = #sql + #vary
--select #final
Exec ( #final)
Delete #temp1 Where tablename=#tablename and columnname = #columnname
End

Get column names in SQL server that satisfy a where condition on data

I just had a random doubt while working with SQL-server to which i thought i could get it clarified here.
Say- i have a condition that i want to find out all the column names in the database which satisfy my where condition on data.
Example:-
There are some 20-30 tables in a SQL-Server DB.Now all i need is a query to find out the list of column names which have "Ritesh" as a data field in them.
I don't know if it is really possible in the first place.
I hope i am clear. Please, any help will be most appreciated.
Thank You.
Ritesh.
This should work, but be aware, this will take a while to execute in large databases. I have assumed that the search string might be just a part of the data contained and am using wildcards. I feel this is purely academic, as I am unable to imagine a scenario, where this will be required.
--you need to iterate the whole columns of entire table to find the matched record
--another thing is that you need dynamic sql to find the table name
DECLARE #Value varchar(50) --value for that find the column Name
SET #Value = 'Ritesh'
CREATE TABLE #Table
(
TableName Varchar(500),ColumnName Varchar(500),
Id int Identity(1,1) --use for iteration
)
CREATE TABLE #Results
(
TableName varchar(500),
ColumnName varchar(500)
)
INSERT INTO #Table
SELECT
TABLE_SCHEMA + '.' + TABLE_NAME AS TableNam,
Column_name AS ColumnName
FROM INFORMATION_SCHEMA.COLUMNS
WHERE Data_type IN ('char', 'nchar', 'varchar', 'nvarchar')
--change the datatype based on the datatype of sample data you provide
-- also remember to change the wildcard, if the input datatype is not a string
DECLARE #Count Int --total record to iterated
SET #Count = 0;
SELECT
#Count = COUNT(*)
FROM #Table
DECLARE #I int --initial value one to iterate
SET #I = 1;
DECLARE #TableName varchar(500)
SET #TableName = ''
DECLARE #ColumnName varchar(500)
SET #ColumnName = ''
DECLARE #Str nvarchar(1000)
SET #Str = ''
DECLARE #param nvarchar(1000)
SET #param = ''
DECLARE #TableNameFound varchar(max)
SET #TableNameFound = ''
DECLARE #Found bit
SET #Found = 0;
WHILE #I<=#Count
BEGIN
SET #Found = 0;
SELECT
#TableName = TableName,
#ColumnName = ColumnName
FROM #Table
WHERE Id = #I;
SET #param = '#TableName varchar(500),#ColumnName varchar(500),#Value varchar(50),#TableNameFound varchar(max),#Found bit output'
SET #str = 'Select #Found=1 From ' + #TableName + ' where ' + #ColumnName + ' Like ' + '''' + '%' + #Value + '%' + ''''
-- here we are using tablename and actual value to find in table
EXEC sp_executesql #str,
#param,
#TableName,
#ColumnName,
#Value,
#TableNameFound,
#Found OUTPUT
IF #Found=1
BEGIN
INSERT INTO #Results (TableName, ColumnName)
SELECT
#TableName,
#ColumnName
END
--increment value of #I
SET #I = #I + 1;
END
--Display Results
SELECT * FROM #Results
--Clean Up
DROP TABLE #Table
DROP TABLE #Results

How can I update all empty string fields in a table to be null?

I have a table with 15 columns ( 10 of them are string value) and about 50000 rows.
It contain a lot empty string value ... I search if is there a query that I can pass a table name for it to iterate all values and if it equal empty then update it to NULL ..
UPDATE mytable
SET col1 = NULLIF(col1, ''),
col2 = NULLIF(col2, ''),
...
this is a simple way to do it based on table. just pass the proc the table names. you can also make a sister proc to loop thought table names and call this proc inside the while loop to work on each table in your loop logic.
CREATE PROC setNullFields
(#TableName NVARCHAR(100))
AS
CREATE TABLE #FieldNames
(
pk INT IDENTITY(1, 1) ,
Field NVARCHAR(1000) NULL
);
INSERT INTO #FieldNames
SELECT column_name FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = #TableName
DECLARE #maxPK INT;
SELECT #maxPK = MAX(PK) FROM #FieldNames
DECLARE #pk INT;
SET #pk = 1
DECLARE #dynSQL NVARCHAR(1000)
WHILE #pk <= #maxPK
BEGIN
DECLARE #CurrFieldName NVARCHAR(100);
SET #CurrFieldName = (SELECT Field FROM #FieldNames WHERE PK = #pk)
-- update the field to null here:
SET #dynSQL = 'UPDATE ' + #TableName + ' SET ' + #CurrFieldName + ' = NULLIF('+ #CurrFieldName+ ', '''' )'
EXEC (#dynSQL)
SELECT #pk = #pk + 1
END

SQL To search the entire MS SQL 2000 database for a value

I would like to search an entire MS SQL 2000 database for one value. This would be to aid development only. Keep that in mind when considering this question.
This will get all the table names and the column of the data type I'm looking for:
SELECT Columns.COLUMN_NAME, tables.TABLE_NAME
FROM INFORMATION_SCHEMA.Columns as Columns
JOIN INFORMATION_SCHEMA.TABLES as tables
On Columns.TABLE_NAME = tables.TABLE_NAME
WHERE Columns.DATA_TYPE = 'INT'
I was thinking something like this:
-- Vars
DECLARE #COUNTER INT
DECLARE #TOTAL INT
DECLARE #TABLE CHAR(128)
DECLARE #COLUMN CHAR(128)
DECLARE #COLUMNTYPE CHAR(128)
DECLARE #COLUMNVALUE INT
-- What we are looking for
SET #COLUMNTYPE = 'INT'
SET #COLUMNVALUE = 3
SET #COUNTER = 0
-- Find out how many possible columns exist
SELECT #TOTAL = COUNT(*)
FROM INFORMATION_SCHEMA.Columns as Columns
JOIN INFORMATION_SCHEMA.TABLES as tables
On Columns.TABLE_NAME = tables.TABLE_NAME
WHERE Columns.DATA_TYPE = #COLUMNTYPE
PRINT CAST(#TOTAL AS CHAR) + 'possible columns'
WHILE #COUNTER < #TOTAL
BEGIN
SET #COUNTER = #COUNTER +1
-- ADD MAGIC HERE
END
Any ideas?
UPDATE I recently found this tool that works quite well.
Since it is dev only (and probably doesn't have to be very elegant), how about using TSQL to generate a pile of TSQL that you then copy back into the query window and execute?
SELECT 'SELECT * FROM [' + tables.TABLE_NAME + '] WHERE ['
+ Columns.Column_Name + '] = ' + CONVERT(varchar(50),#COLUMNVALUE)
FROM INFORMATION_SCHEMA.Columns as Columns
INNER JOIN INFORMATION_SCHEMA.TABLES as tables
On Columns.TABLE_NAME = tables.TABLE_NAME
WHERE Columns.DATA_TYPE = #COLUMNTYPE
It won't be pretty, but it should work... an alternative might be to insert something like the above into a table-variable, then loop over the table-variable using EXEC (#Sql). But for dev purposes it probably isn't worth it...
I've found this script to be helpful... but as Marc noted, it wasn't really worth it. I've only used it a handful of times since I wrote it six months ago.
It only really comes in handy because there are a couple of tables in our dev environment which cause binding errors when you query them, and I always forget which ones.
BEGIN TRAN
declare #search nvarchar(100)
set #search = 'string to search for'
-- search whole database for text
SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED
IF nullif(object_id('tempdb..#tmpSearch'), 0) IS NOT NULL DROP TABLE #tmpSearch
CREATE TABLE #tmpSearch (
ListIndex int identity(1,1),
CustomSQL nvarchar(2000)
)
Print 'Getting tables...'
INSERT #tmpSearch (CustomSQL)
select 'IF EXISTS (select * FROM [' + TABLE_NAME + '] WHERE [' + COLUMN_NAME + '] LIKE ''%' + #search + '%'') BEGIN PRINT ''Table ' + TABLE_NAME + ', Column ' + COLUMN_NAME + ''';select * FROM [' + TABLE_NAME + '] WHERE [' + COLUMN_NAME + '] LIKE ''%' + #search + '%'' END' FROM information_schema.columns
where DATA_TYPE IN ('ntext', 'nvarchar', 'uniqueidentifier', 'char', 'varchar', 'text')
and TABLE_NAME NOT IN ('table_you_dont_want_to_look_in', 'and_another_one')
Print 'Searching...
'
declare #index int
declare #customsql nvarchar(2000)
WHILE EXISTS (SELECT * FROM #tmpSearch)
BEGIN
SELECT #index = min(ListIndex) FROM #tmpSearch
SELECT #customSQL = CustomSQL FROM #tmpSearch WHERE ListIndex = #index
IF #customSql IS NOT NULL
EXECUTE (#customSql)
SET NOCOUNT ON
DELETE #tmpSearch WHERE ListIndex = #index
SET NOCOUNT OFF
END
print 'the end.'
ROLLBACK