Verify all columns can convert from varchar to float - sql

I have tried a bunch of different ways like using cursors and dynamic SQL, but is there a fast way to verify that all columns in a given table can convert from varchar to float (without altering the table)?
I want to get a print out of which columns fail and which columns pass.
I am trying this method now but it is slow and cannot get the list of columns that pass or error out.
drop table users;
select *
into users_1
from users
declare #cols table (i int identity, colname varchar(100))
insert into #cols
select column_name
from information_schema.COLUMNS
where TABLE_NAME = 'users'
and COLUMN_NAME not in ('ID')
declare #i int, #maxi int
select #i = 1, #maxi = MAX(i) from #cols
declare #sql nvarchar(max)
while(#i <= #maxi)
begin
select #sql = 'alter table users_1 alter column ' + colname + ' float NULL'
from #cols
where i = #i
exec sp_executesql #sql
select #i = #i + 1
end
I found this code on one of the SQL tutorials sites.

Why all the drop/create/alter nonsense? If you just want to know if a column could be altered, why leave your table in a wacky state, where the columns that can be altered are altered, and the ones that can't just raise errors?
Here's one way to accomplish this with dynamic SQL (and with some protections):
DECLARE #tablename nvarchar(513) = N'dbo.YourTableName';
IF OBJECT_ID(#tablename) IS NOT NULL
BEGIN
DECLARE #sql nvarchar(max) = N'SELECT ',
#tmpl nvarchar(max) = N'[Can $colP$ be converted?]
= CASE WHEN EXISTS
(
SELECT 1 FROM ' + #tablename + N'
WHERE TRY_CONVERT(float, COALESCE($colQ$,N''0'')) IS NULL
)
THEN ''No, $colP$ cannot be coverted''
ELSE ''Yes, $colP$ CAN be converted'' END';
SELECT #sql += STRING_AGG(
REPLACE(REPLACE(#tmpl, N'$colQ$',
QUOTENAME(name)), N'$colP$', name), N',')
FROM sys.columns
WHERE object_id = OBJECT_ID(#tablename)
AND name <> N'ID';
EXEC sys.sp_executesql #sql;
END
Working db<>fiddle
This is never going to be "fast" - there is no great shortcut to having to read and validate every value in the table.

Related

Select from a loop to a list of table that has similar name but with increment numbers added

I would like to use a query to loop through tables that are similar in names but added a number after that (ie. tableJan01, tableJan02, tableJan03, etc..., tableJan30)
Is there a way in SQL Server to use the same query statement while varying the table name within it. (similar to using parameter values) (need this to add different input to each different month's table)
declare #x nvarchar(50) ='abc'
declare #z int =1
while (#z<30)
BEGIN
SET #z = #z + 1;
select * from (#x)
END;
this shows error
Must declare the scalar variable "#CharVariable".
this script shows too syntax error
declare #x nvarchar(50) ='abc'
declare #z int =1
while (#z<30)
BEGIN
SET #z = #z + 1;
select * from (#x+#z)
END;
also, simple code like this doesn't work too
declare #x nvarchar(50) ='abc'
select * from #x
I agree with John Cappelletti that this requirement feels like a design flaw, however, to get your list of table names you can do something like this:
declare #x nvarchar(50) ='abc'
declare #z int =1
declare #ListOfTableNames TABLE (TableName nvarchar(50));
while (#z<30)
BEGIN
SET #z = #z + 1;
INSERT INTO #ListOfTableNames (TableName) VALUES (#x + CONVERT(NVARCHAR(20), #z))
END
SELECT * FROM #ListOfTableNames
To do dynamic SQL on these tables you could build a query string and then pass that string to the sp_executesql proc. You could put that logic in place of the line where we populate the table variable with the numbered table names. Like this:
declare #x nvarchar(50) ='abc'
declare #z int =1
declare #sql NVARCHAR(100)
while (#z<30)
BEGIN
SET #z = #z + 1;
SET #sql ='SELECT * FROM '+ (#x + CONVERT(NVARCHAR(20), #z))
EXEC sp_executesql #sql
END
I would completely avoid a WHILE loop just use some pattern matching:
DECLARE #Prefix sysname = N'abc';
DECLARE #SQL nvarchar(MAX),
#CRLF nchar(2) = NCHAR(13) + NCHAR(10);
SET #SQL = STUFF((SELECT #CRLF +
N'SELECT *' + #CRLF +
--N' ,N' + QUOTENAME(t.[name],'''') + N' AS TableName' + #CRLF + --Uncomment if wanted
N'FROM ' + QUOTENAME(s.[name]) + N'.' + QUOTENAME(t.[name]) + N';'
FROM sys.schemas s
JOIN sys.tables t ON s.schema_id = t.schema_id
WHERE t.[name] LIKE #Prefix + '%'
AND t.[name] NOT LIKE #Prefix + N'%[^0-9]'
ORDER BY t.[name]
FOR XML PATH(N''),TYPE).value('.','nvarchar(MAX)'),1,2,N'');
--PRINT #SQL;
EXEC sp_executesql #SQL;
DB<>Fiddle
But John is right, you certainly have a design flaw here.
Using dynamic SQL, it would look something like this:
declare
#Base_table_name nvarchar(50) = 'my_table'
,#Step int = 1
,#SQL nvarchar(max);
while(#Step < 30)
begin
set #SQL = 'select * from ' + #Base_table_name + right('00' + cast(#Step as nvarchar(50)),2);
print(#SQL); --this displays the SQL that would be run
--exec(#SQL) --uncomment this to run the dynamic SQL
set #Step+=1;
end;
Alternatively, you can be more precise by using the sys.schemas and sys.tables tables like so:
declare
#Base_table_name sysname = 'my_table'
,#schema_name sysname = 'my_schema'
,#Step int = 1
,#StepCount int = 0
,#SQL nvarchar(max);
/* This will create a table variable and populate it with all the tables you'll want to query */
declare #tables_to_query table (Step int identity, SchemaName sysname, TableName sysname);
insert into #tables_to_query(SchemaName, TableName)
select
s.name
,t.name
from
sys.schemas s
inner join
sys.tables t on s.schema_id = t.schema_id
where
s.name = #schema_name --this will limit the tables to this schema
and t.name like #Base_table_name + '%' --this will look for any table that starts with the base table name
/* this loops through all the tables in the table variable */
while(#Step <= #StepCount)
begin
select
#SQL = 'select * from ' + quotename(SchemaName) + '.' + quotename(TableName)
from
#tables_to_query
where
Step = #Step
print(#SQL); --this displays the SQL that would be run
--exec(#SQL) --uncomment this to run the dynamic SQL
set #Step+=1;
end;
The dynamic SQL approaches laid out in other answers will certainly get the job done for you, but if you find you're querying all of these tables frequently, it might server you well to build out a VIEW and query that as needed.
In keeping with Larnu's suggestion of putting the source table name into the result set, I'd probably do something like this:
CREATE VIEW dbo.vwJan
AS
SELECT
'tableJan01' AS SourceTable,
<Column List>
FROM dbo.tableJan01
UNION ALL
...<28 other iterations>
SELECT
'tableJan30' AS SourceTable,
<Column List>
FROM dbo.tableJan30;
From there, you can go ahead and query them all to your heart's content with a single statement.
SELECT
SourceTable,
<Any other columns you're interested in>
FROM
vwJan;

Looping through a column in SQL table that contains names of other tables

I have fairly new to using SQL, currently I have a table that has a column that contains the names of all the tables I want to use for one query, so what I want to do is to loop through that column and go to every single one of these tables and then search one of their columns for a value (there could be multiple values), so whenever a table contains the value, I will list the name of the table. Could someone give me a hint of how this is done? Is cursor needed for this?
I don't have enough reputation to comment but is the table with the column that contain the table names all in one column, meaning that all the table names are comma separated or marked with some sort of separator? This would cause the query to be a little more complicated as you would have to take care of that before you start looping through your table.
However, this would require a cursor, as well as some dynamic sql.
I will give a basic example of how you can go about this.
declare #value varchar(50)
declare #tableName varchar(50)
declare #sqlstring nvarchar(100)
set #value = 'whateveryouwant'
declare #getTableName = cursor for
select tableName from TablewithTableNames
OPEN #getTableName
fetch NEXT
from #getTableName into #tableName
while ##FETCH_STATUS = 0
BEGIN
set #sqlstring = 'Select Count(*) from ' + #tableName + 'where ColumnNameYouwant = ' + #value
exec #sqlstring
If ##ROWcount > 0
insert into #temptable values (#tableName)
fetch next
from #getTableName into #tableName
END
select * from #temptable
drop table #temptable
close #getTableName
deallocate #getTableName
I'm currently not able to test this out as for time constraint reasons, but this is how I would go about doing this.
You could try something like this:
--Generate dynamic SQL
DECLARE #TablesToSearch TABLE (
TableName VARCHAR(50));
INSERT INTO #TablesToSearch VALUES ('invoiceTbl');
DECLARE #SQL TABLE (
RowNum INT,
SQLText VARCHAR(500));
INSERT INTO
#SQL
SELECT
ROW_NUMBER() OVER (ORDER BY ts.TableName) AS RowNum,
'SELECT * FROM ' + ts.TableName + ' WHERE ' + c.name + ' = 1;'
FROM
#TablesToSearch ts
INNER JOIN sys.tables t ON t.name = ts.TableName
INNER JOIN sys.columns c ON c.object_id = t.object_id;
--Now run the queries
DECLARE #Count INT;
SELECT #Count = COUNT(*) FROM #SQL;
WHILE #Count > 0
BEGIN
DECLARE #RowNum INT;
DECLARE #SQLText VARCHAR(500);
SELECT TOP 1 #RowNum = RowNum, #SQLText = SQLText FROM #SQL;
EXEC (#SQLText);
DELETE FROM #SQL WHERE RowNum = #RowNum;
SELECT #Count = COUNT(*) FROM #SQL;
END;
You would need to change the "1" I am using as an example to the value you are looking for and probably add a CONVERT/ CAST to make sure the column is the right data type?
You actually said that you wanted the name of the table, so you would need to change the SQL to:
'SELECT ''' + ts.TableName + ''' FROM ' + ts.TableName + ' WHERE ' + c.name + ' = 1;'
Another thought, it would probably be best to insert the results from this into a temporary table so you can dump out the results in one go at the end?

Get column names in SQL server that satisfy a where condition on data

I just had a random doubt while working with SQL-server to which i thought i could get it clarified here.
Say- i have a condition that i want to find out all the column names in the database which satisfy my where condition on data.
Example:-
There are some 20-30 tables in a SQL-Server DB.Now all i need is a query to find out the list of column names which have "Ritesh" as a data field in them.
I don't know if it is really possible in the first place.
I hope i am clear. Please, any help will be most appreciated.
Thank You.
Ritesh.
This should work, but be aware, this will take a while to execute in large databases. I have assumed that the search string might be just a part of the data contained and am using wildcards. I feel this is purely academic, as I am unable to imagine a scenario, where this will be required.
--you need to iterate the whole columns of entire table to find the matched record
--another thing is that you need dynamic sql to find the table name
DECLARE #Value varchar(50) --value for that find the column Name
SET #Value = 'Ritesh'
CREATE TABLE #Table
(
TableName Varchar(500),ColumnName Varchar(500),
Id int Identity(1,1) --use for iteration
)
CREATE TABLE #Results
(
TableName varchar(500),
ColumnName varchar(500)
)
INSERT INTO #Table
SELECT
TABLE_SCHEMA + '.' + TABLE_NAME AS TableNam,
Column_name AS ColumnName
FROM INFORMATION_SCHEMA.COLUMNS
WHERE Data_type IN ('char', 'nchar', 'varchar', 'nvarchar')
--change the datatype based on the datatype of sample data you provide
-- also remember to change the wildcard, if the input datatype is not a string
DECLARE #Count Int --total record to iterated
SET #Count = 0;
SELECT
#Count = COUNT(*)
FROM #Table
DECLARE #I int --initial value one to iterate
SET #I = 1;
DECLARE #TableName varchar(500)
SET #TableName = ''
DECLARE #ColumnName varchar(500)
SET #ColumnName = ''
DECLARE #Str nvarchar(1000)
SET #Str = ''
DECLARE #param nvarchar(1000)
SET #param = ''
DECLARE #TableNameFound varchar(max)
SET #TableNameFound = ''
DECLARE #Found bit
SET #Found = 0;
WHILE #I<=#Count
BEGIN
SET #Found = 0;
SELECT
#TableName = TableName,
#ColumnName = ColumnName
FROM #Table
WHERE Id = #I;
SET #param = '#TableName varchar(500),#ColumnName varchar(500),#Value varchar(50),#TableNameFound varchar(max),#Found bit output'
SET #str = 'Select #Found=1 From ' + #TableName + ' where ' + #ColumnName + ' Like ' + '''' + '%' + #Value + '%' + ''''
-- here we are using tablename and actual value to find in table
EXEC sp_executesql #str,
#param,
#TableName,
#ColumnName,
#Value,
#TableNameFound,
#Found OUTPUT
IF #Found=1
BEGIN
INSERT INTO #Results (TableName, ColumnName)
SELECT
#TableName,
#ColumnName
END
--increment value of #I
SET #I = #I + 1;
END
--Display Results
SELECT * FROM #Results
--Clean Up
DROP TABLE #Table
DROP TABLE #Results

how to execute query foreach cell in a database

I need to replace whitespace with NULL in every cell in my database. SQL server 2008 r2. I'm looking for something efficient, but looks like cursor is only way?
First find all tables and columns that are Nullable and of type CHAR or VARCHAR, using INFORMATION_SCHEMA:
SELECT TABLE_NAME, COLUMN_NAME
FROM MyDatabase.INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_SCHEMA = 'dbo'
AND IS_NULLABLE = 'YES'
Cursors are NEVER the answer :)
ypercube did the hard part. You need to take that info and iterate through it, executing an update statement for each column in each table. You can do that using a WHILE statement. Here is an UNTESTED example of how you could do this:
--Set database to use
USE [MyDatabase];
GO
--Create table variable to hold table/column pairs
DECLARE #table TABLE (
[Key] BIGINT PRIMARY KEY IDENTITY (1, 1),
[TABLE_NAME] VARCHAR(100),
[COLUMN_NAME] VARCHAR(100)
);
--Populate table variable
INSERT INTO #table ([TABLE_NAME], [COLUMN_NAME])
SELECT [TABLE_NAME], [COLUMN_NAME]
FROM MyDatabase.INFORMATION_SCHEMA.COLUMNS
WHERE [TABLE_SCHEMA] = 'dbo'
AND [IS_NULLABLE] = 'YES';
--Initialize counting variables
DECLARE #counter BIGINT = 1;
DECLARE #max BIGINT = (SELECT COUNT(1) FROM #table);
--Iterate through each pair
WHILE #counter <= #max
BEGIN
--Assign the current pair values to variables
DECLARE #TableName VARCHAR(100), #ColumnName VARCHAR(100);
SELECT #TableName = [TABLE_NAME], #ColumnName = [COLUMN_NAME]
FROM #table
WHERE [Key] = #counter;
--Execute dynamic SQL
EXEC
(
'UPDATE [' + #TableName + ']' +
'SET [' + #ColumnName + '] = NULL' +
'WHERE RTRIM([' + #ColumnName + ']) = '''';'
);
--Increment the counter
SET #counter = #counter + 1;
END
I guess it depends on how you define whitespace. Will the following work?
update mytable set mycolumn = null where len(rtrim(ltrim(mycolumn))) = 0

using temp tables in SQL Azure

I am writing a query to pivoting table elements where column name is generated dynamically.
SET #query = N'SELECT STUDENT_ID, ROLL_NO, TITLE, STUDENT_NAME, EXAM_NAME, '+
#cols +
' INTO ##FINAL
FROM
(
SELECT *
FROM #AVERAGES
UNION
SELECT *
FROM #MARKS
UNION
SELECT *
FROM #GRACEMARKS
UNION
SELECT *
FROM #TOTAL
) p
PIVOT
(
MAX([MARKS])
FOR SUBJECT_ID IN
( '+
#cols +' )
) AS FINAL
ORDER BY STUDENT_ID ASC, DISPLAYORDER ASC, EXAM_NAME ASC;'
EXECUTE(#query)
select * from ##FINAL
This query works properly in my local database, but it doesn't work in SQL Azure since global temp tables are not allowed there.
Now if i change ##FINAL to #FINAL in my local database, but it gives me error as
Invalid object name '#FINAL' .
How can I resolve this issue?
Okay, after saying I didn't think it could be done, I might have a way. It's ugly though. Hopefully, you can play with the below sample and adapt it to your query (without having your schema and data, it's too tricky for me to attempt to write it):
declare #cols varchar(max)
set #cols = 'object_id,schema_id,parent_object_id'
--Create a temp table with the known columns
create table #Boris (
ID int IDENTITY(1,1) not null
)
--Alter the temp table to add the varying columns. Thankfully, they're all ints.
--for unknown types, varchar(max) may be more appropriate, and will hopefully convert
declare #tempcols varchar(max)
set #tempcols = #cols
while LEN(#tempcols) > 0
begin
declare #col varchar(max)
set #col = CASE WHEN CHARINDEX(',',#tempcols) > 0 THEN SUBSTRING(#tempcols,1,CHARINDEX(',',#tempcols)-1) ELSE #tempcols END
set #tempcols = CASE WHEN LEN(#col) = LEN(#tempcols) THEN '' ELSE SUBSTRING(#tempcols,LEN(#col)+2,10000000) END
declare #sql1 varchar(max)
set #sql1 = 'alter table #Boris add [' + #col + '] int null'
exec (#sql1)
end
declare #sql varchar(max)
set #sql = 'insert into #Boris (' + #cols + ') select ' + #cols + ' from sys.objects'
exec (#sql)
select * from #Boris
drop table #Boris
They key is to create the temp table in the outer scope, and then inner scopes (code running within EXEC statements) have access to the same temp table. The above worked on SQL Server 2008, but I don't have an Azure instance to play with, so not tested there.
If you create a temp table, it's visible from dynamic sql executed in your spid, if you create the table in dynamic sql, it's not visible outside of that.
There is a workaround. You can create a stub table and alter it in your dynamic sql. It requires a bit of string manipulation but I've used this technique to generate dynamic datasets for tsqlunit.
CREATE TABLE #t1
(
DummyCol int
)
EXEC(N'ALTER TABLE #t1 ADD foo INT')
EXEC ('insert into #t1(DummyCol, foo)
VALUES(1,2)')
EXEC ('ALTER TABLE #t1 DROP COLUMN DummyCol')
select *from #t1