Aliasing columns based on cross-reference table - sql

I have a table with over 400 columns, and these are named with our vendor's archaic naming system. I need to move this data into a new table which uses our company's naming conventions, so I have to change the names of these 400 columns.
Fortunately, I also have a table that cross-references the current column names with what they should become, like so:
Acronym | Name
----------------
A | ColumnNameA
B | ColumnNameB
C | ColumnNameC
etc...
So my question is this:
If it were only a few rows, I could easily do
SELECT
A AS ColumnNameA,
B AS ColumnNameB
FROM
Table
But there are just too many columns to do this by hand. What's the best way to dynamically change column names in a SELECT statement based off of a cross-ref table?
My effort so far:
I was thinking something along the lines of
SET #sqlCommand = 'SELECT ' + #columns + ' FROM Table'
EXEC (#sqlCommand)
but I have no idea how to set #columns to be a dynamically generated list of all the acronyms as the final column names. Is this even a viable approach?

DECLARE #sql NVARCHAR(MAX) = N'';
SELECT #sql += ',' + QUOTENAME(Acronym) + ' AS ' + QUOTENAME(Name)
FROM dbo.AcronymTable;
SET #sql = 'SELECT ' + STUFF(#sql, 1, 1, N'') + ' FROM dbo.Table;';
PRINT #sql;
--EXEC sp_executesql #sql;

The easiest way is to add an int field that determines the order of columns so it matches from source to target. Then you can:
DECLARE #sql varchar(max)
SET #SQL = 'INSERT INTO dbo.Target SELECT '
SELECT #SQL = #SQL + Acronym + ','
FROM ConversionTable
ORDER BY OrderColumn
SET #SQL = LEFT(#SQL, (LEN(#SQL) - 1))
SET #SQL = #SQL + ' FROM SourceTable'

Related

SELECT only values defined in INFORMATION SCHEMA

Could you please help me with following issue?
Source table:
Columns defined from INFORMATION_SCHEMA.COLUMNS:
In output I'd like to take my source table, but show only values which column name is the same as column name defined in information schema. Meaning:
Is it possible? Many thanks in advance
You need to use dynamic SQL to do that:
declare #sql varchar(1000) 'select ';
select #sql = #sql + '[' + column_name + '] ,' from INFORMATION_SCHEMA.COLUMNS;
-- remove last character in a string which is comma
select #sql = left(#sql, len(#sql) - 1);
-- you need to change talbe name here
select #sql = #sql + ' from MyTable';
-- execute statement
exec(#sql)

Querying total row size in KBs for SQL Server DB based on date

I've searched for this up and down, but I can't find how to query for a set of rows.
When I query the DB for the rows themselves, that's simple enough.
SELECT *
FROM dbo.[tablename]
WHERE CreatedDate < '2012-12-31-00:00:00'
But I'm not sure how to apply something like sp_spaceused to this.
We need to sum data size of each of the column like below
select SUM(datalength(col1))+SUM(datalength(col2))+.. from tableName
WHERE CreatedDate < '2012-12-31-00:00:00'
Here is a dynamic query that will fetch columns for the table and then add up the size for each column in a row and sum up the total size.
declare #table nvarchar(20)
declare #whereClause nvarchar(50)
declare #sql nvarchar(max)
--initialize those two values
set #table = 'tableName'
set #whereClause = ' CreatedDate < ''2012-12-31-00:00:00'' '
set #sql = 'select ' + ' sum((0'
select #sql = #sql + ' + isnull(datalength(' + name + '), 1)'
from sys.columns where object_id = object_id(#table)
set #sql = #sql + ')) as totalSize from ' + #table + #whereClause
select #sql
exec (#sql)

how do I select records that are like some string for any column in a table?

I know that I can search for a term in one column in a table in t-sql by using like %termToFind%. And I know I can get all columns in a table with this:
SELECT *
FROM MyDataBaseName.INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_NAME = N'MyTableName`
How can I perform a like comprparison on each of the columns of a table? I have a very large table so I can't just spell out LIKE for each column.
As always, I'll suggest xml for this (I'd suggest JSON if SQL Server had native support for it :) ). You can try to use this query, though it could perform not so well on large number of rows:
;with cte as (
select
*,
(select t.* for xml raw('data'), type) as data
from test as t
)
select *
from cte
where data.exist('data/#*[local-name() != "id" and contains(., sql:variable("#search"))]') = 1
see sql fiddle demo for more detailed example.
Important note by Alexander Fedorenko in comments: it should be understood that contains function is case-sensitive and uses xQuery default Unicode code point collation for the string comparison.
More general way would be to use dynamic SQL solution:
declare #search nvarchar(max)
declare #stmt nvarchar(max)
select #stmt = isnull(#stmt + ' or ', '') + quotename(name) + ' like #search'
from sys.columns as c
where c.[object_id] = object_id('dbo.test')
--
-- also possible
--
-- select #stmt = isnull(#stmt + ' or ', '') + quotename(column_name) + ' like #search'
-- from INFORMATION_SCHEMA.COLUMNS
-- where TABLE_NAME = 'test'
select #stmt = 'select * from test where ' + #stmt
exec sp_executesql
#stmt = #stmt,
#params = N'#search nvarchar(max)',
#search = #search
sql fiddle demo
I'd use dynamic SQL here.
Full credit - this answer was initially posted by another user, and deleted. I think it's a good answer so I'm re-adding it.
DECLARE #sql NVARCHAR(MAX);
DECLARE #table NVARCHAR(50);
DECLARE #term NVARCHAR(50);
SET #term = '%term to find%';
SET #table = 'TableName';
SET #sql = 'SELECT * FROM ' + #table + ' WHERE '
SELECT #sql = #sql + COALESCE('CAST('+ column_name
+ ' as NVARCHAR(MAX)) like N''' + #term + ''' OR ', '')
FROM INFORMATION_SCHEMA.COLUMNS WHERE [TABLE_NAME] = #table
SET #sql = #sql + ' 1 = 0'
SELECT #sql
EXEC sp_executesql #sql
The XML answer is cleaner (I prefer dynamic SQL only when necessary) but the benefit of this is that it will utilize any index you have on your table, and there is no overhead in constructing the XML CTE for querying.
In case someone is looking for PostgreSQL solution:
SELECT * FROM table_name WHERE position('your_value' IN (table_name.*)::text)>0
will select all records that have 'your_value' in any column. Didn't try this with any other database.
Unfortunately this works as combining all columns to a text string and then searches for a value in that string, so I don't know a way to make it match "whole cell" only. It will always match if any part of any cell matches 'your_value'.

How to select output of XML Path query in SQL?

Doc table contains a lot of columns (even not used ones):
Doc_DdfID Doc_RentDate Doc_ReturnDate etc.
--------- ------------ -------------- ----
1 2012-07-28 2012-07-28
But I want to query just the used ones within Doc's table.
DocDefinitionFields list columsn that are in use by document:
SELECT Dfl_ColName
FROM DocDefinitionFields
WHERE Dfl_DdfID = 1
DocDefinitionFields:
Dfl_ColName
-----------
Doc_RentDate
Doc_ReturnDate
...........
So I want to select all columns (listed by second query) from Doc table.
Example (if 2 columns are added to document definition form I want to select just them):
Doc:
Doc_RentDate Doc_ReturnDate
------------ --------------
2012-07-28 2012-07-28
Tried to do that by subquerying select with concatenation of fields using XML PATH:
SELECT
(SELECT
Dfl_ColName + ', '
FROM DocDefinitionFields
FOR XML PATH('')
)
FROM Doc
It's not that simple tho. What do you suggest?
What you need here is dynamic SQL, something like this:
DECLARE #sql VARCHAR(MAX)
SET #sql = 'SELECT ' + STUFF((SELECT ', ' + Dfl_ColName FROM DocDefinitionFields FOR XML PATH('') ),1,1,'') + ' FROM Doc'
EXEC (#sql)
Also, in order to eliminate additional comma(,) at the end of columns I have added STUFF function along with FOR XML PATH.
To get the column names for a query dynamically and use these in a query you will need to use Dynamic SQL. Below is an example of how to create the string of available columns
DECLARE #Columns VARCHAR(MAX);
SELECT #Columns =
COALESCE(#Columns + ',
[' + CAST(COLUMN_NAME AS VARCHAR) + ']',
'[' + CAST(COLUMN_NAME AS VARCHAR) + ']')
FROM (SELECT DISTINCT COLUMN_NAME
FROM [SomeDatabase].INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_NAME = N'SomeTableName') AS H
ORDER BY COLUMN_NAME;
GO
You can now use the string of available columns in a Dynamic SQL query. Below we have adopted the above in an INSERT query that build the required fields dynamically. The reason why we need to do it in the below is the inclusion of the set field SomeFieldA along with the others.
DECLARE #SQL NVARCHAR(MAX);
SET #SQL =
N'INSERT INTO [' + #DbName + N']..[SomeOtherTable] ([SomeFieldA], ' + #Columns + N')
SELECT SomeFieldA, ' + #Columns + N'
FROM [SomeTableName];';
EXEC sp_executesql #SQL;
GO
You should be able to amend the above to provide what you need.
I hope this helps.

SQL sum across columns by name

I have a table with lots of columns, some of them with names beginning with "EQ". For an individual row, I'd like to sum all the values in the columns that start with "EQ", but not the other values. I know I can do it like this:
select EQ_DOMESTIC + EQ_INTL + EQ_OTHER from myTable where id=1
However, I have lots of columns, and I was wondering if I can do it systematically, without typing in the name of each column. Would I have to get the column names from the system tables in another query?
Follow up question: Some of the values are nulls, which makes the sum NULL. Is there any way to avoid writing out ISNULL(column,0) for the sum?
You can do this pretty easily with dynamic SQL:
DECLARE #sql NVARCHAR(MAX) = N'';
SELECT #sql += N'
+ COALESCE(' + QUOTENAME(name) + ', 0)'
FROM sys.columns
WHERE [object_id] = OBJECT_ID('dbo.MyTable')
AND name LIKE 'EQ[_]%';
SELECT #sql += N',' + QUOTENAME(name)
FROM sys.columns
WHERE [object_id] = OBJECT_ID('dbo.MyTable')
AND name NOT LIKE 'EQ[_]%';
SELECT #sql = 'SELECT [EQ_SUM] = 0' + #sql
+ ' FROM dbo.MyTable WHERE id = 1;';
PRINT #sql;
-- EXEC sp_executesql #sql;