sum range of columns SQL - sql

I'm using SQL Server 2016. Suppose I have columns sales1, sales2,..,sales100 in a dataset, along with some other column, say name. I want to write a SQL query like
SELECT name, SUM(sales1), SUM(sales2),..., SUM(sales100)
FROM dataset
GROUP BY name
but I don't believe this is valid SQL syntax.. Is there a shorthand to do this?

There is no shorthand, but SSMS provides a simple way to do this.
From Object Explorer expand your table and drag the column folder to a query window.
It will give you a csv list of columns.
Oh, but that is not what you wanted!
Replace ',' with '), SUM(' and with minor tweaking you can have the desired string.
Or you could try this:
DECLARE #SQL VARCHAR(MAX) = ''
SELECT #SQL += 'SUM(' + column_name + '), '
FROM information_Schema.COLUMNS
WHERE table_name = 'mytable'
AND column_name LIKE 'sales%'
ORDER BY ORDINAL_POSITION
SELECT #SQL

Related

Remove quotes when passing a column name via parameter

Hi I am trying to do the following:
SELECT #IDENTITY_COLUMN = (
SELECT COLUMN_NAME
FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_NAME = 'tbltest1'
SELECT #LAST_VALUE_USED = (
SELECT ISNULL(MAX(#IDENTITY_COLUMN),0)
FROM tbltest1
)
If there are no rows, it works fine but when there is a row, the second query is returning a string that has the column name and I believe it is because #IDENTITY_COLUMNhas quotes in it. Hence I am getting Conversion failed when converting the nvarchar value 'columnname' to data type int. How can I solve this problem?
Help appreciated!
I think you're trying to get the maximum value of an IDENTITY column from a table if such a column exists. You'd need dynamic SQL for that. Something like:
DECLARE #identity_column sysname;
DECLARE #query nvarchar(MAX);
SET #identity_column = (SELECT column_name
FROM information_schema.columns
WHERE table_name = 'tbltest1');
IF #identity_column IS NOT NULL
BEGIN
SET #query = '
SELECT isnull(max(' + quotename(#identity_column) + '), 0)
FROM tbltest1;
';
EXECUTE (#query);
END;
Note:
For object names use sysname, that's an extra type for them. Don't use varchar or navarchar etc..
Always use quotename() if embedding object names in a dynamic query. That'll prevent funny things from happen, if the object name contains non alpha numeric characters or is odd in other ways.

How to search for specific column name within one table

I have searched this question on stackoverflow but most of the questions are a little deeper than what I want. Many questions are relating to finding the table that has the specific column name.
I am connected to the database through SSMS. I have found the table that I want to search through by SELECT * FROM Item. In the Item table I want to search all of the field names (or name of the column) and select the ones that contain a specific string 'Size'. I thought something like this would work
Select * FROM Item WHERE column_name LIKE '%SIZE%'
It doesn't work though. How do I specify it to search through all of the column names to find the names that contain 'Size'?
Thanks.
This should be the generic query to get you to what you want.
USE [database_name]
GO
SELECT t.name AS table_name,
SCHEMA_NAME(schema_id) AS schema_name,
c.name AS column_name
FROM sys.tables AS t
INNER JOIN sys.columns c ON t.OBJECT_ID = c.OBJECT_ID
WHERE c.name LIKE '%SIZE%'
AND t.name = 'Item'
ORDER BY schema_name, table_name;
You will need the correct permission on whichever SQL login that you run this through.
SELECT [Name]
FROM sys.columns
WHERE OBJECT_NAME(object_id)='Item'
AND [Name] LIKE '%Size%';
You can use
SELECT * FROM INFORMATION_SCHEMA.COLUMNS where TABLE_NAME ='zzzzz' and COLUMN_NAME like '%size%'
This looks through table named Item for a Column with the name like 'SIZE':
SELECT sch.COLUMN_NAME, sch.*
FROM INFORMATION_SCHEMA.COLUMNS AS sch
WHERE TABLE_NAME = 'Item'
AND COLUMN_NAME LIKE '%SIZE%'
Is this what you wanted?
I think this is what you are looking for, just replace database_Name with your db name:
Declare #myQuery varchar(max) = ' Select ';
Declare #columnName varchar(max) = '';
Declare GetColumnNames Cursor
For
SELECT COLUMN_NAME
FROM database_Name.INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_NAME = N'Item'
And database_Name.INFORMATION_SCHEMA.COLUMNS.COLUMN_NAME like '%SIZE%'
OPEN GetColumnNames
FETCH NEXT FROM GetColumnNames
INTO #columnName
WHILE ##FETCH_STATUS = 0
BEGIN
Set #myQuery += #columnName + ', '
FETCH NEXT FROM GetColumnNames
INTO #columnName
END
CLOSE GetColumnNames;
DEALLOCATE GetColumnNames;
-- Chop off the end character
SET #myQuery = LEFT(#myQuery, LEN(#myQuery) - 1)
Set #myQuery += ' From Item'
exec(#myQuery)
You'll have to take a two step approach to achieve your end query.
First, you'll need to identify the columns you're interested in by using the table metadata, which you can get from either the sys schema or the INFORMATION_SCHEMA tables. Several of the proposed answers will help you get that information.
Next, you'll use the column names you've identified in step one to build the actual query you're interested in. If this is a one-off task you're doing, just copy and paste the results from the meta data query into a new SELECT query as your column list. If you need to do this task programmatically or multiple times using different LIKE strings, then you'll want to invest the time in writing some dynamic SQL.
When you wrap it all up, it'll look something like this:
--Step 1; The meta data part
DECLARE #ColumnList NVARCHAR(MAX)
,#SQL NVARCHAR(MAX)
SELECT
#ColumnList = COALESCE(#ColumnList+',','') + COLUMN_NAME
FROM
INFORMATION_SCHEMA.COLUMNS
WHERE
TABLE_SCHEMA = 'schema'
AND
TABLE_NAME = 'TableName'
AND
COLUMN_NAME LIKE '%SIZE%'
SELECT #ColumnList;
--Step 2; The dynamic SQL part
SET #SQL = 'SELECT ' + #ColumnList + ' FROM schema.TableName;';
EXECUTE sys.sp_executesql #SQL;

TSQL Update stmt for each column name according to data type

For SSIS, I need to create a TSQL update workflow to overwrite the current table records in case of an import error.
I already have a set up for the whole SSIS process but I'm missing the SQL update statement.
So if something goes wrong during the import the current records (all rows) in the table should be updated with a short message - "Error DB Import" for example.
Since I have multiple tables to deal with I also get different column names and data types.
I would use this stmt to get the column names
SELECT COLUMN_NAME , *
FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_NAME = 'myTable'
but how can I set a string message for the char types and NULL for numeric and date types. Using a CASE stmt?
In pseudo-code it's probably just a loop through the columns: if column_name is of data_type "char" then...
I also need to ignore the first 4 columns of each table so that I don't overwrite ID, Date, etc.
If you can help me set up a static test update stmt I'm sure I will be able to transfer this to my SSIS project.
Thank you.
Sounds like you're looking for something like this:
SELECT
CASE DATA_TYPE
WHEN 'int' THEN NULL
WHEN 'varchar' THEN 'STRING MSG GOES HERE'
END,
COLUMN_NAME , *
FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_NAME = 'myTable'
AND ORDINAL_POSITION > 4
SQL Fiddle Demo
You can add as many WHEN clauses to the CASE statement as needed. Also, you want to use the ORDINAL_POSITION column to exclude the first 4 columns.
If you need to use this information to create an UPDATE statement, then you'll need to do that with Dynamic SQL.
EDIT -- Dynamic SQL:
create procedure updateMyTable
as
begin
declare #sql varchar(max)
SELECT #sql = 'UPDATE myTable SET ' +
STUFF(
(SELECT ', ' + COLUMN_NAME + ' = ' +
CASE DATA_TYPE
WHEN 'int' THEN 'NULL'
WHEN 'varchar' THEN '''STRING MSG GOES HERE'''
END
FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_NAME = 'myTable'
for xml path(''))
,1,1,'')
exec(#sql)
end
SQL Fiddle Demo

SQL - Concatenate all columns from any table

I'm using triggers to audit table changes. Right now I capture the individual column changes in the following:
DECLARE #statement VARCHAR(MAX)
SELECT #statement =
'Col1: ' + CAST(ISNULL(Col1, '') AS VARCHAR) + ', Col2: ' + CAST(ISNULL(Col2, '') AS VARCHAR) + ', Col3: ' + CAST(ISNULL(Col3, '') AS VARCHAR)
FROM INSERTED;
The problem is, I need to tweak the column names for every table/trigger that I want to audit against. Is there a way I can build #statement, independent of the table using a more generic approach?
cheers
David
what you need to do is build a memory table using the following query and then loop through the same to produce the SQL statement you want
select column_name from information_schema.columns
where table_name like 'tName'
order by ordinal_position
however i am not sure this would be the right thing to do for AUDIT. How are you going to pull it back later. Say in one of your releases you happen to drop the column what will happen then? how will you know which column held which data.

How can I give condition on the column names in SQL Server 2008

I have my database with the columns as follows:
keyword, part1_d1, part1_d2 ........ part1_d25, part2_d26, ......part2_d34
FYI: d1 through d34 are documents..
How can I give a query to obtain columns with column_name like '%part1%'; as below
keyword, part1_d1, part1_d2, ........ part1_d25
I tried the query:
select (Select COLUMN_NAME
From INFORMATION_SCHEMA.COLUMNS
where COLumn_NAME like '%part1%')
, keyword
from sample
But it didn't work...
Please let me know what to do?
SQL doesn't support dynamic column names - you either have to explicitly state which ones you want, or use the asterisk "*" to indicate all the columns in the table.
Dynamic SQL would allow you to get a list of the columns, and create a SQL statement as a string before executing it - which is what you need to eventually use for the query you attempted. I recommend reading this--The Curse and Blessings of Dynamic SQL-- before looking further.
SQL Server 2005+:
DECLARE #sql NVARCHAR(4000)
SET #sql = 'SELECT ' + STUFF((SELECT ', ' + x.column_name
FROM INFORMATION_SCHEMA.COLUMNS x
JOIN INFORMATION_SCHEMA.TABLES y ON y.object_id = x.object_id
WHERE x.column_name LIKE '%part1%'
AND y.table_name = 'sample'
GROUP BY x.column_name
FOR XML PATH ('')), 1, 2, '') + ' , keyword
FROM sample '
BEGIN
EXEC sp_executesql #sql
END