I have a stored procedure where I need to query a table that contains another query that I then need to execute, get the results, and store those results in another table. I will not know what, or how many, columns are returned from this query, but I must be able to map the unknown columns to columns in my results table. I do know that the query can contain anywhere from 1 to 20 columns which need to map to my results table as RSLT_1 up to RSLT_20.
For Example, let's say the query returns 5 columns. I need to iterate over the results and map column1 to RSLT_1, column2 to RSLT_2, etc. Then store those results in my result table
I have this logic already written in C# which was trivial since I can loop over columns to determine how many exist. I don't know how to do that in a stored procedure. Any ideas?
I cannot write you a complete answer, because I don't know your queries/tables. Here main lines of what I'm thinking about. You have to complete for your needs. But keep in mind, cursor is slow and generally not recommended in production.
select * into #tmp from <sourceQuery>
declare #columnName varchar(100)
declare #sql NVARCHAR(MAX)
declare c cursor for
select name from tempdb.sys.columns where object_id = object_id('tempdb..#tmp')
open c
fetch next from c into #columnName
while ##FETCH_STATUS = 0
begin
select #columnName
set #sql = N'insert into <targetTable> (valueColumn) select ' + #columnName + 'from #tmp'
exec #sql
fetch next from c into #columnName
end
close c
deallocate c
drop table #tmp
Related
Can anyone please help with this query?
I’m using SQL server 2008 . Objective is to select rows from multiple tables based on condition and values from different tables .
I have table1, table2, tableN with columns as ID,ColumnName,ColumnValue . These are the table I need to select rows based on conditions from below table
Control table with columns Number,Function and Enable
Repository table with columns Function and tableName
I need pass Number and ID as parameters and get details of all Function values from Control table which has Enable value = 1 and by using these Function values collect tableNames from Repository table . And for each tableName returned from Repository table get all rows by using ID value.
The way I understand it you have two tables with schema like this:
table Control (Number int, Function nvarchar, Enable bit)
table Repository (Function nvarchar, TableName nvarchar)
Control and Repositories are related via Function column.
You also have a number of other tables and names of those tables are saved in Repositories tables. All those tables have ID column.
You want to get those table names based on a number and then select from all those tables by their ID column.
If that indeed is what you are trying to do, code bellow should be enough to solve your problem.
declare
-- arguments
#id int = 123,
#number int = 123456,
-- helper variables we'll use along the way
#function nvarchar(4000),
#tableName nvarchar(256),
#query nvarchar(4000)
-- create cursor to iterate over every returned row one by one
declare cursor #tables readonly fast_forward
for
select
c.Function,
r.TableName
from [Control] as c
join [Repository] as r on r.Function = c.Function
where c.Number = #number
and c.Enable = 1
-- initialise cursor
open #tables
-- get first row into variables
fetch next from #tables
into #function, #tableName
-- will be 0 as long as fetch next returns new values
while ##fetch_status = 0
begin
-- build a dynamic query
set #query = 'select * from ' + #tableName + ' where ID = ' + #id
-- execute dynamic query. you might get permission problems
-- dynamic queries are best to avoid, but I don't think there's another solution for this
exec(#query)
-- get next row
fetch next from #tables
into #function, #tableName
end
-- destroy cursor
close #tables
deallocate #tables
I'm trying to delete data from a number of tables in my SQL base.
In the database I have a table called company which contains the names of each table that I need to delete data from.
lets assume that I have 3 companies in my company table.
What I want to do is delete all records in some certain tables in each company.
So, in the company table I have the following 3 records:
1 2 3
There are also the following tables in the database which depicts each company's scanned documents.
dbo.1.documents
dbo.2.documents
dbo.3.documents
What I am trying to do is to create a SQL query that will run through the dbo.company table and clear the document tables based on the company names found there.
This is my code:
DECLARE #MyCursor CURSOR;
DECLARE #MyField varchar;
BEGIN
SET #MyCursor = CURSOR FOR
select top 1000 [Name] from dbo.Company
OPEN #MyCursor
FETCH NEXT FROM #MyCursor
INTO #MyField
WHILE ##FETCH_STATUS = 0
BEGIN
delete * from 'dbo.'+#MyField+'$documents'
FETCH NEXT FROM #MyCursor
INTO #MyField
END;
CLOSE #MyCursor ;
DEALLOCATE #MyCursor;
END;
I am not sure how the syntax should go but I imagine it is something like this.
Anybody care to chip in on how I can dynamically delete the data based on the records in the dbo.company.name?
Use dynamic sql.
Replace the delete-statement with code below (the declare can be done at the start):
DECLARE #sql NVARCHAR(1000)
SET #sql = N'delete from dbo.'+CONVERT(VARCHAR,#MyField)+'documents'
EXEC sp_executesql #sql
You can leverage dynamic sql a little differently and avoid all the hassle and overhead of creating a cursor for this. I am using the values in your table to generate a number of delete statements and then executing them. This is a lot less effort to code and eliminates that nasty cursor.
declare #SQL nvarchar(max) = ''
select #SQL = #SQL + 'delete dbo.[' + c.Name + '$documents];'
from dbo.Company
select #SQL --uncomment the line below when you are satisfied the dynamic sql is correct
--exec sp_executesql #sql
I need a small help. I want to convert my result of SQL into single row.
Lets say there is a table Students with ID and Name in it.
if I execute query
select * from Students
it returns.
Col1 Col2
1 Rizwan
2 Ahmed
I want result to be like
1 Rizwan 2 Ahmed
Please note that I want every record in a separate column.
Thanks in advance
I can't think of a plausible scenario where this transform serves any useful purpose because relational-algebra is, by design, about sets of data sharing the same attributes (i.e. tables have rows with columns) - by having everything in a single row with meaningless columns you're just effectively serializing data into a blob.
The only way to achieve this is using Dynamic SQL, as this is the only way to achieve a dynamic number of columns without prior knowledge of what columns are desired.
In MS SQL Server you might think of using PIVOT/UNPIVOT but the columns still need to be manually named, thus requiring Dynamic SQL.
MySQL Server (and MariaDB) have GROUP_CONCAT which can be used to combine multiple rows into a single string (text) value but the server lacks any kind of "split" function. So GROUP_CONCAT doesn't work here because it doesn't return discrete columns.
In T-SQL (MS SQL Server, Sybase) you need to iterate over every target row, this is done using a CURSOR. You cannot reliably perform string concatenation inside a SELECT statment:
DECLARE #sql nvarchar(max) = 'SELECT '
DECLARE c CURSOR FOR
SELECT [Id], [Name] FROM Students ORDER BY [Id] ASC
OPEN c
DECLARE #id int
DECLARE #name nvarchar(100)
FETCH NEXT FROM c INTO #id, #name
WHILE ##FETCH_STATUS = 0
BEGIN
SET #sql = #sql + CONVERT( varchar(10), #id ) + ', ' + #name
FETCH NEXT FROM c INTO #id, #name
END
CLOSE c
DEALLOCATE c
sp_executesql #sql -- this will execute the `SELECT` that was generated, where each discrete value will be returned as an anonymous column.
Maybe there's a better way about this but this has to be somewhat dynamic.
From a vb.net form I need to restore or replace data from one table to another. The two tables are identical except for a couple different columns.
First I wrote some SQL to grab the column names of the table passed in. Then through ordinal position I get only the tables I want values from. I store these tables names in a temp table.
Now I want to get those values from the backup table using the temp table column names and place them in the master table.
So I guess I suppose I need a cursor to loop through in some way.. I haven't touched a cursor since college and wow.
I'll embarrass myself and post my current code.
SET #getColCURSOR = CURSOR FOR
SELECT name
FROM #MyTempTable --created previously as table only holding column names
OPEN #getColCURSOR
FETCH NEXT FROM #getColCURSOR
INTO #columnName
WHILE ##FETCH_STATUS = 0
BEGIN
select #columnName --this variable should as a column name and change
from AUDIT_TABLE a where a.ID = 7 -- 7 is just for testing is dynamic variable
FETCH NEXT FROM #getColCURSOR
INTO #columnName
END
CLOSE #getColCURSOR
DEALLOCATE #getColCURSOR
I'm not going to comment on whether this could be done without a cursor, since I'm a bit lost on what you're trying to do. But one issue with your cursor is that you can't parameterize a column name in a select statement. So you'll need to replace this:
WHILE ##FETCH_STATUS = 0
BEGIN
select #columnName --this variable should as a column name and change
from AUDIT_TABLE a where a.ID = 7 -- 7 is just for testing is dynamic variable
FETCH NEXT FROM getColCURSOR
INTO #columnName
END
--with dynamic SQL like this:
DECLARE #SQL nvarchar(max)
WHILE ##FETCH_STATUS = 0
BEGIN
set #SQL =
N'select ' + QUOTENAME(#columnName) + ' from AUDIT_TABLE a where a.ID = 7'
EXEC (#SQL); --w/o brackets assumes you've calling a stored proc
FETCH NEXT FROM getColCURSOR
INTO #columnName
END
That could possibly introduce other issues, since dynamic SQL statements execute in their own scope. I'd definitely encourage you to look into whether there's a set-based solution to this, since using dynamic SQL will make this even messier, and I don't think you'll be able to escape dynamic SQL if you want to use a variable for column names.
I have a number of tables (around 40) containing snapshot data about 40 million plus vehicles. Each snapshot table is at a specific point in time (the end of the quarter) and is identical in terms of structure.
Whilst most of our analysis is against single snapshots, on occasion we need to run some analysis against all the snapshots at once. For instance, we may need to build a new table containing all the Ford Focus cars from every single snapshot.
To achieve this we currently have two options:
a) write a long, long, long batch file repeating the same code over and over again, just changing the FROM clause
[drawbacks - it takes a long time to write and changing a single line of code in one of blocks requires fiddly changes in all the other blocks]
b) use a view to union all the tables together and query that instead
[drawbacks - our tables are stored in separate database instances and cannot be indexed, plus the resulting view is something like 600 million records long by 125 columns wide, so is incredibly slow]
So, what I would like to find out is whether I can either use dynamic sql or put the SQL into a loop to spool through all tables. This would be something like:
for each *table* in TableList
INSERT INTO output_table
SELECT *table* as OriginTableName, Make, Model
FROM *table*
next *table* in TableList
Is this possible? This would mean that updating the original SQL when our client changes what they need (a very regular occurrence!) would be very simple and we would benefit from all the indexes we already have on the original tables.
Any pointers, suggestions or help will be much appreciated.
If you can identify your tables (e.g. a naming pattern), you could simply say:
DECLARE #sql NVARCHAR(MAX);
SELECT #sql = N'';
SELECT #sql = #sql + 'INSERT output_table SELECT ''' + name + ''', Make, Model
FROM dbo.' + QUOTENAME(name) + ';'
FROM sys.tables
WHERE name LIKE 'pattern%';
-- or WHERE name IN ('t1', 't2', ... , 't40');
EXEC sp_executesql #sql;
This assumes they're all in the dbo schema. If they're not, the adjustment is easy... just replace dbo with ' + QUOTENAME(SCHEMA_NAME([schema_id])) + '...
In the end I used two methods:
Someone on another forum suggested making use of sp_msforeachtable and a table which contains all the table names. Their suggestion was:
create table dbo.OutputTable (OriginTableName nvarchar(500), RecordCount INT)
create table dbo.TableList (Name nvarchar (500))
insert dbo.TableList
select '[dbo].[swap]'
union select '[dbo].[products]'
union select '[dbo].[structures]'
union select '[dbo].[stagingdata]'
exec sp_msforeachtable #command1 = 'INSERT INTO dbo.OutputTable SELECT ''?'', COUNT(*) from ?'
,#whereand = 'and syso.object_id in (select object_id(Name) from dbo.TableList)'
select * from dbo.OutputTable
This works perfectly well for some queries, but seems to suffer from the fact that one cannot use a GROUP BY clause within the query (or, at least, I could not find a way to do this).
The final solution I used was to use Dynamic SQL with a lookup table containing the table names. In a very simple form, this looks like:
DECLARE #TableName varchar(500)
DECLARE #curTable CURSOR
DECLARE #sql NVARCHAR(1000)
SET #curTable = CURSOR FOR
SELECT [Name] FROM Vehicles_LookupTables.dbo.AllStockTableList
OPEN #curTable
FETCH NEXT
FROM #curTable INTO #TableName
WHILE ##FETCH_STATUS = 0
BEGIN
SET #sql = 'SELECT ''' +#TableName + ''', Make, sum(1) as Total FROM ' + #TableName + ' GROUP BY Make'
EXEC sp_executesql #sql
FETCH NEXT
FROM #curTable INTO #TableName
END
CLOSE #curTable
DEALLOCATE #curTable