SQL Server variable DB as stored procedure parameter - sql

I have an Application Database (A-DB) which imports from a set of Databases (B-set-DBs, (around 50 DBs)).
The user selects 4 DBs of "B-set-DB`s" from which he wannts to import into the A-DB for his application.
I wrote an stored procedure for the import (about 4000 Lines of SQL Code) with 4 hard coded DB`s of B-set-DBs and it works perfect.
So my question is how to make the imported Database names variable?
I can not use dynamic SQL for the whole 4000 lines, because im callin functions and other stored procedures, there are a lot of parameters and strings and definitions of temporary tables and supqueries of subqueries of sub......and a lot of other weird things.
(I know it all sounds ugly but i just have to get it to run, i did not design the whole thing.)
Maybe it is possible to define a synonyme at the first line of the import stored procedure:
ALTER PROCEDURE [dbo].[test_storedProcedure]
(#DBName_1 nvarchar(512), #table_name_1 nvarchar(512))
AS
BEGIN
DECLARE #cmd nvarchar(max);
set #cmd = N'create synonym DB_SYN' +
N' for ' +
#DBName_1 + --[A70_V70_V280]
#table_name_1 --.[dbo].[UNITCHANGEMAP]
exec sp_executesql #cmd
...
SELECT * INTO A-DB
FROM DB_SYN
...
END
And afterwards work only with the Synonyms.
Or does anyone knows a way which will do the same?
BTW I am using SQL Server 2008
So i see all the 50 Databases in the Object Explorer of SQL Server.

Related

Copying a stored procedure from one database to another

I have a central management database which collates some information and runs some dynamic SQL for various other tasks when a new database is restored into the environment. One of those tasks is going to be a bit complex to achieve through dynamic SQL so I had the idea of creating a master copy stored procedure in the central DB and copying that over to the new databases after they are restored.
I've seen a few examples of people trying to do that on here but I can't get anything to play ball.
Here's what i am trying to achieve conceptually, note that I'm trying to cater for potentially multiple stored procedures to be created in this way just for future proofing.
declare #sql nvarchar(max), #DatabaseName nvarchar(200)
set #DatabaseName = 'TargetDatabase'
set #sql =
(
SELECT definition + char(13) + 'GO'
FROM sys.sql_modules s
INNER JOIN sys.procedures p
ON [s].[object_id] = [p].[object_id] WHERE p.name LIKE '%mastercopy%'
)
exec #sql
Thanks
Instead of creating dynamic script you could use one script with all the procedures that you want to create (you can script all the procs you want using 2 click in SSMS), you then run this script manually in the context of the database where you want to create these procedures or by passing the file with this script to sqlcmd with -i and passing the correct database name with -d.
Here Use the sqlcmd Utility you can see the examples.

Using single stored procedure for different data imports to different tables

I am in a designing stage of an application, I have a huge functionality of importing data into a SQL Server database. As there are numerous tables in database, I want to avoid conventional based approach of creating models and writing stored procedures for each Import. Is there a way by which I can use create single stored procedure for different tables and insert data into them?
Note: columns will vary from table to table.
Thanks in advance
Well, I would stick to comments discouraging it, but on other hand, if this procedure will be super simple and maintenance will be transferred to JSON creator, you can do it like this:
declare #tablename as nvarchar(max)
declare #json as nvarchar(max)
declare #query as nvarchar(max)
set #tablename = (SELECT TableName FROM YourAllowedTableNamesList WHERE Tablename = #tablename)
Set #query =
'Insert into ' + #tablename +
'SELECT * FROM OPENJSON(' + #json + ')'
Exec (#query)
Yes I have done something like this at my current shop. Your question is too broad so I will give you only a broad overview of what we have done.
We wrote a console app that gets a SQL Command from a meta table and executes it on the source into an in-memory DataTable. It then bulk-inserts that data into a staging table on the destination database.
Then we run a generic merge proc that looks at the system tables to get the Primary Keys and datatypes of the final destination table and constructs INSERT and UPDATE statements using dynamic sql.
Despite the well-meaning warnings of others, it's working well for us, though it does have some limitations, such as an inability to handle BLOB datatypes in a generic way. There may be other limitations that we just haven't encountered yet as well.

Update Stored procedure (T-SQL ) modification automatically?

I have two identical stored procedures with dynamic query.
Let's say
Stored procedure A
Stored procedure B
Both are in different databases. But they have the same code (Not Complete Identical.4-5 lines differ).
Is there a way to update any modification done in stored procedure A to stored procedure B automatically?
Otherwise I always need to copy and paste changes manually. It is an error-prone activity. Can anyone help me on this ?
You could do something like that:
In database A:
design your stored procedure in a way, that you have a parameter for
the database in which you want to do the work
In database B:
Create a synonym for the procedure in database A
Example:
--create procedure in database A
create procedure dbo.StoredProc
(
#dbname --or dbid if you want
)
as
begin
--create your sql command here, using dynamic sql maybe
declare #sqlcmd NVARCHAR(MAX)=N''
set #sqlcmd = 'SELECT * FROM ' + #dbname + '.dbo.AnyTable'
exec sp_executesql #sqlcmd
end
--create a synonym for this procedure in database b:
create synonym dbo.StoredProc FOR databaseA.dbo.StoredProc
--then you can call your procedure in Database A and B like this:
declare #dbname NVARCHAR(100) = DB_NAME()
exec dbo.StoredProc #dbname
so you have to maintain your code only once, and in database b you only have kind of a "link" to this procedure.
hope this helps :)
This is basically what SSDT was designed for, the idea is that you write your T-SQL and schema as CREATE statements, you build a "dacpac" and then you use sqlpackage.exe to deploy the dacpac to whatever database you want.
Doing it this way you have an overhead of the SSDT project but it fixes exactly your main problem with the existing method "It is an error-prone activity."
My blog post shows how to get an existing database into SSDT (in this case adventureworks but replace adventureworks with your database):
https://the.agilesql.club/Blog/Ed-Elliott/AdventureWorksCI-Step2-MDF-To-Dot-Sql
Ed

Copy trigger from one database to another

Is it possible, in a script executed in MS SQL Server 2005, to copy a trigger from one database to another?
I've been asked to write a test script for a trigger my project is using. Our test structure is to create an empty database containing only the object under test, then execute a script on that database that creates all the other objects needed for the test, fills them, runs whatever tests are needed, compares the results against expected results, and then drops everything except the object under test.
I can't just create a database that is empty except for the trigger, because the trigger depends on several tables. My test script currently runs the CREATE TRIGGER after all the required tables are created, but this won't do because the test script isn't allowed to contain the object under test.
What's been suggested is that, instead of running a CREATE TRIGGER, I somehow copy the trigger at that point in the script from the live database to the test database. I've had a quick Google and haven't found a way to do this. Thus my question - is this even possible, and if so, how can I do it?
You could read the text of the trigger with sp_helptext (triggername)
Or you can select the text into a variable and execute that:
declare #sql varchar(8000)
select #sql = object_definition(object_id)
from sys.triggers
where name = 'testtrigger'
EXEC #sql
I have a stored procedure that copies a bunch of tables to a test database. To make it less prone to mistakes that could potentially change the wrong database, I want to avoid using USE and instead explicitly specify per statement which database the trigger is copied from and to.
With the help of this answer, I came up with this solution:
DECLARE #sql NVARCHAR(MAX);
EXEC SourceDB.sys.sp_executesql
N'SELECT #output = (SELECT OBJECT_DEFINITION(OBJECT_ID(''TriggerName'')))',
N'#output VARCHAR(MAX) OUTPUT',
#output = #sql OUTPUT;
EXEC DestDB.sys.sp_executesql #sql;

How do I run SQL queries on different databases dynamically?

I have a sql server stored procedure that I use to backup data from our database before doing an upgrade, and I'd really like it to be able to run the stored procedure on multiple databases by passing in the database name as a parameter. Is there an easy way to do this? The best I can figure is to dynamically build the sql in the stored procedure, but that feels like its the wrong way to do it.
build a procedure to back up the current database, whatever it is. Install this procedure on all databases that you want to backup.
Write another procedure that will launch the backups. This will depend on things that you have not mentioned, like if you have a table containing the names of each database to backup or something like that. Basically all you need to do is loop over the database names and build a string like:
SET #ProcessQueryString=
'EXEC '+DatabaseServer+'.'+DatabaseName+'.dbo.'+'BackupProcedureName param1, param2'
and then just:
EXEC (#ProcessQueryString)
to run it remotely.
There isn't any other way to do this. Dynamic SQL is the only way; if you've got strict controls over DB names and who's running it, then you're okay just truncating everything together, but if there's any doubt use QUOTENAME to escape the parameter safely:
CREATE PROCEDURE doStuff
#dbName NVARCHAR(50)
AS
DECLARE #sql NVARCHAR(1000)
SET #sql = 'SELECT stuff FROM ' + QUOTENAME(#dbName) + '..TableName WHERE stuff = otherstuff'
EXEC sp_ExecuteSQL (#sql)
Obviously, if there's anything more being passed through then you'll want to double-check any other input, and potentially use parameterised dynamic SQL, for example:
CREATE PROCEDURE doStuff
#dbName NVARCHAR(50)
#someValue NVARCHAR(10)
AS
DECLARE #sql NVARCHAR(1000)
SET #sql = 'SELECT stuff FROM ' + QUOTENAME(#dbName) + '..TableName WHERE stuff = #pOtherStuff'
EXEC sp_ExecuteSQL (#sql, '#pOtherStuff NVARCHAR(10)', #someValue)
This then makes sure that parameters for the dynamic SQL are passed through safely and the chances for injection attacks are reduced. It also improves the chances that the execution plan associated with the query will get reused.
personally, i just use a batch file and shell to sqlcmd for things like this. otherwise, building the sql in a stored proc (like you said) would work just fine. not sure why it would be "wrong" to do that.
best regards,
don
MSSQL has an OPENQUERY(dbname,statement) function where if the the server is linked, you specify it as the first parameter and it fires the statement against that server.
you could generate this openquery statement in a dynamic proc. and either it could fire the backup proc on each server, or you could execute the statement directly.
Do you use SSIS? If so you could try creating a couple ssis packages and try scheduling them,or executing them remotely.