I have 2 SQL Servers:
temp1 XX.13.23.2
temp2 XX.23.45.6
The temp1 server has a database called db1 and contains a procedure called p1.
I want that procedure to insert the value on Temp2 server Database name db2 on table T1.
Is it possible to use procedure to insert value on another server's database?
If this is this possible then can someone provide me with an idea or some examples on how to achieve this?
Yes, please look into linked servers:
http://msdn.microsoft.com/en-us/library/ms188279%28SQL.90%29.aspx
You can call a remote stored procedure from the instance you want to insert to:
exec [RemoteServer].DatabaseName.DatabaseOwner.StoredProcedureName
You need to have the RemoteServer set up as a linked server.
Another option, especially if you're going to have a development version of the procedure where you're going to want to do tests and you don't want touching a production environment, would be to use SQL Server synonyms: http://technet.microsoft.com/en-us/library/ms177544.aspx.
I personally like using them because once the proc is initially setup to use them, you won't have to change the SQL in the procedure.
Related
I am attempting to create one single database to store all login errors.
insert into [dbo].[SQL_ErrorLog]
exec sp_readerrorlog 0, 1, 'error'
The above code gets me the information that I need for the current long and I understand that changing the 0 to a 1,2....etc will get me the previous days logs.
I have 4 different environments and instead of setting this same job up on each environment, I would like to control it all from 1 single job. I intend to add a field to determine which environment the log information is coming from.
I know that I could also set up staging tables on each environment and then run a select statement to pull in data from each staging table to the final table, however again I am trying to complete all the work from one environment if possible.
I have linked the other environments using the linked servers and can select data from any of them without a problem.
My question is more related on how I can run the exec sp_readerror stored procedure on the other server and insert that data into my master table.
An example would be:
Env0 - This is where the master table would be and where I would like to set everything up
Env1
Env2
Env3
I would like to be able to pull sp_readerror 0, 1, 'error' information from Env1, Env2, and Env3 and populate it on Env0 without using staging tables on each individual environment if possible.
Please let me know if this is not 100% clear. It makes sense in my head, however that does not always come out in text form. :)
Thanks in Advance.
If you are using linked servers it seems like you could link together multiple calls using go from the main source server. This will work assuming your linked servers are linked off one server.
INSERT INTO [Linked Server Name]. [some database name].[dbo].[SQL_ErrorLog]
EXEC [Linked Server Name].[some database name].[dbo].sp_readerrorlog
GO
INSERT INTO [Linked Server Name2]. [some database name].[dbo].[SQL_ErrorLog]
EXEC [Linked Server Name2].[some database name].[dbo].sp_readerrorlog
GO
INSERT INTO [Linked Server Name3]. [some database name].[dbo].[SQL_ErrorLog]
EXEC [Linked Server Name3].[some database name].[dbo].sp_readerrorlog
GO
INSERT INTO [Linked Server Name4]. [some database name].[dbo].[SQL_ErrorLog]
EXEC [Linked Server Name4].[some database name].[dbo].sp_readerrorlog
I think this will be your best bet. You can use the agent and then put all of these into the agent job and run the job. They will need to be fully qualified in order to run on the correct linked server.
We have environments for Dev, Uat, and Live. We have a different version of the same stored procedure for each environment. For each environment, a stored procedure is run from a database catalog of the same name "CMS". In each case, the sql that is called differs only on the server name. For example, the code on UAT looks like this (simplified):
INSERT INTO UAT.ABC.[dbo].NOTE (ID, [TEXT])
VALUES (2, 'Just a note')
The code on Live looks like this (simplified):
INSERT INTO Live.ABC.[dbo].NOTE (ID, [TEXT])
VALUES (2, 'Just a note')
We would like to just write the stored procedure once and be able to deploy that same stored procedure so that it points to the right server when, say, performing the insert statement in our example. We wish to avoid using dynamic sql. Is there a way to pass down a parameter into the stored procedure to tell it which server to use? Can this be achieved using sqlcmd with scripting variables, if so then how? Is there an easy way of doing this without dynamic sql or scripting variables?
EDIT
Six separate instances of SQL Server 2014 - 3 for each environment for the calling code and 3 for each environment for the code being called.
I'm assuming there are 3 copies of the DB with "identical" code, or rather to code should be identical. Remove the hardcoded DB reference, and the stored procedure will run in the context of the DB in which it lives.
You are making life hard for yourself by hardcoding the db reference.
And you answer your own question. If you ahve to do this, then you must use dynamic SQL.
Keep the Linked Server name the same in all instances, and just change the definition of where the linked server points to go to the correct instance.
So use something like:
INSERT INTO MyAppLinkedServer.ABC.[dbo].NOTE (ID, [TEXT])
VALUES (2, 'Just a note')
In UAT, MyAppLinkedServer will point to UAT SQL server.
In Dev, MyAppLinkedServer will point to DEV SQL server.
Here is an example of how to setup the linked server:
EXEC master.dbo.sp_addlinkedserver #server = N'MyAppLinkedServer', #srvproduct=N'MyAppLinkedServer', #provider=N'SQLNCLI', #datasrc=N'ActualServerNameGoesHere', #catalog=N'ABC'
The short answer was not to use any linked servers but to create any entry the host file for each environment, where the entry would have a different IP mapping for each environment.
Is there a way to update another database with the newly created stored procedure whenever a stored procedure is created in the main database?
For example, i have two databases, DB1 and DB2.
When I create a stored procedure in DB1, i want that same procedure to created in DB2 automatically? Is there a trigger that can do this?
USE [DB1]
CREATE PROCEDURE [dbo].[TestSproc]
..something
AS
BEGIN
...something
END
I know a use [DB2] statement will do the job, but i want this to be done automatically. Any thought?
Thanks for the help!
This might be a bit evil, but in SQL Server you are able to create DDL triggers which fire when you create/alter/drop tables/procedures etc. The syntax to fire when a procedure is created is:
CREATE TRIGGER <proc-name>
ON DATABASE
FOR CREATE_PROCEDURE
AS
--Your code goes here
In a DDL trigger you get access to an object called EVENTDATA. So to get the text of the procedure you created:
SELECT EVENTDATA().value('(/EVENT_INSTANCE/TSQLCommand/CommandText)[1]','nvarchar(max)')
Now all you need to do is keep that value and execute it in your secondary database. So your query becomes something like this, though I leave the code to update the secondary database down to you as I don't know if it's on the same server, linked server etc.:
CREATE TRIGGER sproc_copy
ON DATABASE
FOR CREATE_PROCEDURE
AS
DECLARE #procedureDDL AS NVARCHAR(MAX) = EVENTDATA().value('(/EVENT_INSTANCE/TSQLCommand/CommandText)[1]','nvarchar(max)')
--Do something with #procedureDDL
As this is not tested, there's probably a few gotchas. For example, what happens if you create procedure with full name (CREATE PROC server.database.schema.proc)
No simple solution for this unless you want to execute the same statement twice, once on each target database,
One thing that comes to my mind is that you can Set up Replication and only publish Stored Procedures to your second database which will be the subscriber in this case.
Following is the window where you select which Objects you want to send over to your secondary databases.
Is there a way to check when and with what parameter values a stored procedure has been executed in SQL Server 2008 R2?
As usr said there is no way to do this at all, but you can do this as workaround, which I did in my projects.
Create a log table and implement in each procedure a INSERT INTO log_table statement where you insert time with GetDate(), procedure name and logged user. This table you can seek for your informations then.
This for sure only works for the future and not if you want to look for "old-use".
No, sorry. There is no way to do this.
You can use profiler for the task.
See other threads:
How to implement logging and error reporting in SQL stored procedures?
"Debug"(get information) on a running stored procedure in MS Sql Server
I'm doing some fairly complex queries against a remote linked server, and it would be useful to be able to store some information in temp tables and then perform joins against it - all with the remote data. Creating the temp tables locally and joining against them over the wire is prohibitively slow.
Is it possible to force the temp table to be created on the remote server? Assume I don't have sufficient privileges to create my own real (permanent) tables.
This works from SQL 2005 SP3 linked to SQL 2005 SP3 in my environment. However if you inspect the tempdb you will find that the table is actually on the local instance and not the remote instance. I have seen this as a resolution on other forums and wanted to steer you away from this.
create table SecondServer.#doll
(
name varchar(128)
)
GO
insert SecondServer.#Doll
select name from sys.objects where type = 'u'
select * from SecondServer.#Doll
I am 2 years late to the party but you can accomplish this using sp_executeSQL and feeding it a dynamic query to create the table remotely.
Exec RemoteServer.RemoteDatabase.RemoteSchema.SP_ExecuteSQL N'Create Table here'
This will execute the temp table creation at the remote location..
It's not possible to directly create temporary tables on a linked remote server. In fact you can't use any DDL against a linked server.
For more info on the guidelines and limitations of using linked servers see:
Guidelines for Using Distributed Queries (SQL 2008 Books Online)
One work around (and off the top of my head, and this would only work if you had permissions on the remote server) you could:
on the remote server have a stored procedure that would create a persistent table, with a name based on an IN parameter
the remote stored procedure would run a query then insert the results into this table
You then query locally against that table perform any joins to any local tables required
Call another stored procedure on the remote server to drop the remote table when you're done
Not ideal, but a possible work around.
Yes you can but it only lasts for the duration of the connection.
You need to use the EXECUTE AT syntax;
EXECUTE('SELECT * INTO ##example FROM sys.objects; WAITFOR DELAY ''00:01:00''') AT [SERVER2]
On SERVER2 the following will work (for 1 minute);
SELECT * FROM ##example
but it will not work on the local server.
Incidently if you open a transaction on the second server that uses ##example the object remains until the transaction is closed. It also stops the creating statement on the first server from completing. i.e. on server2 run and the transaction on server1 will continue indefinately.
BEGIN TRAN
SELECT * FROM ##example WITH (TABLOCKX)
This is more accademic than of practical use!
If memory is not much of an issue, you could also use table variables as an alternative to temporary tables. This worked for me when running a stored procedure with need of temporary data storage against a Linked Server.
More info: eg this comparison of table variables and temporary tables, including drawbacks of using table variables.