Creating an SQL Server Error Log using sp_readerror from multiple servers SQL Server 2012 - sql

I am attempting to create one single database to store all login errors.
insert into [dbo].[SQL_ErrorLog]
exec sp_readerrorlog 0, 1, 'error'
The above code gets me the information that I need for the current long and I understand that changing the 0 to a 1,2....etc will get me the previous days logs.
I have 4 different environments and instead of setting this same job up on each environment, I would like to control it all from 1 single job. I intend to add a field to determine which environment the log information is coming from.
I know that I could also set up staging tables on each environment and then run a select statement to pull in data from each staging table to the final table, however again I am trying to complete all the work from one environment if possible.
I have linked the other environments using the linked servers and can select data from any of them without a problem.
My question is more related on how I can run the exec sp_readerror stored procedure on the other server and insert that data into my master table.
An example would be:
Env0 - This is where the master table would be and where I would like to set everything up
Env1
Env2
Env3
I would like to be able to pull sp_readerror 0, 1, 'error' information from Env1, Env2, and Env3 and populate it on Env0 without using staging tables on each individual environment if possible.
Please let me know if this is not 100% clear. It makes sense in my head, however that does not always come out in text form. :)
Thanks in Advance.

If you are using linked servers it seems like you could link together multiple calls using go from the main source server. This will work assuming your linked servers are linked off one server.
INSERT INTO [Linked Server Name]. [some database name].[dbo].[SQL_ErrorLog]
EXEC [Linked Server Name].[some database name].[dbo].sp_readerrorlog
GO
INSERT INTO [Linked Server Name2]. [some database name].[dbo].[SQL_ErrorLog]
EXEC [Linked Server Name2].[some database name].[dbo].sp_readerrorlog
GO
INSERT INTO [Linked Server Name3]. [some database name].[dbo].[SQL_ErrorLog]
EXEC [Linked Server Name3].[some database name].[dbo].sp_readerrorlog
GO
INSERT INTO [Linked Server Name4]. [some database name].[dbo].[SQL_ErrorLog]
EXEC [Linked Server Name4].[some database name].[dbo].sp_readerrorlog
I think this will be your best bet. You can use the agent and then put all of these into the agent job and run the job. They will need to be fully qualified in order to run on the correct linked server.

Related

Check database / server before executing query

I am frequently testing certain areas on a development server and so running a pre-defined SQL statement to truncate the tables in question before testing again. It would only be a slip of a key to switch to the live server.
I'm looking for an IF statement or similar to prevent that.
Either to check the server name, database name, or even that a certain record in a different table exists before running the query.
Any help appreciated
For such cases I use stored procedures. I'd call them TestTruncateTables, etc.
Then instead of calling TRUNCATE TABLE you should CALL TestTruncateTables.
Just make sure that the procedures are not created on the live server. If by any chance you happen to run CALL TestTruncateTables on the live server you only get an error about non-existing proc.

How do we get the same stored procedure code to call a different database server depending upon the calling environment?

We have environments for Dev, Uat, and Live. We have a different version of the same stored procedure for each environment. For each environment, a stored procedure is run from a database catalog of the same name "CMS". In each case, the sql that is called differs only on the server name. For example, the code on UAT looks like this (simplified):
INSERT INTO UAT.ABC.[dbo].NOTE (ID, [TEXT])
VALUES (2, 'Just a note')
The code on Live looks like this (simplified):
INSERT INTO Live.ABC.[dbo].NOTE (ID, [TEXT])
VALUES (2, 'Just a note')
We would like to just write the stored procedure once and be able to deploy that same stored procedure so that it points to the right server when, say, performing the insert statement in our example. We wish to avoid using dynamic sql. Is there a way to pass down a parameter into the stored procedure to tell it which server to use? Can this be achieved using sqlcmd with scripting variables, if so then how? Is there an easy way of doing this without dynamic sql or scripting variables?
EDIT
Six separate instances of SQL Server 2014 - 3 for each environment for the calling code and 3 for each environment for the code being called.
I'm assuming there are 3 copies of the DB with "identical" code, or rather to code should be identical. Remove the hardcoded DB reference, and the stored procedure will run in the context of the DB in which it lives.
You are making life hard for yourself by hardcoding the db reference.
And you answer your own question. If you ahve to do this, then you must use dynamic SQL.
Keep the Linked Server name the same in all instances, and just change the definition of where the linked server points to go to the correct instance.
So use something like:
INSERT INTO MyAppLinkedServer.ABC.[dbo].NOTE (ID, [TEXT])
VALUES (2, 'Just a note')
In UAT, MyAppLinkedServer will point to UAT SQL server.
In Dev, MyAppLinkedServer will point to DEV SQL server.
Here is an example of how to setup the linked server:
EXEC master.dbo.sp_addlinkedserver #server = N'MyAppLinkedServer', #srvproduct=N'MyAppLinkedServer', #provider=N'SQLNCLI', #datasrc=N'ActualServerNameGoesHere', #catalog=N'ABC'
The short answer was not to use any linked servers but to create any entry the host file for each environment, where the entry would have a different IP mapping for each environment.

Problem with SQL Server client DB upgrade script

SQL Server 2005, Win7, VS2008. I have to upgrade database from the old version of product to the newer one. I'd like to have one script that creates new database and upgrades old database to the new state. I am trying to do the following (SQL script below) and get the error (when running on machine with no database ):
Database 'MyDatabase' does not exist. Make sure that the name is
entered correctly.
The question is:
How can I specify database name in upgrade part
Is the better way to write create/upgrade exists ?
SQL code:
USE [master]
-- DB upgrade part
if exists (select name from sysdatabases where name = 'MyDatabase')
BEGIN
IF (<Some checks that DB is new>)
BEGIN
raiserror('MyDatabase database already exists and no upgrade required', 20, -1) with log
END
ELSE
BEGIN
USE [MyDatabase]
-- create some new tables
-- alter existing tables
raiserror('MyDatabase database upgraded successfully', 20, -1) with log
END
END
-- DB creating part
CREATE DATABASE [MyDatabase];
-- create new tables
You don't usually want to explicitly specify database name in a script. Rather, supply it exernally or pre-process the SQL to replace a $$DATABASENAME$$ token with the name of an actual database.
You're not going to be able to include the USE [MyDatabase] in your script since, if the database doesn't exist, the query won't parse.
Instead, what you can do is keep 2 separate scripts, one for an upgrade and one for a new database. Then you can call the scripts within the IF branches through xp_cmdshell and dynamic SQL. The following link has some examples that you can follow:
http://abhijitmore.wordpress.com/2011/06/21/how-to-execute-sql-using-t-sql/
PowerShell may make this task easier as well, but I don't have any direct experience using it.

Is it possible to create a temp table on a linked server?

I'm doing some fairly complex queries against a remote linked server, and it would be useful to be able to store some information in temp tables and then perform joins against it - all with the remote data. Creating the temp tables locally and joining against them over the wire is prohibitively slow.
Is it possible to force the temp table to be created on the remote server? Assume I don't have sufficient privileges to create my own real (permanent) tables.
This works from SQL 2005 SP3 linked to SQL 2005 SP3 in my environment. However if you inspect the tempdb you will find that the table is actually on the local instance and not the remote instance. I have seen this as a resolution on other forums and wanted to steer you away from this.
create table SecondServer.#doll
(
name varchar(128)
)
GO
insert SecondServer.#Doll
select name from sys.objects where type = 'u'
select * from SecondServer.#Doll
I am 2 years late to the party but you can accomplish this using sp_executeSQL and feeding it a dynamic query to create the table remotely.
Exec RemoteServer.RemoteDatabase.RemoteSchema.SP_ExecuteSQL N'Create Table here'
This will execute the temp table creation at the remote location..
It's not possible to directly create temporary tables on a linked remote server. In fact you can't use any DDL against a linked server.
For more info on the guidelines and limitations of using linked servers see:
Guidelines for Using Distributed Queries (SQL 2008 Books Online)
One work around (and off the top of my head, and this would only work if you had permissions on the remote server) you could:
on the remote server have a stored procedure that would create a persistent table, with a name based on an IN parameter
the remote stored procedure would run a query then insert the results into this table
You then query locally against that table perform any joins to any local tables required
Call another stored procedure on the remote server to drop the remote table when you're done
Not ideal, but a possible work around.
Yes you can but it only lasts for the duration of the connection.
You need to use the EXECUTE AT syntax;
EXECUTE('SELECT * INTO ##example FROM sys.objects; WAITFOR DELAY ''00:01:00''') AT [SERVER2]
On SERVER2 the following will work (for 1 minute);
SELECT * FROM ##example
but it will not work on the local server.
Incidently if you open a transaction on the second server that uses ##example the object remains until the transaction is closed. It also stops the creating statement on the first server from completing. i.e. on server2 run and the transaction on server1 will continue indefinately.
BEGIN TRAN
SELECT * FROM ##example WITH (TABLOCKX)
This is more accademic than of practical use!
If memory is not much of an issue, you could also use table variables as an alternative to temporary tables. This worked for me when running a stored procedure with need of temporary data storage against a Linked Server.
More info: eg this comparison of table variables and temporary tables, including drawbacks of using table variables.

Insert value to another sqlserver db

I have 2 SQL Servers:
temp1 XX.13.23.2
temp2 XX.23.45.6
The temp1 server has a database called db1 and contains a procedure called p1.
I want that procedure to insert the value on Temp2 server Database name db2 on table T1.
Is it possible to use procedure to insert value on another server's database?
If this is this possible then can someone provide me with an idea or some examples on how to achieve this?
Yes, please look into linked servers:
http://msdn.microsoft.com/en-us/library/ms188279%28SQL.90%29.aspx
You can call a remote stored procedure from the instance you want to insert to:
exec [RemoteServer].DatabaseName.DatabaseOwner.StoredProcedureName
You need to have the RemoteServer set up as a linked server.
Another option, especially if you're going to have a development version of the procedure where you're going to want to do tests and you don't want touching a production environment, would be to use SQL Server synonyms: http://technet.microsoft.com/en-us/library/ms177544.aspx.
I personally like using them because once the proc is initially setup to use them, you won't have to change the SQL in the procedure.