If I want to run this sort of query in SQL Server, how can I do the same query from one server I am connected to to another?
I tried adding "[ServerName1]." before "[DatabaseName1].[dbo]..." and "[ServerName2]." before "[DatabaseName2].[dbo]..." but that didn't seem to work.
INSERT INTO [DatabaseName1].[dbo].[TableName]
([FieldName])
SELECT [FieldName] FROM [DatabaseName2].[dbo].[TableName]
Is this possible?
Yes you would use the server-name before the whole rest of object-name like:
myserver.mydatabase.dbo.mytable
However you first have to set up linked servers. Look up linked servers in BOL.
If you have adhoc distributed queries enabled you can use OPENDATASOURCE. Setting up a linked server is another option. Not sure of the pros and cons of each approach.
INSERT INTO [DatabaseName1].[dbo].[TableName]
SELECT FieldName
FROM OPENDATASOURCE('SQLNCLI',
'Data Source=Server\InstanceName;Integrated Security=SSPI')
.DatabaseName2.dbo.TableName
The best way to do this would be to create a "linked server" between the two. You will need appropriate permissions to do this.
Then it's just a matter of accessing the databases using your linkedserver name.
Ex: [linkedserver].databasename.dbo.tablename
To create a linkedserver, go to server objects->right click on linked servers->click on 'new linked server'.
In SSMS, Go to Query -> 'SQLCMD Mode'
DECLARE #VERSION VARCHAR(1000)
:CONNECT Soruce_Server_Name
SELECT ##VERSION AS [SQL_VERSION]
INTO
:CONNECT Destination_Server_Name
[MSSQLTips].[dbo].[TEST]
Now on the Destination Server, execute your select command to check your output. For E.g.
SELECT * FROM [CloverInfotech_DB].[dbo].[TEST]
Related
I am attempting to create one single database to store all login errors.
insert into [dbo].[SQL_ErrorLog]
exec sp_readerrorlog 0, 1, 'error'
The above code gets me the information that I need for the current long and I understand that changing the 0 to a 1,2....etc will get me the previous days logs.
I have 4 different environments and instead of setting this same job up on each environment, I would like to control it all from 1 single job. I intend to add a field to determine which environment the log information is coming from.
I know that I could also set up staging tables on each environment and then run a select statement to pull in data from each staging table to the final table, however again I am trying to complete all the work from one environment if possible.
I have linked the other environments using the linked servers and can select data from any of them without a problem.
My question is more related on how I can run the exec sp_readerror stored procedure on the other server and insert that data into my master table.
An example would be:
Env0 - This is where the master table would be and where I would like to set everything up
Env1
Env2
Env3
I would like to be able to pull sp_readerror 0, 1, 'error' information from Env1, Env2, and Env3 and populate it on Env0 without using staging tables on each individual environment if possible.
Please let me know if this is not 100% clear. It makes sense in my head, however that does not always come out in text form. :)
Thanks in Advance.
If you are using linked servers it seems like you could link together multiple calls using go from the main source server. This will work assuming your linked servers are linked off one server.
INSERT INTO [Linked Server Name]. [some database name].[dbo].[SQL_ErrorLog]
EXEC [Linked Server Name].[some database name].[dbo].sp_readerrorlog
GO
INSERT INTO [Linked Server Name2]. [some database name].[dbo].[SQL_ErrorLog]
EXEC [Linked Server Name2].[some database name].[dbo].sp_readerrorlog
GO
INSERT INTO [Linked Server Name3]. [some database name].[dbo].[SQL_ErrorLog]
EXEC [Linked Server Name3].[some database name].[dbo].sp_readerrorlog
GO
INSERT INTO [Linked Server Name4]. [some database name].[dbo].[SQL_ErrorLog]
EXEC [Linked Server Name4].[some database name].[dbo].sp_readerrorlog
I think this will be your best bet. You can use the agent and then put all of these into the agent job and run the job. They will need to be fully qualified in order to run on the correct linked server.
Could you please help me with below task. I need to run below query on remote server
UPDATE prod
SET prod.count = ( SELECT SUM(Inv) FROM cost WHERE pID = prod.ID)
WHERE (( SELECT COUNT(id) FROM Cost WHERE pID = prod.ID ) > 0)
I have tried to use OPENROWSET but do not have enough experience working with it and all online examples with OPENROWSET that I saw use only one table. Can you please give me an idea how to modify this query to use OPENROWSET or ideas of any other solutions?
You can use direct linked server or OPENQUERY (linked_server_name ,'your query').
Best way to to this is to create procedure on target instance and use only easy exec on linked server to run procedure.
exec LinkedServer.TargetDB.TargerSc.NewProcedure
Openrowset is used to "open row set" - primary to access remote data, to modify remote data is openquery.
I know you can do something like:
select count(*) as Qty from sys.databases where name like '%mydatabase%'
but how could you do something like:
select count(*) as Qty from linkedServer.sys.databases where name like '%mydatabases%'
I guess I could put a stored procedure on the linked server and execute the first select, but is there a way to query a linked server for what databases it holds?
Assuming your linked server login has read permissions on the master.sys.databases table, you can use the following:
select * from linkedserver.master.sys.databases
In the past, I've used this very query on SQL Server 2008 R2.
I think its just a matter of your syntax that is stopping you, try using single quotes instead of %% around your database name:
SELECT COUNT(*) as Qty FROM LinkedServer.master.sys.databases where name like 'mydatabase'
The correct formatting for selecting a Linked Server has already been answered here:
SQL Server Linked Server Example Query
Listed below is a link to a cursor that works:
http://jasonbrimhall.info/2012/03/05/are-my-linked-servers-being-used/
The query will need some rework to include all functions and triggers though.
I'm not sure if a remote master DB is always available through a linked server.
I'll be using the following TRY CATCH probe
BEGIN TRY
EXEC ('SELECT TOP 1 1 FROM MyLinkedServer.MyTestDb.INFORMATION_SCHEMA.TABLES')
END TRY
BEGIN CATCH
PRINT 'No MyTestDB on MyLinkedServer'
END CATCH
I'm doing some fairly complex queries against a remote linked server, and it would be useful to be able to store some information in temp tables and then perform joins against it - all with the remote data. Creating the temp tables locally and joining against them over the wire is prohibitively slow.
Is it possible to force the temp table to be created on the remote server? Assume I don't have sufficient privileges to create my own real (permanent) tables.
This works from SQL 2005 SP3 linked to SQL 2005 SP3 in my environment. However if you inspect the tempdb you will find that the table is actually on the local instance and not the remote instance. I have seen this as a resolution on other forums and wanted to steer you away from this.
create table SecondServer.#doll
(
name varchar(128)
)
GO
insert SecondServer.#Doll
select name from sys.objects where type = 'u'
select * from SecondServer.#Doll
I am 2 years late to the party but you can accomplish this using sp_executeSQL and feeding it a dynamic query to create the table remotely.
Exec RemoteServer.RemoteDatabase.RemoteSchema.SP_ExecuteSQL N'Create Table here'
This will execute the temp table creation at the remote location..
It's not possible to directly create temporary tables on a linked remote server. In fact you can't use any DDL against a linked server.
For more info on the guidelines and limitations of using linked servers see:
Guidelines for Using Distributed Queries (SQL 2008 Books Online)
One work around (and off the top of my head, and this would only work if you had permissions on the remote server) you could:
on the remote server have a stored procedure that would create a persistent table, with a name based on an IN parameter
the remote stored procedure would run a query then insert the results into this table
You then query locally against that table perform any joins to any local tables required
Call another stored procedure on the remote server to drop the remote table when you're done
Not ideal, but a possible work around.
Yes you can but it only lasts for the duration of the connection.
You need to use the EXECUTE AT syntax;
EXECUTE('SELECT * INTO ##example FROM sys.objects; WAITFOR DELAY ''00:01:00''') AT [SERVER2]
On SERVER2 the following will work (for 1 minute);
SELECT * FROM ##example
but it will not work on the local server.
Incidently if you open a transaction on the second server that uses ##example the object remains until the transaction is closed. It also stops the creating statement on the first server from completing. i.e. on server2 run and the transaction on server1 will continue indefinately.
BEGIN TRAN
SELECT * FROM ##example WITH (TABLOCKX)
This is more accademic than of practical use!
If memory is not much of an issue, you could also use table variables as an alternative to temporary tables. This worked for me when running a stored procedure with need of temporary data storage against a Linked Server.
More info: eg this comparison of table variables and temporary tables, including drawbacks of using table variables.
I need to update the SQL SERVER stored procedure on three different servers. I do not like to perform this manually. What are my options?
You can use the SQLCMD utility to connect to the three different servers / databases and run the stored procedure script. The control script may look something like this:
:connect server1
use DatabaseName
GO
:r StoredProcedure.sql
GO
:connect server2
use DatabaseName
GO
:r StoredProcedure.sql
GO
:connect server3
use DatabaseName
GO
:r StoredProcedure.sql
GO
SQL Compare is a great tool, especially for large or complex updates. However, you do have to pay for it. Using a utility like SQLCMD is not quite so elegant, but it is quick and free.
Use a tool like Red-Gate SQL Compare to create a script and then use their Multi-Script tool to execute it on multiple servers at one time.
www.red-gate.com
You could use a SQL Server synchronization tool, such as Red Gate SQL Compare. Or you could write a small script / application to connect to each server and execute the update statement, using OSQL.
You can set up some replication between the servers...have 1 main server that you make the update on, and then send that update out to each other server by use of a publication to the other servers. That'd be an easy way to do this.
Check out Migrator.NET, this combined with a builder like Hudson that runs on a check-in should do the trick. Plus you get versioning and rollbacks along with it.
With "Central Management Servers" feature of SQL Server 2008, what you can do is to add those three servers into one group and then run a single alter procedure script against these three servers.