Using 2 differecnt DB's in same SP in SQL Azure - azure-sql-database

I am using an SP which will insert data in 2 tables in 2 different DB's. To mainitain the transaction, the SP has been designed like that. Its working fine in SQL Server environment.
Like Insert into AdminDB.EmpSiteConfig values(,,,)
Insert into MainDB.EmpDetails values(,,,)
where AdminDB and MainDB are the database names.
But when I migrate it to SQL Azure, I am getting an error as follows.
'Reference to database and/or server name in MainDB.dbo.EmpDetails' is not supported in this version of SQL Server.'
Can somebody tell me how to get rid of this error? Or is there any workaround for this?
Thanks in advance.

SQL Azure does not currently support linking to another server. As to workarounds, you could create a queue message requesting a specific action for data insertion. In your worker role, consume the queue message and call a separate stored procedure on each database.

although i think it is better to use David Makogon's solution you might want to take your changes with SQL Shard: http://enzosqlshard.codeplex.com/

Related

Data Migration Assistant - Cross-database queries

I am running an assessment using the Azure SQL Data Migration Assistant (3.4.3948.1). In my initial assessment, I had a function that was calling fn_varbintohexstr so I fixed it (read removed the function). I also deleted all our synonyms.
Now I run the assessment more times and it continues to give the 'cross-database queries' error but without listing any more specifics. How can I find out what particular objects it means? Or is it possible that it has somehow cached my result and I need to invalidate it somehow?
This means you have programming objects in that database that reference another database. For example a query like this:
SELECT * FROM Database1.dbo.Table1
One of the options you have is to import those external objects to your database and change the three and four-part name references that SQL Azure does not support.
You can also use CREATE EXTERNAL DATA SOURCE and CREATE EXTERNAL TABLE on SQL Azure to query tables that belong to other databases that you have to migrate to SQL Azure too.

Updating multiple tables data from different databases having same column name

I have 11 databases in which I'm having tables contains User Details i.e. all employee details. There I have a column "Status"(which is 1 for Active and 0 for Inactive). I have a regular tasks for updating "Status" column value 0 or 1 for mentioned employees and for that, I have to go through all the databases then User table then I have to update. The same task i have to do for all the database and it consumes a lot of time.
If I will get a short Query or Procedure that I have to run once and will do all updation at once then, it would be a great help.
I see a couple of possible options.
You could build an SSIS package to connect to each database and do the necessary updates provided the criteria of which employees to update and what to update them to could be found within the database or some external source such as a text file.
Alternatively, you could use SQLCMD mode in SQL Server Management Studio and then within your SQL script use CONNECT command to switch to each server and database something like this...
:CONNECT Server1
USE Database1
--put your update SQL script
:CONNECT Server2
USE Database2
--put your update SQL script
...
These links provide some further information on using SQLCMD mode...
Connecting to multiple servers in a Query Window using SQLCMD
SQL Server SQLCMD Basics
Noel
As you mentioned, you have 11 databases.
Problem : First, you are using very bad approach for database design,
What really Happens : When you are using multiple databases and you need to check in every database, then the server needs to connect to different database again and again, which takes very more time compared to switching into the tables, because of connection handling.
Solution : In your case, you have only one option to connect different databases in loops and then run the query in the loop for every DB.
Suggestion : you should keep all the data in the same database, you can use an extra column in tables to keep track your data to different entities.

How to query a table to a view and publish to a different database

I have 13 SQL databases some 2005 others 2008, on a VPN. I'd like to take all of the data from the "Employees" table on each database and make it a view at each location. I would then like to publish these views to 1 database on another server, all in one table marking where each came from within the origninal databases. For example the database where all the information goes to would look like this:
User Name Location
bik Bob K 1
JS John S 2
Etc.
Any help is appreciated.
I assume you want the data on the final server to be viewable, but not modifiable, and to reflect changes made to the source databases?
This would probably not perform all that well, but one do-it-yourself-way to do it would be the following (disclaimer: I haven't tried doing this myself):
Set up all the source servers as linked servers on the final server.
Create a view in this form:
SELECT *, 1 as Location
FROM [Linked Server 1].Database1.dbo.Table1
UNION ALL
SELECT *, 2 as Location
FROM [Linked Server 2].Database2.dbo.Table2
... etc ....
You might want to read this documentation on distributed queries, if you haven't already.
I believe it's also possible to use SSIS as the source of a distributed query, but a quick scan through the documentation didn't find anything about it. I mention that because SSIS would make pulling and transforming data from disaparate data sources very easy, and if you could use the final recordset as a data source, you could use an SSIS package as the backend to your view. However, again, performance would probably require considerable tuning.
If the queries don't have to be real time you could look into using SQL Server Integration Services (SSIS) to pull in the data to a local DB. you could schedule the job to run hourly/daily/weekly..

Sql: export database using TSQL

I have database connection to database DB1. The only thing I could do - execute any t-sql statements including using stored procedures. I want to export the specific table (or even the specific rows of specific table) to my local database. As you can read abve, DBs are on diffrent servers meaning no direct connection is possible. Therefore question: Is it possible to write query that returns the other query to execute on local server and get data? Also note, that table contains BLOBs. Thanks.
If you have SQL Server Management Studio, you can use the data import function on your local database to get the data. It works as long as you have Read/Select access on the tables you are trying to copy.
If you have Visual Studio you can use the database tools in there to move data between two servers as long as you can connect to both from your workstation.
Needs Ultimate or Premium though:
http://msdn.microsoft.com/en-us/library/dd193261.aspx
RedGate has some usefull tools too:
http://www.red-gate.com/products/sql-development/sql-compare/features
Maybe you should ask at https://dba.stackexchange.com/ instead.
If you can login to the remote db (where you can only issue t-sql), you may create linked server on your local server to the remote and use it later directly in queries, like:
select * from [LinkedServerName].[DatabaseName].[SchemaName].[TableName]

Efficiently moving large sets of data between SQL Server tables?

I have a rather large (many gigabytes) table of data in SQL Server that I wish to move to a table in another database on the same server.
The tables are the same layout.
What would be the most effecient way of going about doing this?
This is a one off operation so no automation is required.
Many thanks.
If it is a one-off operation, why care about top efficiency so much?
SELECT * INTO OtherDatabase..NewTable FROM ThisDatabase..OldTable
or
INSERT OtherDatabase..NewTable
SELECT * FROM ThisDatabase..OldTable
...and let it run over night. I would dare to say that using SELECT/INSERT INTO on the same server is not far from the best efficiency you can get anyway.
Or you could use the "SQL Import and Export Wizard" found under "Management" in Microsoft SQL Server Management Studio.
I'd go with Tomalak's answer.
You might want to temporarily put your target database into bulk-logged recovery mode before executing a 'select into' to stop the log file exploding...
If it's SQL Server 7 or 2000 look at Data Transformation Services (DTS). For SQL 2005 and 2008 look at SQL Server Integration Services (SSIS)
Definitely put the target DB into bulk-logged mode. This will minimally log the operation and speed it up.