I have a Cloud hosted Database, maintained by a different company. Here is my scenario:
I would like to bring down the entire database, schema and data, and keep my Local database updated real-time(sql server 2008r2).
I cannot setup replication inside SQL server, I do not have permissions on the cloud server.
I cannot use triggers.
I cannot use linked servers
They would provide me a copy of backup (nightly)and access to the transacrion logs every 1 hour
How can I use these to update my entire database.
Thanks in advance
Related
I have a scheduled task on my warehouse server that runs a batch file, which triggers a command that runs a sproc on my SQL server. The task is set up to run as ACCOUNTS\sqlservice.
I recently made some updates to my linked server objects to avoid warehouse users querying data from them by mapping just the user(s) that should have query access in the linked server security. While mapping the local sql server login to sql server login works, I can't seem to map a domain account between the two servers successfully, that is, the ACCOUNTS\sqlservice, who has sa on both servers.
Any ideas on how I can give the sqlservice account access to query the linked server object? Thank you!
One solution: use an appropriate alternative user for the remote mapping that is a local sql server login rather than a domain account.
There may be another way to do this between domain accounts, but haven't the time for that.
We have a local database that retains data for 2 months. I'm replicating the local database to a cloud database using SQL Transactional replication.The idea is we want to save a year worth of data. I disabled the DELETE from being replicated, and this works great. However, if the replication for any reason got reinitialized or someone runs the snapshot agent again in the publisher, I will lose all the data in the cloud and get the current image of my local database! What can I do to stop this from happening from the subscriber side? Is there a way to make the subscriber or the cloud ignores all forms of DELETE or re-initialization, and just keep building up replicated data from the local database?
I realize that Azure SQL Database does not support doing an insert/select from one db into another, even if they're on the same server. We receive data files from clients and we process and load them into a "load database". Once the load is complete, based upon various rules, we then move the data into a production database of which there are about 20, all clones of each other (the data only goes into one of the databases).
Looking for a solution that will allow us to move the data. There can be 500,000 records in a load file and so moving them one by one is not really feasible.
Have you tried Elastic Query? Here is the Getting Started guide for it. Currently you cannot perform remote writes, but you can always read data from remote tables.
Hope this helps!
Silvia Doomra
I am using Microsoft Sync framework to synchronize an Azure database with a local SQL Server 2008 database. Everything is working fine. But I have a small problem as mentioned below
I am synchronizing in one way (ie) from Azure DB to local DB. Insert/update/delete on Azure DB gets synchronized with local database. But I tried to manually update a record in local DB using normal update statement. Also I updated the same record with corresponding new value in the Azure DB. Now the record in the local DB is not getting the updated value from the Azure DB. This problem happens only after updating a record manually in local database.
Please help anyone.......
that's because you're now encountering a conflict. when both copy of a row is updated on both ends, you end up with a conflict and you need to tell Sync Framework how to resolve it (e.g., Retain local copy or overwrite it)
see: How to: Handle Data Conflicts and Errors for Database Synchronization
I have 2 azure sql db and I've created SSIS job to transfer some data from 1 db to another.
The db has millions of records
The SSIS is hosted on premise and if I execute the package on my pc,
will it directly copy the data from 1 azure db to another on the FLY
OR
Fetch the data from 1 azure db to my local and then upload the data to another azure db
A trip from azure to local and again from local to azure will be too costly if I have millions of records.
I am aware of azure data sync but my requirements requires ssis for transferring particular data.
Also, do azure data sync have option to sync only particular tables?
Running the SSIS package on your local machine will cause the data to be moved to your machine before being sent out to the destination database.
When you configure a sync group in SQL Azure Data Sync you should be able to select which tables to synchronize.
I'm pretty sure the SQL Azure Data Sync does have the option to select just the tables you need to transfer. However I don't think there is an option to do transformation over the data being transfered.
As for SSIS, I don't see how would a transfer be possible without data first coming to your premises. You have to connections established - 1 connection with the first SQL Azure, and then the other connestion with the second SQL Azure server. And SSIS will pull the data from the first stream (Connection) then push it to the second one.
I would suggest exploring SQL Azure Data Sync, as it might be the best choise for your scenario. Any other option would require data to first come on premise, then being transfered back to the cloud.
Well, there is 3rd option. You create a simple worker based on ADO.NET and SqlBulkCopy class. Put your worker in a worker role in the Cloud, and trigger it by message in an Azure queue or so. Hm. That would seem to be some of the best solution, as you have total control of what is being copied. Thus all the data will stay in MSFT datacenter which means:
Fast transfer
No bandwidth charges (as long as all the 3 - 2 x SQL Azure server + 1 x Worker role are deployed in same affinity group)