I use SQL Azure Data Sync to sync my remote Azure database with my local SQL database. Data Sync does create some addional tables on client and server and also adds delete, insert and update triggers to existing tables.
For what are these triggers? Can i delete them? I don't think so?
Problem now is that i can't edit data on server.
I get the error
The target table 'dbo.Corporation' of the DML statement cannot have any
enabled triggers if the statement contains an OUTPUT clause without INTO clause.
The triggers are added by the Microsoft Sync Framework, which is being used for SQL Azure Data Sync. And, yes you can't delete them, because the SQL Azure Data Sync will stop working. It is not that easy to modify tables after they are provisioned. If you are adding columns check out this question. If it is something else, just try searching solution to your project tagged under Microsoft sync framework and not SQL Azure.
Related
We are wanting to use Azure servers to run our Power Apps applications, however we have local SQL servers which contains our data warehouse we want only certain tables to be on Azure and want to create data feeds between the two with information going from one to the other.
Does anyone have any insight into how I can achieve this?
I have googled but there doesn't appear to be a wealth of information on this topic.
It depends on how fast after a change in your source (the on premise SQL Server) you need that change reflected in your Sink (Azure SQL).
If you have some minutes or even only need to update it every day I would suggest a basic Data Factory Pipeline (search on google for data factory upsert). Here it depends on your data on how you can achieve this.
If you need it faster or it is impossible to extract an incremental update from your source you would need to either use triggers and write the changes from one database to the other or get a program that does change data capture that does that.
It looks like you just want to sync the data in some table between local SQL Server and Azure SQL database.
You can use the Azure SQL Data Sync.
Summary:
SQL Data Sync is a service built on Azure SQL Database that lets you synchronize the data you select bi-directionally across multiple SQL databases and SQL Server instances.
With Data Sync, you can keep data synchronized between your on-premises databases and Azure SQL databases to enable hybrid applications.
A Sync Group has the following properties:
The Sync Schema describes which data is being synchronized.
The Sync Direction can be bi-directional or can flow in only one
direction. That is, the Sync Direction can be Hub to Member, or
Member to Hub, or both.
The Sync Interval describes how often synchronization occurs.
The Conflict Resolution Policy is a group level policy, which can be
Hub wins or Member wins.
Next step, you need to learn how to configure the Data Sync. Please reference this Azure document:Tutorial: Set up SQL Data Sync between Azure SQL Database and SQL Server on-premises.
In this tutorial, you learn how to set up Azure SQL Data Sync by creating a sync group that contains both Azure SQL Database and SQL Server instances. The sync group is custom configured and synchronizes on the schedule you set.
Hope this helps.
The most robust solution here is Transactional Replication. You can also use SSIS or Azure Data Factory for copying tables to/from Azure SQL Database. And Azure SQL Data Sync also exists.
Can we sync views on local database with tables on SQL azure database using SQL Azure Datat Sync[Preview]? If yes, then how?
In my opinion, the jury is still out on SQL Azure Data Sync. It's been in preview for years and still is.
When you say sync, and you looking to take the results of a local view and persist them as a table in SQL Azure?
If so, you can create a SELECT INTO, taking the data from the local view and letting the SELECT INTO make a table out of it. There are some things you need do to make the SQL Azure DB available locally:
http://msdn.microsoft.com/en-us/library/azure/ee336282.aspx
https://dba.stackexchange.com/questions/59328/insert-to-sql-azure-through-linked-server-very-slow
I am using Microsoft Sync framework to synchronize an Azure database with a local SQL Server 2008 database. Everything is working fine. But I have a small problem as mentioned below
I am synchronizing in one way (ie) from Azure DB to local DB. Insert/update/delete on Azure DB gets synchronized with local database. But I tried to manually update a record in local DB using normal update statement. Also I updated the same record with corresponding new value in the Azure DB. Now the record in the local DB is not getting the updated value from the Azure DB. This problem happens only after updating a record manually in local database.
Please help anyone.......
that's because you're now encountering a conflict. when both copy of a row is updated on both ends, you end up with a conflict and you need to tell Sync Framework how to resolve it (e.g., Retain local copy or overwrite it)
see: How to: Handle Data Conflicts and Errors for Database Synchronization
We are using SQL data sync tool to syncronize on premise DB with cloud. But while provisioning, the data sync throws up the error message that it requires ALTER DATABASE.permission for the SQL id on SQL Server on-premise Database. We did lot of digging to find out the reason and it looks like it uses ALTER DATABASE command to change the <change_tracking_option>. Does this mean if <change_tracking_option> is enabled, it will not create the change tracking triggers for each tables? But if it does create it , then why it requires ALTER DATABASE permission?
SQL Azure Data Sync is based on the Sync Framework and I think this is happening because the Sync Framework is trying to enable snapshot isolation on your database.
I am writing code to migrate data from our live Access database to a new Sql Server database which has a different schema with a reorganized structure. This Sql Server database will be used with a new version of our application in development.
I've been writing migrating code in C# that calls Sql Server and Access and transforms the data as required. I migrated for the first time a table which has entries related to new entries of another table that I have not updated recently, and that caused an error because the record in the corresponding table in SQL Server could not be found
So, my SqlServer productions table has data only up to 1/14/09, and I'm continuing to migrate more tables from Access. So I want to write an update method that can figure out what the new stuff is in Access that hasn't been reflected in Sql Server.
My current idea is to write a query on the SQL side which does SELECT Max(RunDate) FROM ProductionRuns, to give me the latest date in that field in the table. On the Access side, I would write a query that does SELECT * FROM ProductionRuns WHERE RunDate > ?, where the parameter is that max date found in SQL Server, and perform my translation step in code, and then insert the new data in Sql Server.
What I'm wondering is, do I have the syntax right for getting the latest date in that Sql Server table? And is there a better way to do this kind of migration of a live database?
Edit: What I've done is make a copy of the current live database. Which I can then migrate without worrying about changes, then use that to test during development, and then I can migrate the latest data whenever the new database and application go live.
I personally would divide the process into two steps.
I would create an exact copy of Access DB in SQLServer and copy all the data
Copy the data from this temporary SQLServer DB to your destination database
In that way you can write set of SQL code to accomplish second step task
Alternatively use SSIS
Generally when you convert data to a new database that will take it's place in porduction, you shut out all users of the database for a period of time, run the migration and turn on the new database. This ensures no changes to the data are made while doing the conversion. Of course I never would have done this using c# either. Data migration is a database task and should have been done in SSIS (or DTS if you have an older version of SQL Server).
If the databse you are converting to is just in development, I would create a backup of the Access database and load the data from there to test the data loading process and to get the data in so you can do the application development. Then when it is time to do the real load, you just close down the real database to users and use it to load from. If you are trying to keep both in synch wile you develop, well I wouldn't do that but if you must, make a nightly backup of the file and load first thing in the morning using your process.
You may want to look at investing in a tool like SQL Data Compare.
I believe it has support for access databases too, and you can download a trial.
I you are happy with you C# code, but it fails because of the constraints in your destination database you temporarily can disable them and then enable after you copy the whole lot.
I am assuming that your destination database is brand new DB with no data, and not used by anyone when the transfer happens
It sounds like you have two problems:
You're migrating data from one database to another.
You're changing your schema.
Doing either of these things is tricky if you are trying to migrate the data while people are using the data.
The simplest approach is to migrate the data based on a static copy of the data, and also to queue updates to that data from the moment you captured the static copy. I don't know how easy this is in Access, but in SQLServer or Oracle you can use the redo logs for this or a manual solution using triggers. The poor-man's way of doing this is to make triggers for all the relevant tables that log the primary key of the records that have changed. Then after the old database is shut off you can iterate over those keys and get those records from the old database and put them into the new database. Just copy the whole record; if the record was deleted then delete it from the new database.
Your problem is compounded by the fact that you can't simply copy the data, you have to transform it. This means you probably have to shut down both databases and re-migrate the records based on the change list. It will take a lot of planning to ensure you get things right and I'd recommend writing a testing script that can validate that the resulting data is correct.
Also I'd ensure that the code for the migration runs inside one of the databases if possible. Otherwise you are copying the data twice and this will significantly harm the performance.