SQL Azure Sync between two Azure databases doesn't preserve names - sql

I'm trying to sync between two SQL Azure databases as a solution to the inability to do cross domain queries. Basically I have 5 tables in a small database which is updated very frequently, and I want the contents of those 5 tables into my main application database so I've created them both with identical schema in each.
The sync SEEMS to be working but what I end up with a load of tables in another schema, but nothing in my own tables
eg My tables - dbo.ad, dbo.adgroup etc.
but I get datasync.ad, datasync.adgroup etc.

And what we learn from this is patience.
Sql Azure sync created those datasync schema tables as a tracking mechanism for the sync. It can take a little while (approx 30 mins for me) before your data starts to appear, but appear it does.

Related

Sql Azure - Cross database queries

I have N databases, for example 10 databases.
Every database has the same schema, but different data.
Now i would like to take every data of each database from the table "Table1" and insert them in a common table in a new database "DWHDatabase" in a table named Table1Common.
so it's an insert like n to 1.
How i can do that? i'm trying to solve my issues with the elastic queries but seems it's a 1 to 1 stuff
Use Azure Data Factory with Linked Services to each database. Use the Copy activity to load the data.
You can also paramaterize the solution.
Parameterize linked services
Parameters in Azure Data Factory by Catherine Wilhemsen
Elastic query is best suited for reporting scenarios in which the majority of the processing (filtering, aggregation) may be done on the external source side. It is unsuitable for ETL procedures involving significant amounts of data transfer from a distant database (s). Consider Azure Synapse Analytics for large reporting workloads or data warehousing applications with more sophisticated queries.
You may use the Copy activity to copy data across on-premises and
cloud-based data storage. After you've copied the data, you may use
other actions to alter and analyse it. The Copy activity may also be
used to publish transformation and analysis findings for use in
business intelligence (BI) and application consumption.
MSFT Copy Activity Overview: Here.

Azure Growing Sql Database

1.I am new in azure, I want to know can we have same replication mechanism provided by on premise sql on azure sql db?
2 .Issue we are facing is, few of the tables are growing fast, daily insert around 10k records, so we are planning to keep only few months say 6 data on main DB and copy all data to other DB using replication (not sure if feasible).
We need to read data from backup as well in application for some reports.
Please suggest on this if replication will work or any other solution.
Geo-replication uses a version of AlwaysOn with async replicas under the hood. It is very similar to a distributed Availability Group in SQL 2016, but you cannot control it, you can only turn it on or off.
Replication will work for that, but it would replicate all the data in the DB, not just the tables you want.
Link to Azure Documentation: https://azure.microsoft.com/en-us/documentation/articles/sql-database-geo-replication-overview/

How to insert data from one Azure SQL Database into a different Azure SQL Database?

I realize that Azure SQL Database does not support doing an insert/select from one db into another, even if they're on the same server. We receive data files from clients and we process and load them into a "load database". Once the load is complete, based upon various rules, we then move the data into a production database of which there are about 20, all clones of each other (the data only goes into one of the databases).
Looking for a solution that will allow us to move the data. There can be 500,000 records in a load file and so moving them one by one is not really feasible.
Have you tried Elastic Query? Here is the Getting Started guide for it. Currently you cannot perform remote writes, but you can always read data from remote tables.
Hope this helps!
Silvia Doomra

Synchronize data between two different databases with different schemas

We have an old CRM running on a SQL 2005 database that I have been tasked with linking to the corporate website. At the moment the data in it is managed internally by staff. The problem is that the database is so poorly design in terms of how it stores relational data that using it with something like the entity framework makes it an unenviable task.
So with this in mind I planned to extract the information from the old 2005 database and transforming it into relational tables in SQL 2012 that work with the entity framework and use this to power the website. I’d set this up as a SQL agent job that runs once a day as there is quite a lot of data to be synchronized.
I have also in my business logic for the website written some code to handle inserts, updates and deletes in a transaction so that if the user updates their details on the website, it will update the SQL 2012 data table and, via an API update the 2005 database to have some sort of data concurrency. I have written a test application and this works satisfactorily.
The problem now comes when internal staff update the data in the CRM as using this system it will take up to 24 hours before the data will be updated on the website. So far I have found SQL Data Compare 10 and dbForce Data Compare software which can sync the changes across databases, but this will need to be done manually.
Is there an automated way to update the record in the SQL 2012 table when it is changed in the SQL 2005 database?
Is it possible to use database first for the entity framework and then alter the structure of the models it creates to it makes more sense, but it still work with the legacy database?
Thanks in advance

share datas between 4 websites

I'm planing a webproject, containing 4 websites build in MVC3. As a databaseserver I'm going to use the ms sql server.
Each of this websites will have something arround 40 tables. But some of the tables are shared between the websites:
Contact, Cities, Postalcodes, Countries...
How to handle this? should I put all the tables of each database into a common database (so that the database of website 1,2,3 and website 4 are in one databse together). Or should I create one database containing shared datase?
But then I think I'm getting problems with the data consitency, because I think there is no way to point from one database to an other (linking for example the citytable in database one to the buldingtable in databse 2).
Any ideas?
Thanks a lot!
What I like about splitting it out into separate databases is that if each web site has its own database, and one of those web sites gets extremely popular, it is very easy to just move their database to a different, more powerful database server and not much has to change except (a) you need to reference the central "control" data remotely (or replicate/mirror/etc), and (b) you point that web site at a different database server. Another benefit is that if two web sites have the same types of tables (e.g. Patients), you don't have to have tables like Patients_WebSite1, Patients_WebSite2, with different stored procedures that are identical except for table names (or ugly dynamic SQL procedures that paste the table name in). Separated out you can have the exact same schema and the exact same codebase without having to combine everyone's data into a single table.
If you mix the data within a single database, data consistency is easier, and the whole setup is slightly simpler, but splitting it out when you grow is a lot tougher. If you split it out into different databases, no you won't be able to enforce referential integrity using standard DRI (foreign keys). You can accomplish this in other ways if it is important (triggers, validation before insert/update, etc).