I'm working on a legacy project, written for the most part in Delphi 5 before it was upgraded to Delphi 2007. A lot has changed after this upgrade, except the database that's underneath. It still uses MS-Access for data storage.
Now we want to support SQL Server as an alternate database. Still just for single-user situations, although multi-user support will be a feature for the future. And although there won't be many migration problems (see below) when it needs to use a different database, keeping two database structures synchronized is a bit of a problem.
If I would create an SQL script to generate the SQL Server database then I would need a second script to keep the Access database up-to-date too. They don't speak the same dialect. (At least, not for our purposes.) So I need a way to maintain the database structure in a simple way, making sure it can generate both a valid SQL Server database as an Access database. I could write my own tool where I store the database structure inside an XML file, which combined with some smart code and ADOX would generate both database types.
But isn't there already a good tool that can do this?
Note: the application also uses ADO and all queries are just simple select statements. Although it has 50+ tables, there's one root "Document" table and the user selects one of the "documents" in this table. It then collects all records from all tables that are related to this document record and stores them in an in-memory structure. When the user saves the data, it just writes the document record and all changed data back to the database again. Basically, this read/write mechanism of documents is the only database interaction in the whole application. So using a different database is not a big problem.
We will drop the MS-Access database in the future but for now we have 4000 customers using this application. We first need to make sure the whole thing works with SQL Server and we need to continue to maintain the current code. As a result, we will have to support both databases for at least a year.
Take a look at the DB Explorer, there is a trial download too.
OR
Use migration wizard from MS Access to SQL Server
After development in Access (schema changes), use the wizard again.
Use a tool to compare SQL Server schemata.
Related
Basically I need something to generate SQL Server change scripts for data differences only, based on the data differences between two tables with the same schema.
We will have a table with approx 250,000-330,000 rows and 10-12 columns, and two instances of the table:
The Master table, populated with records from the production system.
the Sandpit table, also populated from the production system - but in which the user can add / remove rows and edit cell contents.
Once the user is happy with their edits they need to generate a change script which makes the necessary changes to an instance of the Master table in a variety of servers (test, pre-prod, prod), so it needs to be reliable. It's safe to assume that all versions of the master data will be the same when the script is eventually run.
They also need to be able to re-run the change script for self-testing (restoring the master back to it's original state would be a separate process, out of scope for this question).
Design of the table schema is not yet done, and can be tailored to suit this purpose.
SQL Server 2008 Standard edition, upgrade likely (but still standard edition).
I understand RedGate is pretty much the industry standard / leading choice for generating SQL change scripts but their website focuses a lot on managing schema changes so not sure if it's appropriate to use here. I'm familiar with using SQL Server myself but it's been a few years - not sure if the inbuilt functionality is up to it or not (both being technically capable + user friendly enough). The end user will be a competent SQL user but comes from the business side not IT (not SQL Admin grade).
You should be able to do this from Visual Studio using SSDT (SQL Server Data Tools). You need to do a data comparison between the source and target tables, which will then generate a change script.
For reasons I'm not about to explain, We keep a Access database that is to be a copy of a subset of a larger oracle database. It is not feasible to refer to data directly in the Oracle database due to speed issues (don't ask).
Every time a specific application is opened the local Access database is updated from the newest data found to the time of opening the application. First of all this does not capture changes in the existing records. Secondly it does not take into account changes in the source database made after opening the application.
For this reason several checks may be needed when carrying out certain operations in the application. So is it possible to update the local Access database only with changes in the Oracle database in a smarter and faster way than the hard way I am imagining (I'm not a PL/SQL / SQL expert)? Possibly it might be sufficient to look for changes only after a certain date (stored in one of the fields of the recordset retrieved).
Any suggestions?
You might want to look into data replication beethween Oracle en MSAccess databases. For example thru an ODBS drive or sqlserver database. Just google "ms access oracle replication" an see if this solves your problem.
I'm creating the front-end for a project and I made a copy of the back-end database from the company's server and put it on my computer. I needed to make some changes (a few new tables and two new columns in an existing table) for security roles and other things so I duplicated the copied database and made my changes on the new one.
I want to deploy my project to the company's server now but we need to modify the original back-end database. I need to generate a SQL script that finds the changes between the old-database and my newer database, which can be run on the old database to create the new tables and columns. The script should retain the data from the old database and NOT add any junk/testing data I made in my new database.
By the way, I'm using SQL Server 2008 R2 and the old database on the server is on 2005. I've been looking around for utilities to use and found tablediff. However, it looks like it will copy the data and I can't see an argument on the information page to toggle this.
I'm sure it's simple but I'm not really sure how to do this. Any help would be appreciated. Thanks.
By far the solution I trust most to handle schema comparisons is Red Gate's SQL Compare:
http://www.red-gate.com/products/sql-development/sql-compare/
It has a companion called Data Compare which is designed specifically for data. You can grab the free trial to see if it does what you need in this case.
There are other options as well, for example SQL Server Data Tools has this functionality, though I haven't tested it to any degree that I could compare feature sets, performance, etc.
I've also blogged about why you want to use a tool and just pay for this functionality, rather than solve it programmatically yourself. The post also mentions a variety of alternatives if budget is a primary blocker:
http://bertrandaaron.wordpress.com/2012/04/20/re-blog-the-cost-of-reinventing-the-wheel/
I am writing code to migrate data from our live Access database to a new Sql Server database which has a different schema with a reorganized structure. This Sql Server database will be used with a new version of our application in development.
I've been writing migrating code in C# that calls Sql Server and Access and transforms the data as required. I migrated for the first time a table which has entries related to new entries of another table that I have not updated recently, and that caused an error because the record in the corresponding table in SQL Server could not be found
So, my SqlServer productions table has data only up to 1/14/09, and I'm continuing to migrate more tables from Access. So I want to write an update method that can figure out what the new stuff is in Access that hasn't been reflected in Sql Server.
My current idea is to write a query on the SQL side which does SELECT Max(RunDate) FROM ProductionRuns, to give me the latest date in that field in the table. On the Access side, I would write a query that does SELECT * FROM ProductionRuns WHERE RunDate > ?, where the parameter is that max date found in SQL Server, and perform my translation step in code, and then insert the new data in Sql Server.
What I'm wondering is, do I have the syntax right for getting the latest date in that Sql Server table? And is there a better way to do this kind of migration of a live database?
Edit: What I've done is make a copy of the current live database. Which I can then migrate without worrying about changes, then use that to test during development, and then I can migrate the latest data whenever the new database and application go live.
I personally would divide the process into two steps.
I would create an exact copy of Access DB in SQLServer and copy all the data
Copy the data from this temporary SQLServer DB to your destination database
In that way you can write set of SQL code to accomplish second step task
Alternatively use SSIS
Generally when you convert data to a new database that will take it's place in porduction, you shut out all users of the database for a period of time, run the migration and turn on the new database. This ensures no changes to the data are made while doing the conversion. Of course I never would have done this using c# either. Data migration is a database task and should have been done in SSIS (or DTS if you have an older version of SQL Server).
If the databse you are converting to is just in development, I would create a backup of the Access database and load the data from there to test the data loading process and to get the data in so you can do the application development. Then when it is time to do the real load, you just close down the real database to users and use it to load from. If you are trying to keep both in synch wile you develop, well I wouldn't do that but if you must, make a nightly backup of the file and load first thing in the morning using your process.
You may want to look at investing in a tool like SQL Data Compare.
I believe it has support for access databases too, and you can download a trial.
I you are happy with you C# code, but it fails because of the constraints in your destination database you temporarily can disable them and then enable after you copy the whole lot.
I am assuming that your destination database is brand new DB with no data, and not used by anyone when the transfer happens
It sounds like you have two problems:
You're migrating data from one database to another.
You're changing your schema.
Doing either of these things is tricky if you are trying to migrate the data while people are using the data.
The simplest approach is to migrate the data based on a static copy of the data, and also to queue updates to that data from the moment you captured the static copy. I don't know how easy this is in Access, but in SQLServer or Oracle you can use the redo logs for this or a manual solution using triggers. The poor-man's way of doing this is to make triggers for all the relevant tables that log the primary key of the records that have changed. Then after the old database is shut off you can iterate over those keys and get those records from the old database and put them into the new database. Just copy the whole record; if the record was deleted then delete it from the new database.
Your problem is compounded by the fact that you can't simply copy the data, you have to transform it. This means you probably have to shut down both databases and re-migrate the records based on the change list. It will take a lot of planning to ensure you get things right and I'd recommend writing a testing script that can validate that the resulting data is correct.
Also I'd ensure that the code for the migration runs inside one of the databases if possible. Otherwise you are copying the data twice and this will significantly harm the performance.
We have a common problem of moving our development SQL 2005 database onto shared web servers at website hosting companies.
Ideally we would like a system that transfers the database structure and data as an exact replica.
This would be commonly achieved by restoring a backup. But because they are shared SQL servers, we cannot restore backups – we are not given access to the actual machine.
We could generate a script to create the database structure, but then we could not do a data transfer through the menu item Tasks/Import Data because we might violate foreign key constraints as tables are imported in an order the conflicts with the database schema. Also, indexes might not be replicated if they are set to auto generate.
Thus we are left with a messy operation:
Create a script in SQL 2005 that generates the database in SQL 2000 format.
Run the script to create a SQL 2000 database in SQL 2000.
Create a script in SQL 2000 that generates the database structure WITHOUT indexes and foreign keys.
Run this script on the production server. You now have a database structure to upload data to.
Use SQL 2005 to transfer the data to the production server with Tasks/Import data.
Use SQL 2000 to generate a script that creates the database with indexes and keys.
Copy the commands that generate the indexes and foreign keys only. These are located after the table creation commands. Note: In SQL 2005, the indexes and foreign keys are generated as one and cannot be easily separated.
Run this script on the production database.
Voila! The database is uploaded with all data and keys/constraints in place. What a messy and error prone system.
Is there something better?
Scott Gu had written few posts on this topic :
SQL Server Database Publishing Toolkit for Web Hosting
Generation scripts are fine for creating the database objects, but not for transporting database information. For example, client-specific databases where the developer is required to pre-populate some data.
One of the issues I've run into with this is the new MAX types in SQL Server 2005+. (nvarchar(max), varchar(max), etc.) Of course, this is worse when you are actually using Sql Server Express, which doesn't allow for exporting other than creating your own scripts to create the data.
I would recommend switching to a hosting company that allows you to have the ability to FTP backup files and does NOT require you to use your own scripts. That's the whole point of SQL Server, right? To provide more tools that are friendlier to use. If the hosting company takes that away, you may as well move to MySql for its ease in dumping information.
WebHost4Life is a life saver in this category. They offer FTP to the database server to upload your backup file or MDF and LDF files for attachment! I was so upset when I saw GoDaddy had the similar restriction you mentioned. Their tool didn't tell me it was a bad import, and I couldn't figure out why my site was coming back with 500 errors.
One other note: I'm not sure which is considered more secure. I enabled external connections in GoDaddy and connected with Management Studio, and I was able to see every database on that server! I couldn't access them, but I now have that info. A double whammy is that GoDaddy requires that the user name for the DB be the same as the DB! now all you need to do is spam passwords against those hundreds of DBs!
Webhost4life, on the other hand, has only your specific database shown in Management Studio. And they let you pick your own DB name and user name, independent of each other. They only append the same unique id on the end of the user & db names in order to keep them from conflicting with others.
You should not rely on restoring backups for copying / transferring databases. You need to use scripts - trust me you will get better at it.
I have used the RedGate Compare tools with shared hosting and it works well.
Database-generation scripts are messy, but they also have several advantages that ... well, make the pain more tolerable.
First, if you treat the DB scripts as real programming tasks in and of themselves, you can encapsulate the messiness. If you generate a script once (using a database tool), you can split the table structure aspects from the constraint aspects (keys, indices, etc.). Similarly, you can export the data once, but split it it into "system" data that's not frequently changed but is necessary for correct operation (stuff like tax or shipping rates, etc.), 'test' data that's easily identifiable, and 'operational' data that needs to be moved from DB version Old to DB version New (last week's Orders).
The first 3 minutes after you've accomplished that, things are wonderful: you can regenerate a new database with or without test data in a few minutes. Unfortunately, after 3 minutes, the databases are out of synch, at least in terms of data, if not quite as frequently in terms of structure.
I personally like to have each table's structure as a separate SQL file (and it's constraints as a separate file in a separate directory, and it's test data in one file, it's system data in another, etc.). On the one hand, this means that several different files have to be touched when making a change, but on the other hand, it makes it much easier to see the granularity of what's been changed: it's all right there in the version control logs. (I could probably be convinced that many-files is a mistaken strategy...)
All of this is predicated on the assumption that you have some facility for actually running a complex script involving many files and are not just constrained to some Web-based control panel, which may be what you're describing when you say "we are not given access to the actual machine." I feel that you can't do custom software development and not have some kind of shell access on the server; the hosting business is competitive enough that you can certainly find a script-friendly host easily enough.
Check whether the webhsoting company provides myLittleBackup
This is definitively the easiest solution to "install" a db from the development server to the shared sql server
Answer for SQL Server 2008 users.
I had the same exact issue as OP but I was using SQL Server 2008 and my shared hosting company is GoDaddy. Here's the solution to copy DB + the data to GoDaddy database...
In Visual Studio 2010, go to Server Explorer (in VS Express, I think it's called database explorer). Right click on database and select Publish to Provider ... this opens the Database Publishing Wizard ... go thru the wizard and it'll create a xxx.sql file on your local computer ...
Open SQL Server Management Studio and connect to the GoDaddy database (you should have already created this via the GoDaddy control panel within their website) ...
Open windows explorer and find the xxx.sql file and double click it. The script should open up in SSMS. Execute the script "within the proper database" ... voila, done.