I've had an issue with a SQL Server database after an update from some horrible software. The software "updated" (in actuality, rolled-back) a bunch of encrypted stored procedures and user-defined functions in the database, which is now causing errors in other software.
Thankfully I took a backup from just before the update, however the error wasn't noticed until about an hour later which means records have been updated/inserted/deleted etc since the backup was taken.
Originally my idea was to simply copy the stored procedures and functions to a new database created from the backup, then backup and restore this database onto the broken database, but as they are encrypted I can't copy them.
So the next idea was to transfer the tables from the broken database to the restored database and proceed as above. However I run in to several issues with the existing tables, such as the DBTimeStamp column not allowing the copy. However copying the tables to new clean database works fine.
So here are the questions:
What's the best way to effectively merge the tables from the backup with the tables from the broken db?
Would simply truncating or dropping the existing tables in the backup avoid these validation errors? I get error messages like "VS_ISBROKEN" etc when trying to use the export data function to push the data across, with dropping the existing data set and Identity_Insert turned on etc. (Truncating)
I have yet to try dropping all the tables on the backup and going from there. Would there be an adverse effect with Metadata if I approached the problem this way?
I feel like this should be quite simple, and had the provider not locked down all the functions and stored procedures I wouldn't need to copy the tables out like this.
Thanks for reading :)
Related
I have a weird request as I support an application that does purging of historical data routinely which I would like to store in a separate datastore. I want to keep the schema of the database in the copy identical however I would like to be able to script a process which could take the changes to the primary database and replicate to the clone without deleting the data that the application is purging from the database tables.
How would I go about doing this?
I have a production and development database (on different systems of course). Many months ago, I copied the production database to the development system. I used exp/imp at the time. Since then there has been quite a few changes in the production database I would like to copy down to the development database. I'd rather not wipe out the development database and start over because of data I've had to add to the development database.
My original thought was to use MERGE INTO to copy the new records. But this apparently requires me to do this for tables, and list all fields of all tables. We're talking hundreds of tables and thousands of fields here. Not a pretty solution.
Is there an easier way?
Why not use the TABLE_EXISTS parameter of impdp to append the new data to the existing tables? Duplicate keys will error off but the rest of the data should still import. The results will be a bit messy. Prior to running TRUNCATE any tables in test where you can just bring the entire production table. Disable FK. Re-enable after import.
- -
Another option create a database link and generate INSERT/SELECT into all tables where data not in existing test table. You probably also want to disable FK prior to running and re-enable when done.
I've searched extensively for the answer to this question and could not find a good answer. I've looked into several restore DB articles and a few rollbacks too but still no success.
My situation is: I have a very large database in which I did execute a wrong update query for a single column of a single table, and I have a full backup of this database until yesterday (which is more than enough to correct the problem). But the other tables of this same DB were updated in the meantime, and I require them to keep their current values.
so after all the reading my plan was : Restore the full backup to a new location then get the values of the column I need and input those in the current database.
My problem is: I'm not being able to restore this full backup without affecting the production DB. When I try to restore it, the sql studio says the mdf file can't be overwritten (which is good because I'll be using the table further), then i saw some articles telling me to use the MOVE query. But if I do use it the mdf files from the original/production table will be relocated thus affecting the table right ?
I also saw a few articles telling me to roll it back if I have transaction logs backups. I wasn't actually able to tell if I do have those, nor what are those. even after googling it out
Any thoughts on how I should proceed ?
sorry if it is a newbie question, but I'm not originally a programmer yet I have been doing this for work and I really need it done fast ! So any help would be strongly appreciated
I'm using SQL Server Standard 2005 with SQL Server Mangmt Studio 2008.
Restore The backup With Different Name Like DB_Temp on any location
Copy the Table From Running DB using Select INTO.......
Import records from newly restored DB (DB_Temp) Table to Running DB
Delete the Database DB_Temp
Check the changes between recently copied and original table
Update records accordingly
Thanks
Is there a way to backup certain tables in a SQL Database? I know I can move certain tables into different filegroups and preform a backup on these filegroup. The only issue with this is I believe you need a backup of all the filegroups and transaction logs to restore the database on a different server.
The reason why I need to restore the backup on a different server is these are backups of customers database. For example we may have a remote customer and need to get a copy of they 4GB database. 90% of this space is taken up by two tables, we don’t need these tables as they only store images. Currently we have to take a copy of the database and upload it to a FTP site…With larger databases this can take a lot of the time and we need to reduce the database size.
The other way I can think of doing this would be to take a full backup of the DB and restore it on the clients SQL server. Then connect to the new temp DB and drop the two tables. Once this is done we could take a backup of the DB. The only issue with this solution is that it could use a lot of system restores at the time of running the query so its less than ideal.
So my idea was to use two filegroups. The primary filegroup would host all of the tables except the two tables which would be in the second filegroup. Then when we need a copy of the database we just take a backup of the primary filegroup.
I have done some testing but have been unable to get it working. Any suggestions? Thanks
Basically your approach using 2 filegroups seems reasonable.
I suppose you're working with SQL Server on both ends, but you should clarify for each which whether that is truly the case as well as which editions (enterprise, standard, express etc.), and which releases 2000, 2005, 2008, (2012 ? ).
Table backup in SQL Server is here a dead horse that still gets a good whippin' now and again. Basically, that's not a feature in the built-in backup feature-set. As you rightly point out, the partial backup feature can be used as a workaround. Also, if you just want to transfer a snapshot from a subset of tables to another server, using ftp you might try working with the bcp utility as suggested by one of the answers in the above linked post, or the export/import data wizards. To round out the list of table backup solutions and workarounds for SQL Server, there is this (and possibly other ? ) third party software that claims to allow individual recovery of table objects, but unfortunately doesn't seem to offer individual object backup, "Object Level Recovery Native" by Red Gate". (I have no affiliation or experience using this particular tool).
As per your more specific concern about restore from partial database backups :
I believe you need a backup of all the filegroups and transaction logs
to restore the database on a different server
1) You might have some difficulties your first time trying to get it to work, but you can perform restores from partial backups as far back as SQL Server 2000, (as a reference see here
2) From 2005 and onward you have the option of partially restoring today, and if you need to you can later restore the remainder of your database. You don't need to include all filegroups-you always include the primary filegroup and if your database is simple recovery mode you need to add all read-write filegroups.
3) You need to apply log backups only if your db is in bulk or full recovery mode and you are restoring changes to a readonly filegroup that since last restore has become read-write. Since you are expecting changes to these tables you will likely not be concerned about read only filegroups, and so not concerned about shipping and applying log backups
You might also investigate some time whether any of the other SQL Server features, merge replication, or those mentioned above (bcp, import/export wizards) might provide a solution that is more simple or more adequately meets your needs.
I am writing code to migrate data from our live Access database to a new Sql Server database which has a different schema with a reorganized structure. This Sql Server database will be used with a new version of our application in development.
I've been writing migrating code in C# that calls Sql Server and Access and transforms the data as required. I migrated for the first time a table which has entries related to new entries of another table that I have not updated recently, and that caused an error because the record in the corresponding table in SQL Server could not be found
So, my SqlServer productions table has data only up to 1/14/09, and I'm continuing to migrate more tables from Access. So I want to write an update method that can figure out what the new stuff is in Access that hasn't been reflected in Sql Server.
My current idea is to write a query on the SQL side which does SELECT Max(RunDate) FROM ProductionRuns, to give me the latest date in that field in the table. On the Access side, I would write a query that does SELECT * FROM ProductionRuns WHERE RunDate > ?, where the parameter is that max date found in SQL Server, and perform my translation step in code, and then insert the new data in Sql Server.
What I'm wondering is, do I have the syntax right for getting the latest date in that Sql Server table? And is there a better way to do this kind of migration of a live database?
Edit: What I've done is make a copy of the current live database. Which I can then migrate without worrying about changes, then use that to test during development, and then I can migrate the latest data whenever the new database and application go live.
I personally would divide the process into two steps.
I would create an exact copy of Access DB in SQLServer and copy all the data
Copy the data from this temporary SQLServer DB to your destination database
In that way you can write set of SQL code to accomplish second step task
Alternatively use SSIS
Generally when you convert data to a new database that will take it's place in porduction, you shut out all users of the database for a period of time, run the migration and turn on the new database. This ensures no changes to the data are made while doing the conversion. Of course I never would have done this using c# either. Data migration is a database task and should have been done in SSIS (or DTS if you have an older version of SQL Server).
If the databse you are converting to is just in development, I would create a backup of the Access database and load the data from there to test the data loading process and to get the data in so you can do the application development. Then when it is time to do the real load, you just close down the real database to users and use it to load from. If you are trying to keep both in synch wile you develop, well I wouldn't do that but if you must, make a nightly backup of the file and load first thing in the morning using your process.
You may want to look at investing in a tool like SQL Data Compare.
I believe it has support for access databases too, and you can download a trial.
I you are happy with you C# code, but it fails because of the constraints in your destination database you temporarily can disable them and then enable after you copy the whole lot.
I am assuming that your destination database is brand new DB with no data, and not used by anyone when the transfer happens
It sounds like you have two problems:
You're migrating data from one database to another.
You're changing your schema.
Doing either of these things is tricky if you are trying to migrate the data while people are using the data.
The simplest approach is to migrate the data based on a static copy of the data, and also to queue updates to that data from the moment you captured the static copy. I don't know how easy this is in Access, but in SQLServer or Oracle you can use the redo logs for this or a manual solution using triggers. The poor-man's way of doing this is to make triggers for all the relevant tables that log the primary key of the records that have changed. Then after the old database is shut off you can iterate over those keys and get those records from the old database and put them into the new database. Just copy the whole record; if the record was deleted then delete it from the new database.
Your problem is compounded by the fact that you can't simply copy the data, you have to transform it. This means you probably have to shut down both databases and re-migrate the records based on the change list. It will take a lot of planning to ensure you get things right and I'd recommend writing a testing script that can validate that the resulting data is correct.
Also I'd ensure that the code for the migration runs inside one of the databases if possible. Otherwise you are copying the data twice and this will significantly harm the performance.