By mistake I have updated all the rows of a table with same data for certain columns (missed the statement that contains where clause). Are there any free tools that can help me to restore old data for these rows.
If the recovery model is full then it is possible to go back in time. Other than that, I don't think it's possible to recover that data. Unless you backup your data regularly.
See How to: Restore to a Point in Time (SQL Server Management Studio)
Related
This question was migrated from Stack Overflow because it can be answered on Database Administrators Stack Exchange.
Migrated 5 days ago.
I've been tasked with migrating data from an instance of SQL Server 2000 to 2019. There are a total of four databases to bring over, three of which I was able to backup/restore into 2008 and then into 2019 without any issues. Please note: I am not a DBA in any sense, though I'm the closest thing to one on hand.
The fourth and final database presented the following error that prevented moving from 2008 to 2019:
System.Data.SqlClient.SqlError: An error occurred while processing the log for database 'DbNameHere'. The log block version 2 is unsupported. This server supports log version 3 to 6. (Microsoft.SqlServer.SmoExtended)
Is there a simple fix for this problem that I'm missing in the various SSMS menus?
Alternatively, is there a way to copy raw data from one server to another via, for instance, a flat file, and preserve the identity columns as identity columns? That is, I don't want to just strip that column and bulk insert, as they are often used as foreign keys in other tables, and with twenty-some-odd years of data, something is bound to break in doing this.
An example of an ideal final result in this solution would be something like: legacy table X has 1000 rows, the last of which has an identity column value of 1000. Once the move is complete, new table X has 1000 rows, the last of which has an identity column value of 1000, and upon insert the next row automatically increments to 1001.
Apart from unsuccessfully messing around with flat files, I've also tried the "Copy Database" option in SSMS, which also failed.
I would attempt to get SQL Server to rebuild the transaction log. Based on the error message, that might sort out the situation.
You first use sp_detach_db to detach the database. It is now very likely that the ldf file isn't needed when you do a subsequent attach, and perhaps rebuilding the log this way will sort the situation.
Then you attach the database, without the ldf file. Use CREATE DATABASE with either of the FOR ATTACH or FOR ATTACH_REBUILD_LOG options.
I would do this on the 2008 instance, since from what I understand you got the database in there successfully. But feel free to play around regarding on which version (2000 or 2008) you do the detach and also on which version (2000, 2008, 2019) you do the attach.
I was meant to update a single column of a record in th database but i forgot to specify the id and every single record has now been updated! is there a way i can roll back the data please help!
I was meant to run the following statement :
Update Table set Cusname = "some name" where id = 2;
but i actually ran the following:
update Table set Cusname = "some name"
now every single cusname column has the same name . please help
Please help !
There's not much you can do... this is why you have a good backup strategy in place (and, ideally, don't execute any "ad hoc" t-sql in a production database before testing in a test database, then copy/paste the statements to help avoid these types of errors in the future).
Pulling info from the comments, you can start off by doing something along these lines: Can I rollback a transaction I've already committed? (data loss)
(this is for PostgreSQL, but the idea is the same... stop the server, backup all relevant files, etc).
If you have transaction logging and log backups, you can attempt a point in time restore, but this must be set up prior to your error. See here: Reverse changes from transaction log in SQL Server 2008 R2?
Your best bet in this case may be to spend some time working on resolving without restoring. It looks like you updated your customer names. Do you have another source for customer information? Can you compile an external list of customers and, say, addresses, so you can do a match on those to reset your db's customer names? If so, that might be a much easier route, getting you most of the way to a full recovery of your bonked field. If you don't have a ton of data, you can do this and fill the rest in manually. It's a process, but without a good backup of the db to revert to, it may very well be worth looking at...
Also note that if the database is hosted on a cloud based S/P/Iaas, you may have deeper level backups, or in the case of "SQL Database" (e.g., SQL Azure), point in time backups are set up out of the box even for the lowest service plans.
I am currently searching a query that can reform data back to its original form.
For example if I do the following query:
UPDATE Pupil
SET Telephone = '999'
WHERE Telephone = '0161'
I have done this query and realized I do not wish to change the telephone and want it as it was before. I understand using views and copying the same table to test query's is useful.
But I am wondering if there is actually a query to redo a update or delete query I have made.
This assumes other data might have already contained 999 which you would want left as it was or you could just revert all 999 back to 0161 by inverting your original query values.
If you have full database logging on and did it during a transaction then it might be possible to rollback just that one transaction... Certainly if you restore an old backup you can rerun the transactional changes since the backup.
Otherwise you may have to restore a backup to a copy database, find the matching record, and update to the old value from that one... Restoring to the same server as for example MyDataBaseName_Old means you can join across the databases to get the old record. e.g.
update MyDatabaseName.dbo.Pupil
set p.Telephone = pold.Telephone
from MyDatabaseName.dbo.Pupil p
inner join MyDatabaseName_Old.dbo.Pupil pold on p.PupilID = pold.PupilID
where pold.Telephone = '0161'
Sorry I can't be more help. Hope it gives you some hints for what else you might want to search for.
You can do it only my means of your BackUps. Use Full BackUp both with Transaction log backup. Then you'll be able to restore your database state to the needed one. Otherwise there is no way to restore updated rows in SQL Server. I've heard that in Oracle there exists a feature to make a query to the state of database, in which it was some time ago. Hope that SQL server will follow them and develop such feature too.
Since you are using SQL Server 2008 R2, you might be interested in Change Data Capture. It wouldn't be an easy crtl-z, but it could help.
I am trying to export the tables [around 40] and stored procedures [around 120+] in SQL Server 2008 R2 from dev server to prod server.
I have created a .sql file [right clicking on the database in SSMS, choosing Tasks -> Generate Scripts], but when I am trying to import the table and stored procedures into the prod server [right clicking on the database in SSMS, New Query then copying the content in] it is giving me a long list of errors
Mostly
There is already an object named 'tblMyTable' in the database
Violation of PRIMARY KEY constraint 'PK_MyTable'. Cannot insert
duplicate key in object 'dbo.tblMyTable'
Any idea what I am doing wrong or what should be done? Thanks in advance.
The problem with your current technique is that assumes your target is an empty database. So it will reinsert everything with no attempt at merging data and this is what causes your duplicate primary keys. If you use Management Studio you have to do all the merging of data yourself.
My recommendation is first to look into redgate it's not free but all the time you will save it will be worth it. You will need to use both SQL Compare and Data Compare ( http://www.red-gate.com/products/sql-development/sql-data-compare/ ).
Another alternative is to use Visual Studio 2010 premium if you have it( http://msdn.microsoft.com/en-us/library/aa833435.aspx and http://msdn.microsoft.com/en-us/library/dd193261.aspx). This gives both a data compare and a schema compare option. It is not as good as redgate but I found it works most of the time.
If you are looking for free alternatives check out this stack post https://stackoverflow.com/questions/377388/are-there-any-free-alternatives-to-red-gates-tools-like-sql-compare.
if you are importing the whole database to production you might as well do a restore with replace to the production database.
120 SPs and 20 tables seemed to be the whole database. so Restore with replace should be done.
I have a rather large (many gigabytes) table of data in SQL Server that I wish to move to a table in another database on the same server.
The tables are the same layout.
What would be the most effecient way of going about doing this?
This is a one off operation so no automation is required.
Many thanks.
If it is a one-off operation, why care about top efficiency so much?
SELECT * INTO OtherDatabase..NewTable FROM ThisDatabase..OldTable
or
INSERT OtherDatabase..NewTable
SELECT * FROM ThisDatabase..OldTable
...and let it run over night. I would dare to say that using SELECT/INSERT INTO on the same server is not far from the best efficiency you can get anyway.
Or you could use the "SQL Import and Export Wizard" found under "Management" in Microsoft SQL Server Management Studio.
I'd go with Tomalak's answer.
You might want to temporarily put your target database into bulk-logged recovery mode before executing a 'select into' to stop the log file exploding...
If it's SQL Server 7 or 2000 look at Data Transformation Services (DTS). For SQL 2005 and 2008 look at SQL Server Integration Services (SSIS)
Definitely put the target DB into bulk-logged mode. This will minimally log the operation and speed it up.