MARS Backup (system state) backup failing - Windows server 2012 - azure-backup-vault

MARS Backup - system state backup failing
I am having issues with few servers, where the system state is not getting backup.
MARS system backup failure
I did follow the article below and attempt it, by changing the scratch folder to different location, it did not made any difference.
How do I change the cache location for the MARS agent?
Run this command in an elevated command prompt to stop the Backup engine:
Net stop obengine
If you have configured System State backup, open Disk Management and unmount the disk(s) with names in the format "CBSSBVol_".
By default, the scratch folder is located at \Program Files\Microsoft Azure Recovery Services Agent\Scratch
Copy the entire \Scratch folder to a different drive that has sufficient space. Ensure the contents are copied, not moved.
Update the following registry entries with the path of the newly moved scratch folder.
Table 2Registry path Registry Key Value
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows Azure Backup\Config
ScratchLocation
New scratch folder location
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows Azure Backup\Config\CloudBackupProvider
ScratchLocation
New scratch folder location
Restart the Backup engine at an elevated command prompt:
command Copy
Net stop obengine
Net start obengine
Run an on-demand backup. After the backup finishes successfully using the new location, you can remove the original cache folder.
Can I please have some advice, as I have got few servers where the system state is not being backed up.

This is actually a failure in the Windows Server Backup operation. See this KB article for steps on how you can modify required registry keys to fix the Windows Server Backup failures.
https://support.microsoft.com/en-us/help/4053355/microsoft-azure-recovery-services-agent-system-state-backup-failure

Related

Azure Storage Emulator fails to init with "The database 'AzureStorageEmulatorDb57' does not exist"

I am having an issue with Azure Storage Emulator. I tried to re-initialise the database and got the error below.
This was after installing Visual Studio 2019 Preview but this may just be a co-incidence. I tried for an hour or so to get it running and then gave up and just reset my machine with the "keep my files" option, re-installed Visual Studio 2017 and the Azure Tools but still see the same problem.
I know a reset sounds a bit drastic but VS 2019 broke my Azure Functions in VS2017, they would not launch so I wanted a clean install.
If I manually create the DB with sqllocaldb create (version 13.1.4001.0), the DB gets created fine but the init still fails with the same message.
Any ideas?
C:\Program Files (x86)\Microsoft SDKs\Azure\Storage
Emulator>AzureStorageEmulator.exe init
Windows Azure Storage Emulator 5.7.0.0 command line tool
Found SQL Instance (localdb)\MSSQLLocalDB.
Creating database AzureStorageEmulatorDb57 on SQL instance '(localdb)\MSSQLLocalDB'.
Cannot create database 'AzureStorageEmulatorDb57' : The database 'AzureStorageEmulatorDb57' does not exist. Supply a valid database
name. To see available databases, use sys.databases..
One or more initialization actions have failed. Resolve these errors before attempting to run the storage emulator again.
Error: Cannot create database 'AzureStorageEmulatorDb57' : The database 'AzureStorageEmulatorDb57' does not exist. Supply a valid
database name. To see available databases, use sys.databases..
After resetting my machine (and keeping files), I ran into this issue. For me, I was unable to run an Azure function in Visual Studio 2019 due to an error around being unable to start the emulator.
It looks like I had the same permissions issues as (I presume) my new account after reset, did not have permission to touch the DB.
I resolved this by:
Deleting the Azure Storage Emulator DB file: %USERPROFILE%/AzureStorageEmulatorDb[number].mdf
Then running AzureStorageEmulator.exe start with admin rights
I was then able to run the Azure Function without issue.
Stop the Azure Emulator if it is running.
Open SSMS and connect to your (localdb) instance.
Manually create the "AzureStorageEmulatorDb57".
Open a command prompt as Administrator.
Run the "AzureStorageEmulator.exe init".
Run your VS project.
I was running into this same issue after installing LocalDb for SQL Server 2017. These steps helped me to resolve the problem I was facing:
Open a command line in C:\Program Files (x86)\Microsoft SDKs\Azure\Storage Emulator
Run AzureStorageEmulator.exe init /forceCreate
From checking my error logs (located at %USERPROFILE%\AppData\Local\Microsoft\Microsoft SQL Server Local DB\Instances\MSSQLLocalDB), I saw
2018-12-21 15:41:13.47 spid65 CREATE FILE encountered operating system error 5(Access is denied.) while attempting to open or create the physical file 'C:\Users{username}AzureStorageEmulatorDb59.mdf'.
This error lead me to the following post: https://dba.stackexchange.com/questions/191393/localdb-v14-creates-wrong-path-for-mdf-files
From reading answers there, I gathered that this is a bug in SQL Server 2017. Without having access to the patch, the solution that worked for me was granting Everyone access to modify C:\Users. This was only an issue on my development laptop, so I could afford to make that security change
or as commented by Andrii install CU13 HotFix for SQL Server 2017. After that AzureStorageEmulatorDb<xxx>.mdf will be created you your user directory as it should.
I had this problem and I don't know why an AzureStorageEmulatorDb57_log.ldf was still present in my %USERPROFILE% directory when I deleted my MSSQLLocalDB instance, but after dropping that file the problem went away.
I came across this issue where I had changed the userlogin to my machine. I have created the database from my previous useraccount. I have copied the database files to the new user account but it gave me this error. It seems to be a permission issue.
You need to find the saved location of the mdf and ldf file of this database. In my case it was stored in 'C:\Users\yourUserName'
Simply delete these files and run AzureStorageEmulator.exe init again and it will create the new mdf and ldf files for you.
After manually upgrading my MSSQL 2016 LocalDB to MSSQL 2019 following these instructions, I got the error mentioned as I was unaware that the Azure Storage Emulator uses LocalDB internally.
To fix it, I simply had to manually re-attach the database located in %UserProfile% with the following SQL command:
CREATE DATABASE [AzureStorageEmulatorDb510]
ON (FILENAME = 'C:\Users\<username>\AzureStorageEmulatorDb510.mdf'),
(FILENAME = 'C:\Users\<username>\AzureStorageEmulatorDb510_log.ldf')
FOR ATTACH;
Worked for me:
Delete any storage/sql database related to azure emulator
run this command on StorageEmulator path:
AzureStorageEmulator.exe init /server .
(Or your SQL instance, Mine was ".")
Check you had install Azure SDK with Visual Studio, if you did't you can add the feature
You can locate the mdf and ldf files in your userprofile directory. Just stop the emulator and copy those files to some other place and delete it from userprofile directory.
Then run the emulator again and it's going to create new mdf and ldf files.
Then stop the emulator and copy the old files back and restart the emulator. This way you won't loose any data.
I will help you with this. First of all create a sql server local db.
Then go to storage emulator folder
_-The Storage Emulator is installed by default to C:\Program Files (x86)\Microsoft SDKs\Azure\Storage Emulator.
Then run this AzureStorageEmulator.exe init /server
docs: AzureStorageEmulator.exe init /server localhost\SQLEXPRESS01
Open SSMS and connect to your (localdb) instance.
Manually create the "AzureStorageEmulatorDb...".
To add yet another answer, I did not have the any MDF or LDF files. Instead, I only had a config file at %USERPROFILE%\AppData\Local\AzureStorageEmulator\AzureStorageEmulator.5.10.config. I also could not connect to my local (localdb) instance with SSMS.
I changed the SQLInstance value in that config file to be localhost rather than (localdb)\MSSQLLocalDB, and it started working.
You should have an app called Microsoft Azure Storage Emulator.
Start this application.
If the application indicates that it is running run AzureStorageEmulator.exe stop first otherwise run AzureStorageEmulator.exe Start directly. Should create your database automatically, at least it did for me.
https://learn.microsoft.com/en-us/azure/storage/common/storage-use-emulator
This seems to be because the mdf file already exists but LocalDB doesn't have it attached. You can delete and recreate as others have mentioned, but in my case I was able to just re-attach it and it worked fine.
Open SSMS to (localdb)\mssqllocaldb
Right click Databases
Choose Attach
Click Add
Select the existing MDF file (mine was in my user profile and named AzureStorageEmulatorDb510.mdf)
Click Ok
Then try running the emulator again.
This solution is not recommended in generally, but you can try it.
I think AzureStorageEmulator by somehow can not full access the localDB whitch setup in directory has limited the permission.
You can go to folder propertiy > sercurity > edit to full permission ( with me directory is user > Appdata).
Then restart the emulator
cmd restart the azure emulator.
Now it worked. You must consider it's unsercurity later on.
I initialized the db instance and succeed, bu my SQLServer is 2017.
Then I search the solution and the doc said delete the trouble database will solve the problem. Maybe you can try it follow the steps in the doc.

How to restore a database from bak file from azure data studio on Mac

Previously on Mac I use mysql operation studio and I click on database and click restore then browse to my bak file, but now they change to azure data studio and when I repeat the same steps I got this error:
"You must enable preview features in order to use restore"
but I cannot figure out hot to enable that. I have googled and tried few things even open my azure, microsoft account on website but I do not see that option.
Can some one help please !
Go to Azure Data Studios > Settings
Edit settings by clicking on
"new settings editor"
Search for "preview"
Scroll to the bottom and
check "Enable unreleased preview features"
Adding this as an answer as I don't have enough rep to comment. This is in response to the question in the comment for the answer to the OP (If that's not confusing enough!)
This only applies when your sql database is hosted on a Mac/Linux/Docker Container. We don't have any Windows servers in our estate for me to test this on to see if the location of the .bak files is any different.
When you click on the "..." button, it browses to /var/opt/mssql/data on the machine (or docker container) the database is hosted on. This is not an issue if you are backing up & restoring databases on the same host, however, if you're migrating to a new server or just creating a dev/UAT/staging environment, it becomes a problem because you don't have access to var/opt/mssql/data.
This is a bit of a sledgehammer to crack a nut type solution but as I'm working with 2 dev boxes, it doesn't make a lot of difference to me.
To make this easier to understand I'll call the server that hosts the database you have backed up ProdServ & the server you are restoring to DevServ.
On DevServ, at a terminal prompt, navigate to /var/opt and make a note of the current permissions on the mssql directory (mine were drwxrwx---).
$ cd /var/opt
$ ls -la
Google the octal value for your permissions (in my instance, it's 770)
Change the permissions of the data directory to rwxrwxrwx.
$ chmod -R 777 /var/opt/mssql/data
(You will also need to do this on ProdServ if that is also a Unix-based o/s)
Copy the .bak files from ProdServ to DevServ via a method suitable to the environment you're working in.
Windows --> Linux I'd use WinSCP
For Mac to Docker, docker cp <fileToCopy> <container>:<destinationPath> works perfectly fine.
Once the files have been copied over, they will magically appear when you click the "..." button in azure data studio again. Make sure you change the directory permissions back to their original value via the same command. So in my instance, simply
$ chmod -R 770 /var/opt/mssql/data
As an extra note, if you're used to working in MSSMS, the wizard there allows you to create a database from a .bak file, from what I can see, Azure Data Studio does not. You have to create the database first (CREATE DATABASE <databasename>) in a query window, then restore the .bak file to it.
simple add this to your setting file:
"workbench.enablePreviewFeatures": true
You can click on View at the top menu next to Window.
Then select the first option Command Palette
Then in the command textbox type Restore and select Restore
Then the restore window comes up.

Robocopy fails with security error copying "from nas to nas". Why?

tl;dr
robocopy has security problems copying from 'nas to nas'
The system detected a possible attempt to compromise security. Please
ensure that you can contact the server that authenticated you.
Summary
I'm running into "windows permission problems" when making backups using using the following:
powershell
robocopy
Windows 2008R2
Windows task scheduler
Task Scheduler output
Taskscheduler runs under user domain account "OPS\backupuser"
The script succeeds when it copies "from local drive" "to the backup nas"
However it fails when the script copies "from another nas" "to the backup nas"
In pictures...
Success: local drive --copy-to--> backup NAS
Fails: another NAS --copy-to--> backup NAS
Output
Robocopy fails with exit code 16.
Here is detailed output:
-------------------------------------------------------------------------------
ROBOCOPY :: Robust File Copy for Windows
-------------------------------------------------------------------------------
Started : Thu Jul 07 22:22:11 2016
2016/07/07 22:22:26 ERROR 1265 (0x000004F1) Getting File System Type of Source \\app-data-nas.hosting.acme\bazapp$\production\foo_industries_prod\
The system detected a possible attempt to compromise security. Please ensure that you can contact the server that authenticated you.
Source - \\app-data-nas.hosting.acme\bazapp$\production\foo_industries_prod\
Dest : \\dr-backup-nas\AppDR$\ALL_DR\FOO_INDUSTRIES_DR\foo_industries_prod\
Files : *.*
Options : *.* /NDL /S /E /COPY:DT /PURGE /MIR /B /NP /R:0 /W:1
------------------------------------------------------------------------------
Other points
1) Because I the environment is 'locked down', I could not run this from the command line, either as :
my own account
my own account with elevated command prompt
OPS\backupuser
2) I tried adding '/NODCOPY' , but robocopy failed; apparently we don't have the hotfix for this option.
Thanks in advance!
NAS drives are not Windows drives. You have to map to them uniquely and with certain admin privileges to make them "see" you're trying to copy stuff into them.
First map a drive to the NAS system using NET USE
Next copy the file(s) using ROBOCOPY thus...
robocopy <source path> <nas path> <file(s)> /s /j /r:2 /w:5 /log+:robocopy.log
Place it all inside a CMD file and run it from a Task Scheduler on your Windows server.
>
>
>NAS drives are not Windows drives. You have to map to them uniquely and with certain >admin privileges to make them "see" you're trying to copy stuff into them.>
>
>First map a drive to the NAS system using NET USE
>
>Next copy the file(s) using ROBOCOPY thus...
>
>robocopy <source path> <nas path> <file(s)> /s /j /r:2 /w:5 /log+:robocopy.log
>Place it all inside a CMD file and run it from a Task Scheduler on your Windows server.
>
You forgot:
/FFT #":: assume FAT File Times (2-second granularity)" -ensures the copy ignores OS file system while copying in ROBOCOPY
/Z #":: Includes LARGE file copy restart" - restarts large file copying where the copy left off in the file, instead of starting over again. Like in a 500 GB file it restarts at the byte the copy stopped at, in case you need to schedule offline copying and don't want large files to prevent the copy progression (will ONLY start over if the file date changed!)
/xo #"exclude older files" -copies all new files - something useful to retry copies in Scheduled task Job of Robocopy...
All these are useful in NAS copying... as they tend to have issues resolved by these switches in ROBOCOPY.

Backup individual Raven database

In an old version of Raven (r888) I had an individual database backed up with the following command
"C:\RavenDB\Server\Raven.Backup.exe" --url=http://localhost:8089/databases/Production --dest=C:\temp\raven\production
This would place the backup of the Production database into the destination directory.
On the latest unstable version, after upgrading, the command no longer executes and an error is returned
The system cannot find the path specified.
The docs mention being able to backup the entire server but there is no mention of how to isolate this to a single database?
Could it be that your path to the executable is wrong? (Replace Server with Backup)
"C:\RavenDB\Backup\Raven.Backup.exe"
Actually, we always backup a single db.
To execute on a single db, you use the url http://localhost:8089/databases/Production note the /databases/Production there.

SQL Server 2005 backup restore failing (with folder permissions)

I am trying to restore a database (from file thedb.bak). I am using SQL Server Express edition 2005 on a Windows 7 Ultimate 64-bit machine.
When I try to restore I get the following error:
System.Data.SqlClient.SqlError: The operating system returned the error '5(Access is denied.)' while attempting 'RestoreContainer::ValidateTargetForCreation' on 'C:\Program Files (x86)\Microsoft SQL Server\MSSQL.2\MSSQL\Thedb.MDF'. (Microsoft.SqlServer.Express.Smo)
My username (antoniocs) is an Administrator. I have edited the permissions in the folder (C:\Program Files (x86)\Microsoft SQL Server\MSSQL.2\MSSQL\) so that the user AntonioCS has full control.
I really need to restore this backup. What am I missing?
Note: I am using the Windows authentication to login. Should I try another user (the one I use is an administrator in the machine)?
The account running the SQL Server service requires permissions on that folder.
You may be connected to SQL Server, but actions are done in the service account context: not you.
Run services.msc from command line, see what account is used, permission this folder accordingly.
Do you have a database that uses Thedb.MDF? I ran into this error when I tried to restore a database over a file that SQL Server used. From the restore window, choose Options and change the path or the filename from the "Restore the database files as".