Error when exporting SQL database from Azure to local - sql

I am trying to export a SQL database in Azure to my local machine using SSMS in order to create a local back up. I used 'Export Data Tier Application' for creating back up. But I am getting the error given below.
How can I get rid of this?

Just from the error message, you don't have permission to alter/access the database.
If you want to do the Export Data Tier Application to create the backup bacpac file, you must using the Azure SQL database server admin account, or you don't have the enough permission.
Please try to using the Azure SQL database server admin account to do the operation.
Hope this helps.

Related

UPDATE Azure sql table fails from onprem sql server using linked server

I have an on premise SQL Server database that is the backend for our project management software, a Azure sql table that contains limited data used for reporting with power bi and a linked server to connect the two. Both of the databases have a specific user/pass account just for this which is stored in the linked server. Heres the problem:
When I run a SQL Server Agent job to update the azure table from the on prem table using the linked server everything works fine.
When I manually run a sql update statement from an open window in SSMS to do the same everything works fine.
When I use a workflow in the project management software to trigger a Stored Procedure to execute the same code (update Azure from the on prem database) I get the following error:
The OLE DB provider "SQLNCLI11" for linked server "LinkedServerName" reported an error. One or more arguments were reported invalid by the provider.
The operation could not be performed because OLE DB provider "SQLNCLI11" for linked server "LinkedServerName" was unable to begin a distributed transaction.
OLE DB provider "SQLNCLI11" for linked server "LinkedServerName" returned message "The parameter is incorrect.". Error occurred in: STORED_PROCEDURE_NAME[CRLF]Error occurred on line 23
There's nothing on line 23, and like I mentioned earlier, if I manually run the same update statement it works and if I have a SQL Server Agent Job run the same statement it works. Why does it fail when the code is executed by the project management software? Anyone have experience with this?
This is the code to insert the data from on prem into Azure:
INSERT INTO [LinkedServerName].DatabaseName.SchemaName.TableName ([ProjectNumber],[CreateDate],[SyncDate])
I'm not sure about this with Azure but I had a similar issue with a remote server and had to disable promotion of distributed transactions. It might not be the best thing to do in a production environment so read up carefully about the implications of doing this.
I'm only suggesting to try this to narrow down what the real issue is..
Change this setting and test.
I ended up taking a different strategy. We know using a scheduled SQL Agent Job to insert data in to azure works, it just wouldnt work in any script ran by our software and the user it uses to access the on prem database. So I created a SP in the on prem database that the software executes through a built in workflow. The SP executes saving the data to a staging table, then executes the SQL Job, which reads from the staging table and then inserts the data into an Azure table.
Everything worked in the testing environment, but when I replicated all the scripts into production I got a permissions error. After doing a lot of research and testing adjustments to the user I ended up getting it to work by assigning the role TargetServersRole and db_ddladmin to the user in the msdb database and that worked.
ssms screenshot
Below are the two articles that let me to this conclusion:
Article 1
Article 2

Error while creating backup of SQL Azure with Data Tier Application Wizard

I am trying to create a backup of a SQL Azure database with Data tier application wizard of SQL management Studio but I obtain a lot of errors like the following:
One or more unsopported elements were found in the schema used as part of a data package error sql 71501....
Any hint on how to solve this error?
I found something on Microsoft techcommunity posted by Azure DB support team which may help you:
https://techcommunity.microsoft.com/t5/azure-database-support-blog/exporting-a-database-that-is-was-used-as-sql-data-sync-metadata/ba-p/369062
You can also consider Export to a BACPAC file - Azure SQL Database and Azure SQL Managed Instance.

How to upload BACPAC file to Blob in Azure

So frustrating... , Microsoft do not offer help and you have figure out every small thing at the portal ...
My question is: I try to import my DB to Azure. I created a BACPAC file from the .Bak and when try to import, there's the "*storage" and then when I click on Configure required settings it's opens the "Storage Accounts" tab but give the message "Not found".
Here's the thing: I created storage and container and blob - it's just not find it. So annoying. And on the azure portal it's doesn't say how to create and how to upload the file.
Also, when I create the storage and the blob, there's no option where to Upload the BACPAC file. How do I do it? What's going on with Microsoft?? So unclear...
To summarize:
How do I create a storage/container/blob that when I try to import the DB it will see it?
How do I upload the BACPAC to the Azure Blob Storage?. I couldn't find a way to do that on the portal.
Thanks!!
Not 100% following your example, but if you have a later version of SQL Server Management Studio, you can right click the database you want to deploy, click 'Tasks/Export data tier application'. In there you can connect to a storage account which if you press Connect, you will have to type the name of the account from the portal and provide a key. You can then export it to storage and then if you connect to the Azure instance, you can right click the databases level and 'import data tier application'. If you need to create a storage account in the first place, you do that in the portal, but sounds like you have.
If you want to browse your storage and drop a file in directly, I use Azure Storage Explorer. There are various tools out there some free some not to do this. You could of course code your own interface as the API's are published.
View the portal as the administration of your subscription. When you want to use the services (not configure them) - you'll need to look at the toolsets.
Azure Storage Explorer makes navigating and uploading/downloading files in storage accounts super simple. Visual Studio works well too, for Db workloads Sql Management Studio has Azure integration. If all else fails, powershell gives you the finest level of control.
1 - Create an Azure storage account
2 - Create a blob in the new storage account
3 - Access the blob and upload your bacpac file
4 - Access the Sql server, go to import database and use the link "select the backup" to indicate in the blog the bacpac file you want to use to create the database

MSSQL database on external hard drive shows Recovery Pending

I have created a database in SQL Server 2012 with mdf and ldf pointing to a external hard drive attached to my machine. I created tables, stored procedures, populated tables, etc. etc.
I removed the hard drive at the end of the day.
Today, when I attached the hard drive and tried to access the DB in Management Studio, I see the name of the database with (Recovery Pending).
What does this mean? I see the mdf and ldf files in the D drive.
What worked for me was to take the database offline*, then back online - no RESTORE DATABASE was necessary in this case, so far as I can tell.
In SQL Server Management Studio:
right-click on the database
select Tasks / Take Offline ... breathe deeply, cross fingers...
right-click on the database again
select Tasks / Take Online
When you removed the drive, you forcefully disconnected the database from the SQL Server service. SQL Server does not like that.
SQL Server is designed by default so that any database created is automatically kept open until either the computer shuts down, or the SQL Server service is stopped. Prior to removing the drive, you should have "Detached" the database, or stopped the SQL Server service.
You "may" be able to get the database running by executing the following command in a query window: RESTORE DATABASE [xxx] WITH RECOVERY;
You could, although I would not normally recommend this, alter the database to automatically close after there are no active connections.
To accomplish this, you would execute the following query:
ALTER DATABASE [xxx] SET AUTO_CLOSE ON WITH NO_WAIT;
Another way that works is to "Restart" the Database Engine. If feasible and/or practical for this server, it may be faster whenever you have several DB in the external drive.
In SQL Server Management Studio:
Attach the external drive
right-click on the database engine : Server Name(SQL Server
12.0.2000 ... etc)
Select "Restart"
Answer Yes when asked if you want to proceed
Below worked for me:
Run SQL Management Studio as Administrator (right click on SQL
Management Studio icon and select 'Run As')
Take database offline
Detach the database using DROP option
Attach the database
If you were using this database with a Web App running on IIS then you may need to restart the IIS Server
Hope this helps someone
If the SQL Server knows that database recovery needs to be run but something is preventing it from starting, the Server marks the db in ‘Recovery Pending’ state. This is different from the SUSPECT state because it cannot be said that recovery is going to fail – it just hasn’t started yet.
Check this thread: How to fix Recovery Pending State in SQL Server Database?

Download an entire SQL Azure database as single file

Is there a facility in Azure to get a copy of the database? Or rather, detach the mdf and get it as file? On occasion I create a database in the cloud, it's up for a while, and then I want to take it down and archive it. My current rutine copies the database using SQL Azure Migration Wizard to a local Express instance, which I then detach and put in a safe place.
EDIT
Interestingly my method of choice throws an exception this time around. So it's far from ideal.
There is another way to do it:
"C:\Program Files (x86)\Microsoft SQL Server\110\DAC\bin\SqlPackage.exe" /Action:Export /SourceServerName:"tcp:xxxxx.database.windows.net,1433" /SourceDatabaseName:DbName /SourceUser:user /SourcePassword:password /TargetFile:C:\backups\backup.bacpac
From SQL Server Management Studio 2014, connect to the destination database server, right click its Databases node, and click Import Data-tier Application. In the import wizard, select the option to import from Windows Azure.
Using Sql Managment Studio (I use 2012):
Create a new local database.
Right-click in the database -> Tasks -> Import Data
Then you need to select the azure database as the data source and you new local database as data destination.
At this point, you can create a sql server backup or generate a sql script file to get a local copy at that moment.
I created the Enzo Backup for SQL Azure utility for that very reason. You can create a full backup and get your hands on a file that you can restore later to either another SQL Azure database, or a SQL Server database.
Note that SQL Azure will offer a form of backup, cloud-only, in the future. That's another good option. Finally Red-Gate has a product to copy a SQL Azure database to a local SQL Server database, but I am not sure that it gives you a "backup file" per say.
There is a RedGate tool that will backup your database to a local server http://www.red-gate.com/products/dba/sql-azure-backup/
I have found this useful before I do any database upgrades, in case bad stuff happens.
Since I've asked this question, the Azure management console added an option to export the entire database to blob storage. You can keep your backups there, but if you prefer a hardcopy there are many blob explorer tools such as this one.
A good free option I've been using for past few years is Sql Backup and Ftp. It's 100% free if you backup to your local laptop.
External storage requires paid license though.
From the website:
No more multi-step SSMS configuration, just a single form to automate backups: select databases, backup (full, diff, tran log), encrypt, compress, send to a folder, FTP or cloud service:
schedule backups, receive confirmation emails and restore when needed.