Schedule a SQL-query to move data from one table to another with Azure SQL-db - sql

I have a simple query that takes old data from a table and inserts the data into another table for archiving.
DELETE FROM Events
OUTPUT DELETED.*
INTO ArchiveEvents
WHERE GETDATE()-90 > Events.event_time
I want this query to run daily.
As i currently understand, there is no SQL Server Agent when using Azure SQL-db. Thus SQL Server agent does not seem like the solution here.
What is the easiest/best solution to this using Azure SQL-db?

There are multiple ways to run automated scripts on Azure SQL Database as below:
Using Automation Account Runbooks.
Using Elastic Database Jobs in Azure
Using Azure Data factory.
As you are running just one script, I would suggest you to take a look into Automation Account Runbooks. As an example below, a PowerShell Runbook to execute the statement.
$database = #{
'ServerInstance' = 'servername.database.windows.net'
'Database' = 'databasename'
'Username' = 'uname'
'Password' = 'password'
'Query' = 'DELETE FROM Events OUTPUT DELETED.* INTO archieveevents'
}
Invoke -Sqlcmd #database
Then, it can be scheduled as needed:

You asked in part for a comparison of Elastic Jobs to Runbooks.
Elastic Jobs will also run a pre-determined SQL script against a
target set of servers/databases.
-Elastic jobs were built
internally for Azure SQL by Azure SQL engineers, so the technology is
supported at the same level of Azure SQL.
Elastic jobs can be defined and managed entirely through PowerShell scripts. However, they also support setup/configuration through TSQL.
Elastic Jobs are handy if you want to target many databases, as you set up the job one time, and set the targets and it will run everywhere at once. If you have many databases on a given server that would be good targets, you only need to specify the target
server, and all of the databases on the server are automatically targeted.
If you are adding/removing databases from a given server, and want to have the job dynamically adjust to this change, elastic jobs
is designed to do this seamlessly. You just have to configure
the job to the server, and every time it is run it will target
all (non excluded) databases on the server.
For reference, I am a Microsoft Employee who works in this space.
I have written a walkthrough and fuller explanation of elastic jobs in a blog series. Here is a link to the entry point of the series:https://techcommunity.microsoft.com/t5/azure-sql/elastic-jobs-in-azure-sql-database-what-and-why/ba-p/1177902

You can use Azure data factory, create a pipeline to execute SQL query and trigger it run every day. Azure data factory is used to move and transform data from Azure SQL or other storage.

Related

How to insert table between 2 server in Azure data warehouse?

I want to insert table from different factory / server in Azure data warehouse. Is it possible to insert by query?
Because it takes a lot of time if I make dataset and pipeline for each table in Azure data factory.
Just from your screenshot, according the icon of the Azure SQL, you're using Azure SQL database, not Azure Data Warehouse.
You could use Elastic Query to do a cross database query in Azure.
Ref the tutorial: Get started with cross-database queries (vertical partitioning) (preview)
Elastic database query (preview) for Azure SQL Database allows you to run T-SQL queries that span multiple databases using a single connection point. This article applies to vertically partitioned databases.
When completed, you will: learn how to configure and use an Azure SQL Database to perform queries that span multiple related databases.
Then you can inset data from external table to current database table.
Note: Azure SQL database already has the master key, you don't need create it again.
Hope this helps.

Copy Azure Database Across Subscriptions

I am trying to copy my existing AzureSQL (singleton) database from PRODUCTION subscription into a NON-PRODUCTION database in a different subscription. I need to repeat this process every night so that our production support environment (non-prod) has the latest copy from production from the night before. Since overwriting database is not possible in AzureSQL, I would like to copy "DBProd" from PROD server as "DBProd2" on to my Non-Prod server (in a different subscription), then delete the existing "DBProd" from the destination server and rename "DBProd2" to "DBProd".
I have searched through this site to find answers and the closest I found was this link below...
Cross Subscription Copying of Databases on Windows Azure SQL Database
In that link user #PaulH submitted the below answer...
"This works for me across subscriptions without having matching SQL Server accounts. I am a member of the server active directory admin group on both source and target servers, and connecting using AD authentication with MFA. – paulH Mar 25 at 11:22"
However, I could not figure out the details of how it was achieved. My preference is to use power shell script to get this done. If any of you had done this before, I would appreciate a snippet of sample code, or any pointers to achieve this.
My other option is to go the BACPAC route (export and import), but I would only want to resort to that if copying of DB across subscriptions is not possible.
Thanks in advnace!
Helios
Went through the link...
Cross Subscription Copying of Databases on Windows Azure SQL Database
The Move-AzureRmResource cmdlet may be all you need. Below how it works.
Let's say you create a new Azure SQL Server on a different resource group.
New-AzureSqlDatabaseServer -Location "East US" -AdministratorLogin "AdminLogin" -AdministratorLoginPassword "AdminPassword"
Copy the source database to the newly created Azure SQL Server.
Start-AzureSqlDatabaseCopy -ServerName "SourceServer" -DatabaseName "Orders" -PartnerServer "NewlyCreatedServer" -PartnerDatabase "OrdersCopy"
Move the resource group of the Newly created Azure SQL Server to another subscription.
Move-AzureRmResource -DestinationResourceGroupName [-DestinationSubscriptionId ] -ResourceId [-Force] [-ApiVersion ] [-Pre] [-DefaultProfile ] [-InformationAction ] [-InformationVariable ] [-WhatIf] [-Confirm] []
For more information, please read this DBAStackExchange thread.

Azure SQL DB Refresh From Production

Looking for the best practice on refreshing a QA/Test Azure SQL Database from a Production Azure SQL Database
The production database is on a different server and resource group. So just wondering the best method for getting the production data into the qa/testing database. What tools are available for a task like this?
The most common format of SQL Azure Database's is bacpac, and believe me when I tell you that it is AWESOME.
Exporting
The easiest way to do this is using the Azure Portal or with SSMS.
This will however copy the entire database schema and all data. If you need something more specific, like excluding a table, look no further than sqlpackage.exe.
.\sqlpackage.exe /Action:Export /ssn:SERVER /sdn:ADB /tf:"C:\PATH\TO\FILE.bacpac" /of /p:TableData=TABLE /p:TableData=TABLE /p:TableData=TABLE
Importing
To create a database from the .bacpac you created above, all three of the aforementioned methods also support importing.
Recommendations
I would apply the KISS principle here and just use the portal/SSMS on both ends. Dropping the specific tables you no longer want/need.
You just need to copy the production database using the portal or PowerShell
New-AzureRmSqlDatabaseCopy -ResourceGroupName "myResourceGroup" `
-ServerName $sourceserver `
-DatabaseName "MySampleDatabase" `
-CopyResourceGroupName "myResourceGroup" `
-CopyServerName $targetserver `
-CopyDatabaseName "CopyOfMySampleDatabase"
You can also automate refreshing the development database by recreating it using Azure Automation and the following T-SQL statement.
CREATE DATABASE db_copy
AS COPY OF ozabzw7545.db_original ( SERVICE_OBJECTIVE = 'P2' );

Move data between two Azure SQL databases without using elastic query

I am in need of suggestion to move data from a particular table in one azure sql database to the other azure sql database which has the same table structure without using elastic query
Using SQL Server Management Studio to connect to SQL azure database, right click the source database and select generate scripts.
During the wizard, after have select the tables that you want to output to a query window, then click advanced. About half way down the properties window there is an option for "type of data to script". Select that and change it to "data only", then finish the wizard.
The heck the script, rearrange the inserts for constraints, and change the using at the top to run it against my target DB.
Then right click on the target database and select new query, copy the script into it, and run it.
This will migrate the data.
Please consider using the "Transfer SQL Server Objects task" in SSIS. You can learn all the advantages it provides on this article.
You can use PowerShell to query each database and move data between them as needed. Here's an example article on how to get this done.
Using PowerShell when working with Azure has a number of other benefits in what you can do and can control as well. It's a good choice to spend time learning.
In the source database I created SPs to select the data from the tables.
In the target database I created table types (which would be available in programmability) for the tables with the same structure as in the source.
I used Azure function to move the data into table type from source.
In the target database I created SPs to insert data into the tables from their respective table types.
After ensuring the transfer of data, I would be deleting those records moved to the target in the source database and for this I created SPs.

How to Export data to Excel in SQL Server using SQL Jobs

I need to export the data from a particular table in my database to Excel files (.xls/.xlsx) that will be located into a shared folder into my network. Now the situation is like this -
I need to use SQL SERVER Agent Jobs.
2.I need to generate a new excel file in every 2 minutes that will contain the refreshed data.
I am using sql server 2008 that doesn't include BI development studio. I'm clueless how to solve this situation. First, I'm not sure how to export the data using jobs because every possible ways I tried had some issues with the OLEDB connection. The 'sp_makewebtask' is also not available in SQL 2008. And I'm also confused how to dynamically generate the names of the files.
Any reference or solution will be helpful.
Follow the steps given below :
1) Make a stored procedure that creates a temporary table and insert records to it.
2) Make a stored procedure that read records from that temporary table and writes to file. You can use this link : clickhere
3) Create an SQL-job that execute step 1 and step 2 sequentially.
I found a better way out. I have created a SSIS(SQL Server Integration Services) package to automate the whole Export to Excel task. Then I deployed that package using SQL Server Agent Jobs. This is a more neat and clean solution as I found.