SSIS package to run script file on Azure - sql

I want to create a SSIS package which will first get the .sql script file from Azure Blob storage and then execute the same in Azure sql server. I am able to do it when I am using local storage. That means when I am executing local file not from azure storage.
Please help me!!!

Frist, you're going to want to set up your Azure storage with Azure File Storage That way the machine running your SSIS package will be able to use the file storage like a mapped network drive.
Once you've got that setup, the first step in your package will first run an Execute process Task. This Task should be configured so that it runs a simple batch file (bat) that reconnects to the file share you setup in Azure File Storage. Here is a basic example of what would be in this batch file:
if not exist e:\*
net use e: \\<storageAccountName>.file.core.windows.net\<fileContainerName> /u:<storageAccountName> <primary access key for storage account>
After this is run, your SSIS package would then be able to access your .sql files stored on this share.
Simply pick up the file, read the contents and execute the statement contained in the .sql file.

Related

Restore from an Azure Storage Explorer downloaded through Powershell failing

We receive weekly FULL and daily DIFF back ups from our hosted ERP quoting system provider. We are using Powershell code and a task to download the most recent file from a blob container to a local server location we use for our back ups.
When I download the files from Azure Storage explorer manually and run the restore job it works fine. When I run the restore job from the Powershell downloaded file i get an error.
.bak' is incorrectly formed and can not be read.
I cannot figure out why this is happening. Anyone run into this and fix it?

Azure Data Factory with Integration Runtime - Delete (or move) file after copy

I have an on premise server with the Microsoft Integration Runtime installed.
In Azure Data Factory V2 I created a pipeline that copies files from the on premise server to a blob storage.
After a successful transfer I need to delete the files on the on premise server. I am not able to find a solution for this in the documentation. How can this be achieved?
Recently Azure Data Factory introduced a Delete Activity to delete files or folders from on-premise storage stores or cloud storage stores.
You have the option to call Azure Automation using webhooks, with the web activity. In Azure Automation you can program a powershell or python script with a Hybrid Runbook Worker to delete the file from the on premise server. You can read more on this here: https://learn.microsoft.com/en-us/azure/automation/automation-hybrid-runbook-worker
Another easier option would be to program a script to be run on the server with the windows task scheduler where you run a script to delete the file. Make sure you program the script to be run after data factory has copied the files to the blob, and that's it!
Hope this helped!
If you are simply moving the file then you can use a Binary dataset in a copy activity. This combination makes a checkbox setting visible that when enabled will automatically delete the file once the copy operation completes. This is a little nicer as you do not need the extra delete activity and the file is "moved" only if the copy operation is a success.

How to use a Post deployment script to bulk insert a CSV on remote server

I am using a post deployment script in my SQL server database project to bulk insert data. This works great if I publish to a local database, but if I publish to a remote database, obvisouly the CSV I am bulk inserting from, will not exists on the remote server. I have considered using a command line copy on the CSV to a shared folder, but this raises security concerns, as anyone with access to this folder could possibly tamper with a deployment.
How should I be using post deployment scripts to copy data to a remote server? A CSV is easier to maintian than a bunch of inserts, so I would prefer using a CSV. Has anyone ran into and solved this issue?
The best way is the one you mentioned. Command line copy it to a secured shared folder and BCP from there.
From that point on the security of the folder depends on network services.

Distributing a hsqldb database for use

I have gone through the docs and I haven't been able to fully understand them.
My question :
For testing purpose we created a replica of a database in hsqldb and would want to use this as inprocess db for unit testing.
Is there a way that I can distribute the replica database so that people connect to this db.I have used the backup command and have the tar file.But how do I open a connection to the db which takes this backed up db... something on the lines of handing a .mdb file in case of Access to another user and asking him/her to use that.
Regards,
Chetan
You need to expand the backed up database using standard gzip / unzip tools, before you can connect to it.
The HSQLDB Jar can be used to extract the database files from the backup file. For example:
java -cp hsqldb.jar org.hsqldb.lib.tar.DbBackup --extract tardir/backup.tar dbdir
In the example, the first file path is the backup file, and the second one, dbdir, is the directory path where the database files are expanded.

Upload a file using FTP with SSIS

I need to upload a file using an FTP task in SQL Server 2005 SSIS. While uploading I am unable to get the folders created on the FTP server.
Is there any other solution in SSIS for uploading a file to the FTP server?
Quite often when I need to use FTP in a DTS or SSIS package, I make a text file with the commands I want executed and then use an "Execute Process" task to run "ftp.exe /s:mycommands.txt". This can give you a little more control over what happens during the ftp connection.