How to use a Post deployment script to bulk insert a CSV on remote server - sql

I am using a post deployment script in my SQL server database project to bulk insert data. This works great if I publish to a local database, but if I publish to a remote database, obvisouly the CSV I am bulk inserting from, will not exists on the remote server. I have considered using a command line copy on the CSV to a shared folder, but this raises security concerns, as anyone with access to this folder could possibly tamper with a deployment.
How should I be using post deployment scripts to copy data to a remote server? A CSV is easier to maintian than a bunch of inserts, so I would prefer using a CSV. Has anyone ran into and solved this issue?

The best way is the one you mentioned. Command line copy it to a secured shared folder and BCP from there.
From that point on the security of the folder depends on network services.

Related

Is there any method to regenerate cms/blocks in magento2 without sql file?

So far, I developed cms/blocks on staging server directly, therefore I have no any local dev environment.
Suddenly server HDD was crashed and i failed to restore data on server.
All my codes have been managing by git but can't find any certain files that are related to cms/blocks
The content of CMS blocks are stored in the database table cms_block. There is not backup in the filesystem.

Azure Data Factory with Integration Runtime - Delete (or move) file after copy

I have an on premise server with the Microsoft Integration Runtime installed.
In Azure Data Factory V2 I created a pipeline that copies files from the on premise server to a blob storage.
After a successful transfer I need to delete the files on the on premise server. I am not able to find a solution for this in the documentation. How can this be achieved?
Recently Azure Data Factory introduced a Delete Activity to delete files or folders from on-premise storage stores or cloud storage stores.
You have the option to call Azure Automation using webhooks, with the web activity. In Azure Automation you can program a powershell or python script with a Hybrid Runbook Worker to delete the file from the on premise server. You can read more on this here: https://learn.microsoft.com/en-us/azure/automation/automation-hybrid-runbook-worker
Another easier option would be to program a script to be run on the server with the windows task scheduler where you run a script to delete the file. Make sure you program the script to be run after data factory has copied the files to the blob, and that's it!
Hope this helped!
If you are simply moving the file then you can use a Binary dataset in a copy activity. This combination makes a checkbox setting visible that when enabled will automatically delete the file once the copy operation completes. This is a little nicer as you do not need the extra delete activity and the file is "moved" only if the copy operation is a success.

SSIS package to run script file on Azure

I want to create a SSIS package which will first get the .sql script file from Azure Blob storage and then execute the same in Azure sql server. I am able to do it when I am using local storage. That means when I am executing local file not from azure storage.
Please help me!!!
Frist, you're going to want to set up your Azure storage with Azure File Storage That way the machine running your SSIS package will be able to use the file storage like a mapped network drive.
Once you've got that setup, the first step in your package will first run an Execute process Task. This Task should be configured so that it runs a simple batch file (bat) that reconnects to the file share you setup in Azure File Storage. Here is a basic example of what would be in this batch file:
if not exist e:\*
net use e: \\<storageAccountName>.file.core.windows.net\<fileContainerName> /u:<storageAccountName> <primary access key for storage account>
After this is run, your SSIS package would then be able to access your .sql files stored on this share.
Simply pick up the file, read the contents and execute the statement contained in the .sql file.

SQL database deployment for VB.NET app

I have read many articles here, but haven't quite found the solution. I have a SQL Server Express database that is used by my VB.NET application. I have packaged and deployed the application via an MSI file and everything works great except I cannot figure out how to include my database file with the package. I understand there are three general ways to do this (copy the files over manually, custom actions, and SQL scripts). I didn't need anything fancy here, just a quick way to put the DB on the client machine so my app can access it.
I decided copying over the DB manually was the quickest option. I tried putting it in the working directory and in the \DATA directory of the client's SQL Server Express install, but my app wouldn't connect. I also tried changing my connection in the project to .\SQLEXPRESS instead of [my_computer_name]\SQLEXPRESS followed by a rebuild of the deployment project and reinstall on the client machine, but no soup for me. Same issue. I tried changing the "UserInstance" property in the project to "True" but my project would not let me save that action.
Am I correct that a manual copy is the quickest and easiest way to get this done?
You should to attach your file to the Sql Server instance.
CREATE DATABASE YourDatabaseName
ON (FILENAME = 'C:\your\data\directory\your_file.mdf'),
(FILENAME = 'C:\your\data\directory\your_file_Log.ldf')
FOR ATTACH;
You need to attach your database file to the running SQL Server on the client machine.
This could be easily done using this variation on the connection string stored in your configuration file (app.config or web.config)
Server=.\SQLExpress;AttachDbFilename=where_you_have_stored_the_mdf_file;
Database=dbname; Trusted_Connection=Yes;
in alternative, you could use the |DataDirectory| substitution string.
This shortcut eliminates the need to hard-code the full path.
Using DataDirectory, you can have the following connection string:
Server=.\SQLExpress;AttachDbFilename=|DataDirectory|\yourfile.mdf”
Database=dbname; Trusted_Connection=Yes;

Distributing a hsqldb database for use

I have gone through the docs and I haven't been able to fully understand them.
My question :
For testing purpose we created a replica of a database in hsqldb and would want to use this as inprocess db for unit testing.
Is there a way that I can distribute the replica database so that people connect to this db.I have used the backup command and have the tar file.But how do I open a connection to the db which takes this backed up db... something on the lines of handing a .mdb file in case of Access to another user and asking him/her to use that.
Regards,
Chetan
You need to expand the backed up database using standard gzip / unzip tools, before you can connect to it.
The HSQLDB Jar can be used to extract the database files from the backup file. For example:
java -cp hsqldb.jar org.hsqldb.lib.tar.DbBackup --extract tardir/backup.tar dbdir
In the example, the first file path is the backup file, and the second one, dbdir, is the directory path where the database files are expanded.