Octopus Deploy - Execute Script File - sql

I have defined a process that consists of 2 steps:
Deploy an IIS WebSite
Execute a script file
The first step in the process executes without any issues. I am able to upload a nuget package to Octopus Server's built-in package repository and deploy this to my IIS Web Site.
For the second step in the process, where I am asked to specify a script file to run, I am unable to upload a sql file either using the built-in package repository or using a external package repository since Octopus will not allow me to upload a .sql file type.
What is the best way of specifying the SQL Script File to be run?
Should I package it along with my nuget package and then specify the script source to 'Script file inside a package' or is it possible to upload a SQL file to a repository (external or built-in) and specify that file as the SQL script file to execute?
Is there a best practice for this?
Thanks,
Sean

By "Script file" or "Script file inside a package" Octopus means a Powershell script or Bash script. If you want to execute a SQL script as part of your deployment you have two options:
Install an existing task from the Octopus Community Library (https://library.octopusdeploy.com/listing) that executes SQL. Here are two of them:
SQL - Execute Script - https://library.octopusdeploy.com/step-templates/73f89638-51d1-4fbb-b68f-b71ba9e86720/actiontemplate-sql-execute-script,
SQL - Execute Script File = https://library.octopusdeploy.com/step-templates/709b5872-52e2-4cd9-9ec0-b4a135a0444c/actiontemplate-sql-execute-script-file
Create a Powershell or Bash script to execute your SQL script.

Related

Deployed SSIS package fails when running from a SQL Agent Job

Let me know what additional info would help. I have an SSIS package that Imports a csv file to SQL Server and moves that csv into a subfolder of the folder where the csv resides. I can run the package from the Integration Services Catalog with no problem, but when running from the Agent Job it says it can't read the .csv file and I've tried different versions of the file. I tried recreating the job also. No luck. Also the C# script task is not moving the csv to the archive folder even when I run the package from the ISC. The agent error tells me to look at the Execution of the package which says basically it can't open the csv.

How to deploy the SQL Scripts to the local database (One click deployment)?

How to deploy the SQL scripts to the local database (one-click deployment)?
I am trying to get some syntax advice on using osql batch file mode to deploy all .sql files to the mapped database.

Move files from Azure storage to a local directory

I want to move all the files that in a File share on azure. I can do it at the moment by following way using:
Use "net" command to connect to the network drive and assign it a drive letter
Then there's a bat file that uses "move" command like move g:\files\* c:\files to then move the files which runs every hour to check if there are files and move them using windows task scheduler.
But I don't want to use this way because:
The drive will be disconnected if the Machine needs a restart and hence the process doesn't remain automated as someone will have to mount the drive again
The "move" command doesn't moves folders, it moves only files.
Is there a better way of managing this? We don't want to install tools like AzCopy but using Powershell is feasible.
According to your description, I suggest you can do it as the following ways:
1.Call Azcopy in PowerShell.
You could install Azcopy firstly, you could download the latest version from the link. Azcopy supports upload directory to Azure fileshare. You could use the following command.
AzCopy /Source:C:\myfolder /Dest:https://myaccount.file.core.windows.net/myfileshare/ /DestKey:key /S
Then you could write a bat script to call Azcopy, by default Azure install directory is C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy.
cd "C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy"
&.\AzCopy.exe /Source:C:\myfolder /Dest:https://myaccount.file.core.windows.net/myfileshare/ /DestKey:key /S
Use PowerShell Script, you could use to this link. The script use for ASM mode, you should change some command.
Based on my experience, using Azcopy is more easily and simpler.
The issue was resolved using the following approach:
Delete any existing mapped drive to the drive letter we are using with the net use <drive_letter> /delete command. This is done to make sure the drive was detached since the last time script ran
Map the drive again using the net use command
Copy all the files using robocopy
After that delete all the files using del command
Disconnect the drive now using the net use <drive_letter> /delete command

Programmatically access tfs build output

I'm trying to write a powershell script to allow a user to specify a tfs build id (or alternately a changeset id) and download the build output to the current directory. I have the build configured to copy the output to the server, which means only the most recent build output will be accessible in that directory. However from Visual Studio, or from the TFS Web Access, I can download the drop as a .zip file.
How can I access this .zip file programmatically (either in powershell, or even if I could figure out VB code to do this I can convert it to a powershell script)? Am I thinking about build output wrong, and there's a easier, more obvious way to handle this? Is the build output of the older builds being stored somewhere else on the server, or is it store in the database? Should I be configuring the build differently to store each build in a separate folder rather than overwriting each build in a single folder?
You can access the download zip via a properly constructed URL. For example:
https://{AccountName}.visualstudio.com/DefaultCollection/{TeamProject}/_apis/build/builds/{BuildId}/artifacts/drop?%24format=zip

How can i specify the relative file of dtsconfig file for SSIS in SQL JOBS

I am working with ssis package. and i developed some packages with the configuration file. In the configuration file i given the all relative paths of the connections like FlatFileConnection, OLEDB Connection etc. it is running fine with dtexec command with /config configuration file path.
But now i need to create a sql job to run this ssis package automatically every day. In this case i want to give the relative path of the config file(I dont want to hard code to the path of configuration file. it should take from the same path of the .dtsx file path) . how can i do this?
Thanks in advance.
Eshwer
You can’t!
You must have the path for the config file on your job step.
If you had it configured somewhere else you would need a configuration to show where the cofig file is. Doesn’t make sense, right?
In SQL Server Management Studio, highlight SQL Server Agent -> Start. Highlight Job ->New Job…, name it , myJob.
Under Steps, New Step, name it, Step1,
Type: SQL Server Integration Service Package
Run as: myProxy
Package source: File System
Browse to select your package file xxx.dtsx
Click Ok
Schedule your job and enable it.........