Azure DevOps SQL DB Deploy - sql

I'm trying to deploy a dacpac with db references to 2 other dbs using Azure DevOps. I'm unable to find the right syntax for mentioning additional arguments for the sqlcmd variables for those dbs. I keep getting 'Unrecognized command line argument' error everytime I trigger deployment. the current syntax I'm using is
/Variables:variable1 = "value1" /Variables:variable2 = "value2"

Follow documentation https://learn.microsoft.com/en-us/sql/tools/sqlpackage?view=sql-server-ver15 and use
SQLCMD Variables
The following table describes the format of the option that you can use to override the value of a SQL command (sqlcmd) variable used during a publish action. The values of variable specified on the command line override other values assigned to the variable (for example, in a publish profile).
Parameter Default Description
/Variables:{PropertyName}={Value} Specifies a name value pair for an action-specific variable; {VariableName}={Value}. The DACPAC file contains the list of valid SQLCMD variables. An error results if a value is not provided for every variable.
/Variables: /v {PropertyName}={Value} Specifies a name value pair for an action-specific variable; {VariableName}={Value}. The DACPAC file contains the list of valid SQLCMD variables. An error results if a value is not provided for every variable.

Related

How do I insert special characters in to Azure table storage with "az storage entity insert"-command?

I have a Powershell script that builds a "az storage entity insert" command dynamically. Basically I have a CSV file that I use to create the content of a table by converting it to a long command it then invokes. It has worked fine until I added a field that contains a Regexp.
I started to get strange "The system cannot find the path specified." errors. Not from accessing the CSV as you would first suspect, but from running the command generated. I found out that some special characters in the field's value breaks the command and it tries to execute what comes after that as some separate command or something.
I made the expression simpler and found that not much characters work. As simple commands as this does not work:
az storage entity insert --table-name table --account-name $StorageAccountName --if-exists replace --connection-string $StorageConnectionString --entity PartitionKey=ABC RowKey=DEF Field="(abc)" Field#odata.type=Edm.String
This causes a different error "Field#odata.type was unexpected at this time."
Also | character causes problems, like:
az storage entity insert --table-name table --account-name $StorageAccountName --if-exists replace --connection-string $StorageConnectionString --entity PartitionKey=ABC RowKey=DEF Field="|abc" Field#odata.type=Edm.String
gives "'abc' is not recognized as an internal or external command, operable program or batch file.
This instead works fine:
az storage entity insert --table-name table --account-name $StorageAccountName --if-exists replace --connection-string $StorageConnectionString --entity PartitionKey=ABC RowKey=DEF Field="abc" Field#odata.type=Edm.String
So why do those special characters break the command and how can fix it? I need both of those characters for the regexp and some others too that won't work.
These errors happen both when I run directly from Powershell as well as in my script that uses Invoke-Expression
I initially thought this had to do with the way that PowerShell handles single quotation marks vs double quotation marks but it turns out that I was only half way there. Octopus Deploy lists several solutions including this with wrapped single quotes:
'"(abc)"'
Here are your original commands and then your commands with the wrapped single quotes around double quotes (where I now instead error out on failing to provide an account name):

Getting error while executing DACPAC file (using sqlpackage.exe)

I am getting below error while executing DACPAC file using SQLPackage.
The column [dbo].[Temp].[GMTOffset] on table [dbo].[Temp] must be added, but the column has no default value and does not allow NULL values. If the table contains data, the ALTER script will not work. To avoid this issue you must either: add a default value to the column, mark it as allowing NULL values, or enable the generation of smart-defaults as a deployment option.
PowerShell script -
& $using:SqlPackagePath /Action:Publish /tu:$using:DatabaseUsername /tp:$using:DatabasePassword
/tsn:$using:ServerInstance /tdn:"$_" /sf:$using:DacpacLocation /p:BlockOnPossibleDataLoss=False
I have set 'Generate smart defaults, when applicable' setting in publish profile of the DB project and execute the PowerShell script after compiling the project, however, still getting this error. Any pointers or help would be appreciated.
This error was resolved after specifying this option on the command line like below as #Peter also mentioned.
& $using:SqlPackagePath /Action:Publish /tu:$using:DatabaseUsername /tp:$using:DatabasePassword /tsn:$using:ServerInstance /tdn:"$_" /sf:$using:DacpacLocation /p:GenerateSmartDefaults=True /p:BlockOnPossibleDataLoss=False

Passing a variable between TFS 2018 Release Task and then use that variable in a gate step

I'm trying to use a variable that I have defined in a release Task and then pull that value into a Gate step. It's just a simple pipeline to demonstrate some integration with ServiceNow.
After a Build, a release is kicked off. The code is automatically copied to the "test" environment.
The Next step is a Power-Shell script that will create a Change Request in ServiceNow. I am able to create the CR and then parse the json that is returned to capture the Change Request Number. I use the following command to save the CR Number in the last line of the Power-Shell:
Write-Host "##vso[task.setvariable variable=crnumber]$crnumber
In the next step I use an extension to save the variables out to a JSON file.
Then for debugging I have a Power-Shell step that reads the CR variable.
It produces this output in the logs
2020-01-16T19:08:05.4137159Z Read variable after
2020-01-16T19:08:05.4138360Z --------------------1--------------------------
2020-01-16T19:08:05.4140620Z CHG0417736
2020-01-16T19:08:05.4141674Z --------------------2-------------------------
2020-01-16T19:08:05.4143096Z CHG0417736
2020-01-16T19:08:05.4144492Z --------------------3-------------------------
2020-01-16T19:08:05.5656992Z ##[section]Finishing: Read CR fron ENV
After that I set up Post Deployment conditions. 1 approval (me), and then another Extension that will query ServiceNow for the "Ready to Implement" State.
Note if I enter the CR Number (CHG0417736) is does work as expected. But I would like to use the $crnumber variable.
What am I missing here? Unfortunately I can't just add the Power-Shell Task here to try and debug the variable

TFS2015 Powershell on Target Machine

I am trying to pass some data to a remote powershell script within the TFS2015 build step.
My step is calling a remote Powershell script on a target machine. I am passing data as script parameters. The following script parameters are what I have defined.
This parameter list works:
-buildVersion $(Build.BuildNumber) -queuedBy $env:USERNAME (but the name is the account running the script)
,but I really want the Build.QueuedBy username to get passed so I have tried:
-queuedBy $(Build.QueuedBy)
….or
-queuedBy $env:BUILD_QUEUEDBY
This does not work. Am I specifying something incorrectly or is there a better way?
I would also like to get some of the Build definition Variables to the remote script as well.
I have displayed the variables available to me with a Command line step running: cmd /k set
In order to get the correct value you need something like this:
$a = Get-Item -Path "Env:BUILD_QUEUEDBY"
$a = $a.Value

How to provide vsdbcmd deploy command line target dbschema sql command variables?

The Visual Studio (2010) gui provides options for specifying second command variable file for target. I however cant find this option for the command line implementation - vsdbcmd.exe.
Running vsdbcmd deploy for dbschema to dbschema with only source model command variables given results that objects that implement the variables are treated as having changes. Resulting in incorrect(improper) update script.
The command i use currently:
vsdbcmd.exe /a:deploy /dd:- /dsp:sql /model:Source.dbschema /targetmodelfile:Target.dbschema /p:SqlCommandVariablesFile=Database.sqlcmdvars /manifest:Database.deploymanifest /DeploymentScriptFile:UpdateScript.sql /p:TargetDatabase="DatabaseName"
What im looking for is the /p:TargetSqlCommandVariablesFile, if such thing exists ...
The result script is the same as running so GUI compare without specifying the sqlcmd vars for target
I found what looks like full documentation for VSDBCMD.EXE at this link.
I think you may be looking for something like:
/p:SqlCommandVariablesFile=Filepath
In the end i found no info on the possibility to do what I required - checked vsdbcmd libs with IL spy for hidden parameters - didn't find any.
Reached my goal by parsing the dbschema files for both target and current and parsing the cmd variable values directly into them - then doing the compare on modified dbschemas. This approach no longer allows to change sql cmd vars in resulting script (as the values are already baked into code), however this was deemed as acceptable loss.
Not the most beautiful solution but so far i have had no issues with it.