Set ClickOnce ApplicationVersion and MinimumRequiredVersion to date in VSTS Build - msbuild

I have a visual studio build step in a CI build that creates the clickonce files of a desktop application using the MSBuild arguments below:
/target:publish /p:ApplicationVersion=$(Year:yyyy).$(Month).$(DayOfMonth).$(Build.BuildId) /p:MinimumRequiredVersion=$(Year:yyyy).$(Month).$(DayOfMonth).$(Build.BuildId) /p:InstallUrl=$(InstallUrl)
The $(Build.BuildId) and $(InstallUrl) variables get replaced with their correct values but the $(Year:yyyy), $(Month) and $(DayOfMonth) variables do not get replaced. I am using the same variables to set the Build number format on the General tab and they get replaced correctly. Is it not possible to use the date based variables in a build step in VSTS?
Edit: It appears using $(Build.BuildNumber) would work but I like to include the build definition name in the build number format, which obviously won't work for the version.

$(Year:yyyy), $(Month), $(DayOfMonth) are tokens you could use only in the Build Number Format field, not anywhere else.
I would suggest you to create yourself those variables on the fly, leveraging the following script run by the PowerShell task (with an Inline Script) just before your Visual Studio Build task:
$date = get-date
write-host "##vso[task.setvariable variable=Year;]$(($date).year)"
write-host "##vso[task.setvariable variable=Month;]$(($date).month)"
write-host "##vso[task.setvariable variable=Day;]$(($date).day)"
Then you could use $(Year), $(Month) and $(Day) in place of the tokens you currently use as additional MSBuild arguments.

Related

Selenium How to add ability to control what TestCases to run In VSTS - Visual Test Studio in release definition

I need to ability to externally control what test cases to run on nightly build in VSTS-azure for selenium UI tests.
I want to be able to skip specific Tests from multiple TestFixtures on the release task.
Currently I am using the Test filter criteria option from the Visual Studio Test Task from the pipeline where I can pass individual Test Name to be executed like "Name = Test1, Test3, Test8". Also I can use the Category Tag instead of Name. But way Limits me use
Name = Test1, Test3, Test8 || Category = Category2 || Priority = P4
I wand to control which to execute and which to exclude from my nightly run t from my testsuite which contains around 200 tests.
Is this Possible can we add some file where I can have a switch to control the run???
Thanks.
I don't think there is some file that we can config which tests to include or exclude.
But it might be achieved with powershell.You can have a try with below.
If you are familiar with powershell and dotnet vstest commandline. You can write powershell script to run the tests(You may need to checkin this script file to your repo). You can define different script files with different test filters. So each time when you run your pipeline you can just select the corresponding script file to run.
And aslo in the pipeline add a powershell task to run the tests instead of Visual Studio Test Task.
Below powershell script and powershell task are only for reference:
param(
[string]$source,
[string]$repos = "SelenNunit",
[string]$config
)
$path = $source +"\" +$repos+ "\"+ $repos+ "\"+ "\bin\" +$config
dotnet vstest $path\SelenNunit.exe /TestCaseFilter:"name=test1" /logger:"trx"

Updating a appsettings.json file from Teamcity

I have a appsettings.json file in my dotnet core project. From Teamcity I need to update the content of the file. File looks something like this
{
"keyofjson":"valuetobeupdated"
}
I need to update the text "valuetobeupdated" based on "keyofjson".
In traditional .net projects we had .config file which could be updated using msbuild file based using the target from xmlpeek and xmlpoke tasks. Do we have something same in dotnet core?
In step build use PowerShell script:
$filePath = "yourPath"
(GC $filePath).Replace("valuetobeupdated", "keyofjson") | Set-Content $filePath
You've not specified that this value is different on a build-by-build basis, if the value you want to change is the same every time you can simply add a build feature and select 'File content replacer' and find and replace 'valuetobeupdated' with the value you want.
If the value is a value that is calculated during the build then you will need to use a build step, e.g. powershell/commandline step to do the said find and replace.

How can I conditionally include large scripts in my ssdt post deployment script?

In our SSDT project we have a script that is huge and contains a lot of INSERT statements for importing data from an old system. Using sqlcmd variables, I'd like to be able to conditionally include the file into the post deployment script.
We're currently using the :r syntax which includes the script inline:
IF '$(ImportData)' = 'true'
BEGIN
:r .\Import\OldSystem.sql
END
This is a problem because the script is being included inline regardless of whether $(ImportData) is true or false and the file is so big that it's slowing the build down by about 15 minutes.
Is there another way to conditionally include this script file so it doesn't slow down the build?
Rather than muddy up my prior answer with another. There is a special case with a VERY simple option.
Create separate SQLCMD input files for each execution possibility.
The key here is to name the execution input files using the value of your control variable.
So, for example, your publish script defines variable 'Config' which may have one of these values: 'Dev','QA', or 'Prod'.
Create 3 post deployment scripts named 'DevPostDeploy.sql', 'QAPostDeploy.sql' and 'ProdPostDeploy.sql'.
Code your actual post deploy file like this:
:r ."\"$(Config)PostDeploy.sql
This is very much like the build event mechanism where you overwrite scripts with appropriate ones except you don't need a build event. But you are dependent upon naming your scripts very specifically.
The scripts referenced using :r are always included. You have a couple of options but I would first verify that if you take the script out it improves the performance to where you want it to get to.
The simplest approach is to just keep it outside of the whole build process and change your deploy process so it becomes a two step thing (deploy DAC then deploy script). The positives of this are you can do things outside of the ssdt process but the negatives are you don't get things like auto disabling of constraints on tables changing in the deployment.
The second way is to not include the script in the deploy when you build but create an AfterBuild msbuild task that adds the script as a post deploy script in the dacpac. The dacpac is a zip file so you can use the .net packaging Api to add a part called postdeploy.sql which will then be included in the deployment process.
Both of these ways mean you lose verification so you might want to keep it in a separate ssdt project which has a "same database" reference to your main project, it will slow down the build when it changes but should be quick the rest of the time.
Here is the way I had to do it.
1) Create a dummy post-deploy script.
2) Create build configurations in your project for each deploy scenario.
3) Use a pre-build event to determine which post deploy configuration to use.
You can either create separate scripts for each configuration or dynamically build the post-deploy script in your pre-build event. Either way you base what you do on the value of $(configuration) which always exists in a build event.
If you use separate static scripts, your build event only needs to copy the appropriate static file, overwriting the dummy post-deploy with whichever script is useful in that deploy scenario.
In my case I had to use dynamic generation because the decision about which scripts to include required knowing the current state of the database being deployed to. So I used the configuration variable to tell me which environment was being deployed to and then used an SQLCMD script with :OUT set to my Post-Deploy script location. Thus my pre-build script would then write the post-deploy script dynamically.
Either way, once build completed and the normal deploy process started the Post-Deploy script contained exactly the :r commands that I wanted.
Here's an example of the SQLCMD script I invoke in pre-build.
:OUT .\Script.DynamicPostDeployment.sql
PRINT ' /*';
PRINT ' DO NOT MANUALLY MODIFY THIS SCRIPT. ';
PRINT ' ';
PRINT ' It is overwritten during build. ';
PRINT ' Content IS based on the Configuration variable (Debug, Dev, Sit, UAT, Release...) ';
PRINT ' ';
PRINT ' Modify Script.PostDeployment.sql to effect changes in executable content. ';
PRINT ' */';
PRINT 'PRINT ''PostDeployment script starting at''+CAST(GETDATE() AS nvarchar)+'' with Configuration = $(Configuration)'';';
PRINT 'GO';
IF '$(Configuration)' IN ('Debug','Dev','Sit')
BEGIN
IF (SELECT IsNeeded FROM rESxStage.StageRebuildNeeded)=1
BEGIN
-- These get a GO statement after every file because most are really HUGE
PRINT 'PRINT ''ETL data was needed and started at''+CAST(GETDATE() AS nvarchar);';
PRINT ' ';
PRINT 'EXEC iESxETL.DeleteAllSchemaData ''pExternalETL'';';
PRINT 'GO';
PRINT ':r .\PopulateExternalData.sql ';
....
I ended up using a mixture of our build tool (Jenkins) and SSDT to accomplish this. This is what I did:
Added a build step to each environment-specific Jenkins job that writes to a text file. I either write a SQLCMD command that includes the import file or else I leave it blank depending on the build parameters the user chooses.
Include the new text file in the Post Deployment script via :r.
That's it! I also use this same approach to choose which pre and post deploy scripts to include in the project based on the application version, except that I grab the version number from the code and write it to the file using a pre-build event in VS instead of in the build tool. (I also added the text file name to .gitignore so it doesn't get committed)

TFS 2015 Can build variables access other build variables?

When I define a custom variable in the new TFS 2015 team build as follows:
Name: SomeOutput
Value: $(System.DefaultWorkingDirectory)\Some
...it doesn't seems to expand $(System.DefaultWorkingDirectory).
Is there a way around this?
EDIT:
At least it seems it's not expanded everywhere.
For example, in MSBuild-Arguments, /p:OUTPUT="$(SomeOutput)" is expanded to /p:OUTPUT="C:\TfsData\BuildAgents\_work\3\s\Some" but when i add a cmd line build task with tool set to cmd and parameter set to /k set, it prints
SOMEOUTPUT=$(System.DefaultWorkingDirectory)\Some
EDIT 2:
Here are my variables
This is my workflow step
And this is what the build prints
You can use the VSTS Variable Tasks extension from the Visual Studio Marketplace.
When you define a variable in the Variables screen and use other variables as value, they won't be expanded (as you may have expected). Instead the literal text is passed to the tasks in the workflow. Without this little task the following configuration won't work:
Variable Value
Build.DropLocation \\share\drops\$(Build.DefinitionName)\$(Build.BuildNumber)
By adding the Expand variable(s) task to the top of your workflow, it will take care of the expansion, so any task below it will receive the value you're after.
https://github.com/jessehouwing/vsts-variable-tasks/wiki/Expand-Variable
PS: The new agent (version 2.x) auto-expands variables now.
It can be achieved.
You may need use % % instead of $ to call the variables in cmd to print the result. It is also necessary to add call in the front of the command. Here is a simple example:
Note: System.DefaultWorkingDirectory is not available in cmd (not sure why); you need use System_DefaultWorkingDirectory instead. Details can be viewed in the logs.
I had the same problem - wanted to piece together a path made up of several built-in variables and pass it to a PS script.
Workaround:
I ended up combining the variables in the actual script through the corresponding generated environment variables (for example $env:BUILD_SOURCESDIRECTORY).
Not what I had in mind originally, but it works at least. Drawback - if I need to change the path, I always have to change the PS script instead of a build variable.

How to provide vsdbcmd deploy command line target dbschema sql command variables?

The Visual Studio (2010) gui provides options for specifying second command variable file for target. I however cant find this option for the command line implementation - vsdbcmd.exe.
Running vsdbcmd deploy for dbschema to dbschema with only source model command variables given results that objects that implement the variables are treated as having changes. Resulting in incorrect(improper) update script.
The command i use currently:
vsdbcmd.exe /a:deploy /dd:- /dsp:sql /model:Source.dbschema /targetmodelfile:Target.dbschema /p:SqlCommandVariablesFile=Database.sqlcmdvars /manifest:Database.deploymanifest /DeploymentScriptFile:UpdateScript.sql /p:TargetDatabase="DatabaseName"
What im looking for is the /p:TargetSqlCommandVariablesFile, if such thing exists ...
The result script is the same as running so GUI compare without specifying the sqlcmd vars for target
I found what looks like full documentation for VSDBCMD.EXE at this link.
I think you may be looking for something like:
/p:SqlCommandVariablesFile=Filepath
In the end i found no info on the possibility to do what I required - checked vsdbcmd libs with IL spy for hidden parameters - didn't find any.
Reached my goal by parsing the dbschema files for both target and current and parsing the cmd variable values directly into them - then doing the compare on modified dbschemas. This approach no longer allows to change sql cmd vars in resulting script (as the values are already baked into code), however this was deemed as acceptable loss.
Not the most beautiful solution but so far i have had no issues with it.