Using msbuild, how can I build once and then publish with multiple transform profiles? - msbuild

I have a .net solution that I can build with msbuild and successfully generate a deploy package with a PublishProfile.pubxml transform for deploying to a single web server.
I need to take that build and generate deploy packages for different environments, which are set up as transforms using various .pubxml profile files.
I know I could build each profile separately, and it'd be negligible risk of injecting a change into a build, but it's time and space that aren't necessary to consume. I would only end up keeping one of them anyway, and just copying the unique web.configs from each transform into their own deploy package's folder (sorry if this isn't clear, happy to clarify).
I'm looking for something like this pseudocode, which I know is syntactically incorrect but should get the point across:
// step 1
msbuild myapp.csproj target=Build
// step 2
foreach ($profile in $profileList) {
msbuild myapp.csproj outputdir="releaseDir/$profile" target=Publish publishProfile=$profile
}
I am working in a Jenkins context and can't use Visual Studio functions unless they're available as command-line options.
I've also tried desperately to get msdeploy to work, which I think might simplify some of this, but for the life of me I can't get it to work with the tempAgent option, which is really the only way I want to go in my environment. Details on that particular struggle here.
Thanks in advance!
Update 1: This post works well for me, but as I say in a comment, it's still creating multiple deploy packages that take up a ton of space and aren't necessary. What I'd really like is a way to use a single build artifact and create multiple transforms of the Web.config files. Then on deploy, copy that build artifact and the appropriate config file. That would really be build-once/deploy-many!
FWIW, this is the code i came up with using the link above:
function Initialize-Build {
$ErrorActionPreference = 'Stop'
$appName = 'myApp'
$projsToBuild = 'Ui', 'DataService'
$projPath = "$appName/$appName"
$buildPath = 'obj/Release/Package/'
$releasePath = 'Release-Packages'
$commonArgs = '/p:Configuration=Release',
'/nologo',
'/verbosity:quiet'
}
function Invoke-Build {
foreach ($proj in $projsToBuild) {
$projName = $proj -eq 'Ui' ? 'WebUi' : $proj
$p = "$projPath.$proj/$appName.$projName.csproj"
$buildArgs = '/t:Clean,Build',
'/p:AutoParameterizationWebConfigConnectionStrings=False'
Write-Output "Building $p"
msbuild $p #commonArgs #buildArgs
if ($LASTEXITCODE) { exit 1 }
}
}
function Invoke-Transform {
$xformArgs = "/p:deployOnBuild=True", "/p:Targets=Publish"
foreach ($proj in $projsToBuild) {
$p = "$projPath.$proj/$appName.$projName.csproj"
$pubProfiles = 'Dev', 'QA', 'UAT', 'Prod'
foreach ($prof in $pubProfiles) {
Write-Output "Building $p ($prof)"
$pubProfileArg = "/p:PublishProfile=$prof"
msbuild $p #commonArgs #xformArgs $pubProfileArg
if ($LASTEXITCODE) { exit 1 }
}
}
}
. Initialize-PrivLogBuild
Invoke-Build
Invoke-Transform

I realized I wasn't really asking the right question. When I started searching for msbuild transform posts, I found a way to do what I need.
I landed on updating the .csproj files of the apps I'm building with an AfterBuild target.
There are 4 transforms required, each with their own .config file as the transform source. I was fortunate that these files had already been created by the application developer.
This is the code I ended up with, placed at the end of the .csproj file, inside the </project> tag. To reduce repetition of paths and filenames, I created configDir and xformFile properties. I like this pattern because it's easily scalable and generic!
<!-- all the rest of the .csproj above this -->
<UsingTask TaskName="TransformXml"
AssemblyFile="$(MSBuildExtensionsPath32)/Microsoft/VisualStudio/v16.0/Web/Microsoft.Web.Publishing.Tasks.dll" />
<Target Name="AfterBuild">
<PropertyGroup>
<configDir>../../Release-Packages/configs</configDir>
<xformFile>Web.config</xformFile>
</PropertyGroup>
<MakeDir Directories="$(configDir)" />
<TransformXml Source="$(xformFile)"
Transform="Web.Dev.config"
Destination="$(configDir)/$(AssemblyName).$(xformFile).DEV" />
<TransformXml Source="$(xformFile)"
Transform="Web.QA.config"
Destination="$(configDir)/$(AssemblyName).$(xformFile).QA" />
<TransformXml Source="$(xformFile)"
Transform="Web.Prod.config"
Destination="$(configDir)/$(AssemblyName).$(xformFile).Prod" />
<TransformXml Source="$(xformFile)"
Transform="Web.UAT.config"
Destination="$(configDir)/$(AssemblyName).$(xformFile).UAT" />
</Target>
</Project>
many thanks to these posts for lighting the way to this solution
https://dougrathbone.com/blog/2011/09/14/using-custom-webconfig-transformations-in-msbuild
How do I use custom variables in MSBuild scripts?
https://github.com/sayedihashimi/sayed-samples/blob/master/TransformMultipleWebConfigs/transform.proj
https://www.locktar.nl/general/use-config-transforms-when-debugging-your-web-application/
https://johan.driessen.se/posts/Applying-MSBuild-Config-Transformations-to-any-config-file-without-using-any-Visual-Studio-extensions/
Package multiple publish profiles .pubxml after a successful build in Jenkins CI
https://bartlomiejmucha.com/en/blog/msbuild/how-to-extend-msbuild-publish-pipeline-to-apply-transform-files/

Related

How to include proto files from one project in another project

I am not able to include protos present in Project A in a project B. The idea would be to have the protos with GrpcServices="Server" in project A and in project B, of tests, include the same protos but, now, as GrpcServices="Client"
ProjectA/Protos/Profile.proto
syntax = "proto3";
package profile;
option csharp_namespace = "ProjectA.Protos";
import "google/protobuf/empty.proto";
service ProfileService {
rpc Get(google.protobuf.Empty) returns (Profile);
}
message Profile {
string profile_id = 1;
string description = 2;
}
ProjectA/Protos/User.proto
syntax = "proto3";
package user;
option csharp_namespace = "ProjectA.Protos";
import "google/protobuf/wrappers.proto";
import "Protos/Profile.proto";
service UserService {
rpc Get(google.protobuf.StringValue) returns (UserDetail);
}
message UserDetail {
string id = 1;
string name = 2;
repeated profile .Profile profiles = 7;
}
Project B .csproj (The test project)
<ItemGroup>
<Protobuf Include="..\ProjectA\Protos\*.proto" GrpcServices="Client" ProtoRoot="Protos">
<Link>Protos\*.proto</Link>
</Protobuf>
</ItemGroup>
With these settings I always end up having this error return
error : File does not reside within any path specified using --proto_path (or -I). You must specify a --proto_path which encompasses this file. Note that the proto_path must be an exact prefix of the .proto file names -- protoc is too dumb to figure out when two paths (e.g. absolute and relative) are equivalent (it's harder than you think).
Bringing a return on the applied solution.
The problem that occurs when trying to do this import in the same solution is that each inclusion in a .csproj it will try to generate C# classes and these classes are generated with "global context" and this prevents the generation of "server" and "client" classes because they would have the same names.
The easiest solution was to add Both to the proto file's include configuration. Thus, the generated C# classes already contemplate the client and server side, so it is possible to use them both in project A (server) and in project B (client).
<Protobuf Include="Protos\*.proto" GrpcServices="Both" />
The solution presented by Jan Tattermusch, in the comments, would have a project C with only the protos files, would follow the same configuration and is also a great alternative.

How to disable default gradle buildType suffix (-release, -debug)

I migrated a 3rd-party tool's gradle.build configs, so it uses android gradle plugin 3.5.3 and gradle 5.4.1.
The build goes all smoothly, but when I'm trying to make an .aab archive, things got broken because the toolchain expects the output .aab file to be named MyApplicationId.aab, but the new gradle defaults to output MyApplicationId-release.aab, with the buildType suffix which wasn't there.
I tried to search for a solution, but documentations about product flavors are mostly about adding suffix. How do I prevent the default "-release" suffix to be added? There wasn't any product flavor blocks in the toolchain's gradle config files.
I realzed that I have to create custom tasks after reading other questions and answers:
How to change the generated filename for App Bundles with Gradle?
Renaming applicationVariants.outputs' outputFileName does not work because those are for .apks.
I'm using Gradle 5.4.1 so my Copy task syntax reference is here.
I don't quite understand where the "app.aab" name string came from, so I defined my own aabFile name string to match my toolchain's output.
I don't care about the source file so it's not deleted by another delete task.
Also my toolchain seems to be removing unknown variables surrounded by "${}" so I had to work around ${buildDir} and ${flavor} by omitting the brackets and using concatenation for proper delimiting.
tasks.whenTaskAdded { task ->
if (task.name.startsWith("bundle")) { // e.g: buildRelease
def renameTaskName = "rename${task.name.capitalize()}Aab" // renameBundleReleaseAab
def flavorSuffix = task.name.substring("bundle".length()).uncapitalize() // "release"
tasks.create(renameTaskName, Copy) {
def path = "$buildDir/outputs/bundle/" + "$flavorSuffix/"
def aabFile = "${android.defaultConfig.applicationId}-" + "$flavorSuffix" + ".aab"
from(path) {
include aabFile
rename aabFile, "${android.defaultConfig.applicationId}.aab"
}
into path
}
task.finalizedBy(renameTaskName)
}
}
As the original answer said: This will add more tasks than necessary, but those tasks will be skipped since they don't match any folder.
e.g.
Task :app:renameBundleReleaseResourcesAab NO-SOURCE

How do I pass MSBuild arguments from the build definition to a MSBuild workflow Activity

I've defined an MSBuild activity in a TFS Workflow template, but currently I have to hard code the 'property' command line arguments directly into the template.
I'd like to be able to specify the arguments in the build definition, via the advanced setting, 'MSBuild Arguments'
I can see that I may have to build up the command line with string replace/concat, as mentioned here, but I can't see what I need to put, maybe something like this:
This is what the default MsBuild task uses:
String.Format("/p:SkipInvalidConfigurations=true {0}", MSBuildArguments)
You can change the MSBuildArguments variable in the build process template in multiple steps. For example, I added a Run Architecture Validation property to the process template and then edited the workflow to simply append /ValidateArchitecture=true to the MSBuildArguments before they're being passed to the MsBuild activity.
<If Condition="[PerformArchitectureValidation]" DisplayName="Configure Architecture Validation MSBuild Arguments">
<If.Then>
<Assign>
<Assign.To>
<OutArgument x:TypeArguments="x:String">[MSBuildArguments]</OutArgument>
</Assign.To>
<Assign.Value>
<InArgument x:TypeArguments="x:String">[MSBuildArguments + " /p:ValidateArchitecture=true"]</InArgument>
</Assign.Value>
</Assign>
</If.Then>
</If>
The PerformArchitectureValidation variable is defined as a Property on the Build Process Template level of type Boolean.
Update: Wrote a blogpost that explains this with steps and screenshots

How can I get the exit code of the covered executable from the DotCover console runner?

My organization is working on integrating DotCover's console runner (described here and here) into our MSBuild-based build process via a custom MSBuild task.
As you might expect, we are covering NUnit runs over our unit test assemblies. While we are quite happy with the coverage results DotCover is generating, we've discovered that our tests can now fail without causing our build to fail. One step forward, two steps back.
DotCover (at least the way we are running it) completely hides the results of the covered process, both the console output and the exit code. I wish it behaved more like NCover in this regard -- echoing all output and the exit code from the covered process.
Does anyone know how to achieve either of these with the DotCover console runner? Getting the exit code of the covered process is most important, as we need our builds to fail in the event of a test failure.
We are using MSBuild and MSBuild Community Tasks to fail the build.
You can analyze the dotCover generated output xml file using the XmlRead task of MSBuild.CommunityTasks.
<Target Name="DetermineCoverage">
<Message Text="==================================================" />
<Message Text="Determine Unit Test Coverage" />
<XmlRead
XPath="/Root/#CoveragePercent"
XmlFileName="dotCoverOutput.xml">
<Output TaskParameter="Value" PropertyName="CoveragePercent"/>
</XmlRead>
<Message Text="==" />
<Message Text="== Coverage Percentage $(CoveragePercent)" />
<Error
Text="Unit Test coverage did not exceed the desired threshold"
Condition="$(CoveragePercent) < 90" />
<Message Text="==================================================" />
</Target>
I ran into the same issue.
An alternative workaround is to run your unit tests with your test runner outside of dotCover first. This will expose the correct return codes.
And then as a second step, run dotCover to get your coverage results.
Hope this helps somebody.
Since I needed a workaround quickly, I've added a post-processing step to scrape the NUnit xml file generated during the DotCover run for test case failures, and I fail the build if I find any. I did this via a simple custom MSBuild Task:
public class CheckNUnitResults : Task
{
[Required]
public string ResultFile { get; set; }
public override bool Execute()
{
Log.LogMessageFromText("Analyzing nunit results : " + ResultFile, MessageImportance.Normal);
var failedCases =
XDocument
.Load(ResultFile)
.Descendants("test-case")
.Where(xe => xe.Attribute("success").Value.Equals("False"));
var fail = failedCases.Any();
if (fail)
{
Log.LogError("Found test case failures : " +
string.Join(", ", failedCases
.Select(xe => xe.Attribute("name").Value)
.ToArray()));
}
return !fail;
}
}
This is called from MSBuild thusly:
<CheckNUnitResults ResultFile="YourNUnitResultFile.xml" />
(I call it in a loop over an ItemGroup containing my result files)

msbuild reference resolution

I've been doing some work recently that analyzes relationships between various projects in source control. So far I've been using PowerShell and XPath through the Select-Xml cmdlet to process our csproj files, however this relies on my tenuous knowledge of how MSBuild uses the ProjectReference and Reference elements in the project files. It dawned on me that it would be much better if I could use MSBuild itself to resolve the references and then somehow inspect the results of the reference resolution process.
MSBuild experts: does this seem possible? Would this entail writing a custom targets file or something? Would I be forced into also building the projects as well since csproj files also import Microsoft.CSharp.targets?
Any insight would be nice. Thanks!
It is really quite easy. First reference these assemblies:
Microsoft.Build
Microsoft.Build.Engine
Microsoft.Build.Framework
Microsoft.Build.Utilities.v4.0
...and you can create some tooling around the MSBuild object model. I've got a custom MSBuild task that does this analysis right in the build, snippet below:
private bool CheckReferences(string projectFullPath)
{
var project = new Project(projectFullPath);
var items = project.GetItems("Reference");
if (items == null)
return true;
foreach (var item in items)
{
if (item == null)
continue;
if (string.IsNullOrWhiteSpace(item.UnevaluatedInclude))
continue;
if (!item.HasMetadata("HintPath"))
continue;
string include = item.UnevaluatedInclude;
string hintPath = item.GetMetadata("HintPath").UnevaluatedValue;
if (!string.IsNullOrWhiteSpace(hintPath))
if (hintPath.Contains(#"C:\") || hintPath.Contains("C:/"))
LogWarning("Absolute path Reference in project {0}", projectFullPath);
}
return true;
}