This is a feature I'm used to from TeamCity - I could specify that a certain build configuration will be triggered by the success of another build configuration.
I could even pass the results of one build to another - but perhaps this is asking too much.
I'm looking for a similar functionality in TFS2008, is there a way to set a trigger on a build configuration that it shall start after another finished successfully?
I use the following target in my TFSBuild.proj:
Inject the new targets into the build process. We only trigger dependent builds if a "drop" was successfully created:
<PropertyGroup>
<DropBuildDependsOn>
$(DropBuildDependsOn);
CreateDependentBuildItemGroup;
TriggerDependentBuilds;
</DropBuildDependsOn>
</PropertyGroup>
Create a itemgroup that contains a list of the dependent builds we want to trigger (the Include attribute will list the name of the dependent build as it appears in the build explorer - in my case below, the dependant build is called "Integration"). In our build process, we sometimes want to trigger more than one build, and we want to point the next build at the binaries that were produced by the current build (in this example, I want to run Integration tests against the binaries produced). Notice the hack to get around spaces in configuration names - eg "Any CPU" will cause a problem in the MsBuild args. Using this format, we can have custom MSBuild args per dependent build.
<Target Name="CreateDependentBuildItemGroup">
<ItemGroup>
<DependentBuild Include="Integration">
<!--Using 8dot3 format for "Mixed Platforms" as it's tricky (impossible?) to pass a space char within /msbuildarguments of tfsbuild-->
<MsBuildArgs>/p:CallingBuildDropFolder=$(DropLocation)\$(BuildNumber)\Mixedp~1\Ship;CiSmallBuildNumber=$(CiSmallBuildNumber);BuildNumberPostFix=$(BuildNumberPostFix)</MsBuildArgs>
<PriorityArg>/priority:AboveNormal</PriorityArg>
</DependentBuild>
</ItemGroup>
</Target>
Now, trigger the builds. Notice that we use a Custom GetOption: we want to make sure that dependent builds use the same changeset that the current build used - we can't use Latest, cos someone may have checked in in the meantime - so we want all dependent builds in our "chain" to all be based of the same changeset. The actual command is within the Exec, and the BuildStep stuff is to make sure we report the success (or failure) of the Exec.
<Target Name="TriggerDependentBuilds"
Condition=" '$(CompilationStatus)' == 'Succeeded' ">
<BuildStep TeamFoundationServerUrl="$(TeamFoundationServerUrl)"
BuildUri="$(BuildUri)"
Name="TriggerStep"
Message="Triggering Dependent Builds">
<Output TaskParameter="Id"
PropertyName="TriggerStepId" />
</BuildStep>
<PropertyGroup>
<TriggerBuildCommandBase>TfsBuild start $(TeamFoundationServerUrl) $(TeamProject)</TriggerBuildCommandBase>
</PropertyGroup>
<Exec
ContinueOnError="true"
WorkingDirectory="C:\Program Files (x86)\Microsoft Visual Studio 9.0\Common7\IDE"
Command="$(TriggerBuildCommandBase) %(DependentBuild.Identity) /queue /getOption:Custom /customGetVersion:$(GetVersion) %(DependentBuild.PriorityArg) /msbuildarguments:"%(DependentBuild.MsBuildArgs)"">
<Output TaskParameter="ExitCode"
ItemName="TfsBuildResult"/>
</Exec>
<BuildStep Condition="'#(TfsBuildResult)'=='0'"
TeamFoundationServerUrl="$(TeamFoundationServerUrl)"
BuildUri="$(BuildUri)"
Id="$(TriggerStepId)"
Status="Succeeded" />
<BuildStep Condition="'#(TfsBuildResult)'!='0'"
TeamFoundationServerUrl="$(TeamFoundationServerUrl)"
BuildUri="$(BuildUri)"
Id="$(TriggerStepId)"
Status="Failed" />
</Target>
I hope that helps...
Related
I'm using MSBuild SDK style projects with VS 2019. I'm trying to run a custom file generation tool which depends on the output of the build of the current project. The files should be treated as if it was regular content for which CopyToOutputDirectory is set. In dependent projects I expect the files to be part of the output directory as well. The solution I now have works, but not from clean builds, which is obviously not acceptable.
I currently have this in the project file:
<Target Name="Generation" AfterTargets="AfterBuild">
<Exec Command="GeneratedFiles" />
<ItemGroup>
<Content Include="$(TargetDir)\GeneratedFiles.*.xml">
<TargetPath>GeneratedFiles\%(Filename)%(Extension)</TargetPath>
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
</Content>
</ItemGroup>
</Target>
This works, but only for non-clean builds.
The reason the Generation target doesn't work is because the logic that performs the copies, based on the presence and value of the CopyToOutputDirectory metadata, runs as part of the 'CoreBuild'. The Generation target has AfterTargets="AfterBuild". It runs after the build (and the copying) has already been completed.
The Generation target doesn't work for the first build and it doesn't work for subsequent incremental builds either.
The question description says that the files are copied for an incremental build. While it may be true that the files are copied, they can't be getting copied because of the Generation target. Without seeing the complete project file, I assume there is another place in the project where the CopyToOutputDirectory metadata is being set for the files.
To have the Generation target run after compilation and before files are copied, the following can be used:
<Target Name="Generation" AfterTargets="AfterCompile" BeforeTargets="GetCopyFilesToOutputDirectoryItems">
The GetCopyFilesToOutputDirectoryItems is publicly documented and it runs before another publicly documented target named CopyFilesToOutputDirectory. The CopyFilesToOutputDirectory is defined with a DependsOnTargets attribute which runs a set of targets that perform the actually copying. This means that the file copies are completed before the CopyFilesToOutputDirectory target is completed.
To ensure the correct order, BeforeTargets="GetCopyFilesToOutputDirectoryItems" is used and not BeforeTargets="CopyFilesToOutputDirectory".
If the scenario were different and <Exec Command="GeneratedFiles" /> didn't depend on the compilation step, i.e. the GeneratedFiles command didn't use the assembly being created by the project, then the Generation target could occur even earlier, e.g.
<Target Name="Generation" BeforeTargets="BeforeBuild">
Update - Execute a Target if ProjectReference has a specific Project
This update is a response to discussion in the comments.
Let's say we have a FooApp project and a BarLib project and FooApp depends on BarLib and needs to copy arbitrary files from the BarLib project directory.
FooApp has a ProjectReference to BarLib, e.g.
<ItemGroup>
<ProjectReference Include="..\BarLib\BarLib.csproj" />
</ItemGroup>
ProjectReference is an ItemGroup and we can check if it includes BarLib.
<Target Name="CopyFilesFromBar" BeforeTargets="BeforeBuild">
<PropertyGroup>
<!-- The name of the project to look for -->
<BarProjectName>BarLib</BarProjectName>
<!-- Use batching to check the ItemGroup -->
<!-- Save the project's directory because we will need it later -->
<BarProjectDirectory Condition="'%(ProjectReference.Filename)' == '$(BarProjectName)'">%(ProjectReference.Directory)</BarProjectDirectory>
<!-- Set up a boolean that indicates if the project was found or not -->
<HasProjectRefToBar>false</HasProjectRefToBar>
<HasProjectRefToBar Condition="$(FooProjectDirectory) != ''">true</HasProjectRefToBar>
</PropertyGroup>
<!-- Copy if the project was found in ProjectReference -->
<Copy SourceFiles="$(TargetDir)$(BarProjectDirectory)\bin\GeneratedFiles\*.*" DestinationFolder="$(OutputPath)GeneratedFiles" Condition="$(HasProjectRefToBar)" />
</Target>
This target could be defined once in a Directory.Build.targets file and shared across a solution.
If the generated files (in BarLib in the example scenario) don't change based on Configuration and Platform, consider using an output path location that doesn't change as in the example - 'bin\GeneratedFiles'. This makes it much easier for consuming projects. Otherwise keep all the projects in sync with regards to using the same Configuration and Platform values and the same $(OutputPath).
I'm invoking an MSBuild script that isn't a csproj from a bat script. I would like that script to be able to use the MSBuild Community Tasks, and I don't want to have to install it on every machine, nor do I want to include its binaries in my repo.
By adding these nodes to the script and calling the restore target, the package restores.
<Import Project="$(MSBuildToolsPath)\Microsoft.CSharp.targets" />
<ItemGroup>
<PackageReference Include="MSBuildTasks">
<Version>1.*</Version>
</PackageReference>
</ItemGroup>
To use the tasks it contains, I only need to use them. I don't need to import any other targets files:
<Target Name="MyTarget" DependsOnTargets="Restore">
<AssemblyInfo CodeLanguage="CS"
OutputFile="$(VersionInfoFile)"
AssemblyVersion="1.2.3.5"
/>
</Target>
However, the first time I run my script, the package restores, but then the script fails because it can't find the AssemblyInfo task. The second time, it succeeds. Is there any way to get this to work without calling the MSBuild script twice (the first time, specifically running the Restore target)?
You can force a re-evaluation of the imports generated by NuGet by calling the msbuild file from itself using the <MSBuild> task with a different set of global properties (!).
<Target Name="MyTarget" DependsOnTargets="Restore">
<MSBuild Projects="$(MSBuildProject)" Targets="MyTargetCore" Properties="Foo=Bar" />
</Target>
<Target Name="MyTargetCore">
<AssemblyInfo CodeLanguage="CS"
OutputFile="$(VersionInfoFile)"
AssemblyVersion="1.2.3.5"
/>
</Target>
Depending on the circumstances (solution build, project references), it may or may not work without the Properties="Foo=Bar" part.
However, note that this is a bit risky since not all msbuild caches can even be cleared using the arguments on the MSBuild task. MSBuild 15.5 is going to add a /restore switch that will execute the Restore target, clear all necessary caches and then do the other requested work. So in 15.5 you should be able to call msbuild /restore /t:MyTarget without any difficulties.
I'm currently writing an msbuild script to build a solution I've been working on, as well as run its tests. On my development machine, this works as expected. However, when I try to run the same build script on our build server, I get several failures. I've tracked the source of the problem down to the fact that my build script appears to be trying to run the .exe file associated with my application. This line during the script execution tipped me off, since it doesn't run that command on my dev box:
MSIAuthoring:
Building MSI
"C:\Program Files (x86)\Jenkins\workspace\Test Build\BuildArtifacts\MsiBuildTool.exe" "/MBSBUILD:MsiBuildTool"
I'm fairly new to build scripting, but my understanding is that the build script shouldn't be trying to run my program unless I explicitly tell it to do so. Does anyone know what might be causing this?
For reference, here is my build script:
<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="RunTests"
xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<ItemGroup>
<BuildArtifactsDir Include="BuildArtifacts\"/>
<SolutionFile Include="MsiBuildTool.sln"/>
<NUnitConsole Include="C:\Program Files (x86)\NUnit 2.6.4\bin\nunit-console.exe"/>
<UnitTestsDll Include="BuildArtifacts\MsiBuildToolUnitTests.dll"/>
<TestResultsPath Include="BuildArtifacts\TestResults.xml"/>
</ItemGroup>
<PropertyGroup>
<Configuration Condition="'$(Configuration)' == ''">Release</Configuration>
<Platform Condition="'$(Platform)' == ''">Any CPU</Platform>
</PropertyGroup>
<Target Name="Init" DependsOnTargets="Clean">
<MakeDir Directories="#(BuildArtifactsDir)"/>
</Target>
<Target Name="Clean">
<RemoveDir Directories="#(BuildArtifactsDir)"/>
</Target>
<Target Name ="Compile" DependsOnTargets="Init">
<MSBuild Projects="#(SolutionFile)"
Targets ="Build"
Properties ="OutDir=%(BuildArtifactsDir.FullPath);Configuration=$(Configuration);Platform=$(Platform)"/>
</Target>
<Target Name="RunTests" DependsOnTargets="Compile">
<Exec Command='"#(NUnitConsole)" #(UnitTestsDll) /xml=#(TestResultsPath)'/>
</Target>
</Project>
Update:
After some digging through the output, I found that "MSIAuthoring" step was the result of the Wix# library that I'm using. As described by this thread: https://wixsharp.codeplex.com/discussions/644609#
I disabled the MSIAuthoring step by removing this line in my .csproj files:
<Import Project="..\packages\WixSharp.1.0.22.3\build\WixSharp.targets" Condition="Exists('..\packages\WixSharp.1.0.22.3\build\WixSharp.targets')" />
You're building solution file, thus MSBuild will generate msbuild-xml script first and then will build it. To find why it's being called on build machine but not on your dev machine - follow this advice and obtain generated MSBuild scripts from your dev environment and your build server. Then compare it.
Also enable diagnostic logging (/verbosity:diag in the command line) as Lex Li advised, and you'll see detailed decisions why each target being run or not - grep logs for something like "Conditions A, B, C on target BuildMSI evaluated to False" and this will show you the difference between environments.
It might be some type of post-build script on one of the projects which builds MSI only if it's being run not on dev environment - check actual build script to find where it comes from. Also check that it's really related to your build script, and it's not an extra build step in your Jenkins build configuration.
I want to run an MSBuild Task (which signs an executable/dll) but only when the output exe/dll has changed. If none of the source files have changed causing a recompile of the exe/dll then I don't want the task to run.
Despite spending several hours trying different things out I cannot work out how to get my target task to only run if the project has been compiled where the output files have changed (in other words the CoreCompile target was not skipped I think).
You can just do this:
<PropertyGroup>
<TargetsTriggeredByCompilation>DoStuffWithNewlyCompiledAssembly</TargetsTriggeredByCompilation>
</PropertyGroup>
This works because someone smart at Microsoft added the following line at the end of the CoreCompile target in Microsoft.[CSharp|VisualBasic][.Core].targets (the file name depends on the language and MSBuild/Visual Studio version).
<CallTarget Targets="$(TargetsTriggeredByCompilation)" Condition="'$(TargetsTriggeredByCompilation)' != ''"/>
So if you specify a target name in the TargetsTriggeredByCompilation property, your target will run if CoreCompile runs-- and your target will not run if CoreCompile is skipped (e.g. because the output assembly is already up-to-date with respect to the code).
Should be the same as this answer, using the TargetOutputs parameter::
<MSBuild Projects="File.sln" >
<Output TaskParameter="TargetOutputs" ItemName="AssembliesBuiltByChildProjects" />
</MSBuild>
<Message Text="Assemblies built: #(AssembliesBuiltByChildProjects)" /> <!-- just for debug -->
<CallTarget Targets="SignExe" Condition="'#(AssembliesBuiltByChildProjects)'!=''" />
Here's what I'm trying to do:
A single build script
That script builds two executables from the same Visual Studio project.
The first compiled .exe has a small amount of code disabled.
The other compiled .exe has everything enabled.
I've been reading up on conditional compilation and that takes care of my needs as far as enabling/disabling blocks of code.
I just can't figure out how to control conditional compilation from a build script using msbuild.
Is there a way to manipulate conditional compilation variables from a build script or some other way to accomplish what I'm trying to do?
Use build configurations in your project file. Set the parameters in a PropertyGroup that is optionally included based on the configuration. The configuration can then also define the output path for the two different versions of the assembly.
For the version that needs to remove some code use a configuration that includes the PropertyGroup.
<PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'CompiledOutDebug|AnyCPU' ">
<DefineConstants>$(DefineConstants);MY_CONDITIONAL_COMPILATION_CONSTANT</DefineConstants>
</PropertyGroup>
Then use an MSBuild script that calls the project MSBuild script twice and uses the Properties attribute of the MSBuild task to specify the configuration to build:
<Target Name="Build">
<MSBuild Projects="MyProject.csproj;"
Targets="Build"
Properties="Configuration=Release" />
<MSBuild Projects="MyProject.csproj"
Targets="Build"
Properties="Configuration=CompiledOutDebug" />
</Target>
Hamish beat me to it.
Here's an alternate solution using the same concepts:
At the command line:
msbuild -t:Clean
msbuild
CopyOutputDirForWithoutDefine.cmd
msbuild -t:Clean
msbuild -property:DefineConstants=MY_CONDITIONAL_COMPILE_CONSTANT
CopyOutputDirForWithDefine.cmd
The 1st and 3rd 'msbuild -t:Clean' ensures that you don't have left over turds from previous builds. The 2nd 'msbuild' builds without the conditional define, while the 4rth builds with the conditional define.
If the above are just a couple on shot items, then a batch file maybe enough. I recommend learning a bit of MSBuild and actually scripting everything in a MSBuild file as Hamish has done.
If you don't want to create a separate target for the two compilations, you can do it by specifying the conditional define in the DefineConstants property when you call the build the second time:
<Target Name="Build">
<MSBuild Projects="MyProject.csproj;"
Targets="Build"
Properties="Configuration=Debug" />
<MSBuild Projects="MyProject.csproj"
Targets="Build"
Properties="Configuration=Debug;
AssemblyName=$(AssemblyName)_Conditional;
DefineConstants=$(DefineConstants);CONDITIONAL_DEFINE" />
</Target>
Note that if you do it this way, you need to also overwrite the AssemblyName, otherwise your second build might pick intermediate files from the first build.
You should also look at the MSBuild task docs on MSDN, there are some interesting tidbits there.