MSBUILD Executing only Changed SQL Scripts - msbuild

I need to construct an MSBUILD script executes .SQL Scripts which have changed since last build.
I initially thought that I could copy all scripts from one folder to another using the <Copy> task and using the CopiedFiles <Output> for the copy task. However the copy task returns All files that it Attempted to copy, not actual copied files.
I am able to get MSBUILD to execute SQL Scripts via MSBUILD.ExtensionPack but Im scratching my head on this one

You can do this with a concept known as incremental building. The idea is that you would create a target and then specify the inputs and outputs, which would be files. MSBuild will compare the timestamps of the input files to the output files. If all outputs were created after all outputs then the target is skipped. If all inputs are newer then all the target will be executed for all files. If only a portion are out of date, then only those will be passed to the target. For more info on this see the section Using Incremental Builds in my article Best Practices For Creating Reliable Builds, Part 2.
Also for more resources on MSBuild I have compiled a list at http://sedotech.com/Resources#MSBuild

<Project xmlns="http://schemas.microsoft.com/developer/msbuild/2003" DefaultTargets="RunScripts">
<Import Project="$(MSBuildExtensionsPath)\ExtensionPack\MSBuild.ExtensionPack.tasks"/>
<PropertyGroup>
<ConnStr>Server=Example;Database=Example;Trusted_Connection=True</ConnStr>
<BuildFolder>Build\</BuildFolder>
</PropertyGroup>
<ItemGroup>
<Scripts Include="*.sql"/>
</ItemGroup>
<Target Name="RunScripts"
Inputs="#(Scripts)"
Outputs="#(Scripts->'$(BuildFolder)%(Filename)%(Extension)')">
<SqlExecute TaskAction="ExecuteScalar"
Files="#(Scripts)"
ConnectionString="$(ConnStr)"/>
<Copy SourceFiles="#(Scripts)"
DestinationFiles="#(Scripts->'$(BuildFolder)%(Filename)%(Extension)')"/>
</Target>
</Project>

Could it be that you copying into an empty destination?
SkipUnchangedFiles
If true, skips the copying of files that are unchanged
between the source and destination. The Copy task considers
files to be unchanged if they have the same size and the
same last modified time.
In your case i suspect that all files are considered changed since they don't exist at the destination.

Related

How to add an item only when an incrementally executed target is run?

Say we have the following MSBuild project that defines a target which can be partially run:
<Project DefaultTargets="Foo">
<ItemGroup>
<MyInputs Include="**/*.json"/>
</ItemGroup>
<Target Name="Foo"
Condition="'#(MyInputs)' != ''"
Inputs="#(MyInputs)"
Outputs="#(MyInputs->'%(FileName).cs')">
<MyCustomTask FileToProcess="%(MyInputs.Identity)"/>
<ItemGroup>
<Compile Include="%(MyInputs.FileName).cs"/>
</ItemGroup>
</Target>
</Project>
The problem is that all items are included into ProcessedFiles; even these whose respective MyCustomTasks are not run, due to incremental building. Apparently, MSBuild always processes ItemGroups inside targets.
Is there a way to add an item inside a target, only when the respective target batch is run? I tried using CreateItem, because it is a task and might not get executed just like MyCustomTask, but it didn't work.
My specific problem was that when the source files had already existed, they were included twice in the Compile item, which raised a warning. It was then when I learned about the KeepDuplicates attribute that saved me.
But the question still stands.

How to use MSBuild transform when ItemGroup files all have identical names?

I have a bunch of files x.txt in various directories throughout my project. As part of my build step, I would like to collect these and place them in a single folder (without needing to declare each one). I can detect them by using:
<ItemGroup>
<MyFiles Include="$(SRCROOT)\**\x.txt"/>
</ItemGroup>
However, if I copy these to a single folder - they all overwrite each other. I have tried using a transform where I append a GUID to each file name, but the GUID is only created once, and then re-used for each transform (thus they overwrite each over). Is there a way of generating unique names in MSBuild when copying an ItemGroup with identically named files? The end naming scheme is not important, as long as all the files end up in that folder.
The transform works but you have to 'force' it to generate new data on each iteration. It took me a while to figure that out and it makes sense now but I couldn't find any documentation explaining this. But it works simply by referencing other existing metadata: msbuild sees that has to be evaluated on every iteration so it will happily evaluate anything part of the new metadata. Example, simply using %(FileName):
<Target Name="CreateUniqueNames">
<ItemGroup>
<MyFiles Include="$(SRCROOT)\**\x.txt"/>
<MyFiles>
<Dest>%(Filename)$([System.Guid]::NewGuid())%(FileName)</Dest>
</MyFiles>
</ItemGroup>
<Message Text="%(MyFiles.Identity) -> %(MyFiles.Dest)"/>
</Target>
Alternatively you can make use of the unique metadata you already have namely RecursiveDir:
<Target Name="CreateUniqueNames">
<ItemGroup>
<MyFiles Include="$(SRCROOT)\**\x.txt"/>
<MyFiles>
<Dest>x_$([System.String]::Copy('%(RecursiveDir)').Replace('\', '_')).txt</Dest>
</MyFiles>
</ItemGroup>
<Message Text="%(MyFiles.Identity) -> %(MyFiles.Dest)"/>
</Target>

Can MSBuild be explicitly instructed that a partial incremental rebuild is possible for a target?

I'm a competent user of GNU make on Unix, tasked with writing a build system using MSBuild on Windows 7. (Cygwin is not an option)
The MSBuild documentation on partial incremental builds states that:
MSBuild attempts to find a 1-to-1 mapping between the values of [the Inputs and Outputs attributes].
[...]
1-to-1 mappings are typically produced by item transformations.
If 1-to-1 mappings are typically produced by item transformations, are there any other ways in which they may they be produced?
The page contains the following example target:
<Target Name="Backup" Inputs="#(Compile)"
Outputs="#(Compile->'$(BackupFolder)%(Identity).bak')">
<Copy SourceFiles="#(Compile)" DestinationFiles=
"#(Compile->'$(BackupFolder)%(Identity).bak')" />
</Target>
It bothers me that the transformation from the #(Compile) Item to the set of .bak files is duplicated. It appears once in the Target's Outputs attribute, and once in the Copy Task's DestinationFiles attribute.
I'd like to be able to specify the transformation just once, preferably declaring an Item for the backup files. I might want to use the same Item in other Tasks and Targets. For example:
<ItemGroup>
<BackupFiles Include="#(Compile->'$(BackupFolder)%(Identity).bak')" />
</ItemGroup>
<Target Name="Backup" Inputs="#(Compile)" Outputs="#(BackupFiles)">
<Copy SourceFiles="#(Compile)" DestinationFiles="#(BackupFiles)" />
</Target>
Then I could use #(BackupFiles) as the Outputs of the Target, the DestinationFiles of the Copy, and potentially elsewhere too. However, if I do this then partial incremental building stops happening. If all outputs are up to date then the target is correctly skipped, but if some outputs are out of date then the target is run for all inputs. It appears that the 1-to-1 mapping between #(Compile) and #(BackupFiles) is not identified when the target is run.
My questions are:
Can MSBuild be instructed that a 1-to-1 mapping exists between a Target's Inputs and Outputs attributes when it has failed to spot one itself, such as #(Compile) and #(BackupFiles) above?
Is there an equivalent to make's $^, $< and $# values. That is; the inputs and outputs of the current target? In particular I'd like to use something like $# as the DestinationFiles of the Copy task. That is; "Whatever you identified as the output; use that".
PS: I have the "Using MSBuild and Team Foundation Build" book (2nd edition), if anybody wishes to refer to it.

Can MSBuild ItemGroup's be chunked?

I've got an ItemGroup that includes source files from my project:
<ItemGroup>
<SourceFiles Include=".\**\*.h;.\**\*.cpp"/>
</ItemGroup>
There are a few hundred source files. I want to pass them to a command line tool in an Exec task.
If I call the command line tool individually for each file:
<Exec Command="tool.exe %(SourceFiles.FullPath)" WorkingDirectory="."/>
Then, it runs very slowly.
If I call the command line tool and pass all of the files in one go:
<Exec Command="tool.exe #(SourceFiles -> '"%(FullPath)"', ' ')" WorkingDirectory="."/>
Then, I get an error if there are too many files (I'm guessing the command line length exceeds some maximum).
Is there a way I can chunk the items so that the tool can be called a number of times, each time passing up to a maximum number of source file names to the tool?
I'm not aware of any mechanism to do that with well known item metadata. What you could do is load all those paths into their own item group and write a custom task that calls the exec task. Writing a custom task is pretty simple, it can be done inline:
http://msdn.microsoft.com/en-us/library/vstudio/dd722601(v=vs.100).aspx

The output parameter 'CopiedFiles' of Copy task is returning all the files specified to copy even if it copies nothing given SkipUnchangedFiles="true"

The CopiedFiles parameter is returning all the files that were intended to be copied. But given the fact that SkipUnchangedFiles is set to true and ttask itself is not copying anything as can be seen on command line (no copying message). Why not, then, CopiedFiles is empty?
I need to have CopiedFiles parameter be populated only with files that were actually copied (because they were changed) in order to further copy these files into some other folder. This is to maintain an up-to-date release folder as well as to extract only those files which actually need to be propogated onto UAT/production server.
For reference sake, the copy task code I'm using is given below:
<Copy SkipUnchangedFiles="true"
SourceFiles="#(cfile)"
DestinationFiles="#(cfile->'$(PublishDir)\%(Identity)')">
<Output
TaskParameter="CopiedFiles"
ItemName="Changed" />
</Copy>
<Message Text="changed:#(Changed)" Importance="high" />
Is there a bug in the copy task or this is the intended behavior.
The behavior you are seeing is by design. MSBuild keeps track of file dependencies using task outputs. If it were to do otherwise, anything that relied on the #(Changed) item array as an input would not fully process all of the files it needed in most cases. It will even keep track of properties and items created within targets that don't even execute when Inputs and Outputs are up-to-date, for the same reason. Consider making a different Copy task of your own with an additional output parameter, CopiedFilesCopiedByTask (this naming mirrors the naming and behavior of the ValueSetByTask in the otherwise defunct CreateProperty task).