Integrating Google Closure Compiler with MS Build on a build server - msbuild

I'm looking into ways of minifying javascript files as part of our CI process, so that we can use the un-minified files in development and have them automatically compressed when deployed to staging and live servers.
This is for an ASP.NET site; we use Hudson as a build server.
I'm intrigued by the Google Closure compiler, and I've come across this .Net MSBuild Google Closure Compiler Task, but it doesn't seem to be very widely used. Are there better options for use with MSBuild, using either Closure or alternative minification tools?

We've been using Closure Compiler for some time now in a .NET-based project.
Initially, we used a simple MSBuild .proj file which directly invoked the Python scripts. For example, we would make deps.js with something like the following:
<PropertyGroup>
<ScriptDirectory>yourprojectname</ScriptDirectory>
<ClosureLibrary>closure</ClosureLibrary>
<CalcDeps>$(ClosureLibrary)\bin\calcdeps.py</CalcDeps>
</PropertyGroup>
<Target Name="Deps">
<Exec Command="$(CalcDeps) -o deps -p $(ScriptDirectory) -d $(ClosureLibrary) --output_file=$(ScriptDirectory)\deps.js" />
</Target>
The actual build was more complex, but still relatively straightforward (assuming you're MSBuild savvy). We simply used different types of item groups for each relevant part of the script invocation.
<Target Name="Build" DependsOnTargets="Init;FindCompiler">
<PropertyGroup Condition="'#(Extern)' != ''">
<Externs>-f --externs=#(Extern, ' -f --externs=')</Externs>
</PropertyGroup>
<PropertyGroup Condition="'#(Define)' != ''">
<Defines>-f --define=#(Define, ' -f --define=')</Defines>
</PropertyGroup>
<PropertyGroup Condition="'#(Compile)' != ''">
<Compile>-i #(Compile, ' -i ')</Compile>
</PropertyGroup>
<Exec Command="$(CalcDeps) $(Compile) -o compiled -c $(ClosureCompiler) -p $(ClosureLibrary) -p $(ScriptDirectory) $(Externs) $(Defines) -f #(CompilerOption, ' -f ') --output_file $(OutputFile)" />
</Target>
This was simple enough that we didn't bother looking for a task, or trying to invest in building our own. Closure is quite a fast moving project, so it's good to be in a situation where you're not overly dependent upon any third party build systems, especially one that looks to be unmaintained (the task you linked).
Now, I've been speaking in past tense because our build system has migrated a bit. Specifically, as our project kept growing it became increasingly important to partition different parts of our script code into modules. Doing this with the out-of-the-box Closure scripts would be quite a nightmare. Thus, we decided to move to plovr (http://plovr.com/), which makes partitioning code into modules very simple. plovr is very actively maintained and was created by Michael Bolin, who literally wrote the book on Closure (also highly recommended).
We still wrap this using the same MSBuild file. Basically, the stuff that we were defining in the item groups moves to a plovr-config.js file, and the invocation becomes much simpler as well:
<Target Name="Build" DependsOnTargets="Init;FindPlovr">
<Exec Command="$(Plovr) build plovr-config.js" />
</Target>
There are some other cool features supported by plovr, like size reports and module graphs, but even without those we're very, very pleased with our current setup.

The most obvious choice is YUI Compressor, it's stable and reliable.
It has a .Net port: Yahoo! UI Library: YUI Compressor for .Net
The ships with an mbsbuild task.
Also, using the original (Java) is just as easy using Exec task, the only drawback is that it has java dependency.

There's SquishIt Details which is a runtime compressor, but done quite well.

You should see a very similar question that I answered at Using Microsoft AJAX Minifier with Visual Studio 2010 1-click publish. You should be able to use the details there to solve your problems here.

Related

How do you disable Roslyn Analyzers when using msbuild through the command line?

The Roslyn Analyzers are installed as nuget packages, which are dependencies of the FxCop Analyzers (also installed as nuget packages).
I have enabled full solution analysis as instructed here: How to Enable and disable full solution analysis for managed code.
I have a fairly large solution with most of the projects using the FxCop/Roslyn Analyzers and Visual Studio builds fine, usually in under a minute.
However, when running msbuild through the command line using:
"C:/Program Files (x86)/Microsoft Visual Studio/2017/Community/MSBuild/15.0/Bin/MSBuild.exe" "C:\Source\MySolution\MySmartClient.sln" /p:Configuration=Develop;Platform="Any CPU" /
t:Build
Building the solution takes anywhere from 4-15 minutes. The same is true on the build server which uses the same command.
I've tried /p:RunCodeAnalysis=False and that has no effect. I've also used process monitor to emulate the msbuild command that VS sends to msbuild with no change.
And, according to this doc: How to: Enable and disable automatic code analysis for managed code
The Enable Code Analysis on Build check box only affects static code analysis. It doesn't affect Roslyn code analyzers, which always execute at build if you installed them as a NuGet package.
These excessive build times are not practical. Is there any way to disable when using msbuild through the command line?
It's not really supported, but there is a workaround:
Create a Directory.Build.targets (msbuild >= v15.0), After.{SolutionName}.sln.targets (msbuild < 15.0) file in your solution root folder and add:
<Project>
<Target Name="DisableAnalyzers"
BeforeTargets="CoreCompile"
Condition="'$(UseRoslynAnalyzers)' == 'false'">
<!--
Disable analyzers via an MSBuild property settable on the command line.
-->
<ItemGroup>
<Analyzer Remove="#(Analyzer)" />
</ItemGroup>
</Target>
</Project>
You can pass in /p:UseRoslynAnalyzers=false now to remove all analyzers configured in the project.
See also:
https://github.com/dotnet/roslyn/issues/23591#issuecomment-507802134
https://learn.microsoft.com/en-us/visualstudio/msbuild/customize-your-build?view=vs-2019#directorybuildprops-and-directorybuildtargets
You can edit the condition to also trigger on RunCodeAnalysis=False or Never.
<Target Name="DisableAnalyzers"
BeforeTargets="CoreCompile"
Condition="
'$(UseRoslynAnalyzers)' == 'false'
or '$(RunCodeAnalysis)' == 'false'
or '$(RunCodeAnalysis)' == 'never'" >
To disable a specific analyzer, use this trick:
We just spent 2 hours figuring out how to disable an analyzer based on an MSBuild property, AMA.
https://twitter.com/Nick_Craver/status/1173996405276467202?s=09
The documentation has changed since the original answers. There is now this page documenting how to disable code analysis from analyzers:
There are 3 MSBuild properties you can use to control analyzer behavior (all default to true):
RunAnalyzersDuringBuild Controls whether analyzers run at build time.
RunAnalyzersDuringLiveAnalysis Controls whether analyzers analyze code live at design time.
RunAnalyzers Disables analyzers at both build and design time. This property takes precedence over RunAnalyzersDuringBuild and RunAnalyzersDuringLiveAnalysis.
Edit: it looks like there is an issue being tracked where these props don't work unless your project has Microsoft.CodeAnalysis.targets included. So your mileage may vary until this is fixed.
In case anyone else happens to find themselves here, I came across this issue on the dotnet/roslyn project on Github:
Feature: MSBuild switch for turning on/off analysis #23591
The preceding issue describes a work-around:
Substitute for old MSBuild properties? #1431
<PropertyGroup>
<RunCodeAnalysis Condition="'$(RunCodeAnalysis)' == ''">true</RunCodeAnalysis>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="<whatever analyzers package you are depending on>" Condition="'$(RunCodeAnalysis)' == 'true'" />
</ItemGroup>
# You'll need to run a restore when changing this value
msbuild /p:RunCodeAnalysis=false
Although, I had a couple of differences though since I'm not using package references. This worked for me.
<ItemGroup>
<Analyzer Include="<whatever analyzers package you are depending on>" Condition="'$(RunCodeAnalysis)' == 'true'" />
</ItemGroup>
<!-- I added the condition to the EnsureNugetPackageBuildImports too. -->
<Target Name="EnsureNuGetPackageBuildImports" BeforeTargets="PrepareForBuild">
<PropertyGroup>
<ErrorText>This project references NuGet package(s) that are missing on this computer. Use NuGet Package Restore to download them. For more information, see http://go.microsoft.com/fwlink/?LinkID=322105. The missing file is {0}.</ErrorText>
</PropertyGroup>
<Error Condition="'$(RunCodeAnalysis)' == 'true' AND !Exists('<relative path to the prop of whatever analyzers you are depending on>')" Text="$([System.String]::Format('$(ErrorText)', '<relative path to the prop of whatever analyzers you are depending on>'))" />
</Target>

How ad-hoc aliasing of NuGet-referenced assemblies work?

We had issues in our build with some of our dependencies forcing really old version of a library we were using ourselves at a newer version. This got slightly worse with transitioning towards the new style of project files that no longer explicitly mention DLL resources brought in by NuGet packages, because we lost the ability to mark such assemblies by aliases.
Now, I found out a solution of basically the same problem at NuGet's GitHub. I adapted it for our needs in the following way:
<Target Name="AliasLog4Net" BeforeTargets="FindReferenceAssembliesForReferences;ResolveReferences">
<ItemGroup>
<ReferencePath Condition="'%(FileName)' == 'log4net'">
<Aliases>l4n</Aliases>
</ReferencePath>
</ItemGroup>
</Target>
It works magically. I want to know why though.
I can't find documentation for ReferencePath. Specifically, I would love to know what can I test for in the Condition attribute, apart from %(FileName)?
How can I log this logic? Is there a way to write something out for every ad-hoc application of the alias in this way?

Unnecessary dlls in dotnet core console app?

I'm just trying build an example dotnet-core 2.0 console app which should be published as an execute file. This requires me to add an RuntimeIdentifier in the csproj file. After publishing my sample application for win-x64, I get a output directory which contains around 200 dlls and my executable. I have the feeling that's too much - only to print a simple Hello World to the console.
Is there a way to reduce the number of dlls? In this old (and now surely outdated document) named reducing package dependencies a manual approach is proposed for libraries.
Is there a way to reduce the dependencies in dotnet-core 2.0? Or isn't this an issue after all and I shouldn't care?
Just for completeness, here is my example project definition:
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>netcoreapp2.0</TargetFramework>
<RuntimeIdentifiers>Portable;win-x64</RuntimeIdentifiers>
</PropertyGroup>
</Project>
All the dependencies are useful in a certain way (some classes of each are used to make your app work), so when you say "unnecessary" you are wrong.
So far, there is no better tool than the newly announced IL Linker to shrink the size of deployment,
https://github.com/dotnet/announcements/issues/30

Parallel MSBUILD - critical sections?

My organization has some huge builds that run on build servers, building lots and and lots of MSBUILD projects that are linked with ProjectReferences. We need to be able to build projects and configurations in parallel with msbuild /m.
My problem is that I have a project that is referenced from a large number of other projects, but the project itself is not reentrant. If more that two or more nodes try to build that project in parallel, it will fail.
How can I wrap this one Project, or it's Target, inside a critical section?
What I really need to do is something like this:
<Target>
<EnterCriticalSection ID=$(ProjectGuid) />
<Exec something />
<LeaveCriticalSection ID=$(ProjectGuid) />
</Target>
The idea being that if multiple MSBUILD nodes tried to build this project in parallel, only one of the nodes could do the execution, and the rest of the nodes would have to wait (or go do something else).
I suppose I could write custom MSBUILD tasks to do this, but isn't there some way of doing this built into the MSBUILD system?
=== EDIT 4/5/13. To clarify, the project is building a 3rd-party library with the build script provided for it. Completely rewriting their build script to make it reentrant -- by insuring that each build used a different set of folders for intermediate files, etc. -- is possible in theory, but not a practical solution. For one thing, all that work would would have to be redone on each new release of that library.
=== EDIT 4/6/13. On further reflection, I don't think it's even theoretically possible to insure a project is reentrant. Let me explain:
Suppose project XYZ is set up to use different temporary directories depending on platform and configuration in the usual way:
XYZ.proj:
<PropertyGroup>
<MyWorkingDir>tmp.$(Platform).$(Configuration)</MyWorkingDir>
</PropertyGroup>
Now suppose some other project GraphicsWindow references project XYZ, either through ProjectReferences or MSBuild tasks. And suppose the GraphicsWindow project can be built to use various graphics APIs. I.e., there's an OpenGL version, a DirectX 9 version, a DirectX 10, version...
So somewhere there's a .proj or .targets file containing a target to build all four versions:
<Project xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<ItemGroup>
<ProjectToBuild Include="GraphicsWindow.proj">
<Properties>GraphicsApi=OpenGL</Properties>
</ProjectToBuild>
<ProjectToBuild Include="GraphicsWindow.proj">
<Properties>GraphicsApi=D3D9</Properties>
</ProjectToBuild>
<ProjectToBuild Include="GraphicsWindow.proj">
<Properties>GraphicsApi=D3D10</Properties>
</ProjectToBuild>
<ProjectToBuild Include="GraphicsWindow.proj">
<Properties>GraphicsApi=D3D11</Properties>
</ProjectToBuild>
</ItemGroup>
<Target Name="All">
<MSBuild Projects="#(ProjectToBuild)" BuildInParallel="true" />
</Target>
</Project>
or the equivalent using batching.
Now MSBuild will build the XYZ project 4 times with the same Platform|Configuration combination and the same working directory.
This will work fine as long as you build without the /m option and MSBuild runs a single thread. Depending on how the XYZ project is written, the 2nd, 3rd, and 4th builds might do nothing because the outputs are up-to-date, or it might do some redundant work, but the final result will be correct and the build will succeed.
But as soon as you start using parallel MSBuild, this build is broken! There is now a race condition where multiple threads can enter the XYZ project's target(s) concurrently, and start building using the same working directories, which will fail.
Irrespective of how you execute MSBuild, with multi-proc option /m or without, it is guaranteed to execute a project once for every configuration requested by the build. Here is a quote from MSDN:
When the Microsoft Build Engine encounters a project-to-project (P2P) reference while it is using parallel builds to build a project, it builds the reference only one time. If two projects have the same P2P reference, the reference is not rebuilt for each project. Instead, the build engine returns the same P2P reference to both projects that depend on it. Future requests in the session for the same target are provided the same P2P reference.
If you see the same project built more than once, it means it was referenced in two (or more) different configurations. By configuration here I mean a set of parameters that was passed to the project, e.g. project platform (x86, x64, AnyCPU, etc.,) flavor(debug/retail), localization language, any other parameters you might use.
More often than not, this is problem with mixture of project platforms. For example, you have project A built for x64, project B built for AnyCPU, and both A and B reference C. Now C has to be built twice -- for x64 and AnyCPU. If C correctly handles both platforms by cleanly separating outputs into separate directories, there is no problem. However if C treats x64 and AnyCPU as the same, it will fail randomly in multi-proc build.
Start by inspecting your solution configuration dialog. Make sure all projects have consistent set of platform/configuration parameters. If you have a need to build same project in different configuraions, make sure it puts output into separate locations.

Hidden features of msbuild [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
Locked. This question and its answers are locked because the question is off-topic but has historical significance. It is not currently accepting new answers or interactions.
I have an interest in msbuild this week. I'm cleaning up a lot of extremely complex build scripts. Digging in surprises me with how much it can do - msbuild is sort of a hidden feature of .NET programming in itself.
In the SO convention that questions must have answers, in a few days or a week, I'll mark the most useful or coolest hidden feature(s) as accepted.
let bestAnswer suprise slick useful = (surprise + slick + 2*useful)
Definition of useful: I'm updating existing msbuild scripts that: package (zip files) websites and utilities, CC.NET integration, launch tests (UT + selenium), build databases. I'm adding (new targets, even more useful): deploy to VMWare virtual servers, chained builds (fast build immediately, queue slow tests). If you refer to an external library (like MSBuild community tasks), it would be nice to know how to get it.
Some msbuild surprises I've already found.
Hello world using the Message task and Properties.
Using msbuild as an installer for an extremely complex server product. MSB community tasks managed IIS server setup. The WriteLinesToFile and XmlUpdate tasks wrote server specific configuration files. If you've work with MSI, you'll know that anything is better than MSI for installation.
For newbies: CSProj and Vbproj files are the same as msbuild "proj" files. To edit directly: Unload your csproj or vbproj, then right click project and select edit. This is nicer and more powerful than working with clunky pre-build / post-build events.
MSBuild comes with the generic .NET installation. Unlike other fancy tools, you can use it on a totally clean server / desktop.
Here is msbuild Hello World
After I wrote it, I found the MSDN hello world.
<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build;Test" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<Who>World</Who>
</PropertyGroup>
<Target Name="Hello">
<Message Text="Hello, $(Who)" Importance="high" ></Message>
</Target>
<Target Name="Build" DependsOnTargets="Hello"/>
<Target Name="Test"/>
</Project>
MSBuild has a number of nice features. I like
recursive file specs
<Files Include="$(src)\**\*.cs" Exclude="$(src)\**\*test.cs" />
Batching and Item Metadata
<ItemGroup>
<F Include="SampleApplication.t">
<Version>1</Version>
</F>
<F Include="SampleApplication2.t">
<Version>1</Version>
</F>
<F Include="SampleApplication3.t">
<Version>2</Version>
</F>
</ItemGroup>
<Target Name="Build">
<Touch Files="%(F.FullPath)" AlwaysCreate="True"
Condition=" '%(F.Version)' > '1' ">
<Output TaskParameter="TouchedFiles" ItemName="CreatedFiles"/>
</Touch>
<Message Text="Created files = #(CreatedFiles)"/>
<Message Text="%(F.Identity) %(F.Version)"/>
</Target>
Target level dependency analysis
<Target Name="Build"
Inputs="#(MyItems)"
Outputs="#(MyItems -> '$(MyItems)\%(filename).dll'">
This is not really a hidden feature but I think that batching is very powerful when understood.
For more information you can read my related blog entries at:
MSBuild batching Part 1
MSBuild Batching Part 2
MSBuild Batching Part 3
MSBuild RE: Enforcing the Build Agent in a Team Build
Sayed Ibrahim Hashimi
My Book: Inside the Microsoft Build Engine : Using MSBuild and Team Foundation Build
Use the /M command-line parameter to enable usage of all available CPU cores.
I have found the MSBuild
Extension pack to be incredibly
useful. The documentation is very well organized and easy to find the info you need.
They have a section in configuring intellisense for build files, that can be found here
Attrice has an incredible tool that I use often if I need to work on build scripts. What makes it worth your while to try it out, is that it has a debugger that will show you the dependent tasks, as it executes your build script, with auto's and watch variables while it is running the build script. Microsoft Build Sidekick v2.3
Setting SVN to quiet, this feels to me to have increased the speed of the build process a lot. Adding the following to your MSBuild.Community.Tasks.Subversion.SvnExport will run the build without logging each and every file that it gets out of SVN
Arguments="--force -q"
You can reference one msbuild file from within another. All of our targets, such as those for running NCover, SourceMonitor, Duplo, etc. are within a common targets file. For each project, we create an msbuild file with a PropertyGroup and ItemGroup section, followed by an include to the common targets. This guarantees that all of our builds will run the same set of analysis tasks and save us time writing the scripts.