.Net Compact Framework - Cab Builder on build server re-includes excluded libraries - compact-framework

We are using a Cab Builder project in VS2008 to generate our CF cab files. This works well until we make changes that affect the dependencies. The Cab Builder project decides we need all sorts of additional libraries (mscorlib, etc.) that we don't need to actually deploy in our cab. We select the libraries and explicitly exclude them and commit the changes to svn -- when the build server gets the latest of the project and builds it the rogue libraries are back. We have to actually open the cab project on the build server and (again) manually exclude the libraries we don't want to include.
Is this normal behavior for the cab builder?
Is there a work around?
Will we have less pain if we give up and generate our .inf files with a different method and run the exe on the build server to generate the cabs instead of using the project?
Thank you

Ah, the CAB deployment project. I swear that thing was an afterthought and tasked out to a couple interns to develop. It's absolute garbage for anything but the most simplistic packaging scenarios.
What we do is we hand roll the INF and then call cabwiz manually. I've done this with the aid of MSBUILD to make automation pretty simple.

Related

Updating a binary file (.dll) in a solution to use the latest version of the .dll after a gated check-in build in TFS?

I have a solution file in TFS, located at $/Library/Library.sln, containing the binary files (.dlls) of other built solutions in TFS. Within this Library solution, there is a .dll taken from $/MySQL/bin/Debug/MySQL.dll which has been generated by building my MySQL solution (located at $/MySQL/MySQL.sln).
Normally after making a change to my MySQL solution I build the solution, check it in to TFS and then copy the resulting .dll into my Library solution at $\Library\MySQL\MySQL.dll. However I’d like this process to be automatic i.e. after checking in a change to the MySQL solution, a build is triggered, and the .dll file generated from this build is automatically placed into the Library solution and then checked into TFS. How would be the best way to achieve this? I imagine gated check-ins are something to do with it but I can’t seem to get it to work.
As Daniel said in the comment, NuGet is the way to go for this. Instead of checking in the binaries to TFS, have your MySQL build publish the binaries to a nuget feed. Then have your Library solution "subscribe" to that NuGet feed.

Recommendations for turning multiple solutions/projects into a single msdeploy package?

Our main website is a collection of 10 separate ASP.NET projects and applications. At the moment, to do a complete deployment onto a fresh server involves running ten separate msdeploy jobs; each application is built, configured (using config transforms) and packaged, but we don't have any solution for deploying all the packages as a single operation.
I can see several possibilities that might work in this scenario, but would love to hear from anybody who has succeeded - or failed - in setting up something similar:
A folder full of packages and deploy.cmd scripts, with a "master script" that will call each individual app script in turn and deploy that app to the target server.
Using a staging server where we deploy the latest build of each package from TeamCity using the production configuration, but then use msdeploy to capture that server into a single enormous msdeploy ZIP package, which is then deployed onto each production server as a single msdeploy step.
Creating a single, enormous Visual Studio solution that references EVERY project in our codebase (perhaps via svn:externals?), compiles and cross-references them ALL, and hence supports using a single msbuild job to create a huge monolithic package containing our entire codebase, built from the latest revision in source control and configured for the target environment.
I've studied Troy Hunt's excellent "You're Deploying it Wrong" series, and Scott Hanselman's "Web Deployment Made Awesome" article, but I think I'm looking for something a step beyond either of these approaches that incorporates multiple projects and applications without necessarily building them from source in a single step - any ideas?
We had a very similar scenario in our company, and we created an installation package using WIX. Our config transform happens at installation time, so now we create a single build, then deploy that to each server via an MSI install package. WIX is very flexible, but also has a steep learning curve. We modify our configs using our own custom action, but it could be done other ways.
We use Team Foundation Server and MSBuild to do our builds. This is pretty straight forward, but did take some work to set up correctly with as many projects and solutions as we had.
Other options we looked into, and even tried were:
InstallShield - Not flexible enough.
Writing our own C# Install - WIX already thought of everything we
were trying to accomplish so why reinvent the wheel?
Just saying to heck with it all and installing things manually - 2 or
3 months of development time in WIX and MSBuild have easily paid for
the hours we would have spent of the last year doing things manually.
I think the deployment tools built into Visual Studio were designed for a single application with just a few deployments. It sounds like you need external tools, and development effort, to get your deployments quicker, and eliminate the need for doing things manually. That's why we invested in the above solution, and it has really paid off.
I'll pick Installshield.
Installshield latest versions support creating webdeploy packages.
You can define the IIS configurations for all apps in a single project and create releases if you want to create packages by separate or one single release for all web apps.
Installshield project has an object model where you can automate basically every task from build scripts, also the projects are simple xml files that you can also modify in automation scripts if required
Developers can modify update WixXML projects by separate and you can add those projects builds as merge modules to your installshield projects through your build scripts with some little tweaks to the installshield project xml (at least in 2011 version, this part is not supported by installshield but can be done)
You don't even need to modify Visual Studio Projects for groups of web apps that follow a same pattern, neither manually modify your installshield project to add new web apps for these cases, you can create packages for new web apps without intervention setting one time your build scripts for the installshield project automation task based on the root VS build output

BizTalk 2010 project compilation using MSBuild

I am trying to use MSBuild to compile a solution with a few BizTalk 2010 projects (maps, schemas, pipelines) and a few non-BizTalk projects (console app, web app).
MSBuild gets triggered by Nant. The problem is that, everytime I run the compilation, the BizTalk projects get recompiled (and the assembly version number changes). This happens even if there are absolutely no changes to any part of the entire solution.
In other words, If I build the solution once, the assemblies get created fine. Immediately, if I build again, the non-BizTalk assemblies do not get re-created (MSBuild reports Skipping target "CoreCompile" because all output files are up-to-date with respect to the input files). But, the BizTalk assemblies happily get re-created. This is annoying.
Please can someone help/advise?
BizTalk Server 2009 and 2010 .btproj project files are, indeed, MSBuild projects. As you have noticed, the way standard BizTalk targets are authored prevents incremental build of BizTalk projects.
Fortunately, MSBuild is extensible and can be customized in many ways. Please, follow the instructions on this post to to alter the standard build logic of .btproj files in order to add incremental support for your build system.

MSBuild: What is it, and when do I need it?

I seem to have missed Day 1 of MsBuild 101. I find myself asking "What does it do, what does it replace, and when do I need it?" since I can just hit F5 and compile my application.
What is the bigger picture I'm missing?
MSBuild is the build platform that enables all build activity in the Visual Studio world.
A better, more practical example would be to state that
The .csproj files (every C# project) are msbuild files
When you hit F5, you basically (oversimplifying) call msbuild.exe and passing in your .csproj file.
MSBuild empowers all the things that make hitting F5 work. From creating the "debug" or "release" folder, to dropping references into the bin\ directory, to invoking CSC ... and everything in between ... MSBuild "powers" all that.
If all you will ever need from a build is the output that F5 gives you, then you know about all you probably need to know about MSBuild.
In most commercial/practical development scenarios, however, there will come a time where there is a need to customize the build process. The most common approach is automating the build process (using either TeamBuild or some homegrown system). You may also need to
create a "packaged" deployment
link to another library outside of your project that is also actively
being developed
publish your build to an FTP and send an email to a customer notifying
them of its availability.
The use of a unified and extensible build platform (ie MSBuild) is what makes all these these possible, while still being part of the build process ... keeping the "build" part of the development pipeline simple and contained.
It's useful when you want do automated builds, and have to implement a build process
The F5 Key Is Not a Build Process and links therein (e.g this) is a good read in that regard.
Also, your Visual Studio project files are msbuild files. If you want to do more advanced stuff when you build (e.g. run a javascript minifier, have more control over autogenerated version identifiers, post processing of files etc.) , you'll have to dig into msbuild.
msbuild is used when you want to build your project from the command line. Whenever you see a continuous integration product that will automatically build your project, it will call msbuild to perform the actual build step.
I think that build servers should have the option to press F5 key in a simpler way than via windows API.
I know this is pretty stale, but here's my take on MSBuild.
It's a scriptable build tool really similar to ANT. They both use XML for configuration, so you'll be able to figure it out fairly quickly. Both have the concept of "Targets" for instance, lots more similarities in thinking, if you know ANT the switch shouldn't be tough.
MSBuild files generated from Visual Studio is really like the generated ANT scripts you get from Eclipse that build your projects, remember your includes and define your dependencies. You can modify them directly for fun and profit.
I like MSBuild, it fixes some of the stuff I find annoying about ANT.

Advantages of using MSBuild or NAnt versus running DevEnv.exe from command-line

Can anyone explain what advantages there are to using a tool like MSBuild (or NAnt) to build a collection of projects versus running DevEnv.exe from the command-line?
A colleague I had worked with in the past had explained that (at least with older versions of Visual Studio) using DevEnv.exe was much slower than the other techniques, but I haven't read any evidence of that or if that is now a moot point now that starting with 2005, Visual Studio uses MSBuild under the hood.
I know one advantage of using MSBuild allows you to build your projects without requiring Visual Studio to be installed on the build machines, but I wasn't sure if there were others.
One reason is because there's much more to building a product than just compiling it. Tasks such as creating installs, updating version numbers, creating escrows, distributing the final packages, etc. can be much easier because of what these tools (and their extensions) provide.
While you could do all this with regular scripts, using NAnt or MSBuild give you a solid framework for doing all this. There's a lot of community support for both, including additional tasks that can be downloaded (such as the MSBuild Community Tasks Project). Plus, there's support for them in numerous third party and open source products.
If you're just interested in compiling (and not the entire build process), you may find one time saving benefit of MSBuild is the support for building with multiple processors.
The obvious answer from my team is that not everbody has visual studio installed, in particular we do not install Visual Studio onto our build/CI servers.
The prime reason for using an external build tool like NAnt or MsBuild is the ability to automate your build process and thus provide continous feedback on the status of your system. Also they can be used for loads of things besides a "pure" build and that's where you really start to get value from them, it's an extremly valuable thing to be able to build and test your application with a single command.
You can also start adding stuff like collection of metrics, packinging of release binaries and all sorts of nifty stuff like that.
As far as C# goes, devenv.exe 2005 runs the compiler in-proc, which may cause out of memory exceptions for sizable solutions. Msbuild resorts to launching csc.exe process for each project. Projects that don't compile with devenv /build work fine with msbuild. Hope you like this reason.
We are experimenting with switching from DevEnv to a tool (Visual Build Pro) that uses MsBuild under the hood and we got a "Reference required to assembly 'System.Drawing..." error for a project which doesn't need it and which builds fine in Visual Studio.
We have a large system consisting of C#, managed C++, and plain old unmanaged C++ assemblies/dlls. There is C++ code that depends on managed C++ code that depends of C# code that depends of managed C++ code that depends on plain old C++ code (whew!). When we were setting up our automated build environment a few years ago we discovered that MSBuild.exe didn't properly handle all of the dependencies that we have.
Working with Microsoft we were able to solve some of the issues but not all of them. If my memory serves me, we never could get the C# assemblies that depended on managed C++ dlls to build. So we ended up making a custom build script that called devenv.exe from the command line and it worked just fine.
Of course, that was with VS2005, it might be fixed now, but the script is still working so we haven't revisited the issue.