MSBuild - Assemblies differ slightly after each clean+build - msbuild

I'm trying to work with an existing home grown implementation of click-once. Currently we manually update the manifest for assemblies that we actually changed. I'm attempting to make it automatic based on a binary comparison of the existing assemblies and the newly built assemblies. Unfortunately, it seems that each time I run clean + build (automated build script) there are small differences to the assemblies, essentially invalidating the use of our click-once solution at all. I'm guessing that these differences are caused by some sort of guid generation or something along those lines. Is there anyway to prevent the differences in the assemblies?
And unfortunately, due to our branching/CI strategy I don't have the option of not cleaning because each release is from a new branch.
Otherwise, any suggestions on how I can compare two assemblies to see if any code has changed, without having access to the source code.
Thanks,
David

Typically, autobuild systems check the filesystem timestamps of the binary vs the source files (or object files vs source files, depending on the language). If the source is newer than the binary/object, a rebuild is triggered. This strategy may work better for you instead of actually diffing binaries/

I found BitDiffer a tool from www.BitWidgets.com that compares what has changed in an assembly. While this runs slower than a binary comparison, it removes the need to have MSBuild create an identical assembly.
Thanks,
David

Related

SSDT - Build Deployment Script without dacpac

I've got a question about building a deployment script using SSDT.
Could anyone tell me if it's possible to build a deployment script using SQLPackage.exe where the source file is NOT a dacpac file, but uses the .sql files instead?
To give some background, I've created a project in Visual Studio 2012 for my database schema. This works great, and SSDT builds the folder structure without a problem (functions, stored procedures etc which contain all the .sql files).
Here's the problem - the database in question is from a legacy system, and is riddled with errors. Most of these errors we don't care about anymore and it's not practical or safe to fix them all, so for years we've basically ignored them. However it means we can't build the project and therefore can't generate the dacpac file. Now this doesn't prevent us from doing the schema compare and syncing the database with the file system (a local mercurial repository). However it does seemingly prevent us from building a deployment script.
What I'm looking for is a way of building the deployment script using SQLPackage.exe without having to generate the dacpac file. I need to use the .sql files in the file system instead. Visual Studio will produce a script of the differences without building the dacpac, so this makes me think it must be possible to do it using SQLPackage.exe using one of the parameters.
Here's an example of SQLPackage.exe which I'd like to adapt to use the .sql files instead of the dacpac:
sqlpackage.exe /Action:Script /SourceFile:"E:\SourceControl\Project\Database
\test_SSDTProject\bin\Debug\test_SSDTProject.dacpac" /TargetConnectionString:"Data
Source=local;Initial Catalog=TestDB;User ID=abc;Password=abc" /OutputPath:"C:
\temp\hbupdate.sql" /OverwriteFiles:true /p:IgnoreExtendedProperties=True
/p:IgnorePermissions=True /p:IgnoreRoleMembership=True /p:DropObjectsNotInSource=True
This works fine because it uses the dacpac file. However I need to point it at the folder structure where the .sql files are instead.
Any help would be great.
As has been suggested in comments, I think that biting the bullet and fixing the errors is the way ahead. You say
it's not practical or safe to fix them all,
but I think you should give this a bit more thought. I have recently been in a similar situation to you, and the key to emerging from it is to realise that the operational risk associated with dropping procedures and functions that will throw an exception as soon as they are called is zero.
Note that this does not apply if the reason these objects won't build is that they contain cross-database or cross-server references that are present in production but not in your project; this is a separate problem altogether, but also a solvable one.
Nor am I in favour of "exclude from build" as an alternative to "delete"; a while ago I saw a project where this technique had been deployed extensively; it makes it harder to see what does what from the source files and I am now of the opinion that "Build Action=None" is simply "commenting out the bits that don't work" for the Snapchat generation.
The key to all of this, of course, is source control. This addresses the residual risk that one day you might indeed want to implement a working version of one of your currently non-working procedures, using the non-working code as a starting point. It also obviates the need to keep stuff hanging around in the solution using Build Action=None, as one can simply summon an earlier revision of the code that contained the offending objects.
If my experience is any guide, 60 build errors is nothing; these could easily be caused by references to three or four objects that no longer exists, and can be consigned to the dustbin of source control with some enthusiastic use of the "Delete" key.
Do you have a copy of SQL Compare at your disposal? If not, it might be worth downloading the trial to see if it will work in your scenario.
Here are the available switches:
http://documentation.red-gate.com/display/SC10/Switches+used+in+the+command+line
At the very least you'll need to specify the following:
/scripts1:
/server2:
/database2:
/ScriptFile:

What are the limitations/benefits on using MSM instead of MSI?

I'm currently building a product distributed through MSI Windows Installer. That product is being integrated by our customers using different forms such us inside their own MSI, using bootstrapper/chainner like WiX Burn or authoring tools like InstallShield.
Having this scenario in mind, I always wanted to know what are the limitations and/or benefits to use Merge Modules (MSM) instead of keeping an MSI, and also what are the nowadays recommended approach about choosing one against the other.
On paper merge modules are fine, but in the real world I find them clunky to update and hence error prone since they may be merged into many setups before being discovered to be defective. As a result I do not recommend merge modules at all. I prefer a single MSI that can be run as a batch process via a bootstrapper or batch file and that can also be updated easily. This avoids all kinds of problems that are not generally intuitive.
I want to add that merge modules work well for truly shared files installed in locations in the file system that are meant for shared files and that change infrequently. These are generally OS-runtimes. These merge modules are generally heavily tested and work ok. However, often I see people use merge modules for files that end up changing frequently and that they then end up installing in different locations in different flavors in an ad-hoc fashion. This kind of use is a total mess and a hugely wasted effort.
Having said all that - I have indeed used merge modules successfully when I have needed advanced release management with repetitive, non-changing inclusion of a set of files via a merge module into several setups. Even then I ran into a version issue after a while with a couple of files needing update, and subsequent, minor errors with the wrong merge module being used when I left the project to someone else. I also experienced having to rebuild all setups due to a minor merge module bug fix. All setups then had to go through QA again. Very frustrating with such tight coupling.
If your requirements are simple and you are not taking on a huge multi-product release project sharing a bunch of files, use MSI instead of MSM. Easier to comprehend, generally less work to deal with, more atomic updates and less risk of introducing the same error in many setups due to merge module update or design problems.
There's nothing that wrong with merge modules. Their primary use (which hasn't been mentioned) is sharing. If you want the same set of shared files in multiple MSI files, they need the same set of component guids to preserve the sharing rules. Or if you are giving files to clients for them to use (like Microsoft) in their MSI builds then give them merge modules. That's one of the reasons MS and other vendors redistribute merge modules so that everyone can build their MSIs and install them on the same system without file sharing disasters. I've also seen merge modules used as a common UI for MSI files. But mainly they are essential to make sure that shared files are used correctly. I'll tell you from experience that the disaster resulting from incorrect use of shared files is much worse that any perceived difficulty with using merge modules. Note also that they are universal and can be included in all tools that build MSI files.
I've never found merge modules difficult to patch, version, or fix. Major upgrades aren't a problem. The only potential issue I've seen is with build processes that rebuild all the binaries in a merge module during creation of a patch (.msp) build. If only one binary needs a fix but you compile them all, their versions and internals may change enough so that the patch process (the delta between two MSI files and their content) will tell you that they need to be included in the patch because they've changed, but this issue can be avoided if it is in fact an issue.

MSBUILD fails with "The process cannot access the file xxxxx because it is being used by another process." when maxcpucount is greater than 1

I'm trying to improve build times using CruiseControl.NET and MSBUILD, and one of the commandline switches, maxcpucount can be used to allow the build occur in parallel. Our solution has 60+ projects so any improvement would be helpful. However, whenever I up the maxcpucount above one, we have frequent build failures due to:
"The process cannot access the file xxxx because it is being used by
another process. msbuild"
It appears that the additional parallel build threads/processes are locking each other.
I think I found a solution. It appears that if I add the /nodeReuse:false switch I don't get the file locks. It seems like the nodeReuse functionality is keeping msbuild processes around and those are hanging on to file locks for subsequent builds.
http://msdn.microsoft.com/en-us/library/ms164311.aspx
Are you building from a solution file? If so, make sure that you are using direct project-to-project references and not using the Solution's project-dependency feature. If you happen to be using a bit of both, there can be issues. See this article.
Better yet, if at all possible, ditch the solution file and create your own MSBuild file to drive your build.
Your assembly is probably being used by another assembly thats being built. Make sure each assembly gets built before it's needed by other assemblies

WIX MSBuild automation help - solution best practices

I know there are many questions out there regarding this same information. I have read them all, but my brain is all turned around and I don't know which way to go. Plus the lack of documentation really hurts.
Here is my scenerio. We are trying to use WIX to create an installer for our application that goes out to our dealers for our product information. The app includes about 2000 images and documents of our products and a SQL CE database that are updated via Microsoft Sync Framework. The data changes so often that keeping these 2000 as content files in the app's project is very undesirable. The app relies on .NET Framework 3.5 SP1, SQL Server CE 3.5, Microsoft Sync Framework 1.0 and ADO.NET Sync Services 2.0.
Here are the requirements for the app:
The dealers will be given the app on a CD every year for any updates (app or data updates).
The app must update itself from the internet to get any new images, documents or data.
The prerequisites must be installed if they do not exist on the client machine.
The complete installer should be generated from an MSBuild script with as little human interaction as possible (we don't want to be manually updating the 2000+ file list).
What we have accomplished so far is that we have a Votive project in our solution. We have manually specified the binaries in a .wxs file. Web have modified the .wixproj file to use the HeatDirectory task to gather our data (images and documents and database) from a specified location (This is broken and giving an ICE38 error). This seems all right, but still is a lot of work. We have to manually update our data by running the program in release mode and copying it to the specified directory.
I am looking to see what other people would do in this situation.
How would you arrange your solution with regards to the 2000+ data files? Would you create a custom build script that gets the current data from the server or would you include them as content files in the main project?
How would you get WIX to include all of the project output (including the referenced assemblies) and all of the data files? If you have any complete samples, that would be great. All I have found are little clips here and there and not an entire example from start to finish.
How would you deal with the version numbers? Would you put them as a constant in the build script and reference them through the $(var.VersionNumberName)? Would you have the version number automatically picked up from the project being deployed? If so, How?
If there is any better information than what I am finding, please include. I have read numerous articles, blogs, Stackoverflow questions, the tuturial, the wiki, etc. Everything seems to be in bits and pieces. The tutorial is nice, but doesn't explain anything about MSBuild and Votive. I would like to see a start to finish tutorial on using MSBuild and Votive and all the WIX MSBuild targets. If no one knows of a tutorial like this I may put one together. I have already spent the entire week gathering info and reading. I'm new to MSBuild as well, so if anyone has any great articles on MSBuild, please include them.
The key is to isolate the different types of complexities into separate merge modules and put them altogether into an MSI as part of the build. That way things that change often can change without impacting things that hardly change at all.
1) For the data files:
We use Paraffin to generate the WiX and hence the merge modules for an html + Flash based help system consisting of thousands of files (I can't convince the customer to go to CHM).
Compile these into a merge module all by themselves.
2) Assemblies: assuming that this is a set that changes less often just make a merge module by hand or with WixEdit with the correct files and dependencies.
3) For the version number there a lot of ways to manage this depending on your build system. The AssemblyInfoTask is pretty straight forward way to make sure all your assemblies are versioned appropriately. The MSBuild Extension Pack has some versioning stuff if you are using TFS.
I had a similar scenario and was unable to find a drop in solution so ended up with the following:
I wrote a custom command line program called wixgen.exe for generating wxs manifest files. It is pretty specific to our implementation in that it only knows how to create 2 types of wxs files. One for IIS Website/Virtual Directory deployments and another for Windows Service deployments.
Each time a build is triggered by our continuous integration server a post-build task runs wixgen with the right args to generate a new manifest.wxs for the project being changed. It automatically includes all the files needed for the deployment. These builds also version the dlls using a variation of the technique at: http://richardsbraindump.blogspot.com/2007/07/versioning-builds-with-tfs-and-msbuild.html
A seperate build which is manually triggered is then used to build the wixproj projects containing the generated wxs files and produce the msi's.
I would ditch the CD delivery (so 90's) and got with ClickOnce. This solution seems to fit well since you already use the .NET framework. With ClickOnce you should be able to just keep updating the content of your solution and make updates available to your heart's content. Let me know if you need, sample ClickOnce deployment code.
You can find more ClickOnce information here.
Similar to dkackman's answer, you should seperate your build into several components, isolating build components to be built seperately.
I come from a mainly Java background, however for building MSIs and NET executables we use maven; with the 'maven-wix-plugin' plugin for building the installers, and using the NMaven plugin for compiling any NET code. However, as we're only performing very basic development in NET, with most development in Java, we don't need too much complexity from the NMaven plugin (which is probably a 'good thing' (TM) as it's only at version 0.17).
If you're a purely NET house, you could also look into Blydan (http://www.codeplex.com/byldan), which seems to be the focus of development there at the moment (it's the same team for NMaven and Byldan).
If you do want more information on NMaven or Byldan raise another question and I'll give as much info as I can (which is not a huge amount, as stated I only do very limited NET development).

Compiling Massive VB.NET Project

To compile my current project, one exe with ~90,000 loc + ~100 DLL's it takes about a half hour or more depending on the speed of the workstation.
The build process is one of running devenv from Powershell scripts. This works very well with no problems.
The problem is that it is slow. I want to speed up this build process.
MSBuild (using VS-2005) is one option but there's a bug specifying icons to the vb compiler/linker on the command line such that it won't successfully link.
What other options are there to "make" VB.NET programs?
(Faster workstation is not an option.)
Do you absolutely have to compile the whole solution every time? With that many assemblies it seems unlikely that they all need to be built unless they actually change. If your solution is made up of multiple projects, you might consider creating multiple solutions in your build environment. One master solution could contain all the projects, another that includes the ones that change most often. You can then configure your build process to focus on the projects that have changed. Depending on the source control system you use, you may be able to query the system to determine which projects have changed since the last build, and only build those projects.
There's NAnt, and Cruisecontrol.NET for continuous build.
You mentioned that getting a faster PC is not an option, but how much memory do you have? 2GB should be the minimum for a developer machine. Also, using a fast 10K RPM hard disk makes a big difference.
Have you tried disabling any virus scanner during your build?
If you can, upgrade to the 3.5 version of MSBuild. It can build solution files, and enables support for multiprocessor support (or here if you need to host it yourself) enabling it to build projects in parallel.
The caveat is that you need to be using project references so it knows what to build.
Also, how long is it taking now? Have you looked at the CPU/Memory Usage (using something like PerfMon) to see if it is a bottleneck?
There's not much you can do to make the build process any faster short of adding more cores, CPU power, and memory to your machine, but that isn't an option in your case.
Most large projects are not self-contained in a single EXE. More often, logical units are moved into seperate assemblies, which can either be a DLL or EXE. The end result is a whole bunch of little assemblies, instead of one enormous one.
To cite one example, one project that I worked on was enormous, consisting of 700+ forms and 10s of 1000s of classes. Functionally related forms, such as those related to printing, report generation, user interrogation, etc were self-contained in their own EXEs. If I was working on the reports, I'd exclude all projects not related to reports from the build process, and this helps bring the compilation time down from a half hour to a few seconds.
This programming style can be tricky, but when it done right, it simply works and works flawlessly.
If you have a big number of projects then you should try to reduce them. You can always split them up in dll's later. The fewer projects the faster it can build. Especially if it has to build them in a certain order.
Breaking them in smaller solutions is also an option.