We have our source code stored in Kiln/Mercurial repositories; we use MSBuild to build our product and we have Unit Tests that utilize MSTest (Visual Studio Unit Tests).
What solutions exist to implement a continuous integration machine (i.e. Build machine).
The requirements for this are:
A build should be kicked of when necessary (i.e. code has changed in the Repositories we care about)
Before the actual build, the latest version of the source code must be acquired from the repository we are building from
The build must build the entire product
The build must build all Unit Tests
The build must execute all unit tests
A summary of success/failure must be sent out after the build has finished; this must include information about the build itself but also about which Unit Tests failed and which ones succeeded.
The summary must contain which changesets were in this build that were not yet in the previous successful (!) build
The system must be configurable so that it can build from multiple branches(/Repositories).
Ideally, this system would run on a single box (our product isn't that big) without any server components.
What solutions are currently available? What are their pros/cons? From the list above, what can be done and what cannot be done?
Thanks
TeamCity, from JetBrains, the makers of ReSharp, will do all of that. You will have to configure it for what specifically it means to "build your product", but you can configure up everything you specified with it.
The software can alert you to failed builds, even down to alerting only the person responsible for checking in code that broke the build. It even comes with handy web pages you can view to see only your own changes, which builds they've been through successfully, which ones are pending, and which ones are currently being executed.
Since it is a distributed product, you can make it grow with your organization and product. If at some point you discover that you're waiting for the build to complete too much, because a lot of builds are being queued up, you can add more build agents. The build agents are basically separate client programs you install on additional machines, that execute the actual build configurations.
It comes in two flavors, the professional version and the enterprise version. The professional version is free, can contain up to 20 build configurations, 20 users, and 3 build agents. The enterprise version has unlimited users and build configurations, and you can also use LDAP based security (think domain verified users.) There's also some other bonuses from the enterprise version. You can also buy licenses for more build agents if you need more than the initial 3.
Now, if "no server components" means you don't want it to act like a web server, you're going to be hard pressed to find something that will react to your commits.
However, if you mean that you don't want to have to install a server OS, then TeamCity can work on workstation versions of Windows as well. That isn't to say that you shouldn't consider setting up a proper server for it, but it will run on a workstation if that is what you require.
Our product BuildMaster does all of the things you listed by design and there is a free, somewhat limited edition (e.g. you can only have a limited number of issue tracking providers integrate with it, the database change script packaging tool isn't included in the free version, etc.) for 5 users or fewer.
What you've described is the basics of a CI Tool, so every CI Tool should be OK.
I use Cruise Control.NET but it is bugged with Mercurial and is not very straightforward at first glance. I am nevertheless happy with it. Other tools that come in my mind are Hudson, Team Build (from TFS) and TeamCity.
I have not tried other tools but you can see pros/cons here :
TeamCity vs CC.net
Hudson vs CC.net, Link 1 and Link 2
CC.net vs TFS
EDIT : I forgot to mention that Hudson and Cruise Control.net are Open Source project, you can easily write plugins and patches to your install.
EDIT² : Mercurial bugs seem to be fixed in the upcoming 1.6 version of ccnet (changes commited to the trunk this week).
There's always BuildBot which I like (and have contributed some code to ). It's fairly easy to set-up and run on any OS, and to do simple tasks like that you say, and remarkably flexible if you need it.
What you might find missing is batteries-included log-scrapers and/or report generators that other more commercial CI-servers comes with, especially for Enterprise-y frameworks.
It scales pretty well too, Mozilla and Chromium use it, amongst others.
Related
We are big users of NuGet, we've got 25-30 packages which we make available on a network share.
We'd like to be able to test new packages before they're built and released in the consuming applications. Ideally, this could be done using something similar to Maven's snapshot and having a specific development package (e.g. snapshot functionality).
Has anyone else come up with a, ideally reasonably non-hacky, way of doing it?
Our favoured method is to generate the package assemblies and then manually overwrite the assemblies in the packages/ directory, i.e. to replace the actual project references, but that doesn't seem particularly clean.
Update:
We use a CI build server which creates builds on every commit and has a specific manually triggered NuGet build which works off specifically tagged versions of the codebase. We don't want to create a NuGet build off every commit, but we would like to be able to test a likely candidate in the wild before we trigger the manual NuGet package build.
I ended up writing a unit / integration testing framework to solve a simular problem. Basically, I needed to verity the content of the package, the versions and info, what would happen when I installed and uninstalled the package, what versions were the assemblies in the lib, what bits the assemblies were built as (x86 or x64) and so on - and I needed it all to run without Visual Studio installed and on my build machine (headless) as a quality gate.
Standing on the shoulders of giants like: Pester, PETools, and SharpDevelop's package management module I put together - nuget-test
Clone the project into your package directory (where your .nuspec file and package files are). If for whatever reason you want to keep the nuget-test project as a "git" repo then simple remove "remove-item nuget-test/.git -Recurse -Force" from the command below.
git clone https://github.com/nickfloyd/nuget-test.git; remove-item nuget-test/.git -Recurse -Force
Run Setup.ps1 in the root of the nuget-test directory in an x86 instance of PowerShell.
PS> .\setup.ps1
Write tests and place them in the nuget-test/test directory using the Pester syntax.
Run the tests.
PS> Invoke-Pester
Project page: nuget-test
On github: https://github.com/nickfloyd/nuget-test
I hope this helps you get closer to what you're trying to get done.
If you're using NuGet packages to distribute your libraries, you should not limit to only testing the libraries. You should test the packages themselves as well (if your binaries are OK but incorrectly installed, consumers still have issues). The whole point is to improve this experience.
One way could be to have an additional CI or QA repository. The one you currently have is actually your "production" repository containing consumable releases, considered finished high-quality products.
Going further, you could have a logical package promotion flow (based on Continuous Integration or even using a Continuous Delivery approach), where:
- each check-in produces a package on your CI repository
- testers pick up a CI package for QA and if found OK promote it to either a QA feed, or to the Production feed (whatever you prefer, depends on the quality of your testing and how well it is automated)
There are various ways of implementing this scenario, using simple network shares, internal NuGet.Server or Gallery implementations, or simply use http://myget.org to give it a try with minimal cost and zero effort.
Hope that helps!
Cheers,
Xavier
I ask this question because I find the the community contributions to the various build engines (like MSBuild and NAnt) do include all the tasks that promote for CI servers, like getting versions from source control, cleaning folders, changing build numbers, sending emails, etc...
Is it only because it "listens" to the changes happens on the source control repository? what else am I missing?
Grzegorz Oledzki linked a good resource for finding the differences between multiple CI solutions, but it should be noted that the intent of MSBuild is to specifically turn code into binary and is used by CI software to build the source. It's true that it can do other things but most of its tasks lie closely within that realm.
In addition to what you mentioned about listening to the repo, some CI servers can do all kinds of things like^1:
multi-agent building (not just multi-core, msbuild can do that, but multi-machine)
monitoring build status
notifications (e-mail/sms/rss/whatnot)
assigning blame for broken builds
administrative features
supporting XFDs (extreme feedback devices)
automated deployment
And generally all from a handy UI.
1 Not all CI software will have all of these features, it is by no means meant to be exhaustive and there is some overlap.
I believe CI (Continuous Integration) feature matrix will answer all your questions about particular CI providers and their capabilities.
Wow there are just so many answers to this. As for what a CI system can do that a Build Script can't do other than listen to your Version Control System... Well for starters systems like TeamCity can let you first test your code on the build server and then check it in if it passes all the tests for starters.
I highly recommend using a CI server but I prefer to keep all of the build logic in a MSBuild file and all of the who to notify when it fails etc. in the CI server. Keeping the logic in the Build file helps you to reproduce the build on your own machine and makes it simple to set up new projects in the CI server or to change how the CI server builds the project
Our msbuild process creates a variety of zip packages for deployment (mostly web sites, but other things as well). We have a variety of recurring problems that keep sneaking back - files included that shouldn't be, missing resources. This screams for automated validation. The criteria to test for are simple
Validation of foosite package:
Resource files are present.
No test result files, obj files, or other build artifacts
And so forth.
Ideally, I could use nunit or mstest, which everone is familiar with. Msbuild knows where the packages are. We have a lot of packages, possible concurrent builds on different branches. Ergo, the location of the packages and names of the packages are not deterministic - so the tests don't know where the packages are.
What is the simplest way to feed msbuild information to mstest or nunit? The answer to this question would one possible answer, however, that question got architectural advice instead of an answer. I know this isn't a unit test, but the test framework is handy, anyway. I could create an exe to validate the build - but why add a couple hours to the project?
Or, do you have a better suggestion for automatically validating build packages? (MSIs, zips, whatever)?
What I've ended up doing is having a bunch of custom MS build tasks which spin up a virtual machine on Virtual Server, copy the MSI onto the machine, silently deploy it and then validate against it. I used PSExec to start the MSI. It could then use the MSTest command line runner to use MSTest and run your test bits.
This is probably overkill for you, but using a VM allows you to start clean and not be affected by any previous installs on your dev box.
If you want a fast fail, like a unit test, then I suggest you create unit tests against your packages. Such a test would unzip the .ZIP packages, and run some asserts against the contents.
You could even use some TDD techniques against the packages. For instance, if you have a deployment fail because a particular file is missing, then write a unit test that fails because the file is missing; change the build so that the file is present; then make sure the unit test succeeds.
But in general, deployment issues are broader than that, and I echo the suggestion from blowdart. Deploy into one or more virtual machines, then run automates tests over the deployed environments. These tests would not only test for simple things like was there an error returned during the installation itself; they would also check things like were the IIS virtual directories set up correctly, with the correct properties and contents, and does the web site basically run.
I'd use several different virtual machines to test different deployment scenarios: one for a clean deploy; one for an upgrade from version .-1, etc. It's possible that the same, or similar IVT tests could be run for each environment.
Even if you can't do this all at once, the thought process involved in this exercise should lead to a more formal definition of what your deployment environment really is. You this will be helpful when you get a chance to embody this formal definition in actual tests.
Currently I'm tasked with doing the daily build. We have an ASP.NET 2005 website with a SQL Server 2005 backend. Our current source control is Visual Source Safe 2005.
At this point, I use the brute-force method of daily builds.
Get Latest version of source code
Get Latest version of Database release script
Backup old website files to a directory
Publish new code to my local machine
Run on my server to keep the test/stage site working
Push newly created files to the website
Run SQL Script on test database (assuming updates, otherwise I don't bother)
Test website on the Test Server.
Looking at the idea of automated builds intrigues me since it means that I do less each morning. How would you recommend I proceed? I want to have a fully fleshed out idea before I present it to my boss.
Ditch VSS, move to Subversion, and check out CruiseControl.NET. Alternatively, if you have a MSDN developer license, you can run TFS workgroup edition and set up a build server on any old XP box. Its what we do at our shop.
As Assaf noted, you can use CC.NET with VSS directly. Nice.
TeamCity has worked well for me. It has a very simple setup. Combine it with an MsBuild script for your operations and you're auto-matic.
For build management I wholeheartedly recommend TeamCity. It doesn't require IIS6 (like CC.net does) since it runs on it's own copy of Tomcat and the setup is all done thru various forms. This is a big deal to me since the build server is just an XPPro box. It integrates well with SVN and there is no crazy XML file manipulation like I had to do with CruiseControl.Net. Big win for me.
For a build runner we use NAnt to send emails to various people, copy the packaged builds where they're supposed to go, run NUnit and NCover, and deploy the software to our web farm.
For automated testing we use Watin.
http://www.nunit.org/index.php
http://www.jetbrains.com/teamcity
http://ncover.sourceforge.net/
http://subversion.tigris.org/
http://nant.sourceforge.net/
http://watin.sourceforge.net/
Try CruiseControl.Net. It's free, and whatever customized daily/continuous routine you want it to perform you can always add with scripts.
Remember, it's not just about daily (nightly) builds, but also about letting you catch build errors in time (since it continuously builds after every source commit/check-in). You don't necessarily test every code chance on every possible platform and build configuration, but CC can do exactly that for you (in the background).
http://confluence.public.thoughtworks.org/display/CCNET/Visual+Source+Safe+Source+Control+Block
All of what you are doing can be performed by a set of batch files, depending on how automated your test environment is. The main batch file can be started as a 'scheduled task' at midnight or whatever. That's how we 'do it cheap' here and at other places I've worked. If you need help with a particular batch, I can provide a sample.
I second (or third) the reccomendation for Subversion/CruiseControl.net. Also, if it is appropriate, check out hosted services for SVN like CVSDude. You'll probably become well versed with MSBuild in the process too. Once you get it setup it is great.
The cost doesn't come from licensing of the tools or even hardware necessarily, but from your time building and maintaining the system - and depending on what you are doing, that could become significant.
Start with the basics and incrementally improve it over time. Like anything else, if you try to come out of the gate with lots of automation and functionality you could find yourself mired in it fulltime for weeks.
Whatever tools you use, house them in a virtual machine (ie., vmware).
When the equipment inevitably goes south, you can copy the image onto any machine and not miss a beat because your build server decided to take the day off, assuming of course, you back up.
Working in a team environment, we have a Team Foundation Server that also contains a Team Build component. It is configured to automatically build all projects and solutions at specific times or on request.
We develop a product that is built with several solultions that depend on eachother. When things have been changed in one solution, it has to be rebuilt locally manually in both debug and release mode so that changes take effect in another solution that depends on it.
Also when a developer retrieves all sources the first time, he has to build all solutions manually in the correct order to get a working environment.
What is the best way to automate things like this? Create .cmd files that trigger the correct msbuild files? Using a program such as CruiseControl.NET?
What do you people do to maintain a clean local development environment?
What I did for our Team was to provide a Visual Studio Solution which contains all projects. Then I created a simple .cmd file which uses the commmandline tools of Visual Studio to build this solution with their respective debug/release/profile configurations. This is a one step build solution that can be used from every engineering machine.
The next level is the continuous integration system that is setup to check for changes every 15 min and start a build if there are changes in the VCS. I'm using hudson as our CI system. The CI system is used to build the native projects, the java projects as well as the flex stuff. Since everything can be build from the commandline it's pretty easy to use it with hudson or CruiseControl.NET.