I currently have a nightly build system running as a windows scheduled task, calling at batch file, that works sort of like this:
Check out the latest revision from subversion
Modify the AssemblyInfo.vb file of the main executable and the librarys to set the version number to 0.0.0.revision
Invoke MSBuild to build everything (including the installer)
Upload the installer and a log of the build to an FTP server
This works ok, but step 2 is dirty and fragile, and I can't imagine that this the only way to do what I want. Any ideas?
There are a couple of ways to deal with this. You may want to check this post or others tagged with svn (and containing "AssemblyInfo").
Related
We build our solution using msbuild 4.0 from TeamCity. Our continuous build uses up quite a lot of resources, both on the build machine, and on our central signing and obfuscation servers. One thing I've noticed is that even when a project fails to compile, the build continues, and other projects that do compile get signed and obfuscated.
Is there some way to make the build halt as soon as something fails?
The MsBuild task actually supports a parameter StopOnFirstFailure. However, I can't seem to get it to have any effect.
Can I use this feature to do what I need? I'm prepared to edit e.g. the Microsoft.Common.Targets file on the build servers.
I found this answer on StackOverflow that involves emitting a proj file from the solution and then editing it. I guess I could automate that process on our build servers, but it seems a lot of work to achieve a fairly basic requirement.
I know how many questions have been asked around this, but I still can't figure out the answer to my question.
I need to deploy an ASP.NET 4.0 site, and I want to do something like this:
Get latest version og the entire solution - a website and a couple class projects that are used by the webapp (I am doing this already using CCNet- not a problem)
Build and deploy in debug config to a test site
Build and deploy in a release configuration to a staging site
If everything looks fine on the staging site, I'll run a script that deploys the release build staging site, to 7-8 similar sites used by different customers on the same server. In the future this will be on another server.
There is MSDeploy (webdeploy 2.0), aspnet_compiler, MSBuild, Powershell (my weapon of choice..) and propably more ... I am not 100% sure what to use where?
I would love to mimick the "only deploy files needed to run the site" from the Vs2010 GUI assisted deployment, and I'd love to have the possibility of not touching some existing folders in the websites that we deploy to.
I feel like I should use MSDeploy a lot ... but I find it pretty hard to GET. I'm reading away in IIS.NET and I've heard the Scott Hanselman/Jon Arild Tørresdal podcast. I am not 100% sure where to start ... and I'm not a MSBuild expert, so Powershell is looking pretty good to me. But I feel I'm missing out on the right tools by going that way ...
What tool would you use in which step??
According to me, MSDeploy is build to achieve all of your needs with much more control. MSDeploy is the product of IIS team and it's far more feature rich.
Answers to your questions
Considerations: Svn as Source control, MSDeploy for deployment
You can use SVN Command line tool to get latest version of your source code in some temp directory and build it using MSBuild command line tool. Then you can create a Deployment Package or a Publish Content Package which can be deployed to specific environment. I am currently working on a project which can basically create a deployment package independent of the environment, means the pacakge can be deployed to any stage/test/live environment. Sample for creating a deployment package is...
msbuild C:\Projects\NimBuildDeployTestApp\NimBuildDeployWebApp/NimBuildDeployWebApp.csproj /t:package /p:Configuration=Release /p:PackageLocation=C:\Temp\DeployPackage\NimTestWebApp.zip /p:EnablePackageProcessLoggingAndAssert=true
Use MSDeploy command line tool to deploy the package as below
msdeploy -verb:sync -source:package=packagelocation -dest:CONFIGURABLE_ACCORDING_TO_YOUR_NEEDS
User Step 2 to build the solution in Release mode using MSBUILD and configure the dest parameter in msdeploy.
If everything seems fine on your staging server, run the deploymentToLive.batch file which will deploy a particular PublishContent/package on your live server. Execute 7 msdeploy scripts in same batch file or else use DFS (which is far more complex solution)
MSDeploy is definitely the way to go. One thing that you may consider is having a deployment package ready which can be deployed onto any environment of yours. What I mean by that is your web.config file won't be similar for all the envrionments, and hence create a deployment package which contains web.confif file for all the environments. Refer to my solution here
Hope this helps and gives you confidence in moving forward to automate your deployment process
I'm developing an "installation" like cocoa application wich needs to take care of some http request, some file system reading, copying files to /usr/share, set up cron (not launchd) and ask some information to user.
I discarded PackageMaker since I need more flexibility.
Currently everything is going well, but on my last installation step, I need to:
Delete my previously installed application folder (if exists). It's always the same path: /usr/share/MY_APP
Create again the application folder at: /usr/share/MY_APP
Copy application files to /usr/share/MY_APP
Update a cron job
It's very important that /usr/share/MY_APP keeps protected with administrative privileges, so a regular shouldn't delete it.
What would be the best approach to implement those steps?
BTW, I'm using Xcode 3.2.
Thanks a lot!
Carlos.
Between the preflight script, the postflight script, and perhaps an Installer plug-in for the custom UI, I see no reason why you can't do all of this in PackageMaker.
Note: “Installer plug-in” is a little misleading. The user does not have to install the plug-in somewhere as a separate step; you include the plug-in inside your package, and Installer will use it from there.
The relevant document is a ReadMe file in a sample code project. There's also an Installer plug-in project template in Xcode since 2.0.
Also, an Installer plug-in won't get used if the user does a command-line installation. Of course, they can't install from the command line at all (which includes remote installation onto an office or lab full of machines) if you write your own custom installer.
By the way: Why /usr/share? What are you putting there? There may be a better way to do what you're really trying to accomplish.
I've never worked on tremendously huge projects and the workflow we use at work is check-out/code/compile locally to test/commit. I was wondering how a build server would change this process. How do developer test their code when the application is too huge to compile locally? They just code, commit and pray?
Absolutely not.
The developer usually has a build file which can build the project for him or her, which has some "targets" defined which do the testing. If you have a really big project, you may have certain portions of it precompiled for you, so you don't have to build the whole thing in one big chunk. You usually do your testing locally before you commit to your repository. Breaking the build in big projects can mark you as an object of ridicule and scorn. Breaking the build in really important, really big projects can be career limiting... ;-)
The build SERVER itself doesn't change this. The build server only runs your build file and the targets you tell it to.
There are also build components (I've just started using TeamCity - no affiliation) that allow "personal builds".
I haven't used it yet as we haven't got it set up properly but my understanding is that TeamCity allows running a build (and tests if they are any run on the server) with your changes before committing (and optionally the server will commit your changes if the build is succesful). in TeamCity this is called a Pre-Tested Commit.
Currently I'm tasked with doing the daily build. We have an ASP.NET 2005 website with a SQL Server 2005 backend. Our current source control is Visual Source Safe 2005.
At this point, I use the brute-force method of daily builds.
Get Latest version of source code
Get Latest version of Database release script
Backup old website files to a directory
Publish new code to my local machine
Run on my server to keep the test/stage site working
Push newly created files to the website
Run SQL Script on test database (assuming updates, otherwise I don't bother)
Test website on the Test Server.
Looking at the idea of automated builds intrigues me since it means that I do less each morning. How would you recommend I proceed? I want to have a fully fleshed out idea before I present it to my boss.
Ditch VSS, move to Subversion, and check out CruiseControl.NET. Alternatively, if you have a MSDN developer license, you can run TFS workgroup edition and set up a build server on any old XP box. Its what we do at our shop.
As Assaf noted, you can use CC.NET with VSS directly. Nice.
TeamCity has worked well for me. It has a very simple setup. Combine it with an MsBuild script for your operations and you're auto-matic.
For build management I wholeheartedly recommend TeamCity. It doesn't require IIS6 (like CC.net does) since it runs on it's own copy of Tomcat and the setup is all done thru various forms. This is a big deal to me since the build server is just an XPPro box. It integrates well with SVN and there is no crazy XML file manipulation like I had to do with CruiseControl.Net. Big win for me.
For a build runner we use NAnt to send emails to various people, copy the packaged builds where they're supposed to go, run NUnit and NCover, and deploy the software to our web farm.
For automated testing we use Watin.
http://www.nunit.org/index.php
http://www.jetbrains.com/teamcity
http://ncover.sourceforge.net/
http://subversion.tigris.org/
http://nant.sourceforge.net/
http://watin.sourceforge.net/
Try CruiseControl.Net. It's free, and whatever customized daily/continuous routine you want it to perform you can always add with scripts.
Remember, it's not just about daily (nightly) builds, but also about letting you catch build errors in time (since it continuously builds after every source commit/check-in). You don't necessarily test every code chance on every possible platform and build configuration, but CC can do exactly that for you (in the background).
http://confluence.public.thoughtworks.org/display/CCNET/Visual+Source+Safe+Source+Control+Block
All of what you are doing can be performed by a set of batch files, depending on how automated your test environment is. The main batch file can be started as a 'scheduled task' at midnight or whatever. That's how we 'do it cheap' here and at other places I've worked. If you need help with a particular batch, I can provide a sample.
I second (or third) the reccomendation for Subversion/CruiseControl.net. Also, if it is appropriate, check out hosted services for SVN like CVSDude. You'll probably become well versed with MSBuild in the process too. Once you get it setup it is great.
The cost doesn't come from licensing of the tools or even hardware necessarily, but from your time building and maintaining the system - and depending on what you are doing, that could become significant.
Start with the basics and incrementally improve it over time. Like anything else, if you try to come out of the gate with lots of automation and functionality you could find yourself mired in it fulltime for weeks.
Whatever tools you use, house them in a virtual machine (ie., vmware).
When the equipment inevitably goes south, you can copy the image onto any machine and not miss a beat because your build server decided to take the day off, assuming of course, you back up.