I've noticed projects such as Msbuild Extension Pack and MsBuild Community Tasks give msbuild the power to install assemblies, sql, and setup IIS. These features seem to be oriented to doing installs and not builds.
So I was wondering how many people out there are using msbuild, perhaps in conjunction with Cruise Control.Net to do installs on staging environments?
I use MsBuild to build, and part of the build process runs Wix to create an installer(MSI) which is used to deploy to production.
I wrote up a little sample of templating configurations for different target environments with msbuild: http://blog.privosoft.com/2010/10/msbuild-environment-sandboxing.html
We use CC.NET & MSBuild to build and then also to publish to our dev and stage environments, however we do not have the push live on CruiseControl.NET, we run that MSBuild by hand. We just thought it would be way to tempting with a button to publish live ;) It took probably 2 or 3 revisions to get our MSBuild set up right. But now everything is in one file, and everything is based on Targets and Properties to do all the work. About 6 months ago, was what should be the last update and that was a multi-server push so we are ready for scaling up. We can now push any combination of parts to any combination of servers. So if we want 5 database servers, 3 contenet servers, and 2 web servers we have that ability. No need to use anything else. MSBuild can do it.
I created a deployment system where a central coordinator can:
- identify the right target server for a given component (e.g Windows service goes to a given server, web services go to another, etc.)
- perform a PsExec of a deployment MSBuild script on the target server
- the deployment MSBuild script is responsible for:
a) downloading the right component package (in my case a .zip)
b) backing up previous versions of the component
c) extracting the package to the right place
d) tailoring the installation steps to the type of component to deploy (e.g. needs to perform an Exec task of installutil.exe on a Windows service )
e) logging the result of the deployment
This system is built using a mix of:
- core MSBuild tasks
- [Tigris MSBuild community tasks][1]
- [MS SDC tasks][2]
- and custom tasks
The system allows us to perform consistent deployment of complex apps across partitioned environments (e.g. DEV, QA, UAT, etc) made of virtual servers.
I use MSBuild to build a fairly large client/server application. I use InstallShield 2008 to create a separate client and server install set.
By adding a custom target into the build process you can combine the creation of the installers into the build.
I would recommend that you create and test the build and the installer separately, before attempting to integrate the two.
I know this is an old question... but...
I am currently using MSBuild with MSBuild Extension Pack (http://msbuildextensionpack.codeplex.com) to do my entire deployment. The database portion is handled with the VS database command-line tool (vsdbcmd.exe - http://msdn.microsoft.com/en-us/library/dd193283.aspx). That Extension Pack is pretty amazing, and is letting me build web sites, app pools, Windows services, update config, and much more.
I've also put Team City agents on the test servers, so I can deploy as part of a build chain (introduced in version 7 of Team City). And running my deploy MSBuild script is super easy from Team City.
I used to use MSBuild, now I'm using PowerShell. MSBuild is a build language. It is painful to script in. There is a lot I wanted to do in it that were difficult and sometimes impossible.
Over the past year, I've created an PowerShell module somewhat equivalent to MSBuild Extension Pack called Carbon.
I strongly, strongly encourage everyone out there to learn and use PowerShell.
Related
Some original info was changed to make the post more focused on the real issue after it was found.
These are some of the details of the current environment. I listed these only because questions were raised in other posts to determine what was and was not working in the current environment:
Upon check-in TFS 2017 successfully builds a web project on the build agent.
A VS 2017 publish profile can manually transform the project properly
The build machine artifact location includes both the transform and profile files
The artifact location is shown below:
I have researched this in depth on Microsoft's VS site, SO and other forums, but there are so many different answers, many of them for older versions, I have been unable to piece this together. As a result I have several sub-questions.
1) Can transforms be engaged in both Builds and Releases?. I read that transforms are applied during the publish process, not the build process, and that made me wonder if it is even possible to do this during a Build. But then when I was exploring releases, I saw all the same tasks usable in a Build, which suggests I can publish with a transform in either Build or Release. Is that correct?
2) Does TFS 2017 require a lot of special handling to engage a transform file? Some of the posts instructed the editing of the .proj file. I wanted to get a confirmation before doing that kind of detailed manipulation, especially given the improvements in TFS 2017.
The following information is the state of the current build definition named "confPanner-CI". The shaded PS script was successfully used to upload to the hosting location to test the whole process, but that is not adequate for the task at hand which requires transforms to be applied:
The full MSBuild Arguments which also created a temp location for the powershell script are:
/p:DeployOnBuild=True /p:DeployDefaultTarget=WebPublish /p:WebPublishMethod=FileSystem /p:DeleteExistingFiles=True /p:publishUrl=c:\ConfPlnrWeb
If I were to add a task for publishing I saw the Publish Build Artifacts task:
But none of the settings as shown below seem to relate to transforms:
The bottom line question is: How do I configure the build so the web project upload has the proper web transform applied?
Update: The following added after the answer below led to at least one place where VS transforms can be applied during a build, and presumably also a release.
Inside the MSBuild Build solution task set the Configuration as shown below:
Publish Build Artifacts task is used to publish the related artifacts ( The “a” working directory contains the artifacts (also known as the “drop”) that are uploaded at the end of the build) to Visual Studio Team Services/TFS or a file share.
Usually it should be a package and be used in a deploy task such as Deploy: WinRM - IIS Web App Deployment or Azure App Service Deployment to achieved the deployment.
1) Can transforms be engaged in both Builds and Releases?
Yes, you could also do this in a build pipeline with the useage of build deploy task. You need to add the task after the publish build artifacts task.
2) Does TFS 2017 require a lot of special handling to engage a transform file?
update
The BuildConfiguration variable is different in TFS 2017, it's inside
the MSBuild task! Transforms are now applied according
to the MSBuild task Configuration setting.
Edit the .proj file is a method to do the transform. If you don't need to change the transform, it will auto do it during the build.You could also use some 3-rd party task/extension for extra transform such as: XDT Transform
Usually we separate the build and release for the deployment, cause it's easy to configure multiple environments and easy to debug issue. You definitely could do this only in build but with a bloated process. You could refer this tutorial: Build and Deploy Azure Web Apps using Team Foundation Server/Services vNext Builds.
For a separate build and release solution, you could take a look at this blog: Using web.config transforms and Release Manager – TFS 2017/Team Services edition
I come from a Continuous Integration and Continuous Delivery (CI-CD) implementation project background for java web applications. Now i am working for a .NET based project. Microsoft technologies is completely new to me. It is using the MsBuild for the build process via Jenkins. I am learning MsBuild at this time. The more i read, the more confused i am with the Microsoft way of doing this.
I noticed that the msbuild is executed for every environment where the app is going to be deployed using various configuration and profiles based on the environment for the deployment. Below are some msbuild commands for 2 different environments (PIE & TEST)
C:\\Program Files (x86)\\MSBuild\\12.0\\Bin\\MSBuild.exe" /p:Configuration=PIE /m:4 /nr:false src/myapp.sln
C:\\Program Files (x86)\\MSBuild\\12.0\\Bin\\MSBuild.exe" /p:Configuration=TEST /t:Rebuild /m:2 /p:DeployOnBuild=true /p:PublishProfile=TEST src/myapp.sln
C:\\Program Files (x86)\\MSBuild\\12.0\\Bin\\MSBuild.exe" /p:Configuration=STAGE /t:Rebuild /p:DeployOnBuild=true /p:PublishProfile=STAGE /m:2 SprintA/src/myapp.sln
i may be wrong, but i feel that the code being deployed to the two environments (when the code progress from PIE to TEST) is being build for each environment which is not the real code progression concept. IMHO, the build is done once and its progressed to subsequent environments for testing/validation as long as there are no bugs in the code. The various environment specific settings are handled via config files inside the package and the containers (tomcat for a java app) are started with the parameters that reads/parse the confif files.
Is there a way to handle this in .NET? The app is deployed in IIS
UPDATE:
The more i do research reading various docs and blogs, i came across the web publishing method using msbuild for each configuration and the deploy/publish profiles. IS this just the standard way that the mass follows for a .net project's CICD?
Yes, this is something Microsoft realized and is enforced using the new Release System.
Basically you have a "process" (Build) building your code and producing artifacts (ie. website file structure, nuget package, installers, etc) in this process you typically take care of things like applying the version value to your assemblies, minifiying js and css files or anything not related with the any specific environment.
Then you have another "process" (Release) to configure your artifacts based on the environment where they will be deployed (ie. modifying web.config files from your website) and deploy those artifacts to the desired environment without having to build them again. (ie. push nuget package to some pre-production nuget feed, copy you website structure to the server, etc)
How do you implement these two "processes"?. Well, that depends on your preference and tools. If you use Visual Studio Team Services you have these processes clearly defined out of the box by the infrastructure, and a lot of built in task to support them.
I have not worked with Jenkins but as long as you can use msbuild you could have 2 msbuild projects one to build your artifacts from the source code on different branches and another one to deploy to different environments base on some configurations (ei. your PIE & TEST) and of course you could use tools like powershell or MSbuild custom tasks to support more advanced scenarios within your processes.
I have create a sample ASP.NET 5 application (pretty much the example one from New Solution), and pushed it to GIT hosted on Visual Studio Team Services (former Visual Studio Online). I want to set up continuous integration to Azure Web App (former Azure Web Site). I have tried to set it up from Azure portal itself, it did create a new build definition, but it fails to build ASP.NET 5. I have found a guide how to do this, but it never really worked for me, I get errors like this e.g.
Error parsing solution file at C:\a\1\s\Frontend\src\Frontend\Frontend.xproj: Exception has been thrown by the target of an invocation.
Predefined type 'System.Void' is not defined or imported
Another problem is that it seems it really takes a lot of time to install dnvm, get packages, etc. So all in all it's a pain to make it work.
So are there real alternatives for that or more importantly is Microsoft is planning to implement something like a Build ASP.NET 5, Deploy ASP.NET to Azure and such to make it easy as I suppose it's easy with the current ASP.NET 4 apps. I really hope that it will be an option soon since it's quite impossible to work with current build system.
For "System.Void" issue, please check the runtime version in "global.json" file and make sure it is consistent with the dependencies in "project.json" file.
For dnvm install issue, since AspNet5 runtime environment isn't installed on VSTS Hosted Build Agent for now and the different users may use different runtime versions, it requires the user to add a "PreBuild" PowerShell step to read the runtime version in "global.json" file and then install it. If you can make sure that you will always only use one version (For example: 1.0.0-rc1-update1), you can deploy your own build agent and install "1.0.0-rc1-update1" on it, then you can skip the dnvm installation during the build process.
Take a look on http://riffer.eu/wordpress/?p=112. There I have a solution vor asp.net core RC_1.
Amazingly you need only two powershell scripts - there is no compiling / visual studio necessary.
I have started using VisualStudio.com build services for my continuous Integration. However some of my test projects use JustMocks and I can't find an easy way to get this working in the cloud.
Has anyone got any easy methods of doing this?
Do I need to create my own hosted build agent or is there another way?
If JustMocks required to be "installed" then you will need to either create your own agent or change your tool. To be honest I don't recommend using a framework that needs installed. I would ask Telerik about options.
JustMock should provide a Nuget package that you can reference so that you don't need to install anything on the build server.
JustMock provides installation-free elevated mocking specially for shared build servers. Depending on what build system you use, there are integration points for MSBuild, TFS Build and through the environment.
Our main website is a collection of 10 separate ASP.NET projects and applications. At the moment, to do a complete deployment onto a fresh server involves running ten separate msdeploy jobs; each application is built, configured (using config transforms) and packaged, but we don't have any solution for deploying all the packages as a single operation.
I can see several possibilities that might work in this scenario, but would love to hear from anybody who has succeeded - or failed - in setting up something similar:
A folder full of packages and deploy.cmd scripts, with a "master script" that will call each individual app script in turn and deploy that app to the target server.
Using a staging server where we deploy the latest build of each package from TeamCity using the production configuration, but then use msdeploy to capture that server into a single enormous msdeploy ZIP package, which is then deployed onto each production server as a single msdeploy step.
Creating a single, enormous Visual Studio solution that references EVERY project in our codebase (perhaps via svn:externals?), compiles and cross-references them ALL, and hence supports using a single msbuild job to create a huge monolithic package containing our entire codebase, built from the latest revision in source control and configured for the target environment.
I've studied Troy Hunt's excellent "You're Deploying it Wrong" series, and Scott Hanselman's "Web Deployment Made Awesome" article, but I think I'm looking for something a step beyond either of these approaches that incorporates multiple projects and applications without necessarily building them from source in a single step - any ideas?
We had a very similar scenario in our company, and we created an installation package using WIX. Our config transform happens at installation time, so now we create a single build, then deploy that to each server via an MSI install package. WIX is very flexible, but also has a steep learning curve. We modify our configs using our own custom action, but it could be done other ways.
We use Team Foundation Server and MSBuild to do our builds. This is pretty straight forward, but did take some work to set up correctly with as many projects and solutions as we had.
Other options we looked into, and even tried were:
InstallShield - Not flexible enough.
Writing our own C# Install - WIX already thought of everything we
were trying to accomplish so why reinvent the wheel?
Just saying to heck with it all and installing things manually - 2 or
3 months of development time in WIX and MSBuild have easily paid for
the hours we would have spent of the last year doing things manually.
I think the deployment tools built into Visual Studio were designed for a single application with just a few deployments. It sounds like you need external tools, and development effort, to get your deployments quicker, and eliminate the need for doing things manually. That's why we invested in the above solution, and it has really paid off.
I'll pick Installshield.
Installshield latest versions support creating webdeploy packages.
You can define the IIS configurations for all apps in a single project and create releases if you want to create packages by separate or one single release for all web apps.
Installshield project has an object model where you can automate basically every task from build scripts, also the projects are simple xml files that you can also modify in automation scripts if required
Developers can modify update WixXML projects by separate and you can add those projects builds as merge modules to your installshield projects through your build scripts with some little tweaks to the installshield project xml (at least in 2011 version, this part is not supported by installshield but can be done)
You don't even need to modify Visual Studio Projects for groups of web apps that follow a same pattern, neither manually modify your installshield project to add new web apps for these cases, you can create packages for new web apps without intervention setting one time your build scripts for the installshield project automation task based on the root VS build output