I have a bunch of C# VS2010 projects compiling automatically on a TeamCity build server.
The build server compiles the projects and then runs automatic unit tests on the output.
Problem is, part of the tests are trying to communicate with WCF services on the local server.
The tests fail because the BuildServer only builds the projects and does not publish the output services onto the IIS7(Running alongside with the TeamCity).
Is there a simple way to automatically tell TeamCity(maybe through MSBuild.exe) to publish my *.svc files every time the code finished compiling?
Thank you [=
Simplest thing to do would be to point IIS7 at the TeamCity checkout directories -- it will build there so you can run the tests against the services without simulating deployment. You also might want to create 2 stages of tests -- one more traditional unit tests that get run before "deployment" and a 2nd set that gets run after the 1st set is successful and "deployment" happens.
Deploying out of TeamCity can definitely work though exactly how depends on your network and application topography.
To deploy the web services you can use Web Deploy to package and install your services in IIS. However, it seems that the real issue is the dependency your tests have on your services. You should abstract your service interfaces and use a mocking framework and your favourite DI container in your tests so the services don't need to be up.
HTH.
Related
I come from a Continuous Integration and Continuous Delivery (CI-CD) implementation project background for java web applications. Now i am working for a .NET based project. Microsoft technologies is completely new to me. It is using the MsBuild for the build process via Jenkins. I am learning MsBuild at this time. The more i read, the more confused i am with the Microsoft way of doing this.
I noticed that the msbuild is executed for every environment where the app is going to be deployed using various configuration and profiles based on the environment for the deployment. Below are some msbuild commands for 2 different environments (PIE & TEST)
C:\\Program Files (x86)\\MSBuild\\12.0\\Bin\\MSBuild.exe" /p:Configuration=PIE /m:4 /nr:false src/myapp.sln
C:\\Program Files (x86)\\MSBuild\\12.0\\Bin\\MSBuild.exe" /p:Configuration=TEST /t:Rebuild /m:2 /p:DeployOnBuild=true /p:PublishProfile=TEST src/myapp.sln
C:\\Program Files (x86)\\MSBuild\\12.0\\Bin\\MSBuild.exe" /p:Configuration=STAGE /t:Rebuild /p:DeployOnBuild=true /p:PublishProfile=STAGE /m:2 SprintA/src/myapp.sln
i may be wrong, but i feel that the code being deployed to the two environments (when the code progress from PIE to TEST) is being build for each environment which is not the real code progression concept. IMHO, the build is done once and its progressed to subsequent environments for testing/validation as long as there are no bugs in the code. The various environment specific settings are handled via config files inside the package and the containers (tomcat for a java app) are started with the parameters that reads/parse the confif files.
Is there a way to handle this in .NET? The app is deployed in IIS
UPDATE:
The more i do research reading various docs and blogs, i came across the web publishing method using msbuild for each configuration and the deploy/publish profiles. IS this just the standard way that the mass follows for a .net project's CICD?
Yes, this is something Microsoft realized and is enforced using the new Release System.
Basically you have a "process" (Build) building your code and producing artifacts (ie. website file structure, nuget package, installers, etc) in this process you typically take care of things like applying the version value to your assemblies, minifiying js and css files or anything not related with the any specific environment.
Then you have another "process" (Release) to configure your artifacts based on the environment where they will be deployed (ie. modifying web.config files from your website) and deploy those artifacts to the desired environment without having to build them again. (ie. push nuget package to some pre-production nuget feed, copy you website structure to the server, etc)
How do you implement these two "processes"?. Well, that depends on your preference and tools. If you use Visual Studio Team Services you have these processes clearly defined out of the box by the infrastructure, and a lot of built in task to support them.
I have not worked with Jenkins but as long as you can use msbuild you could have 2 msbuild projects one to build your artifacts from the source code on different branches and another one to deploy to different environments base on some configurations (ei. your PIE & TEST) and of course you could use tools like powershell or MSbuild custom tasks to support more advanced scenarios within your processes.
We're trying to work through the new tool chain for building and deploying an ASP.NET 5 (vNext) CoreCLR web site to a server cluster. Between the new compilation changes and the changes to TFS, I'm not sure how everything now gets built and deployed. The scenario is as follows:
On-premise TFS for source control and build agent
Targeting ASP.NET 5 under CoreCLR, hosted via IIS
Questions are:
Using TFS for continuous integration builds (and hopefully deployment to an on-premise IIS server), how does one build and deploy this new application type?
It seems like MSBuild might still be usable to point at a .sln file so as to indirectly invoke dnu.exe, is that correct? Is that the appropriate way to do that now?
Should we be running a scripted build task instead to run dnu.exe instead?
How are these new CoreCLR builds deployed? Just a straight copy to a directory on a remote machine?
This is a new application and we're using a multi-layered application architecture, where the DAL and Business logic are in their own CoreCLR projects, if that makes a difference.
Thanks in advance for shedding some light on the situation.
Here is what we ended up doing:
Powershell script "prebuild.ps1" as per the previous answer and Microsoft deployment guidelines: https://msdn.microsoft.com/en-us/Library/vs/alm/Build/azure/deploy-aspnet5
Vanilla MSBuild build. no switches or special settings.
Powershell script to execute xUnit test runner. We used guidance from this post at http://fluentbytes.com/running-xunit-test-with-asp-net-dnx-and-tfs-2015-build/
Powershell script to run "dnu publish". This creates a directory of the entire web application's structure.
"Windows File Copy" task to deploy the directory structure created in #4 to all of the target machines in the test environment.
To build and deploy ASP.NET 5 via TFS2015 vNext build system, you need to:
1). Create a PowerShell script (named Prebuild.ps1, for example) to install DNX. Details of the PowerShell script can be found: https://msdn.microsoft.com/en-us/Library/vs/alm/Build/azure/deploy-aspnet5 . Add the script file into TFS version control.
2). Add the PowerShell script build step into build definition. Run the Prebuild.ps1 script in this step:
3). In the MSBuild step, specify the project needs to be built, and add the following /p:DeployOnBuild=True /p:DeployTarget=MSDeployPublish /p:CreatePackageOnPublish=True /p:MSDeployPublishMethod=InProc /p:MsDeployServiceUrl=localhost /p:DeployIisAppPath="Default Web Site/TFSTest1" /p:VisualStudioVersion=14.0 to publish the project to IIS.
I am trying to convert a build system setup with TeamCity and Nant scripts to use TFS2010 (We bought the license and might just as well make use of it) After some work I get the web project to build and deploy to the web-server. We have a domain, API, test and web project in our solution.
How do I configure TFS to run the unit tests that we have written so far? I did configure the build to look for ***.UnitTest.dll in(VS2010) Edit build definition>Process>Automated Tests
Now the build fails with a message that says:"Could not load file or assembly 'nunit.framework, Version=2.5.3.9345" Am I correct when I say that TFS is trying to run NUnit on the build server? I did install NUnit-2.5.3.9345 on that TFS2010 build server and still nothing?
Thank you
Jack
The build facility in TFS uses MSTest as test runner, with which it's tightly integrated.
If you want to run your unit tests with NUnit as part of your build, take a look at the NUnit for Team Build project on CodePlex.
The project started out for TFS 2008, however support for TFS 2010 has been added in version 2.0. Note that this feature is still in early stages of development, so your mileage may vary.
I'm late to the game, because I've had to deal with this issue recently. I found this article helpful for me in this. It didn't work right off the bat, but I found if I added it into my buildscript via the controls in a similar manner/pattern, it would work.
My only problem now has been getting it to actually error (right now it warns) even when flagging them to cause the build to error
Link: http://blog.gfader.com/2011/06/running-nunit-tests-in-tfs-2010.html
I'm using the MSTest system for unit testing my compact framework (3.5) application and DLLs. When I test some DLLs it just runs but for some it loads the emulator first. Can anyone tell me what determines whether the emulator is launched?
The testrunconfig file tells mstest which platform to deploy the tests to. However, if you have the configuration set to both build and deploy all of the DLLs, then the DLLs will attempt to deploy to their default target, not the target from the testrunconfig (yes, it's stupid and confusing).
The general rules I follow are:
Go through each project and set the target device to the same thing.
Use the Configuration Manager, and set to deploy only those items that won't be deployed due to being a dependency
Set the testrunconfig to match the target device from above
I've noticed projects such as Msbuild Extension Pack and MsBuild Community Tasks give msbuild the power to install assemblies, sql, and setup IIS. These features seem to be oriented to doing installs and not builds.
So I was wondering how many people out there are using msbuild, perhaps in conjunction with Cruise Control.Net to do installs on staging environments?
I use MsBuild to build, and part of the build process runs Wix to create an installer(MSI) which is used to deploy to production.
I wrote up a little sample of templating configurations for different target environments with msbuild: http://blog.privosoft.com/2010/10/msbuild-environment-sandboxing.html
We use CC.NET & MSBuild to build and then also to publish to our dev and stage environments, however we do not have the push live on CruiseControl.NET, we run that MSBuild by hand. We just thought it would be way to tempting with a button to publish live ;) It took probably 2 or 3 revisions to get our MSBuild set up right. But now everything is in one file, and everything is based on Targets and Properties to do all the work. About 6 months ago, was what should be the last update and that was a multi-server push so we are ready for scaling up. We can now push any combination of parts to any combination of servers. So if we want 5 database servers, 3 contenet servers, and 2 web servers we have that ability. No need to use anything else. MSBuild can do it.
I created a deployment system where a central coordinator can:
- identify the right target server for a given component (e.g Windows service goes to a given server, web services go to another, etc.)
- perform a PsExec of a deployment MSBuild script on the target server
- the deployment MSBuild script is responsible for:
a) downloading the right component package (in my case a .zip)
b) backing up previous versions of the component
c) extracting the package to the right place
d) tailoring the installation steps to the type of component to deploy (e.g. needs to perform an Exec task of installutil.exe on a Windows service )
e) logging the result of the deployment
This system is built using a mix of:
- core MSBuild tasks
- [Tigris MSBuild community tasks][1]
- [MS SDC tasks][2]
- and custom tasks
The system allows us to perform consistent deployment of complex apps across partitioned environments (e.g. DEV, QA, UAT, etc) made of virtual servers.
I use MSBuild to build a fairly large client/server application. I use InstallShield 2008 to create a separate client and server install set.
By adding a custom target into the build process you can combine the creation of the installers into the build.
I would recommend that you create and test the build and the installer separately, before attempting to integrate the two.
I know this is an old question... but...
I am currently using MSBuild with MSBuild Extension Pack (http://msbuildextensionpack.codeplex.com) to do my entire deployment. The database portion is handled with the VS database command-line tool (vsdbcmd.exe - http://msdn.microsoft.com/en-us/library/dd193283.aspx). That Extension Pack is pretty amazing, and is letting me build web sites, app pools, Windows services, update config, and much more.
I've also put Team City agents on the test servers, so I can deploy as part of a build chain (introduced in version 7 of Team City). And running my deploy MSBuild script is super easy from Team City.
I used to use MSBuild, now I'm using PowerShell. MSBuild is a build language. It is painful to script in. There is a lot I wanted to do in it that were difficult and sometimes impossible.
Over the past year, I've created an PowerShell module somewhat equivalent to MSBuild Extension Pack called Carbon.
I strongly, strongly encourage everyone out there to learn and use PowerShell.