Bamboo configuration as code for .NET project - bamboo

Configuration as code looks like it's purely for Maven type projects.
Would it be unwise to use this for a .NET project? We're doing micro-services built in .net/C# and want to take advantage of the re usability and not have to live in a clone-driven process.

this is a pretty new feature, and tbh I haven't looked at it yet - but, as I understand it from their documentation, they are providing tools for you to author and publish build plans or deployment projects that are Java and Maven based, but the project you are building in that plan could be any technology.
So if you were building a .NET app with MSBUILD, you would write java code to define a build plan that uses at some point an MSBUILD task:
https://docs.atlassian.com/bamboo-specs/latest/com/atlassian/bamboo/specs/builders/task/MsBuildTask.html
And you would publish that plan from the java code using their maven goal
mvn -Ppublish-specs
But once it is in Bamboo and runs, the system its building is your .NET service.
Again I have not tried this yet, but that is my understanding.

Related

Unable to understand few statements from ASP.NET Core ebook

I have recently started learning ASP.NET Core with the help of an ebook. There are few statements mentioned in the initial chapters, which I am unable to understand clearly.
For eg. following statements are mentioned under Foundational improvements in ASP.NET Core section
Lightweight and modular HTTP request pipeline
Ships entirely as NuGet packages including the runtime
Runtime can be installed Side-by-side- allows you to version application along with runime
The above statements are not clear to me probably because of the term "modular HTTP request pipeline" from Point 1 and terms "runtime" and "version application" from Point 2.
Any short explanation or reference to the suitable doc will be appreciated.
Thanks
http request pipeline
They totally rebuild the HttpListener which is also called the http server. Normally you would host your application in iis which would give you tons of functions but is 1 very old, massive sluggish application.
Now on default you run the application as a console which starts up a HttpListener which is called Kestrel in dotnet core.
This kestrel is totally build from the ground up (so modular priciples and barely any technical depth). And is build based on a very vast C++ library called libuv.
The modularity in this means that it has been build in various loosly coupled parts, meaning that you could replace or extend those parts if you want to. For example use a test server for automatic integration tests.
Ships in nuget packages
Normally you would install a netFramework eg: 4.5.0 and you would already get all these system.* dlls eg: System.web.dll
Now all these dll's are nuget packages and bundled into 1 package called netstandard: https://www.nuget.org/packages/NETStandard.Library/.
Multiple runtimes
If you build a dotnet project, it creates Dlls. these dlls can be run by by any OS if that os has installed the dotnet runtime. (basicaly dll is intermediate language and can be run by the dotnet runtime).
You can also build your dotnet project to include the runtime inside your application, so you can run multiple dotnet applications on an OS with different Dotnet runtime versions.
The downside of this last option is that you have to build your dotnet project for every OS specifically. So normally people choose to just build the OS independent Dlls and make sure the right runtime is installed on the OS.

Is it possible to use MSTest and xUnit in the same Visual Studio solution in separate projects with .NET Core?

The reason I ask this question is because we started a .NET Core project using MSTest and we have lots of them. But now we want to transition to xUnit without having to change existing unit tests.
If it is possible, how is that done? Currently using project.json file. Can I just add the runner in the project.json? I don't know how to have two runners in project.json.
I don't think my test projects even still have a project.json which has a test=runner specified with .NET Core 1.1 - just included nuget packages with the runners and done (I'm using Visual Studio 2017 latest RC).
Should this somehow not be possible for you, why not use a separate project for your new tests?
And finally, are your tests really that hard to update to xUnit? Usually the basic setup is similar enough that some clever find replace does the trick. If not, you should probably consider using the builder pattern for testing. If you're using Resharper you can probably also create some rewrite macro's, etc.

msbuild with code progress in various environments

I come from a Continuous Integration and Continuous Delivery (CI-CD) implementation project background for java web applications. Now i am working for a .NET based project. Microsoft technologies is completely new to me. It is using the MsBuild for the build process via Jenkins. I am learning MsBuild at this time. The more i read, the more confused i am with the Microsoft way of doing this.
I noticed that the msbuild is executed for every environment where the app is going to be deployed using various configuration and profiles based on the environment for the deployment. Below are some msbuild commands for 2 different environments (PIE & TEST)
C:\\Program Files (x86)\\MSBuild\\12.0\\Bin\\MSBuild.exe" /p:Configuration=PIE /m:4 /nr:false src/myapp.sln
C:\\Program Files (x86)\\MSBuild\\12.0\\Bin\\MSBuild.exe" /p:Configuration=TEST /t:Rebuild /m:2 /p:DeployOnBuild=true /p:PublishProfile=TEST src/myapp.sln
C:\\Program Files (x86)\\MSBuild\\12.0\\Bin\\MSBuild.exe" /p:Configuration=STAGE /t:Rebuild /p:DeployOnBuild=true /p:PublishProfile=STAGE /m:2 SprintA/src/myapp.sln
i may be wrong, but i feel that the code being deployed to the two environments (when the code progress from PIE to TEST) is being build for each environment which is not the real code progression concept. IMHO, the build is done once and its progressed to subsequent environments for testing/validation as long as there are no bugs in the code. The various environment specific settings are handled via config files inside the package and the containers (tomcat for a java app) are started with the parameters that reads/parse the confif files.
Is there a way to handle this in .NET? The app is deployed in IIS
UPDATE:
The more i do research reading various docs and blogs, i came across the web publishing method using msbuild for each configuration and the deploy/publish profiles. IS this just the standard way that the mass follows for a .net project's CICD?
Yes, this is something Microsoft realized and is enforced using the new Release System.
Basically you have a "process" (Build) building your code and producing artifacts (ie. website file structure, nuget package, installers, etc) in this process you typically take care of things like applying the version value to your assemblies, minifiying js and css files or anything not related with the any specific environment.
Then you have another "process" (Release) to configure your artifacts based on the environment where they will be deployed (ie. modifying web.config files from your website) and deploy those artifacts to the desired environment without having to build them again. (ie. push nuget package to some pre-production nuget feed, copy you website structure to the server, etc)
How do you implement these two "processes"?. Well, that depends on your preference and tools. If you use Visual Studio Team Services you have these processes clearly defined out of the box by the infrastructure, and a lot of built in task to support them.
I have not worked with Jenkins but as long as you can use msbuild you could have 2 msbuild projects one to build your artifacts from the source code on different branches and another one to deploy to different environments base on some configurations (ei. your PIE & TEST) and of course you could use tools like powershell or MSbuild custom tasks to support more advanced scenarios within your processes.

TFS2010 build with unit test

I am trying to convert a build system setup with TeamCity and Nant scripts to use TFS2010 (We bought the license and might just as well make use of it) After some work I get the web project to build and deploy to the web-server. We have a domain, API, test and web project in our solution.
How do I configure TFS to run the unit tests that we have written so far? I did configure the build to look for ***.UnitTest.dll in(VS2010) Edit build definition>Process>Automated Tests
Now the build fails with a message that says:"Could not load file or assembly 'nunit.framework, Version=2.5.3.9345" Am I correct when I say that TFS is trying to run NUnit on the build server? I did install NUnit-2.5.3.9345 on that TFS2010 build server and still nothing?
Thank you
Jack
The build facility in TFS uses MSTest as test runner, with which it's tightly integrated.
If you want to run your unit tests with NUnit as part of your build, take a look at the NUnit for Team Build project on CodePlex.
The project started out for TFS 2008, however support for TFS 2010 has been added in version 2.0. Note that this feature is still in early stages of development, so your mileage may vary.
I'm late to the game, because I've had to deal with this issue recently. I found this article helpful for me in this. It didn't work right off the bat, but I found if I added it into my buildscript via the controls in a similar manner/pattern, it would work.
My only problem now has been getting it to actually error (right now it warns) even when flagging them to cause the build to error
Link: http://blog.gfader.com/2011/06/running-nunit-tests-in-tfs-2010.html

NAnt or TFS build which is better?

There was a question about Msbuild and NAnt advantages and disadvantages. Now let's see which is better TFS Build(with msbuild) or NAnt. In my opinion NAnt because you can easily move the building environment in few seconds to another machine (depends on copying files), also it's easier to manage, much faster to debug and it's not integrated with Team Foundation Server, what do You think?
It's worth noting that "TFS Build" (actually named "Team Build") uses the MSBUILD engine that's part of the Windows SDK now. It's a free engine into which you can plug custom tasks, and there's a community of users who have done just that. See
http://www.codeplex.com/sdctasks
http://www.codeplex.com/MSBuildExtensionPack
among many others.
MSBUILD is very mature, having been part of .NET since version 2.0. It could hardly be any more widely-used, as it is the build engine that builds all C# and VB.NET projects from the Visual Studio IDE.
It also comes with an API, with which you can programmatically control builds, do custom logging, etc.
I have used both. We have moved from Cruise Control / Nant to TFS.
The big benefit and the reason why we moved is the integration and reporting possibilities.
Nant is easier to work with for simple projects. But if you have a large environment and you look at it as your software development process, not just a build server, then I find TFS much better.
Just use what make you the most productive.
I haven't used TFS, but have been using NAnt for some time, so take the following points for what you will:
NAnt is free and open source
NAnt is pluggable with custom tasks
NAnt can be used together with various other tools, usually free and open source
NAnt is mature and widely used
NAnt is portable, doesn't require installation
The question should be nant vs msbuild. You could use Teambuild to run nant if you wanted.
Sometimes a solution is easier in msbuild, sometimes nant...
I like both, but msbuild is alway on the box...