Does installing MsBuild tools change my old environment - msbuild

I am tasked to upgrade our build environment as it is currently using a mix of VS2010 and VS2015 as I understand. I want to install the latest build tools and start updating the scripts but as I don't have a lot of experience and this build machine is used to do production builds I don't want to mess up the current environment in case it will take a lot of time for me to update everything.
So my question is if I can install the latest MsBuild tools without breaking anything like environment variables and such.
Also, how does ToolsVersion="Current" work? Is that set up in some environment variable through vsvars or similar?

Related

What is the upgrade path for C# MSBuild Tools to be?

We are using Bamboo to compile our C# projects and recently enhanced our build AMIs to have the MsBuild 15 Build Tools so devs are able to use C#7. With the advent of C# 7.1, Microsoft state they are "increasing the cadence" of language releases and I've been trying to find out what the upgrade path for MS Build Tools and how to keep it updated with the latest version.
At the moment it appears that the Bamboo admins would have to always be one step ahead of the devs (who update their IDE to use the new language release) to enable clean compilations.
I can't find a decent way of automating this (other than though something like Chocolatey) - I'd be interested to see how other people get round this or ensure their Build Tools will always up to date.

Dynamically change Upgrade Code in Install Shield

As said here and here, doing an upgrade with Install Shield is easy.
Did it, but I need something more - I need to change the Upgrade Code, of an upgrade, dynamically.
I have TeamCity, which builds my product with different Upgrade codes - this way I have different products, that I can install on same machine. I am doing it using InstallShieldPropertyOverrides in **.isproj* file. It works, but I want every single of those products to be upgradeable, so I need to change the Upgrade Code in Upgrades section (Orginize Your Setup > Upgrade Paths). I need it to be done dynamically, during build on TeamCity, but I found no InstallShield property in InstallShieldPropertyOverrides to be configured with MSBuild. Does anyone have an idea or solution?

How to obtain all versions of KRE?

Problem
I want to both use stable versions of KRE and the bleeding edge nightly built KRE. One ASP.NET5 application may be beta2, but another I may want to be beta4. So what I did was install both in powershell as found here.
What happened is that the stable KVM installed in C:/Users/derp/.kre and the nightly build KVM installed in C:/Users/derp/.k
Worse yet, I can only see this now
Attempts
I tried kvm install KRE-CLR-x86.1.0.0-beta2 and it failed
Shall I try moving the packages from /kre file to the /.k file? This seems hacky and like a really bad idea
RTFM - Tried to use the install feature and including the -a, but failed.
I'm doing something the hard way and can't see the obvious.
I search on here
I feel if there is an answer to what I am trying to do above, it is worth being on here for others to find as well. Thank you all for your patience.
ASP.NET 5 is under development and there is no guarantee that changes between different pre-release version are backward compatible (sorry!).
The /.kre -> ./k rename is not backward compatible and you cannot have both the old and the new kvm simultaneously on the PATH. However, you can get can have two versions of kvm on your machine but you will have to use the full path for at least one of them.
I think the key is the path environment variable of your system. You have to use two set of "kvm", one for night builds, one for public beta, to download and set correct path environment variable.
For instance, I get one kvm from Entity Framework 7 repository, which can download and use beta 4 builds. I also have another kvm from Home repository which can download and use public beta builds.
You can use either kvm with "upgrade" or "use" command to set correct path environment variable, then run your application on the runtime you need. I think even Visual Studio 2015 CTP runs your projects based on the Runtime specified in your path environment variable. For the time being, only beta 3 run times can display in the project property dialog of VS 2015 CTP, but when hitting ctrl + F5, my website starts to load beta 4 runtime and assemblies, I can see the loading in output window, I think this is because I have .k folder prior to the .kre folder in the path environment variable.
Can you try the following?
$cmd-prompt>kpm Install KRE-CLR-x86
It worked for me.

Testing a NuGet package

We are big users of NuGet, we've got 25-30 packages which we make available on a network share.
We'd like to be able to test new packages before they're built and released in the consuming applications. Ideally, this could be done using something similar to Maven's snapshot and having a specific development package (e.g. snapshot functionality).
Has anyone else come up with a, ideally reasonably non-hacky, way of doing it?
Our favoured method is to generate the package assemblies and then manually overwrite the assemblies in the packages/ directory, i.e. to replace the actual project references, but that doesn't seem particularly clean.
Update:
We use a CI build server which creates builds on every commit and has a specific manually triggered NuGet build which works off specifically tagged versions of the codebase. We don't want to create a NuGet build off every commit, but we would like to be able to test a likely candidate in the wild before we trigger the manual NuGet package build.
I ended up writing a unit / integration testing framework to solve a simular problem. Basically, I needed to verity the content of the package, the versions and info, what would happen when I installed and uninstalled the package, what versions were the assemblies in the lib, what bits the assemblies were built as (x86 or x64) and so on - and I needed it all to run without Visual Studio installed and on my build machine (headless) as a quality gate.
Standing on the shoulders of giants like: Pester, PETools, and SharpDevelop's package management module I put together - nuget-test
Clone the project into your package directory (where your .nuspec file and package files are). If for whatever reason you want to keep the nuget-test project as a "git" repo then simple remove "remove-item nuget-test/.git -Recurse -Force" from the command below.
git clone https://github.com/nickfloyd/nuget-test.git; remove-item nuget-test/.git -Recurse -Force
Run Setup.ps1 in the root of the nuget-test directory in an x86 instance of PowerShell.
PS> .\setup.ps1
Write tests and place them in the nuget-test/test directory using the Pester syntax.
Run the tests.
PS> Invoke-Pester
Project page: nuget-test
On github: https://github.com/nickfloyd/nuget-test
I hope this helps you get closer to what you're trying to get done.
If you're using NuGet packages to distribute your libraries, you should not limit to only testing the libraries. You should test the packages themselves as well (if your binaries are OK but incorrectly installed, consumers still have issues). The whole point is to improve this experience.
One way could be to have an additional CI or QA repository. The one you currently have is actually your "production" repository containing consumable releases, considered finished high-quality products.
Going further, you could have a logical package promotion flow (based on Continuous Integration or even using a Continuous Delivery approach), where:
- each check-in produces a package on your CI repository
- testers pick up a CI package for QA and if found OK promote it to either a QA feed, or to the Production feed (whatever you prefer, depends on the quality of your testing and how well it is automated)
There are various ways of implementing this scenario, using simple network shares, internal NuGet.Server or Gallery implementations, or simply use http://myget.org to give it a try with minimal cost and zero effort.
Hope that helps!
Cheers,
Xavier

Local Build Automation?

Working in a team environment, we have a Team Foundation Server that also contains a Team Build component. It is configured to automatically build all projects and solutions at specific times or on request.
We develop a product that is built with several solultions that depend on eachother. When things have been changed in one solution, it has to be rebuilt locally manually in both debug and release mode so that changes take effect in another solution that depends on it.
Also when a developer retrieves all sources the first time, he has to build all solutions manually in the correct order to get a working environment.
What is the best way to automate things like this? Create .cmd files that trigger the correct msbuild files? Using a program such as CruiseControl.NET?
What do you people do to maintain a clean local development environment?
What I did for our Team was to provide a Visual Studio Solution which contains all projects. Then I created a simple .cmd file which uses the commmandline tools of Visual Studio to build this solution with their respective debug/release/profile configurations. This is a one step build solution that can be used from every engineering machine.
The next level is the continuous integration system that is setup to check for changes every 15 min and start a build if there are changes in the VCS. I'm using hudson as our CI system. The CI system is used to build the native projects, the java projects as well as the flex stuff. Since everything can be build from the commandline it's pretty easy to use it with hudson or CruiseControl.NET.