VSTS Service fabric proper way to build - msbuild

I am using VSTS build templates and having trouble to place the necessary publish profile files into my service fabric build. So I have disabled the top and added two more steps one build and one copy. Is this the way to go ? What is the difference between two templates and where do we see that ?

Yes, you can copy necessary files (e.g. publish profile files) to other place by using Copy Files or Windows Machine Copy Files task.
Regarding Build Service Fabric App task (top task), based on the icon, I think it is a task group, if so, you can check the detail tasks by selecting Build & Release tab>Click Task Groups> Select a task group> Tasks.

Related

How to remove Post Job Cleanup step in Team Services?

We've recently downloaded and are hosting an on-premises Visual Studio Team Services build agent for our source code and have noticed that it's doing an extra step in the build process compared to our hosted agent. This extra step is the 'Post Job Cleanup' as seen below:
When setting up this agent locally there was no options for setting this, and looking at our build steps this extra job isn't listed there:
I've checked online guides but there's been no hint as to where this extra step is coming from. Does anyone know where the option is to include/exclude this for builds is?
Setting process.clean to false in the variables of the release pipeline stops the "finalize job" step from killing all processes.
Setting variables in the release pipeline
https://developercommunity.visualstudio.com/solutions/498153/view.html
Gradle Daemon being killed in "Finalize Job" step
They are the built-in steps and there isn’t the way to remove them (Get source, Post Job Cleanup) in VSTS, you don’t need to worry it, it won’t affect your project or build.

How to avoid a build and deployment of dependencies which have no code changes

I'm doing a proof of concept on continuous integration and whether our development team will benefit from automated builds and automated deployments to reduce human error.
I've already come quite far in the process but have some questions on how to configure our incremental builds to avoid rebuilding of dependencies that had no code changes.
In addition I’d like our deployment tool to identify and deploy only assemblies rebuilt as a result of a code change.
We already use Microsoft products like TFS for source control, Visual Studio for development and Team Foundation Build for continuous integration builds. We’re currently leaning toward InRelease for deployment as it seem to integrate well with Team Foundation Build.
But first, here is our current setup...
There are 200+ C# solution files, each containing one or more projects. It is not practical in the environment to combine these projects into less solutions, i.e. by design. Projects within a solution uses project references to resolve dependencies and file references to projects in other solutions. As far as I know, this is the recommended approach by Microsoft when dealing with a large amount of projects.
We use a "branch by feature" strategy e.g. isolated development on concurrent features branches which is merged up to a stable Main branch when complete. When it's time for a release, a release is branched from main and isolated for hotfixes and deployment. The feature branches and main branch have a CI build triggered by code check-ins. Releases will mostly like be manually executed from InRelease against a selected release branch. A release will be deployed through various environments including INTEGRATION/TEST, UAT and ultimately to all our clients. We're still fleshing out the details of branching strategy, but that's a question for another time.
The current problems to solve:
1. Avoid rebuilding of dependencies that have no code changes...
When we deploy new functionality or a patch to a client, we want to push the absolute minimum in files. Our company has a very large customer base (thousands of customers) with sometimes slow internet connections, so doing a full deployment of all assemblies (200+) to every customer is not an option. I've partially solved the problem by setting up incremental builds which correctly rebuilds only changed projects as expected but also rebuilds all the dependent projects even though NO CODE CHANGES were made to them. This results in both the changed assemblies and dependencies having new timestamps. If we use the change of timestamp to identify which assemblies to deploy, then this would result in deployment of functionally unchanged assemblies. The goal here is to deploy only assemblies where the code has changed and assemblies where breaking changes occur.
For example:
Solution B, has a project called Project B
Solution A, has a project called Project A
Project B -> Project A (where Project A has a file dependency on Project B)
When a non-breaking change is made in Project A, say to the interior of a method, then the expected result is: only A is built and therefore a candidate for deployment.
When a breaking change is made in Project A then that will break Project B, the expected result is: Both A and B is built and therefore a candidate for deployment.
Currently MSBuild rebuilds all dependents regardless, which is not what we want.
2. Automatically identify which assemblies should be deployed...
I have a partial solution to the problem.
When a build is performed, my build process template is configured to run a MSBuild script containing a list of solutions to build in a particular order.
This operation is performed in the build agent’s workspace. Every time a new build is performed the build process template creates an unique drop folder in the format and copies the binaries from the build agent workspace to the drop folder. This is out of the box functionality taken care of by the standard build process template. The build has been configured not to clear the build agent workspace, so the first time it runs it will build all projects within a solution but subsequent builds will only build projects that have code changes or is dependent on other projects (incremental build?). Therefore unchanged assemblies will have the original time stamps and changed assemblies will have new timestamps.
We have a tool that can do folder comparisons between drop folders and output the results to a txt file. This allows us to identify which binaries have been added/changed/removed since the last deployment. It also gives us the added benefit of comparing the list of actual artefacts to a manifest of expected artefacts as defined by the developer. This will ensure that no assemblies get deployed that has not been specified and proven to be unit tested.
The question is how can be we leverage InRelease to deploy only the required files as per the example above and not all files in the drop folder?
Install a TFS Proxy in before your build machine, this reduce the net traffic
You will start with a branch strategy like Service Pack, you can read a documentation about in ALM Rangers guidance... And adapt you process template to build just the part of code changed. I think in BRD Lite, another guidance by ALM Rangers, you will found more information.

Customizing the TFS 2008 build sequence to avoid compilation and deploy SSRS

I'm trying to create a CI process for SQL Server Reporting Services.
I am fairly new to TFS but quite experienced with MSBuild. In the past I've used a combination of MSBuild with Team City so the whole build process is more or less custom.
Here lies the start of my problems, as the solution I am deploying only contains Report Server projects (rds), no compilation is required. I thought that I would override the the first default task that TFS runs (EndToEndIteration) to override the default TFS build sequence and inject my own.
The first snag that I have come across is that the build always fails, how can I set the status of the build to success? Currently the EndToEndIteration task is very light and only has a message.
Is this the best method to create a custom build process in TFS where compilation is not required? Or should I use the default sequence and override one of the hook tasks mentioned in
http://msdn.microsoft.com/en-us/library/aa337604%28VS.80%29.aspx
(ie: AfterCompile)
The core steps that I'd like to achieve are:
Bundle the RDL and datasource files
Connect to the host server to
register/deploy the reports
Re-apply any subscriptions that
previously existed
Run tests to verify the deployment
succeeded and is returning results
as expected
I have found another article on Report services deployment:
Reporting Services Deployment
But it doesn't mention the best practice for customizing the standard build process.
Any help would be appreciated.
For anyone interested I've just stumbled apon an answer to the first question I asked:
The first snag that I have come across is that the build always fails, how can I set the status of the build to success?
You can find a solution to this at
Link
The options available for this property are:
Unknown
Failed
Succeeded
Don't forget to also set the TestStatus else the build will only partially succeed
Still looking for the best practice for creating a custom build sequence.

Getting Build URi or Build number of the last build from MSBuild

I am trying to create a custom task for MSBuild so that it will send an email to the users that sais that a new version is up on the test server.
I go the email part done, what i would like to do is add the work items that are included in this build.
I tried the MSBuild extention (used to send the email) but the feature to get that info is not supporte don TFS 2005 wich is what i am using and cannot upgrade.
I was trying to use BuildStore.GetWorkItemsForBuild but i need a builduri which i cannot find a way to get.
The setup is like this, the steps to compile, build and deploy are called from a batch file as diffrent options. At the end of the deploy option, the email is sent.
Anyone can help me on this?
It's feasible but it seems your'e trying to stretch MSBuild functionality for tasks that related to build management.
Consider a solution like TeamCity that wraps for you all the build processes and manage notifications (i.e. when build has succeed or failed). It has out-of-the-box support for MSBuild.

Local Build Automation?

Working in a team environment, we have a Team Foundation Server that also contains a Team Build component. It is configured to automatically build all projects and solutions at specific times or on request.
We develop a product that is built with several solultions that depend on eachother. When things have been changed in one solution, it has to be rebuilt locally manually in both debug and release mode so that changes take effect in another solution that depends on it.
Also when a developer retrieves all sources the first time, he has to build all solutions manually in the correct order to get a working environment.
What is the best way to automate things like this? Create .cmd files that trigger the correct msbuild files? Using a program such as CruiseControl.NET?
What do you people do to maintain a clean local development environment?
What I did for our Team was to provide a Visual Studio Solution which contains all projects. Then I created a simple .cmd file which uses the commmandline tools of Visual Studio to build this solution with their respective debug/release/profile configurations. This is a one step build solution that can be used from every engineering machine.
The next level is the continuous integration system that is setup to check for changes every 15 min and start a build if there are changes in the VCS. I'm using hudson as our CI system. The CI system is used to build the native projects, the java projects as well as the flex stuff. Since everything can be build from the commandline it's pretty easy to use it with hudson or CruiseControl.NET.