Using inherited process model for existing collection on Azure DevOps Server 2019 - tfs-process-template

With Azure DevOps Server 2019 RC it is possible to enable inherited process model on new collections (see release notes). Is there any way to use the inherited process model also for existing collections, where no customization on the process was made

Inherited process model is currently only supported for new collections created with Azure DevOps Server 2019 and not for existing collections.
See this Developer Community entry which asks for it.

I added a set of comments on how I hacked my way from an existing XML collection with a set of Projects to the Inherited type.
https://developercommunity.visualstudio.com/content/idea/614232/bring-inherited-process-to-existing-projects-for-a.html
Working as long as a vanilla workflow is applied to an existing XML collection before doing the voodoo thing.

Not exactly an answer for your question but we recently had the same task and I want to share how we handled this. We also wanted to move to the inherited model and we did not want to do any hacking. So we decided to create a new Collection on our Azure Devops Server 2020 with the inherited model and also migrate our tfvc repository to git.
Create the new Collection. Documentation
git-tfs to create a local repository from our tfvc repository and push it
azure-devops-migration-tools to copy all work items from the old collection to the new collection
In the old collection add the ReflectedWorkItemId for every WorkItem look here
In the new collection add the ReflectedWorkItemId for every WorkItem by using the process editor
Pro-Tip: create a full backup of the new collection to revert to this state easily. I had multiple try-error-restores.
You can't migrate shared steps or shared parameters like this, because you can't edit these work item types in the new collection. There is a workaround
We used the WorkItemTrackingProcessor to migrate all Epics/Features/Product Backlog Items/Bugs/Tasks/Test Cases. Then the same processor but with the mentioned workaround for Shared Steps and Shared Parameters.
This processor also migrates the Iterations and Area Paths
Finally we used the TestPlansAndSuitesMigration to migrate the Test Plans and Suites
For speeding up the migration, you can chunk the work items (for example by date or id) and start the migration multiple times.
Our build and release pipelines + task groups were migrated manually by import and export
We migrated the variable groups by using the API
The teams were created manually and we added the default area paths also by hand

Related

Azure devops migration tools cant find required field

I am trying to migrate an Azure DevOps project from one organization to another organization. I get the following message at console output.
[08:22:35 INF] Found target project as myTestProject [08:22:35 WRN]
ValidatingRequiredField: Epic does not contain
Custom.ReflectedWorkItemId
Do this mean that the custom process used has to be used in target project?
If so, is there a method to export processes in Azure DevOps?
The message actually only means that the Epic work item is missing the ReflectedWorkItemId field (see documentation and documentation 2).
The field is used to store the state of the migration. Each affected WI Type (Source + Target) must have this field. Depending on the Process Model type you use the old tooling (witadmin) or the new tooling(web access).
"Custom" actually only means a derived process template. With the inheritance model you cannot change the template directly, but derive from the Microsoft original.

is it possible to bulk import existing infrastructure in terraform?

i am creating infra with terraform. I have decided to modularize it. But after modularizing and using terraform plan, i can see my Plan: 28 to add, 0 to change, 28 to destroy.
If i change the existing structure to modularize will terraform destroy all? Is there any way to not delete infra
Since you decided to split your infrastructure code in several modules, terraform will treat your resources as a new ones, because their location did change.
Moving resource blocks from one module into several child modules causes Terraform to see the new location as an entirely different resource.
Documentation: https://www.terraform.io/language/modules/syntax#transferring-resource-state-into-modules
There're several ways you can proceed now:
a. You can use refactoring feature of Terraform (available from version 1.1): https://www.terraform.io/language/modules/develop/refactoring and utilize moved block to map old resources to the new ones.
b. You can start with the clean terraform state and manually import resources from your actual infrastructure to the state (https://www.terraform.io/cli/import) (you need to do it for all 28 resources)
But if its your new project, the easiest way would be to just recreate resources from scratch (of course if it's not the production environment containing important data)

How does importing a work item type definition for a TFS project impact other projects in the collection

I am using TFS 2015 on premises and I am trying to understand the scoping of process templates and the work item type definitions within them. I have been reading a number of the reference documents provided by Microsoft, yet I still find myself confused.
https://learn.microsoft.com/en-us/vsts/work/work-items/guidance/manage-process-templates?view=vsts
https://learn.microsoft.com/en-us/vsts/work/customize/reference/process-templates/customize-process?view=vsts
https://learn.microsoft.com/en-us/vsts/work/work-items/guidance/work-item-field?view=vsts#what-is-a-field-how-are-field-names-used
The above articles clearly suggest that the work item fields are at the project collection level (emphasis added by me):
Most process template components that you customize will affect only the team project that you create by using the process template. The exceptions to this rule are global lists, link types, and work item fields. These objects are defined for a team project collection.
Why then when I import a work item type definition, do I specify a project within a collection to import it to? The importwitd documentation here states I am importing my changes to a particular project:
https://learn.microsoft.com/en-us/vsts/work/customize/reference/witadmin/witadmin-import-export-manage-wits?view=tfs-2018&viewFallbackFrom=vsts
importwitd: Imports work item types from an XML definition file into a team project on a server that runs Team Foundation Server.
I must be failing to understand some of the intricacies here, but I cant seen to wrap my head around the impact radius of making work item type definition changes.
Your team project contains 14 or more work item types (WITs), based on the process (Agile, Scrum, or CMMI) used to create the team project. A WIT is the object you use to track different types of work. When you modify a WIT, you should have known which WIT under which team project you want to modify, and export it:
witadmin exportwitd /collection:CollectionURL /p:ProjectName /n:TypeName /f:"DirectoryPath/FileName.xml"
Work item fields defined for work item types that are defined for a team project collection. Changes that you make on work item field attributes will apply to all team projects within the team project collection. If you have installed TFS power tools, you could check Work item field explorer there:
Also, you could use command to list the fields:
witadmin listfields /collection:CollectionURL /n:RefName [/unused]
Work item process template changes are scoped to a single team project. If you have multiple team projects and you change a work item type definition, you have to import it into all of the team projects where you want the change to be visible.

How revert to tfs 2015 stock agile process template

After (or before) we convert from TFS 2012.2 to TFS 2015.3 (which we have done just fine in a test run) we would like to revert our team project to the standard TFS 2015 Agile process template, and no longer use the customized agile process that we had modified from TFS 2012. We are quite willing to delete all of our work items and start over, but need to keep the team project history and change sets. Anyone know how to do this? Answers to prior questions on this did not address this situation. Thanks.
There is no easy way to do it. Basically the steps require you to use a lot of witadmin commands. Start by deleting any work item types that were added and don't exist in the default template.
Then push the standard work item definition for each work item type.
Then push the categories
Then push the process configuration
Then delete any fields that are no longer used
That should bring you back to the standard template.
An alternative you could try is to use the WitMorph project. You can write a set of rules to migrate your data back into working order.

RavenDb Config and DocumentStore abstraction?

I am using RavenDb across multiple projects and solutions to access three different databases that are all part of the same product. For instance, I have multiple MVC projects that fetch user info and some data out of the 'web' centric database and the 'backend' database, using '-' for the id override (but I need this only for a subset of classes in the 'web' db). And then I have another 'backend' database that is used by services (as well as the MVC projects). And finally a third temp/scratch database I use by another set of services to build the backend db. And of course, all of these are being accessed from different class libraries and even console test, seed, and integration test apps.
Managing all of these is becoming quite a nuisance. Every time I create a new console app or class library that access the db, I have to setup config and raven packages for each project, make sure indexes are built, etc.... Not to mention running update on all nuget updates, or in my case, installing a new unstable version of the server/client binaries.
Is there an easier way to manage this?
I tried to abstract the DocumentStore creation and initialization, as well as index creation into it own project and reference that. But the other projects then had to manually add newtonsoft.json (and nlog) from the package directory.
As well, I am getting the following when I try and abstract the DocumentStore into a class with a static property:
StackTrace of un-disposed document store recorded. Please make sure to dispose any document store in the tests in order to avoid race conditions in tests.
Anyone have any thoughts on handling these issues?
Thanks
I don't think that the manual addition of the references is a big issue, but you can add the actual nuget references as well.
Note that the DocumentStore not disposed error is something that only happened in the unstable (debug builds), and won't happen on release builds.