Recommendations for migrating Areas and Iterations - azure-devops-migration-tools

What is the recommended configuration for migrating the source Areas\Iterations to the target Areas\Iterations? I have been able to successfully migrate workitems but I can't seem to figure out the configuration to migrate Areas and Iterations. This will be same source to target Areas\Iterations migration.

If you are migrating everything then you only need to migrate work items and Area & Iterations are migrated automatically.
Otherwise you can run Area & Iteration on their own using https://nkdagility.github.io/azure-devops-migration-tools/Reference/Processors/TfsAreaAndIterationProcessor.html

Related

ASP .NET Core builds project on every item add, how to disable

I have hard times on migrating one of our enterprise MVC projects to Core 2.1.
I want to move the project to the new structure Razor Pages + View Components from Controllers + Views/Partials. We have a ton of models and components and actions there.
When I convert projects I usually move things around, copy items to new paths, run automated refactorings, create new/change classes to fit the new requirements and design and that BREAKS the project. A build is the last thing I do when everything is already setup, just to see if I missed something.
Now after few refactorings and breaking changes I can't add new items(razor pages, view components and so on) just because project is not buildable.
"There was an error running the selected code generator: Failed to build project..."
Basically it forces me to do everyting manually, check every copied/migrated piece of code just to add a new item !
I'm in nightmare, please someone wake me up, how to disable this thing ? or suggest a migration strategy for large projects.
First, and most importantly, adding a item via scaffolding will always kick off a project build. The scaffolding needs the project to be in a consistent state in order to function correctly. There is no way around this.
Aside from that, Visual Studio will only rebuild on changes if you're actually running the site. So if you've got it running in IIS Express, kill it to avoid that.
For what it's worth, it's better to correct errors as you go, anyways. It much easier to process a few errors at a time than hundreds all at once, and you'll also then be able to take advantage of Visual Studio's refactoring features, which only work when the project can build, making the total amount of work you have to do usually far less.
in MVC:
Add -> View adds scaffold without build
in Core:
Add -> View adds scaffold + build
Add -> New Item -> View adds without build
So if you don't want to run build on every single add in Core use
Add -> New Item -> ...

How revert to tfs 2015 stock agile process template

After (or before) we convert from TFS 2012.2 to TFS 2015.3 (which we have done just fine in a test run) we would like to revert our team project to the standard TFS 2015 Agile process template, and no longer use the customized agile process that we had modified from TFS 2012. We are quite willing to delete all of our work items and start over, but need to keep the team project history and change sets. Anyone know how to do this? Answers to prior questions on this did not address this situation. Thanks.
There is no easy way to do it. Basically the steps require you to use a lot of witadmin commands. Start by deleting any work item types that were added and don't exist in the default template.
Then push the standard work item definition for each work item type.
Then push the categories
Then push the process configuration
Then delete any fields that are no longer used
That should bring you back to the standard template.
An alternative you could try is to use the WitMorph project. You can write a set of rules to migrate your data back into working order.

Composite C1 - develop locally, sync to live site

I have a couple of Composite C1 CMS websites.
To edit them currently I use the web based CMS on the live site.
However - I would like to update the (code & content) in Visual Studio locally - then sync to the web. However, if my local copy is older than that online (e.g. a non techy client has edited something on the live site) and I Web Deploy - it will go over the top of the new file on the server.
I need a solution that works out the newest change? I can't find anything in Google or the C1 docs.
How can I sync - preferably using Web Deploy. Do I need some kind of version control?
Is there a best practice for this - editing the live site through the web interface seems a bit dicey & is slow.
The general answer to this type of scenario seems to be to use the Package Creator. With that you can develop locally, add the files you've changed to a package, and install that package on a live site. This solution does not at all cover all the parts of you question though, and has certain limitations:
You cannot selectively add content to a package. It's all pages or no pages.
Adding datatypes is easy, but updating them later requires you to delete the datatype (and data), and recreate the datatype.
In my experience packages works well for incremental site updates, if you limit the packages content to be front end stuff, like css, images and such.
You say you need a solution that works out the newest changes - I believe the only solution to this is yourself, with the aid of some tooling. I don't think there's a silver bullet solution here.
Should you use a version control system? Yes! By all means. Even if you are not sharing your code with anyone, a VCS is a great way to get to know Composite C1 from a file system perspective, as you can carefully track what files are changed on disk, as you develop. This knowledge is crucial when you want to continuously add features the a website that is already alive and kicking - you need to know what to deploy, and what not to touch.
Make sure you read the docs on how Composite fits in VCS: http://docs.composite.net/Configuration/C1-and-Version-Control
I assume that your sites are using the XML data storage (if you where using SQL Data Store, your content would not be overridden upon sync).
This means that your entire web application lives in one folder on disk on the web server, which can be an advantage here.
I'll try to outline a solution that could work for you, although I must stress that I've never tried this - I'm making it up as I type.
Let's say you're using git, download the site in it's entirety from the production web server, and commit the whole damned thing* to your master branch.
Then you create a new feature branch from that commit, and start making the changes you want to deploy later, and carefully commit your work as you go along, making sure you only commit the changes that are needed for your feature to work, to the feature branch.
Now, you are ready to deploy, and you switch back the master branch, and again download the entire site and commit it to master.
You then merge your feature branch into the master branch, and have git do all the hard work of stitching you changes in with the changes from the live site. There are bound to be merge conflicts, and that is where you will have to jump in, and decide for yourself what content needs to go live.
After this is done and tested, you can web deploy the site up to the production environment.
Changes to the live site might have occurred while you where merging, so consider closing the site, or parts of it, during this process.
If you are using SQL Data Store i suggest paying for a tool like Red Gate's SQL Compare and SQL Data Compare or SQL Delta, to compare your dev database to the production database, and hand pick SQL scripts that can be applied to the production database along with your feature deployment.
'* Do consider using a .gitignore file to avoid committing certain files - refer to the docs for mere info.
I suppose you should use the Package Creator
Also have a look here: http://docs.composite.net/Configuration/C1-and-Version-Control

TFS2010 database size

We've been using TFS since around 2009 when we installed TFS2008. We upgraded to TFS2010 at some point and we've been using it for source control, work item management, builds etc.
Our TFSVersionControl.mdf file is 287,120,000 KB (273GB). We ran some queries and found that our tbl_BuildInformationField table is massive. It has 1,358,430,452 rows which takes up 150,988,624 KB (143GB). We have multiple active products over multiple active builds which more than one solution per build and the solutions aren't free of warning messages.
My questions:
Is it possible to stop MSBuild from spamming the
tbl_BuildInformationField table so much? I.e. only write errors and
general build information and not all the warnings for every
project?
Is there a way to purge or clean up old data from this
table?
Is 273GB for 4 years of TFS use an average size?
Is 143GB for tbl_BuildInformationField a "normal" size?
The table holds the values and output of build process. Take note that build retention policy doesnt actualy delete the build object like everything else in TFS the object is marked deleted and only public visibility and drop location is cleared.
I would suggest if you have retainened same build definitions for very long time (when build definition is deleted the related objects get removed as well) you should query for build info including deleted ones using TFS api, the same api will also alow you to remove them for good. Deleting build definitions probably will not work and will fail with timeout error.
You can consult the following:
http://blogs.msdn.com/b/adamroot/archive/2009/06/12/working-with-deleted-build-data-in-team-foundation-server-2010-beta-1.aspx

TFS Builds, Project Files: Orphaned references to files not being pushed are causing endless build errors

We are using TFS 2010 (Visual Studio) for our deployments and have client code projects (.csproj files) and database projects (.dbproj files) We understand that when our Developers add files to our application there is a corresponding reference to these files in the Project file. If I push a changeset from Dev to QA that includes the project file, and the project file contains a reference to a file that's been added that is not in the changeset, I will receive a build error.
Once we started pushing just changesets (as opposed to performing full builds) this quickly became our number one bottleneck in doing TFS builds. I would deploy the database project and there would be 20 errors. The only way I could correct them was to navigate down the entire solution explorer tree and exclude each orphaned reference individually. This has proved far too time consuming and on the advice of our lead programmer we have returned to doing full builds of QA and UAT.
We are in the early stages of this product, and therefore we will be adding many files for some time. We need a better solution for this problem. Neither the manual exclusions nor asking developers to not check in code until it is ready for qa will suffice for us. Has anybody out there had any experience with this problem and if so how did you deal with it? Thanks!
Jon
Pushing changesets to QA selectively is known as cherry picking and causes the sorts of issues that you are experiencing. This is not the recommended practice, instead setup the Qa build so that successful build is part of checkin. This way that if a part of a fix is missed ( as it may be in multiple change sets ) the build will fail and the checkin cannot be performed.
Second have the developers do the second checkin to QA or merge the dev change sets to Qa and have the team lead coordinate changes to project files by watching for changes by turning on "notify changes made by others " or setting a policy for the dev team. Full builds should always be done as partials do not always pick up the complete pick up the dependency graph.