Work Item discussion is not being added when rerunning tool - azure-devops-migration-tools

After migrating the work items from one azure devops environment to another. When rerunning the tool, the tool updates the related links but the discussions are not updating. How can we move over new discussions from the old environment?

Discussions are not migrated as they are part of the REST API, and the Azure DevOps Migration Tools use the ObjectModel...
https://nkdagility.github.io/azure-devops-migration-tools/

Related

Publishing to Azure Data Factory from GitHub

Can someone remind why when I publish from Gitbhub my Synapse Live workspace isn't updated with with the Pipelines created in Github?
Only when a pipeline is created in azure synapse analytics workspace's GIT mode, the pipeline would also appear in Synapse Live mode. Look at this official Microsoft documentation's best practices section which indicates the same.
But when the pipeline exists in your GIT repository, then after integrating your GIT, this pipeline would only be seen in the GIT branch but not in the Synapse Live mode.
Let's say I have the following pipeline in my repo.
When I integrate this repo with my synapse workspace, this pipeline would only appear in GIT branch.
You can see that it is not visible in synapse live mode.
If you want your synapse live mode pipelines to GIT, you can check this option while configuring GIT and then publish to get them to your GIT.
This kind of things can happen when objects in your git repository get out of synch with the publish branch. (changes to linked services for example can be created imediately and create this kind of bad situation).
Meanwhile, there is at the moment a strange situation in Synapse that could also explain "weird behaviours". There seems to be a general bug causing 2 things :
problems when publishing to synapse live.
problems with notebook activity in pipelines : the reference to the notebook is lost and we are unable to select notebook in the pipeline activity
In my case i am waiting microsoft feedback on this

what is gitpod: what does it actually do?

The gitpod GitHub page says
Gitpod is an open-source Kubernetes application providing prebuilt,
collaborative development environments in your browser - powered by VS
Code.
However, I can not comprehend what it actually does. Can anyone please explain.
Gitpod co-founder here.
Gitpod = server-side-dev-envs + dev-env-as-code + prebuilds + IDE + collaboration.
From a Git Repository on GitHub, Gitlab or Bitbucket, Gitpod can spin up a server-side-dev-environment for you in seconds. That's a docker container that you can fully customize and that includes your source code, git-Terminal, VS Code extensions, your IDE (Theia IDE), etc. The dev environment is enough powerful to run your app and even side-services like databases.
Step (1) is easily repeatable and reproducible because it's automated and version-controlled and shared across the team. We call this dev-environment-as-code. Think of infrastructure-as-code for your dev environment.
After (1), you're immediately ready-to-code, because your workplace is already compiled and all dependencies of your code have been downloaded. Gitpod does that by running your build tools on git-push (like CI/CD would do) and "prebuilds" and store your workspace until you need it. This really shines when reviewing PRs in Gitpod.
Collaboration becomes much easier once your dev environments live server-side and your IDE runs in the browser. Sending a snapshot of your dev environment to a colleague is as easy as sending a URL. The same goes for live shared coding in the same IDE and dev-environments.
At the end of the day, you start treating your dev environments as something ephemeral: You start them, you code, your push your code, and you forget your dev environment. For your next thing, you'll use a fresh dev environment.
The ease of mind that you get from not messing, massaging, and maintaining dev environments on your local machine is incredibly liberating.
Gitpod can be used on gitpod.io, or self-hosted on Kubernetes, GCP, or AWS.
To illustrate Gitpods, note that GitLab 13.5 (October 2020) adds a new feature
Launch Gitpod Workspaces directly from GitLab
Engineers have complicated development environments that can take time to set up and make testing changes or exploring new projects challenging. Often getting started with a project involves following documentation, installing dependencies, and hoping there are no conflicts with other services running. This process can be time consuming, error prone, and may not replicate the configuration accurately to test and contribute to a project.
With Gitpod integrated into GitLab, you can easily launch your Gitpod Workspace directly from the GitLab interface. When editing a project on GitLab, a new dropdown option exists to open that project in GitPod:
Gitpod allows you to define your project’s configuration in code so you can launch a prebuilt development environment with one click.
These environments are configured through a .gitpod.yml file inside of the project and include options for Docker configuration, start tasks, editor extensions and more. This flexible configuration, which is part of the project’s code, allows developers to get started working on a project quickly.
Try this today with the GitLab project which is already setup to work with Gitpod.
Thanks to Cornelius Ludmann from Gitpod for contributing this!
https://about.gitlab.com/images/13_5/phikai-launch-gitpod-editor.gif -- Launch Gitpod from the GitLab UI
See Documentation and Issue.
And with GitLab 14.2 (August 2021)
Launch a preconfigured Gitpod workspace from a merge request
Launch a preconfigured Gitpod workspace from a merge request
The Gitpod integration, introduced in GitLab 13.5, helps you manage your complicated development environments.
Once you define your project’s configuration in code, you can launch a prebuilt, cloud-based development environment with a single click.
This convenient workflow has made it faster than ever to generate new changes, but launching a Gitpod environment to review an existing merge request meant building an environment against the main branch before switching to the target branch and building again.
Now, in GitLab 14.2, you can launch Gitpod directly from the merge request page, preconfigured to use the target branch, to speed up your reviews and reduce the need for context switching.
Enable the Gitpod integration, and your merge requests display a grouped Open in button, so you can open the merge request in either the Web IDE or Gitpod.
Thanks to Cornelius Ludmann from Gitpod for this contribution!
https://about.gitlab.com/images/14_2/create-gitpod-in-mr-view.png -- Launch a preconfigured Gitpod workspace from a merge request
See Documentation and Issue.
GitPod is essentially an ephemerial/adhoc environment that instantiates a Docker container via a .gitpod.Dockerfile yaml. At the core, there is the VS Code integration and the SSH Remote extension is the key piece there that ties a lot of the "what GitPod does" question. In fact, the UI would be another key piece there, as workspaces can be cached via prebuilds (which are available "almost instantly"), or manual "one-off" builds (which take much longer to run - because it's a build - duh), and can be re-instantiated via the UI, which auto-parses stale workspaces after 14 days.
The workspace is the environment. The gitpod/workspace-full Docker image which contains the following at time of this post:
gitpod/workspace-c ✅
gitpod/workspace-clojure ✅
gitpod/workspace-go ✅
gitpod/workspace-java-11 ✅
gitpod/workspace-java-17 ✅
gitpod/workspace-node ✅
gitpod/workspace-node-lts ✅
gitpod/workspace-python ✅
gitpod/workspace-ruby-2 ✅
gitpod/workspace-ruby-3 ✅
gitpod/workspace-ruby-3.0 ✅
gitpod/workspace-ruby-3.1 ✅
gitpod/workspace-rust ✅
gitpod/workspace-elixir ✅
So all in all, as long as the open-source community is active, your getting a pretty fresh, well-provisioned, "full" environment, and it's available "on-demand" via a web UI, that can take a query string with gitpod.io/#{your github url}.
For free, a workspace runs for 1 hour with a total of 50 hours per month avaialble. Increased time and team config is available, so for example, a two-pizza team on a team plan is around $200-$300 per month, which, if you put pen and paper to it, has decent ROI considering time-savings, and amping up the DevX.

How to migrate team project with history within same VSTS account?

We need to migrate our VSTS team project. I already saw that this is an eagerly awaited feature from the Visual Studio user Voice.
However, in our case the new team project is to be in the same VSTS account. Is there a way to do this while also keeping version control change history? Keeping the change history available as part of the old team project is unfortunately not an option as we will lose access to the old team project soon after migration.
If somebody has done this before with the help of any of the below tools, then it would be great if they can share their experience:
VSTS copy project
VSTS sync migrator
OpsHub
It's a bit unclear what you're about to migrate from where. And why you'd lose access to the existing project. And you have different options based on the current source control type selected.
One option which you could try is to create 2 new accounts and leave the whole old account in read-only state. That should leave the history available to everyone. You can then spin up as many new accounts as you want, using just the latest version of the sources.
Git
If it's a Git repository it's as simple as making a local clone of the whole repo, creating a new team project in VSTS and pushing the clone into its second home.
TFVC
If it's TFVC, it's much harder. I've used OpsHub in the past which works reasonably well, but in our case completely got stuck in a couple of strange merge situations. Those were probably created as part of work done back when that team project was hosted in TFS 2008, so you may be luckier than we were.
You could decide to move to Git as part of your migration. Use git-tfs to create a local git repository with all your TFVC history and then push that into a bare Git repository in your new team project. Or use the TFVC import tool. There's quite a bit of documentation on this subject.
The VSTS Sync Migrator supports a snapshot without history as far as I can tell. Which would not suit you.
VSTS Copy Project doesn't support TFVC, and is no option in this case.
An option that's missing from your list is Timely Migration, it supports TFVC to TFVC migrations among other options. I've used them a long time ago to copy data between TFS servers. Back then they were working exactly as advertised.

what is the API for VSTS

I am having a scenario where in I need to migrate the artifacts from HP ALM to VSTS for test management. I know about the ALM API and have worked with that before for exporting defects, and test cases but I am not aware of any such things for VSTS ( am very new with it) and want to create a task and then create a bug with the help of the adapter code and move the details from alm to vsts.
Can anyone please help me get the VSTS API, or some sample adapters based on which I can start coding for the VSTS API?
N.B: I tried finding it over google but no luck till now.
Thanks in advance.
You can try to use OpsHub tool.
Another tool: Microsoft Visual Studio Online Integration
Regarding API, you can try to use VSTS/TFS sdk: Microsoft Team Foundation Server Extended Client. You can check the source code of vstssync migrator (Ove mentioned)
Another option is vstssync migrator created by mrhinsh
https://github.com/nkdAgility/vsts-sync-migration
There is the code to get you started with your adapter.

Adding SQL Scripts to TFS Build Process

our team currently updates our production databases via the dacpacs/bacpacs since we do not have any client data. This will not be the case in the near future and I'm looking to change the process of only modifying the database via SQL scripts via build automation.
Is managing these SQL scripts in Team Foundation Server and executing them in the build do-able? And If so how have people approached this?
Thanks!
You should probably not be deploying DACPACs during the build process; it's more appropriate to push changes as part of a release/deployment workflow that occurs after a build.
That said, you can publish a DACPAC without dropping the database and recreating it; there are numerous options for controlling the database publishing behavior that can be set at the project level and overridden at the command line if necessary.
There are two main ways of managing SQL data base changes when deploying to production.
DACPAC
Sequential SQL Scripts
Both have there own issue and bonus when trying to deploy. If you control your entire DevOps pipeline then Dacpac is a good option. If you have to deal with corporate dba's then SQL scripts can also be done.