Migrating Azure DevOps Work Items using the DevOps Query ID in the VSTS Sync Migrator Tool's Configuration File - azure-devops-migration-tools

TL;DR;
I need to migrate Parent/Child related Work items listed in my "Tree of Work Items" Query
Is there a Parameter in the Configuration.json file that takes in the Query ID?
Currently I'm using the WiQL Query bit to sort out the work items I need to migrate but I wanted to know if there's a way to tell the tool to get a specific set of work items from a Work Item query, of query type "Tree of Work Items" in order to migrate the child work items under an epic.

I already thought about making a separate Area Path (called "for migration"), move the work items listed by my Query Tree to that area path then have the query bit in the config.json to migrate work items under the area path "for migration", but I just wanted to know if its possible to migrate work items from a query using the Query ID from DevOps.

It is not possible to use a QueryID. You need to add the text of the query directly into the configuration file. You can use the Wiql Editor tp create a flat query and export the text.
Also "Tree of Work Items" queries are not supported. You need to use a flat query!

Related

How to compare table structure in two environments

I've just inherited a fairly large DB, and been just informed that the table structure does not match between Dev and Prod. This is causing us problems since the code developed for Dev ends up crashing in Prod which is causing some pretty catastrophic releases.
So, I'd like to find a way to compare the tables, keys, and indexes programatically. Adding a tool requires client approval which is quite the challenge. I can't just copy the data over because they have different data sets.
Does anyone have any scripts that could help me figure out which ones to update?
There are quite a few tools you could use. I prefer the Red Gate tool, however VS will work just as well.
Data is a non-sequitur, client approval is also a non-sequitur.
Simply script out the two tables, add them to your local client or a test server, use the tool to compare the two.
To expand on Tab Alleman's response it is typically SOP to roll Prod down to Dev. You can find tons of information on SDLC via bing / google.
A secondary option would be to rename the Dev version of the table, script out and "move" the table from Prod to Dev. Insert the data from the Dev version of the table back into the new table which will have the proper object name.
Perhaps simplest way without using any gimmicky third party tools is provided in the SQL Server Management studio it self.
Below are brief steps to give overall idea of actions involved:
STEP 1: Generate full database script for both the Dev and Prod databases using SQL Server Management Studio's inbuilt Generate a Script option.
STEP 2: Compare both the scripts using any basic text comparison utility (like Beyond compare, meld, WinMerge etc).
STEP 3: Note down differences and make plan to fix them.
and If you are still reading :)
More detailed explanation of generating database script:
In Object Explorer, expand Databases, right-click a database, point to Tasks, and then click Generate Scripts. Follow the steps in the wizard to script the database objects.
On the Choose Objects page, select Script entire database and all database objects.
On the Set Scripting Options page, select Save scripts to a specific location.
To specify advanced scripting options, select the Advanced button in the Save scripts to a specific location section.
Tricky step - In Advanced Scripting Options popup Select False for Include Descriptive Headers.
This will remove unwanted time stamps, which is a great help while comparing scripts.
On the Summary page, review your selections. Click Next to generate a script of the objects you selected.

How to import selective tables using SSIS based on custom (e.g.XML) file

I am having about 1200 tables in my oracle database and need to import them to SQL Server database. But I would like to configure the import in such a way that at any given import, I should be able to select the tables that need to be imported.
So, I have an custom XML file listing all the tables and a flag for each table indicating whether that table is to be imported or not. Also I have created the package to import all the tables and would like to modify this to check table if that is to be imported from XML file at runtime.
I was thinking to implement something like given here, but don't want to do this for these many tables and also don't know whether it'll do the job.
How can I get around this? Can I use SSIS configuration file for this (not sure though)? Is there any way that I can read XML at runtime and import tables based on XML file (or any other file with key-value pairs).
Any help in any form would be greatly appreciated.
It might seem a lot of work, but this is how I'd approach:
Create one package for each table that needs to be imported - so 1200 packages.
Store the package names in a metadata table along with a flag column, indicating whether that packages needs to be executed or not.
Create a parent package.
Add an execute sql task in the parent package. SQL command like this: select PackageName from metadataTable where Flag =1 retrieves the list of packages that need to be executed.
Map the result set to an object variable.
Add a for each loop container.
Add Execute package task inside the for each loop container, and parametrize the package name property.
This whole setup reads the packages that needs to be executed, and executes them one after the other.
If you like this approach, check out Andy Leonard's SSIS framework.
Samuel Vanga has a solid approach. The only thing I would look at doing is using something to programmatically generate those 1200 packages.
Depending on your familiarity with the SSIS object model and general .NET development, I'd investigate an EzAPI if you enjoy coding.
Otherwise, look at BIML and the package generation feature of BIDSHelper. You do not need to by a license for Mist to create your BIML script, you can browse the existing scripts on BIMLScript and solve probably most of your needs. Copy, paste, generate.

How do I filter objects to be compared using VSDBCMD

I'm trying to automate my deployment and I've be trying to use the VSDBCMD command line tool to compare schemas of my development and staging databases. I can get it to work comparing everything but what I can't figure out is how to filter out the objects I want to be compared. At the moment it compares everything which means it wants to add or remove users, full text catalogs, file groups etc.
Basically I just want to compare tables, stored procedures, views, functions and a few other things. From within visual studio you can set what objects to compare but I can't see from the documentation how to do this using the command line tool.
anyone have any ideas?
Unfortunately you can't. The best explanation I have seen is here: http://social.msdn.microsoft.com/Forums/en-US/vstsdb/thread/75656877-95e1-4c13-8540-8a445f47ca57
I'm not at my workstation now, but I believe that it is possible to filter out user scripts by checking the "ignore permissions" option in the db settings file. You might try experimenting with the other ignore settings to see if you can get closer to your goal that way.

Sql Query to XML document inside Drupal

I need to be able to dump my user table inside Drupal to an XML document that I can hit from a path in a browser.
e.g. mysite.com/users.php
Looking for an easy way to do this, an existing module would be ideal. Not sure if QueryPath does this.
Thanks.
Using Views Bonus Pack you can first build a view of all your users, and then create a 'feed' display. In the feed settings, you'll be able to select XML as one of the formats. You can then set the path to anything you like.
QueryPath can do this.
There is a database extension that comes with QueryPath's Drupal module that basically lets you take any SQL statement and inject the results into an XML document. The basic idea is explained here:
http://technosophos.com/content/using-querypath-interact-sql-database-part-1
The Drupal module has Drupal-specific DB bindings though. (So stuff like {table} is correctly translated.)
I've also used the Views Datasource module to do this, but it's buggy. The last two times I did it, I had to edit the source code for the module first.

Creating a CHANGE script in Management Studio?

I was wondering if there is a way to automatically append to a script file all the changes I am making to my columns, tables, relationships etc...
The thing is I am doing a lot of different changes on a TEST db and the idea will be to apply this change script when I move the test db to production... hence keeping production data but applying all schema and object changes.
Is there an easy way to do this? Can it also migrate database diagram changes?
I have seen how you can create a change script each time I do a change but this means I have to copy and paste into a master file. Actually pretty easy!
I was just wondering if I was missing something?
Do not make changes to the test server using the UI. Write scripts and keep them under source control. You can test your scripts starting from backups of the live data and you can tune yoru scripts untill they achieve the desired result. Then you can check in the scripts for reference and later apply them on the live server. See this article Version Control and Your Database.
BTW, check out the SSMS toolpack, I think it may do what you want (I'm not sure). My advice stand none the less: version your schema, use explicitly created/saved scripts, use source control.
There's no way to directly generate a "delta" script in SSMS.
However, if every time you publish changes, you script out the entire database, including data, to SQL using the SQL Server Database Publishing Wizard you should be able to extract diffs between the versions and get your deltas that way.
If money is no object, you can purchase Visual Studio Team System Database Architect edition and use its fantastic database comparison tools to generate and version control exactly the diffs you want.
Try using TableDiff , that came with SQL Server 2005.
SQL Server 2005 TableDiff Utility
tablediff Utility
We have the process where when a developer gets done with a change, they then script it out and check it into Subversion. In Subversion we have a folder for Tables, Stored Procs, Data, etc. They script it out so it is repeatable (i.e. don’t insert the new data if it is already there.) This is important to do anyway so you keep the history of changes for a given object in the database.
In the past, we would just enter each of the files that we wanted scripted out into a text file (i.e. FileListV102.txt). When we were ready to make a release we would do “get latest” on all of the files (from VSS back then.) We then had a simple utility that would read the “file list” file and open each of those files in turn concatenating them into an output file. That is pretty easy to code.
We outgrew that and now we have a release management tools (which can be found here and will be on sale mid September), that takes all of the files and creates a big SQL script file out of it. It does it in the order that you would expect based on the folder names – so files found in the "Tables" folder are done before those in the "Data" folder, etc.
Either way, once you are done you have a big SQL script file that you can then apply to a fresh copy of production and that is what you test against.
I know I'm way late to the party, but I just wanted to add that there are tens of third party products out there. Some are very good, some are very cheap or free, and some are a mixture. I listed 22 here:
http://bertrandaaron.wordpress.com/2012/04/20/re-blog-the-cost-of-reinventing-the-wheel/
We have been using a relatively new software called Kal Admin.
It has Change Management feature and let distributing selected changes to other databases very easily. We used to do it by comparing two databases but it not satisfy our need for change tracking.
BTW Kal Admin has Metadata and data compare capabilities as well.