Azure ADO Shared query migration - TfsSharedQueryProcessorOptions SourceName TargetName - azure-devops-migration-tools

The documentation shows a sample configuration for migrating queries (TfsSharedQueryProcessorOptions) that includes SourceName and TargetName parameters. My original problem was that when trying to use the config sample to migrate queries, I was getting and error messages saying: "There is no endpoint named [sourceName]".
Things I tried: I changed endpoint names for the existing Endpoints node, I added a new child Endpoints node inside TfsSharedQueryProcessorOptions, I changed the name TfsEndpoints to TfSharedQueryEndpoints, etc. But none of that worked. I eventually found a way to make it work. Please see my own answer below.

Eventually, I found a way to make it work. So I thought I'd share my config, in case someone else is stuck on this. In the config below, change the parameters Version, Organisation and Project to your own values. Also change the AccessToken parameter if you set AuthenticationMode to "AccessToken". This is my entire config to migrate queries:
{
"Version": "**0.0**",
"LogLevel": "Verbose",
"Endpoints": {
"TfsEndpoints": [
{
"Name": "Source",
"AccessToken": "**Your source access token**",
"Organisation": "https://dev.azure.com/**your_source_organization_name**/",
"Project": "**Your Source Project Name**",
"ReflectedWorkItemIdField": "Custom.ReflectedWorkItemId",
"AuthenticationMode": "AccessToken",
"AllowCrossProjectLinking": false,
"LanguageMaps": {
"AreaPath": "Area",
"IterationPath": "Iteration"
}
},
{
"Name": "Target",
"AccessToken": "**Your target access token**",
"Organisation": "https://dev.azure.com/**your_target_organization_name**/",
"Project": "**Your Target Project Name**",
"ReflectedWorkItemIdField": "Custom.ReflectedWorkItemId",
"AuthenticationMode": "AccessToken",
"AllowCrossProjectLinking": false,
"LanguageMaps": {
"AreaPath": "Area",
"IterationPath": "Iteration"
}
}
]
},
"Processors": [
{
"$type": "TfsSharedQueryProcessorOptions",
"Enabled": true,
"PrefixProjectToNodes": false,
"SharedFolderName": "Shared Queries",
"SourceToTargetFieldMappings": null,
"ProcessorEnrichers": null,
"SourceName": "Source",
"TargetName": "Target"
}
]
}

Related

Forge - The category 'rfaFile' in '$(rfaFile)' is unrecognized

We are getting an error while executing a WorkItem in Forge's Design Automation API.
The error is this:
Error: The category 'rfaFile' in '$(rfaFile)' is unrecognized. Valid values are args, settings, appbundles, engine, engines.
And it happens right after the 'Start preparing script and command line parameters.' in the report.txt. We are not really sure why's this happening. It looks like the error is thrown in the activity. The activity looks like this:
function publishActivity() {
return $.ajax({
url: "/api/forge/design_automation/activities",
headers: {
"X-CSRF-Token": csrfToken,
"Forge-Token": forgeToken
},
method: "POST",
contentType: "application/json",
data: JSON.stringify({
activity: {
"id": "DeleteWallsActivity",
"commandLine": [ "$(engine.path)\\\\revitcoreconsole.exe /i \"$(args[rfaFile].path)\" /al \"$(appbundles[TestAppId].path)\"" ],
"parameters": {
"rfaFile": {
"zip": false,
"ondemand": false,
"verb": "get",
"description": "Input Revit model",
"required": true,
"localName": "$(rfaFile)"
},
"result": {
"zip": false,
"ondemand": false,
"verb": "put",
"description": "Results",
"required": true,
"localName": "result.rfa"
},
"inputJson": {
"verb": "get",
"description": "input json",
"localName": "params.json",
"ondemand": false,
"required": false,
"zip": false
}
},
"engine": "Autodesk.Revit+2021",
"appbundles": [ "petar3db.TestAppId+test" ],
"description": "Deletes walls from Revit file."
}
})
}).done(function(data) {
console.log("Activity created");
bundleUploadData = data["uploadParameters"];
}).fail(function(jqXHR, textStatus) {
console.log("Failed to create activity", jqXHR.responseJSON);
console.log(jqXHR, textStatus);
});
}
and it looks like the "localName": "$(rfaFile)" is causing the trouble.
Let's take a look at our WorkItem code which we execute via websockets:
{
"headers": {
"Authorization" : "Bearer <token here>"
},
"action": "post-workitem",
"data": {
"activityId": "petar3db.DeleteWallsActivity+test",
"arguments": {
"rfaFile": {"url": "https://developer.api.autodesk.com/oss/v2/signedresources/da992c60-a3d7-469d-8c3e-d0f089e2e509?region=US", "pathInZip": "emptyfam.rfa"},
"result": {"verb": "put", "url": "https://developer.api.autodesk.com/oss/v2/signedresources/b78151c1-93aa-495f-96c8-183bca26e071?region=US"},
"inputJson": {"localName": "params.json", "url": "the url to the file"}
}
}
}
the really strange part is that this process worked just fine and started throwing this error when we added "inputJson" into the activity and workItem. (We want to send some JSON data to the AppBundle with the WorkItem)
What can be the issue? Are missing something?
As for "localName": "$(rfaFile)", to be noted that if the local name is defined like this, Design Automation will come up a valid name for this argument by its own logic. If you want to fully control the input file, such as accessing it in the addin(Appbundles)'s code, it is recommended to define a "real" localName instead, e.g. "localName": "input.rfa"
In your case above, you may need to:
Remove /i \"$(args[rfaFile].path)\" from commandLine in the Activity
Define "localName": "inputRFA", so the input will be downloaded, unzipped as a folder named as inputRFA. emptyfam.rfa should be under this folder.
Call OpenDocumentFile in the addin to open a Revit file, get document
Call document.LoadFamily(".\inputRFA\emptyfam.rfa", out family); in the adding to open/load rfa file. See this Revit API
There is a mismatch in parameter name in activity with the argument name in workitem. Correct way to post the workitem should be:
{
"headers": {
"Authorization" : "Bearer <token here>"
},
"action": "post-workitem",
"data": {
"activityId": "petar3db.DeleteWallsActivity+test",
"arguments": {
"rfaFile": {"url": "https://developer.api.autodesk.com/oss/v2/signedresources/da992c60-a3d7-469d-8c3e-d0f089e2e509?region=US", "pathInZip": "emptyfam.rfa"},
"result": {"verb": "put", "url": "https://developer.api.autodesk.com/oss/v2/signedresources/b78151c1-93aa-495f-96c8-183bca26e071?region=US"},
"inputJson": {"localName": "params.json", "url": "the url to the file"}
}
}
}
Change the argument field rvtFile to rfaFile.

REST dataset for Copy Activity Source give me error Invalid PaginationRule

My Copy Activity is setup to use a REST Get API call as my source. I keep getting Error Code 2200 Invalid PaginationRule RuleKey=supportRFC5988.
I can call the GET Rest URL using the Web Activity, but this isn't optimal as I then have to pass the output to a stored procedure to load the data to the table. I would much rather use the Copy Activity.
Any ideas why I would get an Invalid PaginationRule error on a call?
I'm using a REST Linked Service with the following properties:
Name: Workday
Connect via integration runtime: link-unknown-self-hosted-ir
Base URL: https://wd2-impl-services1.workday.com/ccx/service
Authentication type: Basic
User name: Not telling
Azure Key Vault for password
Server Certificate Validation is enabled
Parameters: Name:format Type:String Default value:json
Datasource:
"name": "Workday_Test_REST_Report",
"properties": {
"linkedServiceName": {
"referenceName": "Workday",
"type": "LinkedServiceReference",
"parameters": {
"format": "json"
}
},
"folder": {
"name": "Workday"
},
"annotations": [],
"type": "RestResource",
"typeProperties": {
"relativeUrl": "/customreport2/company1/person%40company.com/HIDDEN_BI_RaaS_Test_Outbound"
},
"schema": []
}
}
Copy Activity
{
"name": "Copy Test Workday REST API output to a table",
"properties": {
"activities": [
{
"name": "Copy data1",
"type": "Copy",
"dependsOn": [],
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"userProperties": [],
"typeProperties": {
"source": {
"type": "RestSource",
"httpRequestTimeout": "00:01:40",
"requestInterval": "00.00:00:00.010",
"requestMethod": "GET",
"paginationRules": {
"supportRFC5988": "true"
}
},
"sink": {
"type": "SqlMISink",
"tableOption": "autoCreate"
},
"enableStaging": false
},
"inputs": [
{
"referenceName": "Workday_Test_REST_Report",
"type": "DatasetReference"
}
],
"outputs": [
{
"referenceName": "Destination_db",
"type": "DatasetReference",
"parameters": {
"schema": "ELT",
"tableName": "WorkdayTestReportData"
}
}
]
}
],
"folder": {
"name": "Workday"
},
"annotations": []
}
}
Well after posting this, I noticed that in the copy activity code there is a nugget about "supportRFC5988": "true" I switched the true to false, and everything just worked for me. I don't see a way to change this in the Copy Activity GUI
Editing source code and setting this option to false helped!

Cannot merge Ocelot config files

As per the documentation i tried to merge my config files so they are a bit more readable. The generated ocelot.json file however is not like expected. My folder structure is like follows:
Folder structure
Below is a text representation of this:
.
└── Ocelot route configs
├── ocelot.pokemon.json
├── ocelot.tweet.json
└── ocelot.weather.json
The ocelot.pokemon.json file looks like following (the others are similar to this):
{
"Routes": [
{
"DownstreamPathTemplate": "/api/v2/pokemon",
"DownstreamScheme": "https",
"DownstreamHostAndPorts": [
{
"Host": "pokeapi.co",
"Port": 443
}
],
"UpstreamPathTemplate": "/api/pokemon",
"UpstreamHttpMethod": [ "GET" ],
"AuthenticationOptions": {
"AuthenticationProviderKey": "MyTestKey",
"AllowedScopes": []
}
},
{
"DownstreamPathTemplate": "/api/v2/pokemon/ditto",
"DownstreamScheme": "https",
"DownstreamHostAndPorts": [
{
"Host": "pokeapi.co",
"Port": 443
}
],
"UpstreamPathTemplate": "/api/pokemon/ditto",
"UpstreamHttpMethod": [ "GET" ]
}
]
}
The generated ocelot.json file looks like this:
{
"Routes": [
],
"DynamicRoutes": [
],
"Aggregates": [
],
"GlobalConfiguration": {
"RequestIdKey": null,
"ServiceDiscoveryProvider": {
"Scheme": null,
"Host": null,
"Port": 0,
"Type": null,
"Token": null,
"ConfigurationKey": null,
"PollingInterval": 0,
"Namespace": null
},
"RateLimitOptions": {
"ClientIdHeader": "ClientId",
"QuotaExceededMessage": null,
"RateLimitCounterPrefix": "ocelot",
"DisableRateLimitHeaders": false,
"HttpStatusCode": 429
},
"QoSOptions": {
"ExceptionsAllowedBeforeBreaking": 0,
"DurationOfBreak": 0,
"TimeoutValue": 0
},
"BaseUrl": null,
"LoadBalancerOptions": {
"Type": null,
"Key": null,
"Expiry": 0
},
"DownstreamScheme": null,
"HttpHandlerOptions": {
"AllowAutoRedirect": false,
"UseCookieContainer": false,
"UseTracing": false,
"UseProxy": true,
"MaxConnectionsPerServer": 2147483647
},
"DownstreamHttpVersion": null
}
}
As you can see, the routes I defined were not added. I tried looking on the internet for this specific issue but couldn't find anything. I don't know what I'm doing wrong, help will be appreciated.
Since your different route configuration files are located in a folder you should make sure the correct overload of the AddOcelot method is called. In this case the method should be called with the folder name containing the route files.
For example:
config.AddOcelot("Ocelot route configs", hostingContext.HostingEnvironment)
UPDATE: .NET Core 3+ with Ocelot 17.0.0
As the method AddOcelot needs an IWebHostEnvironment, and this is not available in HostBuilderContext:
You need to get it via WebHostBuilderContext:
I created two different directories Development and Production and with the below code, I'm able to read ocelot configuration according to development environment to generate the final ocelot.json that will be use by ocelot middleware.

Swagger UI and Docker Container Communication

I have a docker container running Swagger UI on port 80 and I have another API running in another container on port 32788
http://127.0.0.1:80/ >>> returns swagger UI
http://127.0.0.1:32788/swagger.json >>> returns swagger API def
But when I put the json file into the Swagger UI field and hit explore, it says
NetworkError when attempting to fetch resource. http://127.0.0.1:32788/swagger.json
Any ideas on how to solve this. The docs say that they should automatically be connected to the bridge network.
Below is the result of the network inspection
docker network inspect bridge
[
{
"Name": "bridge",
"Id": "4b5cc1526055297df70dc9adc4959fcee93384c412fbf90500c041b5b83ed43a",
"Created": "2018-01-17T03:48:39.2325461Z",
"Scope": "local",
"Driver": "bridge",
"EnableIPv6": false,
"IPAM": {
"Driver": "default",
"Options": null,
"Config": [
{
"Subnet": "172.17.0.0/16",
"Gateway": "172.17.0.1"
}
]
},
"Internal": false,
"Attachable": false,
"Ingress": false,
"ConfigFrom": {
"Network": ""
},
"ConfigOnly": false,
"Containers": {
"257a15af9ab9b25c6c5622fb0ebe599e5703b2ca5f2e4eaa97a8745a21e7f9a9": {
"Name": "pensive_neumann",
"EndpointID": "22be4b781f75e071bcb0098b917b81b16ca493e9080848188dd7a811c27070ec",
"MacAddress": "02:42:ac:11:00:02",
"IPv4Address": "172.17.0.2/16",
"IPv6Address": ""
},
"30de904a599a19075d5e20ef5d974a11be9d7e58a68d984a24f4af9e22c4d92b": {
"Name": "naughty_mirzakhani",
"EndpointID": "f704b3e103a82ca5c56d5955ac27845d8951cfe13f0bc3e1ccc8717ea9c28d39",
"MacAddress": "02:42:ac:11:00:03",
"IPv4Address": "172.17.0.3/16",
"IPv6Address": ""
}
},
"Options": {
"com.docker.network.bridge.default_bridge": "true",
"com.docker.network.bridge.enable_icc": "true",
"com.docker.network.bridge.enable_ip_masquerade": "true",
"com.docker.network.bridge.host_binding_ipv4": "0.0.0.0",
"com.docker.network.bridge.name": "docker0",
"com.docker.network.driver.mtu": "1500"
},
"Labels": {}
}
]
Edit to explain how started each:
The API is part of Azure Machine Learning so its hard to say how it gets started exactly (unless there is some command I can run in docker):
az ml service create realtime
Swagger UI was started as follows:
docker run -p 80:8080 swaggerapi/swagger-ui

Express static mount path in krakenjs

I am trying to find the equivalent config.json file entry in krakenjs for the below code.
app.use("/app/static", express.static(path.join(__dirname, 'public'), {maxage: '2h'}));
I tried something like the below. But, it didn't pick the mounted path
"static": {
"enabled": true,
"priority": 40,
"name": "server-static",
"module": {
"arguments": [
"path:./public",
{"maxAge" : "3h"},
"mountpath:/app/static"
]
}
}
I am unable to access it with the following URL : https://app.com/app/static/style.css. But, it is accessible via https://app.com/app/style.css
Note: /app is my requestURI.
I figured out. Here is how it is configured.
"static": {
"enabled": true,
"priority": 40,
"name": "server-static",
"module": {
"arguments": [
"path:./public",
{"maxAge" : "3h"}
]
},
"route": "/static"
},