Migrate from Azure DevOps 2020 on prem german to the english version [closed] - azure-devops-migration-tools

Closed. This question is not about programming or software development. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 4 months ago.
Improve this question
long time ago we made a bad decision to install the German version of the TFS Server. We upgraded to every new version of the tfs and now Azure DevOps Server 2020 but had no chance to change the language. Now we want to migrate to VSTS, but therefore we need an English version. As far as i know, MS supports only a migration of English DevOps Server to VSTS.
Finding this tool and reading the line "v9.0 - Added support for migration between other language versions of Azure DevOps. Developed for German -> English", we hope to have a solution for our problem.
Searching the docs, I could not find any hints how to do, even start with it.
Is there any documentation how to do it?
Regards Bernhard

This is not documented but you need to look at the LanguageMaps and change the Source LanguageMaps to the words that your TFS uses for 'Area' or 'Iteration'.
"Source": {
"$type": "TfsTeamProjectConfig",
"Collection": "https://dev.azure.com/nkdagility-preview/",
"Project": "myProjectName",
"ReflectedWorkItemIDFieldName": "Custom.ReflectedWorkItemId",
"AllowCrossProjectLinking": false,
"PersonalAccessToken": "",
"LanguageMaps": {
"AreaPath": "Area",
"IterationPath": "Iteration"
}
},
In Spanish would be:
"Source": {
"$type": "TfsTeamProjectConfig",
"Collection": "https://dev.azure.com/nkdagility-preview/",
"Project": "myProjectName",
"ReflectedWorkItemIDFieldName": "Custom.ReflectedWorkItemId",
"AllowCrossProjectLinking": false,
"PersonalAccessToken": "",
"LanguageMaps": {
"AreaPath": "Área",
"IterationPath": "Iteración"
}
},

Related

Is there an Azure API that will give me VM characteristics from RateCard description [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 4 years ago.
Improve this question
Is there an Azure API that spits out details/attributes/characteristics of a specific meter provided through the RateCard API?
An example meter object from the API looks like this.
{
"EffectiveDate": "2016-09-01T00:00:00Z",
"IncludedQuantity": 0.0,
"MeterCategory": "Virtual Machines",
"MeterId": "40f2bbb5-1ca8-4ac5-afd6-b1e47f16314b",
"MeterName": "Compute Hours",
"MeterRates": {
"0": 0.010
},
"MeterRegion": "US East 2",
"MeterSubCategory": "Standard_D2_v2 VM (Windows)",
"MeterTags": [],
"Unit": "Hours"
}
I want to know more information about the MeterSubCategory in this case the VM meter of Standard_D2_v2 VM (Windows). How many cores, how much memory, max number of disks, and hopefully VM throttling limits for IOPS and MB/sec.
I want to know more information about the MeterSubCategory in this
case the VM meter of Standard_D2_v2 VM (Windows). How many cores, how
much memory, max number of disks
You could use Azure Virtual Machine Rest API. See this link.
GET https://management.azure.com/subscriptions/{subscriptionId}/providers/Microsoft.Compute/locations/{location}/vmSizes?api-version={apiVersion}
For example:
https://management.azure.com/subscriptions/************/providers/Microsoft.Compute/locations/eastus/vmSizes?api-version=2017-12-01
The APi will get VM name, numberOfCores,osDiskSizeInMB,resourceDiskSizeInMB,memoryInMB,maxDataDiskCount.
and hopefully VM throttling limits for IOPS and MB/sec.
Based on my knowledge, no API could list this, you could check Azure official document.

Download My files with REST CALLS CKAN [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I'AM new in CKAN, please tell me if its possible to download my ckan datasets ( my files) using the ckan API.
Regards.
You can make a call to CKAN's API action 'package_show' to see the details of a dataset, which gives you the 'resources' containing the URLs of the data files. Then you can use these URLs to download the data directly from wherever they are stored.
e.g. doing a package_show from the CKAN API by GETting this URL: https://data.gov.uk/api/3/action/package_show?id=index-of-multiple-deprivation
{
"help": "http://data.gov.uk/api/3/action/help_show?name=package_show",
"success": true,
"result": {
...
"resources": [
...
{
"description": "2010: Sub-domains living environment",
...
"url": "https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/6882/1871676.xls",
}
...
]
}
}
and clearly you can download this spreadsheet from https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/6882/1871676.xls

Google sitelink search box [closed]

Closed. This question is not reproducible or was caused by typos. It is not currently accepting answers.
This question was caused by a typo or a problem that can no longer be reproduced. While similar questions may be on-topic here, this one was resolved in a way less likely to help future readers.
Closed 3 years ago.
Improve this question
We are implementing search box within the search result in Google for our site.
We have our own search feature on website, and dont want to use Google custom search.
We are following instructions on the following page, but finding it difficult to set it up.
Google developer site.
I added the following JSON-LD in the head, but it is not working
<script type="application/ld+json">
{
"#context": "http://schema.org",
"#type": "WebSite",
"url": "http://www.oursite.com.au/",
"potentialAction": {
"#type": "SearchAction",
"target": "http://www.oursite.com.au/search.aspx?keyword={search_term}",
"query-input": "required name=search_term"
}
</script
I was wondering if I set up the code correctly?
There is one possible error with your syntax, which is that you are missing a last closing bracket. Your script is also not closed properly as well. I'm guessing it is possible both of these issues could be from copying into SO.
For good measure, it should look like:
<script type="application/ld+json">
{
"#context": "http://schema.org",
"#type": "WebSite",
"url": "http://www.oursite.com.au/",
"potentialAction": {
"#type": "SearchAction",
"target": "http://www.oursite.com.au/search.aspx?keyword={search_term}",
"query-input": "required name=search_term"
}
}
</script>
After that, can you confirm the following:
http://www.oursite.com.au/search.aspx?keyword=test would successfully perform a search on your site for the term "test"? Per the spec does that search sit on the same domain you have the code on?
Have you allowed your site enough time to index after making these changes?

What is the best way to access Graphite data programmatically? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 5 years ago.
Improve this question
What is the best way to access data from Graphite render API?
https://graphite.readthedocs.org/en/latest/render_api.html#data-display-formats
Is there a JVM compatible client implementation?
Or there is a possibility to retrieve this data using some other API?
I do realise that the format is self descriptive and it is not a rocket science, but it would be great to reuse and contribute rather than writing from scratch.
The render api, as you mentioned, allows the following variables along with the API call-
&format=png
&format=raw
&format=csv
&format=json
&format=svg
For implementations such as , you can make straightforward curl calls like:
curl "http://graphite.com/render/?target=carbon.agents.host.creates&format=json"
The call would return:
[{
"target": "carbon.agents.ip-10-0-0-111-a.creates",
"datapoints": [
[4.0, 1384870140],
[1.0, 1384870200],
[18.0, 1384870260],
[0.0, 1384870320],
[4.0, 1384870380],
[12.0, 1384870440],
[3.0, 1384870500],
[7.0, 1384870560],
[8.0, 1384870620],
[null, 1384870680]
]
}]
Since it is this straightforward, therefore it'd be pretty lame to implement something just for making curl calls. What the community has done is that they are using these as fundamental building blocks for custom frontends, querying scripts that alert, nagios plugins etc.
Is there something more specific that you are looking for?

Is there any API for drawing wikipedia pageview data [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
How can I get the daily pageview of wikipedia page.
for example, I want to get the history of daily page view of this page http://en.wikipedia.org/wiki/Programming
Is it possible?
There doesn't seem to be an API for it, but a website exists at stats.grok.se that processes the (very large) files from http://dammit.lt/wikistats/.
There is a new (December 2015) API here: https://wikimedia.org/api/rest_v1/?doc
For example, to get the number of views of http://en.wiktionary.org/beauty on 2/12/2015:
https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/en.wiktionary/all-access/all-agents/beauty/daily/20151202/20151203
Response:
{
"items": [
{
"project": "en.wiktionary",
"article": "beauty",
"granularity": "daily",
"timestamp": "2015120200",
"access": "all-access",
"agent": "all-agents",
"views": 34
}
]
}
You can also see top 1000 pages on a wiki (/metrics/pageviews/top/) and aggregate pageviews on a wiki (/metrics/pageviews/aggregate/).
There are (at least) two new initiatives to build an API around Wikipedia pageviews: https://www.mediawiki.org/wiki/Analytics/Hypercube and http://www.mediawiki.org/wiki/User:GWicke/Notes/Storage. Both are in the planning stage but feel free to chime in with specific use cases.