While Migrating From Jira To Azure DevOps using Solidify, We get "ChangedBy value cannot be empty when BypassRules is specified" error - migration

while migrating work items from Jira to Azure DevOps (AD0), we get the below error during import to Azure DevOps
"ChangedBy value cannot be empty when BypassRules is specified"
Should I do any project settings to skip the error?

Related

Erroe run synapse Pipline

How can i resolve the error bellow ? i don't understant because it's worked before
The Azure Synapse Workspace DB connector is currently in public preview.
I also tried to reproduce the same in my environment. I am also getting same error. with the different source types with sink type Workspace DB and getting same error.
When I tried with other Sink type it worked successfully.
May be the issue is with the Workspace DB connector sink type
you can use different sink type, or you can raise Support ticket to Microsoft.

Getting error:login failed for user <token- identification principle> data factory linked service to synapse analytics

I am trying to create azure data factory linked service to synapse analytics with system-assigned managed identity but i am getting this error
Error 22300:Cannot connect to SQL Database:
#xxxxsql.azuresynapse.net', Database: xxxx, User: Check the linked
service configuration is correct, and make sure the SQL Database
firewall allows the integration runtime to access.
Login failed for user token-identified principal
I am getting this error. how solve this error?
I tried to reproduce same thing in my environment, I got same error.
To resolve above error, Please follow below steps:
Go to Azure synapse workspace -> Azure Active directory ->
Set Admin -> search for Azure data factor, make as admin and save it.
You can refer this similar SO thread.

Azure Synapse Analytics - exception Running a Dataflow

Using the preview of Synapse Analytics Workspace and the relates Synapse Studio, I have created a Data Flow that simply loads parquet file from a Datalake gen2 store into a table inside a SQL pool . Running the pipeline that contains only suchData Flow, I got the error
Livy Id=[0] Job failed during run time with state=[dead].
In synapse studio, looking into Monitor -> Apache Spark Application I found the Driver-stderr log for the failed spark application. There there was a row stating
ERROR Dataflow AppManager: name=AppManager.main, opId=AppManager fail, unexpected:java.lang.NoSuchMethodError: com.microsoft.azure.kusto.ingest.IngestionProperties.setJsonMappingName(Ljava/lang/String;)V, message=adfadf
Does any of you ever seen such error?

Azure Devops workitem migration from one account to another

We have been trying to migrate the workitems from one AzureDevops account to our enterprise AzureDevops account using the vsts-work-item-migrator mentioned below. But the "Discussion Field" information is not getting migrated. Is this expected behavior or we missing something here.
https://mohamedradwan.com/2018/04/12/migrating-tfs-and-vsts-work-items-using-vsts-work-item-migrator/

Azure U-SQL Continous deployment using VSTS Powershell task

I'm building CI/CD for my Azure Data lake Analytics - USQL code and facing below error while deploying my release using VSTS Power Shell task.
"Access from 'example-app1' is denied. Please grant the user with necessary roles on Azure portal. Trace: 03e7229d-e7ca-43d5-a7be-6e0a3a3b9317"
I have created Azure AAD following this link - https://learn.microsoft.com/en-us/azure/azure-resource-manager/resource-group-create-service-principal-portal and created a service End point. I also gave access to this AAD (example-app1) in my Azure Data lake analytics store path. Below is my ADLA - USQL code -
#searchlog =
EXTRACT UserId int,
Start DateTime,
Region string,
Query string,
Duration int?,
Urls string,
ClickedUrls string
FROM "adl://adlacicd.azuredatalakestore.net/Samples/data/SearchLog.tsv"
USING Extractors.Tsv();
OUTPUT #searchlog
TO "adl://adlacicd.azuredatalakestore.net/Samples/data/output/SearchLog-first-u-sql.csv"
USING Outputters.Csv();
any help in resolving this issue would be great.
Using relative path instead: /Samples/data/SearchLog.tsv.