How to migration existing V1 Json data-factory template to V2 version and use the MS migration tool - azure-data-factory-2

Have requirement to migrate the Azure data factory v1 jobs to V2 version.Tried using the existing migration tool by MS its not working giving error.Need help and advice ,how to achieve the migration and what best option or approach can be followed .Also any way to validate the Json template before deployment.
Tried with the MS v1 to v2 tool.Its giving error can't covert also tried directly connecting portal by tool.. same error. Any powershell script or other method where can divide the json and make it v2 format and validate same quickly.
Suggest the best practice or method to be followed.Any existing case study or reference which done this activity will be really helpful.

Based on my research and this thread,it seems that only v1 to v2 migration tool could be chosen so far.
See documentation for more on the differences between V1 and V2, and ADF V2 offers richer control flow, authoring, and monitoring capabilities.
If you crashed into error while using the tool, the migration tool allows for direct feedback to be submitted in the tool itself.
More details, please refer to this blog.

Related

How to rename a Linked Service in ADF

I am new to ADF and need some help please.
I created a Linked service for a sql database. I know need to make this dynamic (I know how to do this). I also want to rename the Linked service to reflect this dynamic nature. But I cannot find a way to do this.
Can someone help please? Not many hits in a google search
Thank you
I'm pretty sure you can't rename a Linked Service after it is published. If you want to dynamically create a linked service I would suggest building a small script using the SDK:s that are available like the ArtifactsClient from azure-synapse-artifacts for Python if you're running in Synapse Analytics Data Factory. You might then want to create a Linked Service for each run and tear it down after you've ran your pipelines. There should be an SDK for this in the "regular" Data Factory as well.
EDIT: Just noticed that there's a function for renaming a linked service through the mentioned API. See documentation here.
If your ADF is linked to GIT you can follow these simple steps to rename your linked Service:
Clone your ADF locally
Create a new branch (optionally)
Open ADF code in any Editor and Search and replace Linked Service name in all files (I use VSCode)
Rename Linked Service file name to new name (If not done ADF will prompt an issue)
Commit and push your changes
Verify on ADF (don't forget to switch branch)
Create a pull request to merge your changes to your main branch

How to migrate ArcGis Online data to external database?

I am using Arcgis Online. I want to migrate my feature layers (data which stored in esri's internal database) to external database.
Can anyone help me to figure out how I can do it? Also how can I use external database in my Application(PostgreSQL/Neo4j).What I need is, to host my own database server like neo4j and use as a replacement of feature-table provided by ArcGis.
Thanks,
Tarni
You have many options for downloading data from ArcGIS Online.
If you do not have many feature services, the easiest way would be to go to "My Content", then find your feature service. You should see an "Export Data" option towards the top-right.
If you have multiple feature services, you could repeat the directions above, but choose "Open in ArcGIS Desktop". This will download a file that will set up the connection in ArcMap for you. You can also hit the services directly from ArcGIS Desktop by going to "My Hosted Services" in ArcCatalog and logging in.
Another option is to use a Python script. This may be best if you have alot of data, and if the data is updated frequently. Check out https://github.com/tedrick/SyncSurvey for an example of getting data repeatedly from a feature service.

Mule ESB non production version

Do we have anon-production version for mulesoft. Basically we are performing a poC on AWS and are using mulesoft as middleware. Is there a developer version of mulesoft which we can deploy?
ssanrao helped. There is a community edition of mule which can be used for development purposes. Thanks ssanrao for the quick help
In the community version you won't get access for the two important features i.e Data weave and batch job .
Data weave is the most powerful transformer using which you can easily transform very complex data types to your desired data type without any hassle.You won't get the actual flavor of mule.

How to perform data validation on data from source files into CRM Dynamics 2013

Is it possible to perform data validation on fields from external source files(.csv,xml,txt) using the out of the box CRM Import Wizard? or do I need to perform validation using the CRM Web service SDK in say a plugin or so, please advice on best approach for this m i would really appreciate it.
The best place to do data validations is in plugins, i.e. in the pre-validation and pre-operation stages of the create and update messages.

How to manipulate SQL Azure database through pure REST calls

I am an iPhone developer. For one of my clients I am supposed to access their database stored in SQL Azure.
I know that there is an Objective C SDK. I have downloaded it and ran the NetFilx example successfully. But of course my client's account is password protected. Also as of now the Objective C SDK seems to provide only read support. But I will need to write to the database too. So I guess that I will have to use REST based calls to update the database.
My problem is that I cannot figure out what will be the URL of the REST services for the SQL Azure Database and how the authentication will work. I tried searching the net but all the examples seems to show how to connect through .Net, Java or PHP (and other supported languages). Nobody seems to talk about pure REST calls.
I can successfully connect to the database using following command:
sqlcmd -UUsername#Server -PPassword -Stcp:server.database.windows.net -dDBName
If such is the connection command, can any of the Gurus out there help me figure out, what should be the URLs to access this DB through pure REST calls and how the authentication will take place.
Any help is greatly appreciated. Thanks in advance.
Pritam.
What you need is the OData interface to SQL Azure. Currently, SQL Azure only supports TDS protocal, which will require a library to use. However, if you put the OData interface in front of SQL Azure, you can call to SQL Azure via REST.
More information
http://www.odata.org/blog/got-sql-azure-then-youve-got-odata/