Workday Incremental Extract - api

I am using Informatica Intelligent Cloud Services (IICS) to retrieve Contracts Information(Get_Customer_Contracts). Is it possible to incrementally extract data from Workday using API. Example : Contracts which have been created or updated in the last one day.
Any help is appreciated. Thank you.

As per what I see in the API documentation there is only a way to get contracts between 2 effective dates and no way to extract based on Creation date.
API Documentation : https://community.workday.com/sites/default/files/file-hosting/productionapi/Revenue_Management/v39.0/samples/Get_Customer_Contracts_Request.xml
Unfortunately if you must extract based on creation date, you need to use a RAAS report.

Related

How to update insert new record with updated value from staging table in Azure Data Explorer

I have requirement, where data is indigested from the Azure IoT hub. Sample incoming data
{
"message":{
"deviceId": "abc-123",
"timestamp": "2022-05-08T00:00:00+00:00",
"kWh": 234.2
}
}
I have same column mapping in the Azure Data Explorer Table, kWh is always comes as incremental value not delta between two timestamps. Now I need to have another table which can have difference between last inserted kWh value and the current kWh.
It would be great help, if anyone have a suggestion or solution here.
I'm able to calculate the difference on the fly using the prev(). But I need to update the table while inserting the data into table.
As far as I know, there is no way to perform data manipulation on the fly and inject Azure IoT data to Azure Data explorer through JSON Mapping. However, I found a couple of approaches you can take to get the calculations you need. Both the approaches involve creation of secondary table to store the calculated data.
Approach 1
This is the closest approach I found which has on-fly data manipulation. For this to work you would need to create a function that calculates the difference of Kwh field for the latest entry. Once you have the function created, you can then bind it to the secondary(target) table using policy update and make it trigger for every new entry on your source table.
Refer the following resource, Ingest JSON records, which explains with an example of how to create a function and bind it to the target table. Here is a snapshot of the function the resource provides.
Note that you would have to create your own custom function that calculates the difference in kwh.
Approach 2
If you do not need a real time data manipulation need and your business have the leniency of a 1-minute delay, you can create a query something similar to below which calculates the temperature difference from source table (jsondata in my scenario) and writes it to target table (jsondiffdata)
.set-or-append jsondiffdata <| jsondata | serialize
| extend temperature = temperature - prev(temperature,1), humidity, timesent
Refer the following resource to get more information on how to Ingest from query. You can use Microsoft Power Automate to schedule this query trigger for every minute.
Please be cautious if you decide to go the second approach as it is uses serialization process which might prevent query parallelism in many scenarios. Please review this resource on Windows functions and identify a suitable query approach that is better optimized for your business needs.

Incident SLA through Rest API

there is one way to extract sla from a service now rest api call?
I need to extract from service now all the sla related to my incident. Is there one way to do that?
SLA data is stored in the task_sla table. You an use the Table API. Execute a GET on the table task_sla using sysparm_query=task=xxx where xxx is the sys_id of your Incident.

How can I set a date range for requestReport using Amazon Advertising API?

Is it possible to set a date range for requestReport?
POST /v2/sp/{recordType}/report
{
"segment": {segment},
"reportDate": {reportDate}, <-- here
"metrics": {metrics}
}
Or do I need to make 30 request in order to get results for a month? Maybe snapshots can help?
Yes, you have to do multiple requests (one for each day) and merge them. The Advertising API currently does not offer an option to request data for a date range.
Snapshots can maybe help you to get data which has no click data for the requested date(s).
Yes, you do. What I do is run it daily using a cron job and load the data into a database, and query the database. Keep in mind certain metrics, like sales, don't finalize until after the attribution window e.g. 14 days.

Data Lake Analytics: Custom Outputter to write to different files?

I am trying to write a custom outputter for U-SQL that writes rows to individual files based on the data in one column.
For example, if the column has a date "2016-01-01", it writes that row a file with that name, and a the next row to a file with the value in the same column.
I am aiming to do this by using the Data Lake Store SDK within the outputter, which creates a client and uses the SDK functions to write to individual files.
Is this a viable and possible solution?
I have seen that the function to be overriden for outputters is
public override void Output (IRow row, IUnstructuredWriter output)
In which the IUnstructuredWriter is casted to a StreamWriter(I saw one such example), so I assume this IUnstructuredWriter is passed to this function by the U-SQL script. So that doesn't leave for me any control over this what is passed here, also it will remain constant for all rows and can't change.
This is currently not possible but we are working on this functionality in reply to this frequent customer request. For now, please add your vote to the request here: https://feedback.azure.com/forums/327234-data-lake/suggestions/10550388-support-dynamic-output-file-names-in-adla
UPDATE (Spring 2018): This feature is now in private preview. Please contact us via email (usql at microsoft dot com) if you want to try it out.

In Quickbooks Api retrieving a aggreate view of the transactionqueryrq

I am using the QuickBooks Api to request transaction data from a QB database. However due to the number of transactions, it takes a long time for the request to come back. Is there a way of requesting a summarise view in the XML, i.e instead of getting data on TxnID level, I can get it to just aggregated the 'amount' by accounts.
Thanks in advance
Is there a way of requesting a summarise view in the XML, i.e instead of getting data on TxnID level, I can get it to just aggregated the 'amount' by accounts.
Using a TransactionQueryRq? No.
If you're trying to get summary data, you should look into the reporting features of the SDK/qbXML instead -- they are likely closer to what you need.