PowerApps Data Source Changed to Stored Procedure - sql

The data source in PowerApps gallery was a SQL View.
Search('[dbo].[vwCandidate]', textSearchCandidate.Text, "NameLast", "NameFirst", "MiscellaneousTags", "EmailAddress", "PhoneNumber")
The selected record populated a global variable for the form item.
Set(varCandidate, gallerySearchResults.Selected)
Everything works as expected. Then, I changed the data source to use a stored procedure to move the search from PowerApps to SQL server. After doing so I received the error message
"Incompatible Type. We can't evaluate your formula because the context
variable types are incompatible with the types of the values in other
places in your app"
I cannot revert back to the view that was working without getting the same error. I'm hoping my only option is NOT to use a new variable and change every occurrence in the form/App. I'd like to avoid this if possible.
I cannot view the form so I'm not sure how to debug properly. My hunch is the date fields being returned via Flow are causing the problem. They are 'smalldatetime' types and the Flow is returning a string 'yyyy-mm-ddThh:mm:ss.000' even though 'date' is being requested.
"PhoneNumber": {
"type": "string"
},
"CandidateStatus": {
"type": "string"
},
"DateApplied": {
"type": "string",
"format": "date"
},
Flow JSON here does not seem to like any of the other 'date' format types.
Are there any workarounds from Flow? Should I reformat the date values when I am setting the global variable? Advice?

Turns out, I was on the right track thinking that the DATE data type coming from Flow as a string. Here's why:
A new record was created using a Patch function while setting the global variable:
Set(varCandidate, Patch('[dbo].[candidate]', Defaults('[dbo].[candidate]'), {DateApplied: DateTimeValue(Text(Now())), CreatedDate:DateTimeValue(Text(Now())), CreatedBy:varUser.Email}))
The "DateApplied" field was a "DATE" type in the SQL table and it was coming from Flow as a string "2019-03-13T17:40:52.000". The recordset from Flow was being set to the same global variable when I wanted to edit the record
Set(varCandidate, gallerySearchResults.Selected)
The error "Incompatible Type" (see question for full error message) was due to this field being a "Date Value" in a new record and a "string" in an edit record.
My fix is to remove this "Date" type fields from the patch and modify the Flow to retrieve the newly created record record by ID.

Reset everything back, including the data source, then save and close the app completely, re-test.
Remove any flow connections then save and close the app completely , re-test, then re-add the flow connections.
I don't why but PowerApps some times persist data connection errors until you have close the app down.
And just to confirm PowerApps doesn't support stored procedures as data sources, only as write using the patch function etc.

Related

How do I insert into a user column in a SharePoint list using Graph API?

I am trying to create an item in a SharePoint list using Microsoft Graph API and all the fields are inserting except when I add a user column I get the following error:
"code": "generalException",
"message": "General exception while processing".
Based on research, to insert into a user column the user's LookupId is required. My request body for the user column is as follows:
{
"fields": {
"[ColumnName]LookupId": "12"
}
}
If anybody could advise what I'm doing wrong or if I can insert using the user's email that would be better.
Cheers.
Everything is good with your request, but this body will work only for lookup/user columns where setting "Allow multiple selections" is false. I guess in your case it's true.
You can check it with the endpoint
GET https://graph.microsoft.com/v1.0/sites/{{SiteId}}/lists/{{ListName}}/contentTypes?expand=columns(select=name,type,personOrGroup)
where personOrGroup.allowMultipleSelection will show the flag.
For user or lookup type column where multiple selection is allowed, use the following body (and obviously you may pass multiple values in array):
{
"fields": {
"[columnName]LookupId#odata.type":"Collection(Edm.String)",
"[columnName]LookupId":["12"]
}
}
As for referring to user fields with email, I don't think it's possible with Graph API, but you may check Sharepoint REST API v1 if it supports that

Connectwise API missing property value gives no response

This question may very well be a general API question, but the API I am using is the Connectwise Tickets API.
I'm writing in VB and I'm getting my list of tickets(i) then I'm setting the following variable:
currentTicket = tickets(i)
So that I can reference value like currentTicket.Source.Name and save the info to a DB.
As long as the Connectwise user put something into the "Source" field, everything works fine. I can reference that property and log it to the database or whatever else I want to do. If they left it blank though, even trying to look at currentTicket.Source.Name stops my program in it's tracks. It doesn't crash or error out, it just doesn't go past the line of code referencing the empty field.
Since it won't even let me reference currentTicket.Source.Name for example, I am unable to check to see if currentTicket.Source.Name = "" or Is Nothing.
What am I missing? Is there a way to check and see if the property even exists for a given API response?
Any help would be appreciated.
EDIT:
OK so I decided to take the Ticket object and grab the raw json and send it to the command line output.
When a ticket has the source field filled in, this is what that section of the JSON looks like:
...
},
"servicelocation": {
"id":4,
"name":"remote",
"_info":{
"location_href":"https://API_url"
}
},
"source":{
"id":4,
"name":"Email Connector",
"_info": {
"source_href":"https://API_url"
}
},
"severity": "Medium",
"impact":"Medium",
...
When the source field in the application has been left blank, the JSON looks like this instead:
...
},
"servicelocation": {
"id":4,
"name":"remote",
"_info":{
"location_href":"https://API_url"
}
},
"severity": "Medium",
"impact":"Medium",
...
That behavior is probably normal, I'm obviously just missing the knowledge of how to deal with it. I feel like I should be able to test if the property of the object is non-existent the same way I test if it's null or "", using Tickets(i).Source.Name, but I get no exceptions, no errors, no crashes, the program just sits their waiting for a response to "What's Source Name's value?"
I mean I suppose I could parse out the entire JSON response, create my own private object, assign values as they exist and then set my own property values so I could then set mySourceName = "" When it doesn't exist in the JSON response, but that seems like a lot of work when I only care about like 10 fields and it's a pretty large json response.
Is that the normal way of doing things with APIs?

How to transform record value before update? - Update API expects a different format than fetch API

I have setup react-admin with Hasura (GraphQL API on Postgres) using the ra-data-hasura provider and I ran into an error when trying to update a record in an existing table.
I have this field tags with type varchar[] which is delivered by the fetch API in the format: tags: ["A", "B"], but the problem is that the UPDATE API expects the format tags: "{A,B}".
Therefore all UPDATE requests will fail.
I already tried the parse() and format() functions on the InputField, but they are not changing the initial value of the record. The update will still fail if the tags field is left untouched.
This is the API's (Hasura) error message in response to the UPDATE request. (just putting it here so other people might find this post)
{
"path": "$.args.$set",
"error": "A string is expected for type : _varchar",
"code": "parse-failed"
}
Is there a way to transform the value for tags in react-admin without having to modify the API?
You can decorate the Data Provider similarly to the example:
https://marmelab.com/react-admin/DataProviders.html#decorating-your-data-provider-example-of-file-upload

JMeter: Creating Proper UPDATE Calls for API CRUD Testing

I'm performing API testing of basic CRUD functionality. For a record creation, I need to take the response, modify a field, and save the full thing off as a file so i can be recalled for an Update.
Here is what occurs for the creation.
CREATE POST Body
{
"id": 0,
"name": "apiTest: Code Rate ${__Random(1,10000000)}",
"deletable": false,
"codePeriods": null
}
CREATE RESPONSE Body
{
"name": "apiTest: Code Rate 869531",
"id": 1257745140,
"deletable": true,
"codePeriods": null,
"lastChangedDateTime": "03/01/2016 10:13:09",
"lastChangedTime": 36789410,
"createdUser": {
"id": 1003941890,
"userName": "N9SFBulkUser"
},
"lastChangedDate": 736024,
"lastChangedUser": {
"id": 1003941890,
"userName": "N9SFBulkUser"
},
"createdDateTime": "03/01/2016 10:13:09"
}
I need to change the "name" field in order to perform an UPDATE on the record.
As of now, I have:
a RegEx to extract the name field value and save it. (newCodeRate)
a Save Response to a file to save off the entire response. (newCodeRateFile)
another HTTP Request to update the record where:
Body Data = ${__fileToString(${__eval(${newCodeRateFile})},,)}
As you can see, right now it's just taking the previous response, saving it to a file and then being re-sent. This is not a proper UPDATE as the database sees nothing has changed and just ignores it. Sure, I get a 200 OK response, but it's misleading as nothing was updated. You can tell this because the Creation and Update date/times still match.
I was thinking maybe I need a BSF PostProcessor where (using Javascript):
var data = prev.getResponseDataAsString();
var object = JSON.parse(data);
vars.put("name", object.name);
But not being a developer by trade, I'm not sure how what to do with this and how to save the new name value into the saved recallable file.
I don't think you have JSON in BSF JavaScript, it is not part of Rhino
I don't think you need to store response into a file and read it, you can do it in memory.
So:
Change your __Random function to store generated value into a JMeter Variable like:
${__Random(1,10000000,randomNumber)}
Add Regular Expression Extractor as a child of CREATE request and configure it as follows:
Reference Name: anything meaningful, i.e. body
Regular Expression: (?s)(^.*)
Template: $1$
Add __Beanshell function as UPDATE request body, it should look like:
${__BeanShell(return vars.get("body").replaceAll(vars.get("randomNumber")\,"${__Random(1,10000000)}");,)}
See How to Use JMeter Functions posts series for more comprehensive information on JMeter functions.

Creating Collection in Azure Search Service using Indexer

I am using indexer to sync data from my SQL Database to Azure Search Service. I have a field in my SQL View, which contains XML data. The Column contains a list of string. The corresponding field in my Azure Search Service Index in a Collection(Edm.String).
On checking some documentations, I found that Indexer does not change Xml(SQL) to Collection(Azure Search).
Is there any workaround as to how I can get create the Collection from the Xml data?
p.s I am extracting the data from a View, so I can change the Xml to JSON if needed.
UPDATE on October 17, 2016: Azure Search now automatically converts a string coming from a database to an Collection(Edm.String) field if the data represents a JSON string array: for example, ["blue", "white", "red"]
Old response: great timing, we just added a new "field mappings" feature that allows you to do this. This feature will be deployed sometime early next week. I will post a comment on this thread when this is rolled out in all datacenters.
To use it, you indeed need to use JSON. Make sure your source column contains a JSON array, for example ["hello" "world"]. Then, update your indexer definition to contain the new fieldMappings property:
"fieldMappings" : [ { "sourceFieldName" : "YOUR_SOURCE_FIELD", "targetFieldName" : "YOUR_TARGET_FIELD", "mappingFunction" : { "name" : "jsonArrayToStringCollection" } } ]
NOTE: You'll need to use API version 2015-02-28-Preview to add fieldMappings.
HTH,
Eugene