SubmitForm then Patch results in "The data returned by the service was invalid" - azure-sql-database

I'm building a PowerApps app on Azure SQL
The requirement
I have a form which has "Save" and "Confirm" buttons.
Both buttons should save the form data. The Commit button should also set database column "Confirm" to 1
I've read at length about how I can programatically override the update value of a hidden control for this. But I'm not satisfied with the level of complexity (maintenance) required to get this working, i.e.
Populate a variable with the current db value
In the button code set the variable value
In the form field, set the update property to the variable
What I'm Trying
So I'm trying a different approach: SubmitForm then Patch. Even though this requires an extra database call, I'd like to understand if this will work. This is the code for OnSelect in the commit button:
// Save the record
SubmitForm(frmEdit);
// Update confirmed to 1
Patch('[dbo].[Comments]',cRecord,{Confirmed:1});
Some Complexities
Note that my record is a variable, cRecord. In short I want this app to be able to upsert based on URL parameters.
This is my App.OnStart which captures URL values, inserts a record if required. Regardless, the result of this event is that cRecord is set to the record to be edited.
// Cache employees and store lookups (as they are in a different db)
Concurrent(Collect(cEmployees, Filter('[dbo].[SalesPerson]', Status = "A")),Collect(cStores, '[dbo].[Store]'));
// Check for parameters in the URL. If found, set to Edit/Add mode
Set(bURLRecord,If((!IsBlank(Param("PersonId")) && !IsBlank(Param("Date"))),true,false));
// If URL Parameters were passed, create the record if it doesn't exist
If(bURLRecord,
Set(pPersonId,Value(Param("PersonId")));
Set(pDate,DateValue(Param("Date")));
// Try and find the record
Set(cRecord,LookUp('[dbo].[Comments]',SalesPersonId=pPersonId && TransactionDate = pDate));
If(IsBlank(cRecord),
// If the record doesn't exist, create it with Patch and capture the resulting record
Set(cRecord,Patch('[dbo].[Comments]',Defaults('[dbo].[Comments]'),{SalesPersonId:pPersonId,TransactionDate:pDate}))
);
// Navigate to the data entry screen. This screen uses cRecord as its item
Navigate(scrEdit);
)
frmEdit.Item is set to cRecord. As an aside I also have a gallery that sets this variable value when clicked so we can also navigate here from a gallery.
The navigating using new and existing URL parameters works. Navigating from the gallery works.
The problem
When I press the Commit button against a record which has Confirmed=0 I get this popup error:
The data returned by the service is invalid
When I run this code against a record which already has Confirmed=1 I don't get an error
If I run the PowerApps monitor it doesn't show any errors but it does show some counts being run after the update. I can paste it here if required.
I also tried wrapping the Path in a Set in case it's result was confusing the button event but it didn't make a difference.
What I want
So can anyone offer me any of the following info:
How can I get more info about "The data returned by the service is invalid"?
How can I get this to run without erroring?
Is there a simpler way to do the initial upsert? I was hoping a function called Patch could upsert in one call but it seems it can't
With regards to the setting field beforehand approach, I'm happy to try this again but I had some issues implementing it - understanding which control where to edit.
Any assistance appreciated.
Edit
As per recommendations in the answer, I moved the patch code into OnSuccess
If(Confirmed=1,Patch('[dbo].[CoachingComments]',cRecord,{Confirmed:1}));
But now I get the same error there. Worse I cleared out OnSucces and just put SubmitForm(frmEdit); into the OnSelect event and it is saving data but still saying
The data returned by the service was invalid

First things first,
Refactoring has multiple steps,
I can t type all out at once...
The submitform and patch issue:
Never use the submitforn with extra conplexity
Submitform is only the trigger to send the form out,
The form handler will work with your data...
If you hsven t filled out the form correctly, u don t want to trigger your patch action...
So:
On your form, you have an OnSucces property,
Place your patch code there...
Change your cRecord in your patch statement:
YourForm.LastSubmit

Related

yadcf externally triggered filters 'shut off' the actual filtering

I am trying to set my yadcf filters up so they can be triggered from a call (link) from another page. I have an angular single page application that has three tabs on it. If a user clicks a link on lets say the first tab, they will go to another tab (separate table) that contains detailed information relevant to the link they click. (e.g. They are on a row in a table that deals with Apple Mac Pro computers. They see that there are 20 skus currently in the system. They click the number 20 and they go to a lower tab (different table) that contains all the information for those skus). There is no server call in the middle. All the data is loaded in all the tables when the application loads up. So, they are simply clicking a link that applies a filter to the detail table.
yadcf can do this through externally_triggered filters. However, when I set 'externally_triggered': true, it stops the actual filters from working on the details table. (In other words, I can no longer go to that table and manually adjust the filters.)
Does anyone know a way around this issue?
It appears the externally_triggered: true switch does not need to be turned on to use yadcf.exFilterColumn() method. I do not understand when it does need to be turned on, but I am able to call the exFilterColumn method and pass it the options needed to 'prefilter' the table while still retaining the ability to filter the table manually.
externally_triggered and yadcf.exFilterColumn are not related in any way, indeed when yadcf.exFilterColumn is used filters behave a bit differently - they are not filtering on change/keyup/etc , but rather only when the uadcf.exFilterExternallyTriggered function is called (its on purpose and all is explained in the docs)
Here is the relevant text from the docs of the externally_triggered, here it is:
* externally_triggered
Required: false
Type: boolean
Default value: false
Description: Filters will filter only when yadcf.exFilterExternallyTriggered(table_arg) is called
Special notes: Useful when you want to build some form with filters and you want to trigger the filter when that form
"submit" button is clicked (instead of filtering per filter input change)
Here is the showcase page

RSA Archer user cannot specify a date in the future / past

Has anyone any good patterns for RSA Archer validation which prevents a user from saving the record when a given date specified is in the future (or past)?
Currently I am catching this using calculated fields after the data has been saved, in a data exceptions report. But ideally I would like to catch this early prior to the user saving the record.
I would suggest that you use custom object in this case.
So remove the basic onclick attribute of the SAVE and APPLY button.
In your custom object, check if the entered date matches the system date (or the time-zone you need). Set a flag. Based on the flag value, you can call the actual function call of the SAVE or APPLY button.
Hope that helps!
Alex,Tanveer is correct. You have to use a Custom Object with embedded JavaScript code to implement described functionality.
You will need to create a function that will validate the value entered by the end user and either accept it or make user correct himself.
Now, you have two options how to do it:
1. You can attach your validation function to the Save and Apply buttons as Tanveer described. I have shared a similar code in the following question before. You can review it here: LINK
2. You can attach your validation function to the element you plan to validate directly. So when user is done with given input element and input element loses focus your function will be called. Here is a sample code with jQuery:
$('#elementid').blur(function() {
// validate entered value here
// if required show a pop-up message
WarningAlert(msg, title);
});
Good luck!

Data not updated in dhtmlx .net scheduler

I have problems and questions about this scheduler. I already tried to build and almost finished it. Although, I receive some errors...
What I do?
I create custom lightbox
mapping all the data in to table dbo.bEvent
I used custom eventbox like this : Scheduler.Templates.event_text = "({position_desc})" + " " + "{newrate}"
position_desc is actually from other table 'dbo.zone'
I create Views in SQL Server to retrieve data from 'newrate'. 'newrate' actually is a new attribute after I do some query to change rate "1000 to 1k" which is new rate save '1k'.
the views that I create by joining table "dbo.zone and dbo.bEvent"
Problem is?
when I save a new data or insert or update new data. my event box just give me '(Undefine)undefine'
all the data that i put in lightbox is save in dbo.bEvent
after I resfresh the page using f5 or navigate to next page or previous page then the data is updated.
Here i attach some screenshot. Thanks in advance
http://s1319.photobucket.com/user/matpyam/library/?sort=3&page=1
If you save the changes as described here http://scheduler-net.com/docs/lightbox.html#define_crud_logic ,
note that the method uses SchedulerFormResponseScript class to render the response.
Constructor takes instance of the event class to be returned to the client. Values of that event will be applied to the related event on the client-side
return (new SchedulerFormResponseScript(action, changedEvent));
make sure object that you send (changedEvent) has all data properties initialized. Basically the event should have the same data as when it is loaded from the Data action.
Alternative solution, which may be more straightforward, would be reload calendar data with the client-side api after saving event:
scheduler.clearAll();
scheduler.load("dataUrl", "json");
http://docs.dhtmlx.com/scheduler/api__scheduler_clearall.html
http://docs.dhtmlx.com/scheduler/api__scheduler_load.html

SharePoint Workflow Error: "Unable to transform the input lookup data into the requested type" BUT only on New Item Creation

FYI to start, I am aware of how to properly set up an update to a lookup, and am 99% positive I've done this correctly.
I know this because When I set the workflow to automatically start when an Item is Changed, then it works perfectly. But when I simply change this setting so it will automatically start on New Item Creation, it Cancels the workflow and I get a "Coercion Failed: Unable to transform the input lookup data into the requested type." If both options are checked then it fails on creation, but simply clicking edit on the item properties, and the "Save" makes it work.
The workflow is on a Document Library and works as follows;
User selects the Work Task LookUp from a dropdown in the edit properties form after uploading, and then Saves the item (adding it to the document library). The workflow is suppose to then look at the Work Task LookUp selected, and pull the Account and Effective Date-Type lookUp ID's that Work Task item has, and sets the Document's identical fields to the same value.
Here is the code for the workflow if it helps;
If Current Item: Parent Task is not empty
If Current Item: Sub Task is not empty
Log Both are empty to workflow history list
Then Set Account to Work Tasks:Account
The Log Set Account to workflow history list
Then Set Effective Date and Type to WorkTasks: Effective Date and Type
The Log Set EffDateType to the workflow history list
This is all done in one step. I also added additional steps to test if the account and effective date type fields have been set properly, and if not to set them again. But everytime I run the workflow on change and it works, it always correctly sets these fields based upon the first Step (posted above) and the additional check logs to the history that they are not needed.
As an example, The lookUp for Integer for Tasks:Account is set to work as follows;
Date Source: Work Tasks (a list)
Field from Source: Account (a lookup)
Return Field as: Lookup ID (as Integer)
Find the List Item
Field: Title (from the Work Tasks list)
Value: Current Item: Parent Task (Which is a look up of the "Title"
Field from Work Tasks List, and is set to return the Value as a LookUp Value (As Text))
The Effective Date and Type setting is pretty much identical.
So anyone have any insight? I've tried running it as an impersonated Step, setting a workflow pause (for 1 minute), changing the lookup types incase I messed it up to start with, but ultimately the above workflow DOES work, but only when I set it to "Automatically start on the Change (edit) of an Item", NOT "Automatically start on New Item Creation" like I need to to do.
Oh yes, fyi, I am using SPServices CascadingDropDown on the Work Task and Sub Task fields of the doc Library form, but I honestly do not believe this has anything to do with my issue.
UPDATE:
I've talked with another developer, and he believes it is due to the issue that the workflow is occuring too quickly, before the item creates an ID for itself, which it needs to conduct the lookUps. He had me add another "Pause Workflow" to the very top of my workflow code (above the If conditions) and set it for 1 minute.
It then worked properly.
Downside is we want this to labeling to occur as close to item creation as possible. Because a view of the library relies on grouping based upon Account and Effective Date and Type. To add to this downer, Microsoft's Pause Workflow only allows for 1 minute or more, and then the timer used for this is often off, resulting in a pause longer than that. So far, every test is currently showing 2 minutes minimum on the pause.
A possible alternative solution for instantaniously populate the fileds is to use Javascript and SPServices to do the lookUp to the Task list to pull the account and effective date - type fields and then populate, but my Javascript is not very strong and I would need help doing this. If anyone has any suggestions, I would appreciate them.
(Answered in a question edit. Converted to a community wiki answer. See Question with no answers, but issue solved in the comments (or extended in chat) )
The OP wrote:
I don't know if it is the ID for the item after further testing. I changed the start of the workflow to wait until a field in the item changes. I set it to wait until the ID field is not 0 (since you cannot set to null), and it still does not work.
6/14/2012 4:13 PM Comment System Account Waiting on ID ​
6/14/2012 4:13 PM Comment System Account Waiting complete on ID ​
6/14/2012 4:13 PM Error System Account Coercion Failed: Unable to transform the input lookup data into the requested type.
I have tried other fields as well, like document ID value is not empty, and it will wait, log it finishing the wait, and then fail.
UPDATE This issue has something to do with the Parent Task field. I have solved the issue without having to wait for a period of time by setting the change from above to wait until the Parent Task field is not empty. It then completes the workflow fine.
Anyone know why there is a delay though? I've solved it, but still don't fully understand what takes it so long.
The main fault has been solved (hence the answer), and the remaining point about the reasons for the delay would probably be a discussion point or not specific enough for SO. Any further clarification can be edited in here.

How to use a hidden page item within an SQL report query without submitting

Using: Oracle ApEx 3.0.1
I have a SQL report region that contains a hidden page item as part of the "where clause". My problem is, based on a value entered by the user, I need to assign this value entered to my hidden item, so that it can be used within the where condition of my SQL but this would need to be done without actually submitting the form.
At the moment, I can set the value via an on-demand process but my SQL is still not returning any values as the hidden page item within the query is not set (as page has not been submitted).
I am not sure how to do this and whether in actual fact, this is possible to do, without having submitted the page.
Since you are on Apex 3, you don't have dynamic actions, but that doesn't hinder so much.
I've set up an example on apex.oracle.com. To get in the workspace, use workspace 'tompetrusbe' + apex_demo / demo.
There is a dynamic action there, which can do the work too, but i've disabled it.
What you need to make it work:
an ajax callback process, with the following line:
apex_util.set_session_state('P2_PAR_ENAME', apex_application.g_x01);
Give your report region a static id, i called mine 'report_emp'. This so i can easily retrieve it.
In the javascript region of the page, you then need to call the app process, and then refresh the region. Also bind the event which needs to trigger this action. I've done it here through the onchange event of the parameter textfield.
function refresh_report(oTrgEl){
//alert('refresh: ' + $v(oTrgEl));
//calling the application process which sets the session state of P2_PAR_ENAME
var oGet = new htmldb_Get(null, &APP_ID., 'APPLICATION_PROCESS=set_session_state', &APP_PAGE_ID.);
oGet.addParam("x01", $v(oTrgEl));
oGet.get(); //the app process just sets something, it returns nothing
//refresh the report region
$("#report_emp").trigger("apexrefresh");
};
function bind_events(){
//call this onload
$("#P2_PAR_ENAME").change(function(){refresh_report(this);});
};
In the query of the report i use where ename like '%'||UPPER(:P2_PAR_ENAME)||'%'.
When you type (for example) 'bl' in, and tab out (to trigger the onchange), the region will refresh and will be filtered.
You'll just need to adapt to your solution :)