I have a Radlistview, where the data switches out based on user query. With loadOnDemandMode="Auto" and when the current query is exhausted, I then call notifyLoadOnDemandFinished(true). However, when a new query is made, I cannot re-enable loadOnDemand, and new items are not loaded.
Is there a way to reactivate loadOnDemand, perhaps with a method on the radListView object? I couldn't find anything in the docs.
Found the mistake, posting here for anyone else that may have the problem.
I was trying to set
listView.loadOnDemandMode = "Auto"
adding _nativeView fixed it
listView._nativeView.loadOnDemandMode = "Auto"
Related
I'm building a PowerApps app on Azure SQL
The requirement
I have a form which has "Save" and "Confirm" buttons.
Both buttons should save the form data. The Commit button should also set database column "Confirm" to 1
I've read at length about how I can programatically override the update value of a hidden control for this. But I'm not satisfied with the level of complexity (maintenance) required to get this working, i.e.
Populate a variable with the current db value
In the button code set the variable value
In the form field, set the update property to the variable
What I'm Trying
So I'm trying a different approach: SubmitForm then Patch. Even though this requires an extra database call, I'd like to understand if this will work. This is the code for OnSelect in the commit button:
// Save the record
SubmitForm(frmEdit);
// Update confirmed to 1
Patch('[dbo].[Comments]',cRecord,{Confirmed:1});
Some Complexities
Note that my record is a variable, cRecord. In short I want this app to be able to upsert based on URL parameters.
This is my App.OnStart which captures URL values, inserts a record if required. Regardless, the result of this event is that cRecord is set to the record to be edited.
// Cache employees and store lookups (as they are in a different db)
Concurrent(Collect(cEmployees, Filter('[dbo].[SalesPerson]', Status = "A")),Collect(cStores, '[dbo].[Store]'));
// Check for parameters in the URL. If found, set to Edit/Add mode
Set(bURLRecord,If((!IsBlank(Param("PersonId")) && !IsBlank(Param("Date"))),true,false));
// If URL Parameters were passed, create the record if it doesn't exist
If(bURLRecord,
Set(pPersonId,Value(Param("PersonId")));
Set(pDate,DateValue(Param("Date")));
// Try and find the record
Set(cRecord,LookUp('[dbo].[Comments]',SalesPersonId=pPersonId && TransactionDate = pDate));
If(IsBlank(cRecord),
// If the record doesn't exist, create it with Patch and capture the resulting record
Set(cRecord,Patch('[dbo].[Comments]',Defaults('[dbo].[Comments]'),{SalesPersonId:pPersonId,TransactionDate:pDate}))
);
// Navigate to the data entry screen. This screen uses cRecord as its item
Navigate(scrEdit);
)
frmEdit.Item is set to cRecord. As an aside I also have a gallery that sets this variable value when clicked so we can also navigate here from a gallery.
The navigating using new and existing URL parameters works. Navigating from the gallery works.
The problem
When I press the Commit button against a record which has Confirmed=0 I get this popup error:
The data returned by the service is invalid
When I run this code against a record which already has Confirmed=1 I don't get an error
If I run the PowerApps monitor it doesn't show any errors but it does show some counts being run after the update. I can paste it here if required.
I also tried wrapping the Path in a Set in case it's result was confusing the button event but it didn't make a difference.
What I want
So can anyone offer me any of the following info:
How can I get more info about "The data returned by the service is invalid"?
How can I get this to run without erroring?
Is there a simpler way to do the initial upsert? I was hoping a function called Patch could upsert in one call but it seems it can't
With regards to the setting field beforehand approach, I'm happy to try this again but I had some issues implementing it - understanding which control where to edit.
Any assistance appreciated.
Edit
As per recommendations in the answer, I moved the patch code into OnSuccess
If(Confirmed=1,Patch('[dbo].[CoachingComments]',cRecord,{Confirmed:1}));
But now I get the same error there. Worse I cleared out OnSucces and just put SubmitForm(frmEdit); into the OnSelect event and it is saving data but still saying
The data returned by the service was invalid
First things first,
Refactoring has multiple steps,
I can t type all out at once...
The submitform and patch issue:
Never use the submitforn with extra conplexity
Submitform is only the trigger to send the form out,
The form handler will work with your data...
If you hsven t filled out the form correctly, u don t want to trigger your patch action...
So:
On your form, you have an OnSucces property,
Place your patch code there...
Change your cRecord in your patch statement:
YourForm.LastSubmit
ok so im sorry to be asking, however im trying to make it so that when i press z, a portal appears at my Spr_players coordinates, however if one of them already exists, i want it to be erased and im simply wondering what ive done wrong. Again sorry for bothering. (please note that i am a bad programmer and i appoligize if i broke any rules)
if object_exists(portal)
{
instance_destroy()
action_create_object(portal,Spr_player.x,Spr_player.y)
}
else
{
action_create_object(portal,Spr_player.x,Spr_player.y)
}
The instance_destroy() statement destroys the current self instance which is what is executing the code. You must use the with (<objectID>) {instance_destroy()} syntax to destroy another instance.
As long as there is only one instance of portal in the room this code should work:
if object_exists(portal)
{
with (portal) instance_destroy(); //you should also need a semicolon here to separate
//this statement from the next, it is good practice
//to do this after all statements as I have done.
action_create_object(portal,Spr_player.x,Spr_player.y);
}
else
{
action_create_object(portal,Spr_player.x,Spr_player.y);
}
If there are multiple instances of portal this will only destroy the first one. To destroy all you would have to use a for loop to iterate through them all. Off the top of my mind I can not remember the function to get the ids of all the instances of an object, but it looks like this is not a problem since each time one is created the existing one is destroyed and thus you will only have one a t a time.
Another way of doing this is just to move the existing portal to the new position. The only difference here is that the create event of the portal will not be executed and any alarms will not be reset.
portal.x=Spr_player.x
portal.y=Spr_player.y
Again this will only move the first portal if there are more than one.
Back in the unversioned Ember Data days (e.g. "rev 12" maybe) I'm pretty sure you could do this:
var comment = App.Comment.find(42); // Already exists, but not yet loaded...
post.get('comments').addObject(comment);
Because App.Comment.find(42) would return an App.Comment object, albeit one with no fields populated except it's ID. (I don't remember the details of how you'd then save the App.Post--i.e. if you could or couldn't save it until the comment object was completely loaded…I never got that far.)
Why this was neat is that if your template rendered post.comments, a new row/div would appear immediately that could check isLoaded to display a loading indicator and show instantly that a new record was attached while waiting for the record's data to load. This is/was a selling point of Ember/Ember Data, and one I really like.
But this doesn't work now in 1.0.0-beta.2 beta.4 beta.5:
var comment = controller.get('store').find('comment', 42);
post.get('comments').addObject(comment); // Fails
Because controller.get('store').find('comment', 42) returns a promise, and if I try to add it to the hasMany it complains that I can only add App.Comment objects to the relationship.
Is it still possible to do something like this, so that my template which renders the comments immediately updates with a new record, but asynchronously populates its data?
(Please ignore that it doesn't make sense to add an already existing comment to a post--using the ubiquitous example scenario is easier than posting all my model code. Thanks!)
Okay, I came up with at least one way that does it:
var comment = controller.get('store').find('comment', 42);
var inFlightRecord = controller.get('store').getById('comment', 42);
controller.get('comments').addObject(inFlightRecord);
To be safer, maybe:
var comment = controller.get('store').find('comment', 42);
var inFlightRecord = controller.get('store').getById('comment', 42);
if(inFlightRecord){ // should be null if it isn't in the store
controller.get('comments').addObject(inFlightRecord);
} else {
// add a then block to the promise to make sure it gets added later
}
It seems that getById returns the "unloaded" object like we used to get from App.Comment.find(42), and the object still has an isLoaded property you can check to show loading status in your template.
I'm not sure if this is supposed to be supported behavior that I can rely on going forward (I suppose arguably nothing is, until 1.0 release), but it seems to work. I even checked that the object returned by getById === the object fulfilled in the promise. So this seems to be a good solution.
Anyone see a problem with this, or have a better way?
I am currently developing an application in which i experience the exception "A cycle was detected in the set of changes" when calling DataContext.SubmitChanges(). I know why this exception is thrown but i have not been able to find a fix for my situation. Let me explain the situation. I have a database with a table as shown below which i access with LINQ to SQL so it gets mapped to classes in vb.net.
Device
-------
ID
DefaultGatewayID
The DefaultGatewayID is a Device an can even be the same object or another Device. The user uses a GUI with a DataGrid to alter and add new records. The updating records is no problem. The ID already exists and the DefaultGatewayObject is attached to the record (the ID is stored in database).
However when i try to add a new record and set the DefaultGatewayObject in the same transaction i get the 'Cycle detected in set of changes'-exception. I suspect this is caused by LINQ to SQL because it does not know which record to add first, although it is the same item in this case.
I do not have the option to break the insertion into two parts, one for the Device and then adding the DefaultGateway because my submit button is bound to a XAML Command which executes the SubmitChanges.
Ideally i would have some option to specify which object is to be created first, or something like that. I think it's an option to remove the connection to itself and just set the ID in this field, but i'd rather find a fix within LINQ to SQL.
I hope SO has an answer to this. I could only find this related post "Cycle detected while adding Circularly linked list"
You can break the insertion into two parts ans still have one transaction if you wrap your code in a TransactionScope.
Using trans As New TransactionScope()
'Code that generates a new ID in the database
dc.SubmitChanges()
'Code that uses the new ID value.
dc.SubmitChanges()
trans.Complete()
End Using
This is the only way to avoid the exception. If this is impossible because of architectural decisions ("my submit button is bound to a XAML Command") you need to change the architecture. I think a UI command should never be so close to the data access layer anyway. You better call a service method from XAML.
FYI to start, I am aware of how to properly set up an update to a lookup, and am 99% positive I've done this correctly.
I know this because When I set the workflow to automatically start when an Item is Changed, then it works perfectly. But when I simply change this setting so it will automatically start on New Item Creation, it Cancels the workflow and I get a "Coercion Failed: Unable to transform the input lookup data into the requested type." If both options are checked then it fails on creation, but simply clicking edit on the item properties, and the "Save" makes it work.
The workflow is on a Document Library and works as follows;
User selects the Work Task LookUp from a dropdown in the edit properties form after uploading, and then Saves the item (adding it to the document library). The workflow is suppose to then look at the Work Task LookUp selected, and pull the Account and Effective Date-Type lookUp ID's that Work Task item has, and sets the Document's identical fields to the same value.
Here is the code for the workflow if it helps;
If Current Item: Parent Task is not empty
If Current Item: Sub Task is not empty
Log Both are empty to workflow history list
Then Set Account to Work Tasks:Account
The Log Set Account to workflow history list
Then Set Effective Date and Type to WorkTasks: Effective Date and Type
The Log Set EffDateType to the workflow history list
This is all done in one step. I also added additional steps to test if the account and effective date type fields have been set properly, and if not to set them again. But everytime I run the workflow on change and it works, it always correctly sets these fields based upon the first Step (posted above) and the additional check logs to the history that they are not needed.
As an example, The lookUp for Integer for Tasks:Account is set to work as follows;
Date Source: Work Tasks (a list)
Field from Source: Account (a lookup)
Return Field as: Lookup ID (as Integer)
Find the List Item
Field: Title (from the Work Tasks list)
Value: Current Item: Parent Task (Which is a look up of the "Title"
Field from Work Tasks List, and is set to return the Value as a LookUp Value (As Text))
The Effective Date and Type setting is pretty much identical.
So anyone have any insight? I've tried running it as an impersonated Step, setting a workflow pause (for 1 minute), changing the lookup types incase I messed it up to start with, but ultimately the above workflow DOES work, but only when I set it to "Automatically start on the Change (edit) of an Item", NOT "Automatically start on New Item Creation" like I need to to do.
Oh yes, fyi, I am using SPServices CascadingDropDown on the Work Task and Sub Task fields of the doc Library form, but I honestly do not believe this has anything to do with my issue.
UPDATE:
I've talked with another developer, and he believes it is due to the issue that the workflow is occuring too quickly, before the item creates an ID for itself, which it needs to conduct the lookUps. He had me add another "Pause Workflow" to the very top of my workflow code (above the If conditions) and set it for 1 minute.
It then worked properly.
Downside is we want this to labeling to occur as close to item creation as possible. Because a view of the library relies on grouping based upon Account and Effective Date and Type. To add to this downer, Microsoft's Pause Workflow only allows for 1 minute or more, and then the timer used for this is often off, resulting in a pause longer than that. So far, every test is currently showing 2 minutes minimum on the pause.
A possible alternative solution for instantaniously populate the fileds is to use Javascript and SPServices to do the lookUp to the Task list to pull the account and effective date - type fields and then populate, but my Javascript is not very strong and I would need help doing this. If anyone has any suggestions, I would appreciate them.
(Answered in a question edit. Converted to a community wiki answer. See Question with no answers, but issue solved in the comments (or extended in chat) )
The OP wrote:
I don't know if it is the ID for the item after further testing. I changed the start of the workflow to wait until a field in the item changes. I set it to wait until the ID field is not 0 (since you cannot set to null), and it still does not work.
6/14/2012 4:13 PM Comment System Account Waiting on ID
6/14/2012 4:13 PM Comment System Account Waiting complete on ID
6/14/2012 4:13 PM Error System Account Coercion Failed: Unable to transform the input lookup data into the requested type.
I have tried other fields as well, like document ID value is not empty, and it will wait, log it finishing the wait, and then fail.
UPDATE This issue has something to do with the Parent Task field. I have solved the issue without having to wait for a period of time by setting the change from above to wait until the Parent Task field is not empty. It then completes the workflow fine.
Anyone know why there is a delay though? I've solved it, but still don't fully understand what takes it so long.
The main fault has been solved (hence the answer), and the remaining point about the reasons for the delay would probably be a discussion point or not specific enough for SO. Any further clarification can be edited in here.