ExtJS4 - Cross-store model deletion: Removing a model from a store by creating a model and setting it's ID - extjs4

I am building a web application and I have a Grid Panel A who has Store A that uses a Model A. When the user clicks a certain entry E with an ID and clicks the delete button, what I want to happen is to get Store B then remove the entry with the same ID as the selected entry E.
Basically, what I'm trying to do is some sort of "cross-store" model deletion. The model from Store A gets selected, but the entry from Store B gets deleted.
Here's what I've done so far:
var userStore = Ext.getStore('borrowerListStore'); //this is Store B
var model = Ext.ModelManager.create({
}, 'myAppLicationName.model.borrowerList'); //this is Model B
model.set("ID", personID); //person id here is the ID of Entry E selected earlier
Ext.getBody().mask('Starting Client Delete...');
userStore.remove(model); //I remove the model from the store
//then I sync the store
userStore.sync({
success: function(batch){
Ext.getBody().unmask();
console.log('delete user details success');
},
failure: function(batch){
Ext.getBody().unmask();
console.log('delete user details failure');
}
});
However, I am stuck on the masking screen.
I also tried loading the store first as such before I remove then sync the store:
userStore.load({
callback: function() {
userStore.remove(model);
}
});
However, I still got stuck on the loading screen.
Is there any way to do a cross-store model deletion based on a model property? I know that I can get Store B then iterate through the models and then remove the one whose ID matches the ID of what the user selected. My issue with that is if I have a lot of records in my store, it would take a lot of time to search through those.

Okay I'm an idiot.
I did something like:
userStore.getProxy().extraParams = {
selectedUserID: personID
};
and I had a if-else handling part in the php if selectedUserID was passed, I'd query the database using that in my Where clause so I'd end up with 1 entry.

Related

How to add one item from observable to an array of items in a different observable?

I am struggling with using observables and pipes where I want to add one item from one observable to another observable containing a list of items of the same type.
I have type X. And of the type X I have an observable array:
readonly arrayOfx$: Observable<X[]>;
I also have an observable of only the the type X:
private readonly _x$: Observable<UpdateOfX>;
interface UpdateOfX {
x: X,
updateState: "Add" | "Modified" | "Removed"
}
All this code is in a Service Class where the service should only expose the array of X. The data in the array I want to show in my html with async piping and this part of the functionality works. The host and the client are connected with the signalR technique and onConnected, an array of items of type X is retrieved. But as the application runs, in the backend new items of type X can be created, existing items can be changed or can be removed and when this occures, only this item will be send via the signalR connection and the modification state.
In the front end, this item must be added to the already retrieved array of items of type X. In the service, the pipe technique is used and my question is, how do I add the single item I get in a later moment to the list of items I retrieved earlier?
constructor() {
this.arrayOfx$ = this._someSignalRHelperService.retrieveMultipleItems$.pipe(
tap((xArray: X[]) => console.log(xArray)),
//can I somehow get the a later created x from the server here...
);
this._x$ = this._someSignalRHelperService.retrieveOneItem$.pipe(
tap((updateOfX: UpdateOfX) => console.log(updateOfX)),
map((updateOfX: UpdateOfX) => {
//process the updateState
//... or must I do something here to get x into x[]?
})
);
}
Since SignalR is used, the backend is in control when the client receives a new item of type X when there is one created.
You can use combineLatest(), and do whatever manipulation you want as soon as your receive the two emissions:
constructor() {
this.combinedOfX$ = combineLatest(
this._someSignalRHelperService.retrieveMultipleItems$,
this._someSignalRHelperService.retrieveOneItem$
).pipe(
map(([singleOfX, multipleOfX]) => {
// do your adding or mapping and whatever here.
})
)
}
Not sure if that is by design but the problem with having the frontend (client) to deal with the data is not so ideal. Your single source of truth is now based on your client, which different machines have different processing speed and may cause nuances and inconsistent displays. Also, the code will be messy in a sense that you will now need to check through the entire arrayOfX every time a singleOfX gets updated - you need to check if it exists in the current list, if yes, edit/delete; else, append to the list. What if the user refreshes his browser accidentally? You lost all of your processing.
Since you are already using SignalR, it would be more advisable that you let the server handle all the data, and let the server be single source of truth. Then you will just need to subscribe to one hub and listen to the changes of the arrayOfX; and pretty much don't care of the single updates.

firestore nested object field needs updating, not replacing

I am creating an app, where announcements are shown, stored in firestore and with that there is a hasRead object for each announcement.
It works, as in when a user is reading the announcement it is shown as read on the users app. But when another user is reading the same announcement, his/her usrid is being stored, overwriting the any other usrid stored.
Here his how I store it.
setAnnounceToRead(userId) {
firebase.firestore().collection('announcements').doc(this.state.id).set({
hasread: {
userId
}
},
{ merge: true });
}
I already found out that it is because of the merge, as it doesn't "adds" the usrid but overrides it instead.
How can I add every userid that reads the announcement, but keeping the already existing userids?
Cheers
Right now you're storing each user's UID as a field named userId. Since you're using the same field name for each user, you end up storing only the last user's UID.
To store the UID for all users, you'd usually have a structure like this:
hasread: {
udartsUid: true,
pufsUid: true
}
In your code that would translate to something like:
let update = {};
update[userId] = true;
firebase.firestore().collection('announcements').doc(this.state.id).set({
hasread: update
},
{ merge: true });
But this type of operation got a lot easier recently, since Firestore now has operations that allow you to use an array for this type of information.
let doc = firebase.firestore().collection('announcements').doc(this.state.id);
doc.update({ "hasRead": FieldValue.arrayUnion(userId) });
This snippets will add the userId value to the array if it isn't already in there. If the value is already in the array, it does nothing.
For more on the latter, see the blog post Better arrays in Cloud Firestore.

google app maker link fields of related records

I am attempting to set a field in one data model to equal to a field in a related data model. I've considered setting up an event to set the field equal to the other but do not know what the best trigger for this event would be and do not know the code that would be required.
Additionally, perhaps an event is not needed and there is some more fundamental/basic way to establish this field connection between related models.
Example: People Model has Companies Model as a related model. When adding a new People record, selecting the related Companies record would mean that the "Industry" field in the People record would be equal to the "Industry" field in the related Companies record.
Thank you!
You can execute a callback function after the People record is created. The callback function would change the Industry field value of the related Companies record to match the same value of the People record. Something like this (GIF). Notice that I am updating the Companies Industry value while creating a People record.
Here's the code on a Client Script:
var pgPeople = app.pages.PeopleCompanies;
var pgPeopleDesc = pgPeople.descendants;
function updateRelatedRecordField(){
var peopleDatasource = app.datasources.People;
peopleDatasource.createItem(function(record){
var industry = record.Industry;
record.Companies.Industry = industry;
});
}
You need to replace the default onClick function on the Form widget button with your function updateRelatedRecordField();
Read more here.

Ember-data creating extraneous record in memory

I have a many to many relationship table with payload (additional field) coming from .Net WebAPI that I have modelled in ember-data. When I add a record into this table/relationship ember is creating an additional record that is held in memory until the user performs a browser page refresh. My models are:
// student.js
export default DS.Model.extend({
name: DS.attr('string'),
studentsClasses: DS.hasMany('student-class')
})
// class.js
export default DS.Model.extend({
desc: DS.attr('string'),
studentsClasses: DS.hasMany('student-class')
})
// student-class
export default DS.Model.extend({
studentId: DS.attr('string'),
student: DS.belongsTo('student'),
class: DS.belongsTo('class'),
grade: DS.attr('number') // payload
})
Here is the code I use to create and add the many to many record.
let newRecord = this.get('store').createRecord('student-class');
newRecord.studentId = 1;
newRecord.grade = 3;
class.get('studentsClasses').pushObject(newRecord);
The new record gets created and added and everything looks good on screen, until I come back to the same page and there is an extra record in the class.studentClasses array.
Any idea why ember-data is creating an extra record in memory and how I can stop it doing it please?
Thanks
As you said, ember-data keeps records in memory. And you must keep in mind that ember-data will not remove those records by it own. It can only be removed from memory by yourself, page refresh or replaced by new payload if has same id property. You can observe that behavior by using ember debug plugin for browsers like chrome and firefox.
In your case, you've created a new record by store.createRecord(). In this moment, it had added this record to your memory already, and it was pushed to your class record. If you didn't save these models successfully, it will keep in a status called 'dirty', and if you never clean your store memory (using something like store.unloadRecord() which has some side effects, or remove this unsaved new record from your related model), the next time you use store.findRecord() to find a record, useless you force record to be reloaded like store.findRecord('class', 1, {reload: true}), it will use the existing data in your memory as first priority.
So my suggestion for this is to force reload this class model when entering this class page.

Load Document and create object inside script patch

I have an event object with Payments collection. When event is cancelled I need to add those payments to the appropriate User object Refunds collection. Based on the documentation I came to the following schematic script:
_(this.Payments).forEach(function(payment) {
var user = LoadDocument(payment.UserId);
user.Refunds.push(new { EventId = this.Id, Payment = payment });
}
There are two things in this schematic script that I didn't find how to do right in the documentation:
1. Load another document by Id (line 2)
2. Create new json-object (line 3)
The LoadDocument() is correct, however the loaded document isn't automatically tracked by any Unit of Work when loaded within a Patch.
You have to tell Raven to update/store that document as well:
var user = LoadDocument(payment.UserId);
user.Refunds.push({ EventId = this.Id, Payment = payment });
PutDocument(user.UserId, user);
If you really want to do this from a patch, the above might work. However, this seems like a more domain specific operation and might be better to model the behaviour in your application code (i.e. raise an event and add the refunds to the user objects from code). Not 100% sure how Raven handles transactions within patches and so on...
Edit: For your second question: You don't need to use the 'new' keyword