I am working on an application and was doing something like this:
dojo.ready(
function(){ require['dojo/parser','dijit/registry','dojo/on'],function(.....){
//find a dijit and wrap it in event handling code.});
I was getting an error indicating that dojo was trying to register a widget with an id that was already in use. To solve the problem I entered this line of code:
//before finding the dijit destroy the existing registry.
However, logically this prevents the next line from working because now no widget exists to which I can connect an event. How can I recover the dijit ids?
The best solution is to find out why your code is trying to register a widget with an id that is already in use and change it to not to do so.
The #mschr's solution should work, but I would advise again using it, as it can break your code in many other places and you are likely to spend hours investigating strange behavior of your application.
Anyway, if you are willing to do it that way and automatically destroy widgets with the same ID, do not override registry.add() method. You could do it, but it does not mean, you should do it (especially in programming). Employ dojo/aspect instead to call a function that will destroy the widget with the same ID before registry.add() is called:
require([
"dojo/aspect",
"dijit/registry"
], function(
aspect,
registry
) {
aspect.before(registry, "add", function(widget) {
if(registry.byId(widget.id)) {
registry.byId(widget.id).destroy();
// this warning can save you hours of debugging:
console.warn("Widget with id==" + widget.id + " was destroyed to register a widget with the same id.");
}
return [widget];
});
});
I was myself curious how to accomplish #mschr solution without that override, so I created an jsFiddle to experiment: http://jsfiddle.net/phusick/feXVT/
What happens once you register a dijit is the following; it is referenced by the dijit.registry._hash:
function (widget) {
if (hash[widget.id]) {
throw new Error("Tried to register widget with id==" + widget.id + " but that id is already registered");
}
hash[widget.id] = widget;
this.length++;
}
Now, every now and then you would have a contentpane in which you would put a widget programatically (programatically, hence dojo.parser handles cpane.unload and derefences / destroys parser-instantiated widgets).
When this happens, you need to hook onto some form of 'unload', like, when your call cpane.set('content' foo) or cpane.set('href', bar). Hook is needed to destroy and unregister the instances you keep of widgets - otherwise you would have a memoryleak in your program.
Normally, once an object has no references anywhere - it will get cleaned out of memory however with complex objects such as a widget might be, 'class-variables' often have reference to something _outside _widget scope which flags the widget unsafe to delete to the garbage collector... Once you get this point, you will know to perform proper lifecycles, yet not before the concept is fully understood..
What you could do is to override the dijit.registry with your own handler and have any widgets that are doublets destroyed automatically like so:
// pull in registry in-sync and with global scoped
// accees (aka dijit.registry instead of dj_reg)
require({
async:false,
publishRequireResult:true
}, [
"dijit.registry"
], function(dj_reg) {
dijit.registry.add = function(widget) {
// lets change this bit
if (this._hash[widget.id]) {
this._hash[widget.id].destroy(); // optinally destroyRecursively
this.remove(widget.id)
}
this._hash[widget.id] = widget;
this.length++;
}
});
Related
This is the location controller file that is going to access by the html code.
export default class extends Controller {
static targets = [ "visible", "map" ]
mapTargetConnected(element) {
this.name = "aaa"
}
add(event) {
console.log(this.name) // this line is logged that variable is undefined.
}
}
here is the HTML code
<%= form_with(model: #location, local: false, url: location_path(), data: {controller: 'location', action: 'ajax:beforeSend->location#add'}) do |form| %>
....
<% end %>
This is the code regarding form submit via ajax request. if i access the this.name variable inside the add method or click event its says the variable is undefined… but if i same name variable assign it in connect() method than it’s working…
but i want to assign variable at targetConnected method and use it in the add action method.Please suggest any solution or let me know if i'm doing wrong.
Most likely the add event is being triggered before the mapTargetConnected has run.
Stimulus will go through the DOM and match elements and their targets and then trigger the relevant someTargetConnected and connect lifecycle methods once the controller is set up.
However, this is not instant and there may be some nuance to how the timing works when you are working with other events.
You will need to work out when the actual map target is being added to the DOM and possibly do some logging to check that timing compared to when the ajax:beforeSend event triggers.
Sometimes, adding a setTimeout can assist as it will ensure that the code provided to it runs 'last' (there is some nuance to this, technically it is the next event cycle).
For example
add(event) {
// here the mapTargetConnected may not have run
setTimeout(() => {
// by this point, mapTargetConnected has hopefully now run
console.log(this.name);
});
}
It is hard to offer more help without a bit more specifics on what ajax:beforeSend is and when it triggers, along with what actually adds the map target to the DOM. It may be more helpful to write this question with the initially rendered HTML output (with the minimum parts to help guide the question).
In general, it is good to remember that in the browser, things do not happen instantly, while they may be fast there can be timing issues to be aware of.
I have an edit page (in a DurandalJS single page app), where I use the .canDeactivate lifecycle method to check if there are any changes to the record, and optionally prompt them for confirmation before leaving the page.
I also have a 'Save' and 'View History' button. Is the correct thing to do to override the .canDeactivate method before calling router.navigate, to stop the modal popup invoking?
E.g.: As here:
self.onSave = function() {
self.repository.updateItem(self.model).done(function() {
self.canDeactivate = null; // Is this the correct way to do this?
router.navigate("#/home");
}
}
As this .canDeactivate will otherwise get called:
self.canDeactivate = function() {
if (!self.model.hasChanges()) {
return true;
}
return app.ShowMessage("Unsaved data will be lost", "Are you sure you wish to exit?", ["Yes", "No"]).done(function(result) {
return result !== "No";
}
};
Why dont you just set
self.model.hasChanges(false)
in your updateItem callback?
Then when your canDeactivate is called, it will return true.
Also you seem to have an error in your ShowMessage callback. I think you mean to do:
return result != "No";
I don't think the way Durandal decides whether to attempt to call a canDeactivate function is fully defined, other than the fact that if it's not in the view model, it won't try. Hence, even if it works as is, a future version of the framework could change its check to something like if (canDeactivate in viewModel) viewModel.canDeactivate(...); without further tests, and your code would break.
This is unlikely, but if you want to worry about it, you should thus delete self.canDeactivate instead of assigning it the null value.
Quote from the documentation:
To participate in the lifecycle, implement any (or none) of the
functions below on the object that you set the activator to (...)
Current implementation (activator.js, L126, 1eecbc2d3f84dc42eb7304bde761d88f300d8951):
if (item && item.canDeactivate) {
So it only checks if it's truthy (which would indicate using null works fine currently, too).
If you want to discuss the pattern, I don't see anything wrong with it, as long as it makes sense to you and everyone who should read the code.
You're not supposed to be activating and deactivating views programmatically in any critical path, so performance should be irrelevant either way (flag on view model or deletion of canDeactivate).
This was originally posted on discuss.emberjs.com. See:
http://discuss.emberjs.com/t/what-is-the-proper-use-of-store-filter-store-find-for-infinite-scrolling/3798/2
but that site seems to get worse and worse as far as quality of content these days so I'm hoping StackOverflow can rescue me.
Intent: Build a page in ember with ember-data implementing infinite scrolling.
Background Knowledge: Based on the emberjs.com api docs on ember-data, specifically the store.filter and store.find methods ( see: http://emberjs.com/api/data/classes/DS.Store.html#method_filter ) I should be able to set the model hook of a route to the promise of a store filter operation. The response of the promise should be a filtered record array which is a an array of items from the store filtered by a filter function which is suppose to be constantly updated whenever new items are pushed into the store. By combining this with the store.find method which will push items into the store, the filteredRecordArray should automatically update with the new items thus updating the model and resulting in new items showing on the page.
For instance, assume we have a Questions Route, Controller and a model of type Question.
App.QuestionsRoute = Ember.Route.extend({
model: function (urlParams) {
return this.get('store').filter('question', function (q) {
return true;
});
}
});
Then we have a controller with some method that will call store.find, this could be triggered by some event/action whether it be detecting scroll events or the user explicitly clicking to load more, regardless this method would be called to load more questions.
Example:
App.QuestionsController = Ember.ArrayController.extend({
...
loadMore: function (offset) {
return this.get('store').find('question', { skip: currentOffset});
}
...
});
And the template to render the items:
...
{{#each question in controller}}
{{question.title}}
{{/each}}
...
Notice, that with this method we do NOT have to add a function to the store.find promise which explicitly calls this.get('model').pushObjects(questions); In fact, trying to do that once you have already returned a filter record array to the model does not work. Either we manage the content of the model manually, or we let ember-data do the work and I would very much like to let Ember-data do the work.
This is is a very clean API; however, it does not seem to work they way I've written it. Based on the documentation I cannot see anything wrong.
Using the Ember-Inspector tool from chrome I can see that the new questions from the second find call are loaded into the store under the 'question' type but the page does not refresh until I change routes and come back. It seems like the is simply a problem with observers, which made me think that this would be a bug in Ember-Data, but I didn't want to jump to conclusions like that until I asked to see if I'm using Ember-Data as intended.
If someone doesn't know exactly what is wrong but knows how to use store.push/pushMany to recreate this scenario in a jsbin that would also help too. I'm just not familiar with how to use the lower level methods on the store.
Help is much appreciated.
I just made this pattern work for myself, but in the "traditional" way, i.e. without using store.filter().
I managed the "loadMore" part in the router itself :
actions: {
loadMore: function () {
var model = this.controller.get('model'), route = this;
if (!this.get('loading')) {
this.set('loading', true);
this.store.find('question', {offset: model.get('length')}).then(function (records) {
model.addObjects(records);
route.set('loading', false);
});
}
}
}
Since you already tried the traditional way (from what I see in your post on discuss), it seems that the key part is to use addObjects() instead of pushObjects() as you did.
For the records, here is the relevant part of my view to trigger the loadMore action:
didInsertElement: function() {
var controller = this.get('controller');
$(window).on('scroll', function() {
if ($(window).scrollTop() > $(document).height() - ($(window).height()*2)) {
controller.send('loadMore');
}
});
},
willDestroyElement: function() {
$(window).off('scroll');
}
I am now looking to move the loading property to the controller so that I get a nice loader for the user.
Let's say we have a simple Backbone View, like this:
class MyView extends Backbone.View
events:
'click .save': 'onSave'
onSave: (event) ->
event.preventDefault()
# do something interesting
I want to test that event.preventDefault() gets called when I click on my element with the .save class.
I could test the implementation of my callback function, pretty much like this (Mocha + Sinon.js):
it 'prevents default submission', ->
myView.onSave()
myView.args[0][0].preventDefault.called.should.be.true
I don't think it's working but this is only to get the idea; writing the proper code, this works. My problem here is that this way I'm testing the implementation and not the functionality.
So, my question really is: how can I verify , supposing to trigger a click event on my .save element?
it 'prevents default submission', ->
myView.$('.save').click()
# assertion here ??
Thanks as always :)
Try adding a listener on the view's $el, then triggering click on .save, then verify the event hasn't bubbled up to the view's element.
var view = new MyView();
var called = false;
function callback() { called = true; }
view.render();
// Attach a listener on the view's element
view.$el.on('click', callback);
// Test
view.$('.save').trigger('click');
// Verify
expect(called).toBeFalsy();
So you want to test that preventDefault is called when a click event is generated, correct?
Couldn't you do something like (in JavaScript. I'll leave the CoffeeScript as an exercise ;)):
var preventDefaultSpy;
before(function() {
preventDefaultSpy = sinon.spy(Event.prototype, 'preventDefault');
});
after(function() {
preventDefaultSpy.restore();
});
it('should call "preventDefault"', function() {
myView.$('.save').click();
expect(preventDefaultSpy.callCount).to.equal(1);
});
You might want to call preventDefaultSpy.reset() just before creating the click event so the call count is not affected by other things going on.
I haven't tested it, but I believe it would work.
edit: in other words, since my answer is not that different from a part of your question: I think your first approach is ok. By spying on Event.prototype you don't call myView so it's acting more as a black box, which might alleviate some of your concerns.
I'm working on a pseudo plugin (just really an namespaced initializer on the jQuery object) and I'm having a bit of trouble with .proxy() and .queue() (seemingly two of the most misunderstood methods around)
Anyways, I thought I had the logic sorted out; the function $.cb() takes a map of functions as such:
$.cb({
'show': function(){ },
'hide': function(){ },
'open': function(){ },
'close': function(){ },
'beforeUpdate': function(){ },
'afterUpdate': function(){ }
});
These functions (should) contain animation sequences applied to $(this), the context of which has internally, via .proxy(), been changed to the respective element(s). They are stored in a settings variable, available to all methods of the "plugin".
Internally, some namespaced event handlers, attached via .live({ }):
// ...
'cb.hide': function(event){
if(event.isPropagationStopped()){
return false;
}
event.stopPropagation();
$.proxy(settings.hide, this)();
$(this).hide();
},
'cb.update': function(event, html){
if(event.isPropagationStopped()){
return false;
}
event.stopPropagation();
$.proxy(settings.beforeUpdate, this)();
$(this).html(html);
$.proxy(settings.afterUpdate, this)();
},
// ...
Anyways, the purpose is that there in inherent functionality brought to the table using this "plugin", but the implementer can pass the function map to opt for different transitional animations.
The problem, is that I can't seem to get these functions to queue properly; different ones taking precedence, etc. I've tried mocking around with .queue() but I can't seem to get anything right with it:
// in cb.update
var $this = $(this);
$(this).queue(function(next){
$.proxy(settings.beforeUpdate, $this)();
next();
}).queue(function(next){
$this.html(html);
next();
}).queue(function(next){
$.proxy(settings.afterUpdate, $this)();
}).dequeue();
The problem is especially prevalent with the 'cb.update' event, as the order should be:
beforeUpdate is called (animation sequence occurs and completes)
The element's contents are updated via .html()
afterUpdate is called (animation sequence occurs and completes)
Whats actually happening is:
The element's contents are updated via .html()
beforeUpdate is called (animation sequence occurs and completes)
afterUpdate is called (animation sequence occurs and completes)
So given the supplied animation is simply .fadeOut() and fadeIn() for beforeUpdate and afterUpdate respectively, it's updating the contents, then fading out and in.
So, any suggestions on this sort of implementation? How can I ensure the proper ordering of the events/animations? Have I gone a wildly stupid route in terms of trying to implement such a feature?