I want to save the results from a query using itemFileReadStore into an array called boxes, but the return value is empty (presumably because fetch is run asynchronously).
The gotItems function builds the array as I want it to, but I can't return that back to myself for any use! I could build the rest of my functionality into the gotItems part, but that would make my code unpretty.
How do I return an array for general use in my JavaScript from the gotItems function?
function getContentFile() {
contentStore = new dojo.data.ItemFileReadStore({
url: '../config/content.json',
preventCache : true
});
var boxes = new Array();
contentStore.fetch({query: {partner : 'enabled'}, onItem: gotItems });
return boxes;
}
function gotItems(item ) {
boxes.push( contentStore.getValue(item,'title') );
console.log( boxes );
return boxes;
}
dojo.addOnLoad( function() {
boxes = getContentFile();
console.log(boxes);
fadeIn('header', 500, 0);
});
Welcome to the world of asynchronous operations.
You'll need to do it with the "continuation-style" programming. ItemFileReadStore's fetch operations is asynchronous -- as you already know by passing the gotItems continuation to it.
contentStore.fetch({query: {partner : 'enabled'}, onItem: gotItems }) will return immediately. Your boxes will be empty at that point (because JavaScript is single-threaded). gotItems is executed after data arrived and subsequent to the function passed to dojo.addOnLoad returning.
You have to put your handling code:
console.log(boxes);
fadeIn('header', 500, 0);
inside the continuation gotItems itself. For example, something like:
function gotItems(item ) {
var boxes = [];
dojo.forEach(item, function(box) {
boxes.push( contentStore.getValue(box,'title') );
});
console.log(boxes); // You probably need to store "boxes" somewhere instead of just logging it
fadeIn('header', 500, 0);
}
Also, the data passed to onItems is an array, so you need to iterate it.
You don't have access to the results when the function returns because as you guessed, the fetch operation executes asynchronously.
You can either put the code that uses the results in your gotItems() function (as answered by Stephen), or you can use Deferreds and Promises. IMHO, that's a better alternative since it lets you organize your code better (once you get used to the idioms of dealing with promises, the code reads more naturally) and it allows you to transparently execute both synchronous and asynchronous operations.
See these two Dojo tutorials on the subject.
In your case, a possible solution involving deferreds would read like:
function getContentFile() {
contentStore = new dojo.data.ItemFileReadStore({
url: '../config/content.json',
preventCache: true
});
var dfd = new dojo.Deferred();
var boxes = new Array();
contentStore.fetch({
query: { partner : 'enabled' },
onItem: function(item) {
boxes.push( contentStore.getValue(item,'title') );
},
onComplete: function() {
// resolve the promise and execute the function in then()
dfd.callback(boxes);
}
});
return dfd;
}
dojo.addOnLoad( function() {
getContentFile().then(function(boxes) {
console.log(boxes);
fadeIn('header', 500, 0);
});
});
Related
I'm trying to create an object in one part of vuex store, and then pass id to it to another object, and i'm not sure how to properly do that since mutations can't return returning anything (in this case, id).
Two store objects look like this:
// store/report.js
const state = {
name: 'Untitled Report',
subReportIds: []
};
// store/subReport.js
const state = { ... }
And i'd like this action to create blank report, then blank subreport, and then assign subreport id to newly created report. (subreports are independent entities, and can be used by multiple reports, hence different area in store)
const actions = {
createNewReport({ state, commit }) {
commit(mutationTypes.CREATE_NEW_REPORT)
// below doesn't work - i can't get return from mutation
let newSubreportId = commit(mutationTypes.ADD_NEW_SUBREPORT)
// if this worked, i'd then do something like
commit(mutationTypes.ADD_SUBREPORT_TO_REPORT, newSubreportId)
}
};
How can i achieve the above?
So best way to accomplish to me would be to dispatch actions instead of committing the mutations. If you look at the methods in Vuex source, commit only executes with no return (so is a void) and dispatch returns the value you return from the action (which is a function)
For my actions, i always return a promise so that i can compose them like you mention above. Here is an example.
fetchSomething ({ commit }) {
return mockApiGetIds()
.then(response => {
commit({
type: SOME_MUTATION,
ids: response
});
return response;
});
},
Disclaimer : I don't know if it is truely a good idea, but at least, it seems to work, and to me, it feels prettier than having to use actions and promises, or to generate the id in the action...
With your mutation, you can pass an argument. To return a value from a mutation (like a newly created id), I write it to a placeholder in that argument :
someMutation(state, arg){
//...
arg.out = {
status : "succeed"
}
}
//...
this.$store.commit('someMutation', arg);
if(arg.out !== "succeed") console.log("ERROR");
Is there a nice way to prevent duplicate routes from being registered in express? I have a pretty large application with hundreds of routes across different files, and it gets difficult to know if I've already registered a certain route when I go to add a new one. For example, I'd like to throw an error when express gets to routes487.js:
File: routes1.js
var ctrl = require('../controllers/testctrl');
var auth = require('../libs/authentication');
module.exports = function (app) {
app.get('/hi', auth.getToken, ctrl.hi);
app.get('/there', auth.getToken, ctrl.there);
};
File: routes487.js
var ctrl = require('../controllers/testctrl487');
var auth = require('../libs/authentication');
module.exports = function (app) {
app.get('/hi', auth.getToken, ctrl.hi487);
};
You could try a custom solution by wrapping express methods with the validation. Consider the following modification to your express app:
// route-validation.js
module.exports = function (app) {
var existingRoutes = {}
, originalMethods = [];
// Returns true if the route is already registered.
function routeExists(verb, path) {
return existingRoutes[verb] &&
existingRoutes[verb].indexOf(path) > -1;
}
function registerRoute(verb, path) {
if (!existingRoutes[verb]) existingRoutes[verb] = [];
existingRoutes[verb].push(path);
}
// Return a new app method that will check repeated routes.
function validatedMethod(verb) {
return function() {
// If the route exists, app.VERB will throw.
if (routeExists(verb, arguments[0]) {
throw new Error("Can't register duplicate handler for path", arguments[0]);
}
// Otherwise, the route is saved and the original express method is called.
registerRoute(verb, arguments[0]);
originalMethods[verb].apply(app, arguments);
}
}
['get', 'post', 'put', 'delete', 'all'].forEach(function (verb) {
// Save original methods for internal use.
originalMethods[verb] = app[verb];
// Replace by our own route-validator methods.
app[verb] = validatedMethod(verb);
});
};
You just need to pass your app to this function after creation and duplicate route checking will be implemented. Note that you might need other "verbs" (OPTIONS, HEAD).
If you don't want to mess with express' methods (we don't know whether or how express itself or middleware modules will use them), you can use an intermediate layer (i.e., you actually wrap your app object instead of modifying its methods). I actually feel that would be a better solution, but I feel lazy to type it right now :)
I am trying to insert multiple images to couchdb via nano using express 4 and formidable. I can access and insert individual files without difficulty using formidable and nano, however, when I try to insert one file after another, I get conflict errors. I am very new to js and node, and I know my understanding of callbacks and asynchronous functions is limited. This is what I have at the moment. Any help would be greatly appreciated.
function uploadImage(req, res) {
var form = new formidable.IncomingForm(),
files = [],
fields = [];
uploadcount = 1;
form.on('field', function(field, value) {
fields.push([field, value]);
})
form.on('file', function(field, file) {
files.push([field, file]);
var docid = fields[0][1];
getrevision();
function getRevision(){
dbn.get(docid, { revs_info: true }, function(err,body, file){
if (!err) {
exrev = body._rev;
insertImage(exrev);
}else{
console.log(err);
}
});
}
function insertImage(exrevision){
var exrev = exrevision;
fs.readFile(file.path, function (err, data) {
if (err){
console.log(err);}else{
var imagename = docid + "_" + uploadcount + ".png";
dbn.attachment.insert(docid,imagename,data,'image/png',
{ rev: exrev }, function(err,body){
if (!err) {
uploadcount++;
}else{
console.log(err);
}
});
};
});
};
});
form.on('end', function() {
console.log('done');
res.redirect('/public/customise.html');
});
form.parse(req);
};
I found a solution by dumping the files first into a temporary directory, then proceeding to insert the files into couchdb via nano all within a single function. I couldn't find a way to pause the filestream to wait for the couchdb response, so this sequential method seems adequate.
This is a problem handling asynchronous calls. Because each attachment insert requires the doc's current rev number, you can't do the inserts in parallel. You must insert a new attachment only after you get a response from the the previous one.
You may use the promise and deferred mechanism to do this. But, I personally have solved a similar problem using a package called "async". In async, you can use async.eachSeries() to make these async calls in series.
Another point is about the revision number, you may just use the lighter weight db.head() function, instead of db.get(). The rev number is presented under the "etag" header. You can get the rev like this:
// get Rev number
db.head(bookId, function(err, _, headers) {
if (!err) {
var rev = eval(headers.etag);
// do whatever you need to do with the rev number
......
}
});
In addition, after each attachment insert, the response from couchdb will look something like this:
{"ok":true,"id":"b2aba1ed809a4395d850e65c3ff2130c","rev":"4-59d043853f084c18530c2a94b9a2caed"}
The rev property will give the new rev number which you may use for inserting to the next attachment.
This is Worklight 6.1 code with dojo, testing with Chrome and the std dev server Liberty. What I want this code to do is to query a collection, which should have 0 or 1 entries, and either retrieve the one entry if it exists or create an entry with a supplied set of values. What I'm trying to do is store a url, id, and password for a service. If this is the first time the app has run after installation I want to prompt the user for this info and store it. The code to prompt the user will be added later. If it is not the first run of the app then the values should be stored in the collection and be retrieved. I'll add code later to allow the user to change and update the values.
What is happening now is that the .add never seems to be executed, and also the execution sequence I'm seeing thru the breakpoints I've set seems weird.
Here is the setup code
// set up the jsonStore
var collectionName = 'servers';
var collections = {};
collections[collectionName] = {};
// initialize the default jsonStore Monitor credentials
var jsonURL = 'http://myserver.com:9082';
var jsonUser = 'keyser';
var jsonPassword = 'soze';
And here is the problem code
// Initialize the JSONStore
WL.JSONStore.init(collections)
.then(function() {
console.log("store initialized");
// query the store
var query = {_id: 0};
WL.JSONStore.get(collectionName)
.find(query)
.then(function(arrayResults) {
console.log("credentials retrieved " + arrayResults.length);
if (arrayResults.length > 0) {
// retrieve the credentials from the json object
console.log("password retrieved " + arrayResults[0].json.password);
jsonURL = arrayResults[0].json.url;
jsonUser = arrayResults[0].json.user;
jsonPassword = arrayResults[0].json.password;
} else {
// load the default credentials into jsonStore
var credentials = {url: jsonURL, user: jsonUser, password: jsonPassword};
WL.JSONStore.get(collectionName)
.add(credentials)
.then(function() {
console.log("credentials loaded " + credentials.url);
})
.fail(function(errorObject) {
console.log("credential load failed");
});
} // end of else
// Query the model list
queryModels();
}) // end of get(collectionName) then
.fail(function(errorObject) {
console.log("credentials not retrived");
}); // end of get(collectionName) fail
}) // end of init(collections) then
.fail(function(errorObject) {
console.log("store init failed" + errorObject);
}); // end of init(collections) fail
}); // end of ready
When I step thru it flows in this sequence.
init(collections)
Then it jumps immediately to the "end of ready". Seems weird but I'm a rookie so maybe it's OK?
Back to the get(collectionName)
to the .then and logs "credentials retrieved" with and array length of 0
To the else clause of the statement
And it breaks on the get(collectionName) in the else clause. So far so good
From here it jumps to queryModels(), skipping over the .add (far as I can tell)
Then it returns to the .then under the 2nd get and logs "credentials loaded"
At this point execution ends "normally" except,
The item never gets added to the collection, and
The queryModels runs before I expect it to, I want it to run after the item is added.
By now it's probably obvious that I'm a rookie, so I'm probably making the rookie mistake. I know
I'm dealing with deferreds here with the .then and .fails, and I'm nesting them, which seems to be
an accepted technique, but I'm not getting the execution sequence I want.
I've tried this code commenting out the 2nd get(collections) in a couple of formats and it barfs both ways.
// WL.JSONStore.get(collectionName)
.add(credentials)
and
// WL.JSONStore.get(collectionName)
servers.add(credentials)
Any help greatly appreciated. Thanks!
Here's my "answer" below based on what I learned from the other answers below.
Bluewing and cnandrue's answers were both very helpful, and I got it working. The main issues I had turned out to be.
I had failed to grasp that slot 0 in a collection equates to a document _id key of 1. I was trying to query _id = 0, and never getting a hit. The add to the collection was working all along, I was just not reading it correctly.
Moving the queryModels into the if/else clauses (bluewing's suggestion) worked, and reading the material cnandreu referenced (very worthwhile to read) explained why it worked. Thanks!
The tip about the "weird" execution sequence being an artifact of the breakpoints was also very useful, I quit chasing that red herring.
Here is a working draft of the code after fixing these issues. I did not implement all of the suggestions yet, but probably will as I polish this up. Thanks again.
// Initialize the JSONStore - you have to .init to start the collection before you can read it.
WL.JSONStore.init(collections)
.then(function() {
console.log("store initialized");
// query the store
var query = {_id: 1};
WL.JSONStore.get(collectionName) // get 1
.find(query)
.then(function(arrayResults) {
console.log("credentials retrieved " + arrayResults.length);
if (arrayResults.length > 0) {
// retrieve the credentials from the json object
console.log("password retrieved " + arrayResults[0].json.password);
jsonURL = arrayResults[0].json.url;
jsonUser = arrayResults[0].json.user;
jsonPassword = arrayResults[0].json.password;
queryModels();
} else {
// load the default credentials into jsonStore
var credentials = {url: jsonURL, user: jsonUser, password: jsonPassword};
WL.JSONStore.get(collectionName) // get 2
.add(credentials)
.then(function(numberOfDocumentsAdded) {
console.log("Number of Docs Added" + numberOfDocumentsAdded);
queryModels();
}); // end of .add then
} // end of else
}); // end of get(collectionName) 1 then
}) // end of init(collections) then
.fail(function(errorObject) {
console.log("something failed" + errorObject);
}); // end of init(collections) fail
All the JSON store calls ( like add , init etc) are asynchronous. So only you are getting that weird flows when you are checking with Breakpoints.
To get you execution sequence try to move the queryModels(); once the credentials are loaded.
WL.JSONStore.get(collectionName)
.add(credentials)
.then(function() {
console.log("credentials loaded " + credentials.url);
queryModels();
})
My suggestion is the same as Bluewings', but I wanted to share some pseudocode:
function handleCredentials (arrayResults, callback) {
if (arrayResults.length > 0) {
//.... synchronous code here.
setTimeout(function () {
callback();
}, 0);
} else {
WL.JSONStore.get(collectionName)
.add({url: jsonURL, user: jsonUser, password: jsonPassword})
.then(function() {
callback();
});
}
}
WL.JSONStore.init(collections)
.then(function() {
WL.JSONStore.get(collectionName)
.find({_id: 1})
.then(function (arrayResults) {
handleCredentials(arrayResults, function () {
queryModels();
});
});
});
Notice I created a function for handleCredentials, that function will either do a synchronous operation (setting some variables with the result from the find call) or an asynchronous operation (calling add to add credentials). A setTimeout with 0 is called to preserve async behavior, this is explained in detail here. After the handleCredentials function has finished, you call the queryModels function via the callback pattern.
As an aside, I recommended reading this blog post: What’s so great about JavaScript Promises?. Especially the "Error Handling" section. You don't need to add a .fail to every promise, you can get away with less failure functions and the error object should provide enough details into what went wrong. JSONStore error objects are documented here, notice they contain the source of the failure (e.g. src: 'find').
I'd like to use custom headers to provide some more information about the response data. Is it possible to get the headers in a response from a dojo datagrid hooked up to a jsonRest object via an object store (dojo 1.7)? I see this is possible when you are making the XHR request, but in this case it is being made by the grid.
The API provides an event for a response error which returns the response object:
on(this.grid, 'FetchError', function (response, req) {
var header = response.xhr.getAllResponseHeaders();
});
using this I am successfully able to access my custom response headers. However, there doesn't appear to be a way to get the response object when the request is successful. I have been using the undocumented private event _onFetchComplete with aspect after, however, this does not allow access to the response object, just the response values
aspect.after(this.grid, '_onFetchComplete', function (response, request)
{
///unable to get headers, response is the returned values
}, true);
Edit:
I managed to get something working, but I suspect it is very over engineered and someone with a better understanding could come up with a simpler solution. I ended up adding aspect around to allow me to get hold of the deferred object in the rest store which is returned to the object store. Here I added a new function to the deffered to return the headers. I then hooked in to the onFetch of the object store using dojo hitch (because I needed the results in the current scope). It seems messy to me
aspect.around(restStore, "query", function (original) {
return function (method, args) {
var def = original.call(this, method, args);
def.headers = deferred1.then(function () {
var hd = def.ioArgs.xhr.getResponseHeader("myHeader");
return hd;
});
return def;
};
});
aspect.after(objectStore, 'onFetch', lang.hitch(this, function (response) {
response.headers.then(lang.hitch(this, function (evt) {
var headerResult = evt;
}));
}), true);
Is there a better way?
I solved this today after reading this post, thought I'd feed back.
dojo/store/JsonRest solves it also but my code ended up slightly different.
var MyStore = declare(JsonRest, {
query: function () {
var results = this.inherited(arguments);
console.log('Results: ', results);
results.response.then(function (res) {
var myheader = res.xhr.getResponseHeader('My-Header');
doSomethingWith(myheader);
});
return results;
}
});
So you override the normal query() function, let it execute and return its promise, and attach your own listener to its 'response' member resolving, in which you can access the xhr object that has the headers. This ought to let you interpret the JsonRest result while fitting nicely into the chain of the query() all invokers.
One word of warning, this code is modified for posting here, and actually inherited from another intermediary class that also overrode query(), but the basics here are pretty sound.
If what you want is to get info from the server, also a custom key-value in the cookie can be a solution, that was my case, first I was looking for a custom response header but I couldn't make it work so I did the cookie way getting the info after the grid data is fetched:
dojo.connect(grid, "_onFetchComplete", function (){
doSomethingWith(dojo.cookie("My-Key"));
});
This is useful for example to present a SUM(field) for all rows in a paginated datagrid, and not only those included in the current page. In the server you can fetch the COUNT and the SUM, the COUNT will be sent in the Content-Range header and the SUM can be sent in the cookie.