Storing results of SQL Select query in a var node js - sql

I'm looking for a way to store the results of this select query like a "rank" chart for a game but I'm not sure if what I'm encountering is an async issue or a data-type issue or something else entirely. Any thoughts?
var ranksVar = [];
db.all("select * from user", function(err, rows){
if(err) {
throw err;
} else {
setValue(rows);
}
});
function setValue(value) {
ranksVar = value;
console.log(ranksVar);
}

I've found out a useful post about using SQLite with NodeJS and it gives you the basic examples needed to understand how to use it. Here it is: SQLite NodeJS
In your case, look under the section data query.
Here the example code:
const sqlite3 = require('sqlite3').verbose();
// open the database
let db = new sqlite3.Database('./db/chinook.db');
let sql = `SELECT DISTINCT Name name FROM playlists
ORDER BY name`;
db.all(sql, [], (err, rows) => {
if (err) {
throw err;
}
rows.forEach((row) => {
console.log(row.name);
});
});
// close the database connection
db.close();
As you can see, the rows variable is (I guess) a special type created by the module. To get all data from it, you might want to iterate over it and push it into your array variable.
Cheers 😁 !

I figured out the issue. It was a data type/async issue, I was trying to print a string for an undefined array.
Went back a loop and used the JSON.stringify method to display the array objects correctly.

Related

Inserting Data to Wix via JSON/REST API: WD_PERMISSION_DENIED

I'm trying to insert data to my Wix collection using the API. I'm using a POST function and am posting a JSON document. It's supposed to simply add a new row to a database containing 1 value.
Here is the http-functions.js which I can trigger without issues (it's more or less a copy of the example from the documentation):
import {created, serverError} from 'wix-http-functions';
import wixData from 'wix-data';
export function post_peopleCount(request) {
let options = {
"headers": {
"Content-Type": "application/json"
}
};
// get the request body
return request.body.text()
.then( (body) => {
// insert the item in a collection
return wixData.insert("NumberOfPeopleDB", JSON.parse(body));
} )
.then( (results) => {
options.body = {
"inserted": results
};
return created(options);
} )
// something went wrong
.catch( (error) => {
options.body = {
"error": error
};
return serverError(options);
} );
}
The database looks like this:
and the JSON I am posting looks like this:
But the Error I am getting is:
But the permissions I have set for the collection is:
Do you know why I might be getting that "WD_PERMISSION_DENIED" and 500 Server Error? (The data does not get entered.)
Thanks!
My friend, its not related to creating a collection from scratch it is because of the permissions set to this collection once created. You fixed that by not noticing :).
Permission need to be given in order to perform such queries.
It turns out, if I create a new collection (= table) from scratch, it works. I also changed the field value in the collection to people, maybe value is a reserved term. Nevertheless, now it seems to work:
So if you run into the same problem: Try recreating the collections from scratch.
The critical thing for me which has not been mentioned yet is that you need to set the collection to have form-like permissions so that anyone has permission to submit data to the collection.

HapiJS reply with readable stream

For one call, I am replying with a huge JSON object which sometimes causes the Node event loop to become blocked. As such, I'm using Big Friendly JSON package to stream JSON instead. My issue is I cannot figure out how to actually reply with the stream
My original code was simply
let searchResults = s3Access.getSavedSearch(guid)).Body;
searchResults = JSON.parse(searchResults.toString());
return reply(searchResults);
Works great but bogs down on huge payloads
I've tried things like, using the Big Friendly JSON package https://gitlab.com/philbooth/bfj
const stream = bfj.streamify(searchResults);
return reply(stream); // according to docs it's a readable stream
But then my browser complained about an empty response. I then tried to add the below to the reply, same result.
.header('content-encoding', 'json')
.header('Content-Length', stream.length);
I also tried return reply(null, stream); but that produced a ton of node errors
Is there some other way I need to organize this? My understanding was I could just reply a readable stream and Hapi would take care of it, but the response keeps showing up as empty.
Did you try to use h.response, here h is reply.
Example:
handler: async (request, h) => {
const { limit, sortBy, order } = request.query;
const queryString = {
where: { status: 1 },
limit,
order: [[sortBy, order]],
};
let userList = {};
try {
userList = await _getList(User, queryString);
} catch (e) {
// throw new Boom(e);
Boom.badRequest(i18n.__('controllers.user.fetchUser'), e);
}
return h.response(userList);
}

Using promises in Mongoose

I am new to the Promise method used to retrieve multiple database records at the same time and I want to rewrite my existing code to use promises
I have this piece of code in Express:
getController.getData = function(req,res, collection, pagerender) {
var id = req.params.id;
collection.find({}, function(err, docs){
if(err) res.json(err);
else res.render(pagerender, {data:docs, ADusername: req.session.user_id, id: req.params.id});
console.log(docs);
});
};
Now I want to use promises here, so I can do more queries to the database. Anyone know how I can get this done?
First, check if collection.find({}) returns a promise. If it does, then you can call your code like:
collection.find({}).
then(function(docs){
res.render(pagerender, {data:docs, ADusername: req.session.user_id, id: req.params.id});
})
.catch( function(err) {
res.json(err);
})
If you want more calls here, just create new DB call and add another .then block.
I suggest you read the documentation on promises, just to get a general feeling about them (https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise/then). You will also see how you can handle both success and rejection in the same function if you want.

Multiple image attachments to couchdb (nano) with Node.js Express 4.0 & formidable

I am trying to insert multiple images to couchdb via nano using express 4 and formidable. I can access and insert individual files without difficulty using formidable and nano, however, when I try to insert one file after another, I get conflict errors. I am very new to js and node, and I know my understanding of callbacks and asynchronous functions is limited. This is what I have at the moment. Any help would be greatly appreciated.
function uploadImage(req, res) {
var form = new formidable.IncomingForm(),
files = [],
fields = [];
uploadcount = 1;
form.on('field', function(field, value) {
fields.push([field, value]);
})
form.on('file', function(field, file) {
files.push([field, file]);
var docid = fields[0][1];
getrevision();
function getRevision(){
dbn.get(docid, { revs_info: true }, function(err,body, file){
if (!err) {
exrev = body._rev;
insertImage(exrev);
}else{
console.log(err);
}
});
}
function insertImage(exrevision){
var exrev = exrevision;
fs.readFile(file.path, function (err, data) {
if (err){
console.log(err);}else{
var imagename = docid + "_" + uploadcount + ".png";
dbn.attachment.insert(docid,imagename,data,'image/png',
{ rev: exrev }, function(err,body){
if (!err) {
uploadcount++;
}else{
console.log(err);
}
});
};
});
};
});
form.on('end', function() {
console.log('done');
res.redirect('/public/customise.html');
});
form.parse(req);
};
I found a solution by dumping the files first into a temporary directory, then proceeding to insert the files into couchdb via nano all within a single function. I couldn't find a way to pause the filestream to wait for the couchdb response, so this sequential method seems adequate.
This is a problem handling asynchronous calls. Because each attachment insert requires the doc's current rev number, you can't do the inserts in parallel. You must insert a new attachment only after you get a response from the the previous one.
You may use the promise and deferred mechanism to do this. But, I personally have solved a similar problem using a package called "async". In async, you can use async.eachSeries() to make these async calls in series.
Another point is about the revision number, you may just use the lighter weight db.head() function, instead of db.get(). The rev number is presented under the "etag" header. You can get the rev like this:
// get Rev number
db.head(bookId, function(err, _, headers) {
if (!err) {
var rev = eval(headers.etag);
// do whatever you need to do with the rev number
......
}
});
In addition, after each attachment insert, the response from couchdb will look something like this:
{"ok":true,"id":"b2aba1ed809a4395d850e65c3ff2130c","rev":"4-59d043853f084c18530c2a94b9a2caed"}
The rev property will give the new rev number which you may use for inserting to the next attachment.

Nested WLJSONStore Calls Not Executing in Expected Sequence and Not Adding Items to the Collection

This is Worklight 6.1 code with dojo, testing with Chrome and the std dev server Liberty. What I want this code to do is to query a collection, which should have 0 or 1 entries, and either retrieve the one entry if it exists or create an entry with a supplied set of values. What I'm trying to do is store a url, id, and password for a service. If this is the first time the app has run after installation I want to prompt the user for this info and store it. The code to prompt the user will be added later. If it is not the first run of the app then the values should be stored in the collection and be retrieved. I'll add code later to allow the user to change and update the values.
What is happening now is that the .add never seems to be executed, and also the execution sequence I'm seeing thru the breakpoints I've set seems weird.
Here is the setup code
// set up the jsonStore
var collectionName = 'servers';
var collections = {};
collections[collectionName] = {};
// initialize the default jsonStore Monitor credentials
var jsonURL = 'http://myserver.com:9082';
var jsonUser = 'keyser';
var jsonPassword = 'soze';
And here is the problem code
// Initialize the JSONStore
WL.JSONStore.init(collections)
.then(function() {
console.log("store initialized");
// query the store
var query = {_id: 0};
WL.JSONStore.get(collectionName)
.find(query)
.then(function(arrayResults) {
console.log("credentials retrieved " + arrayResults.length);
if (arrayResults.length > 0) {
// retrieve the credentials from the json object
console.log("password retrieved " + arrayResults[0].json.password);
jsonURL = arrayResults[0].json.url;
jsonUser = arrayResults[0].json.user;
jsonPassword = arrayResults[0].json.password;
} else {
// load the default credentials into jsonStore
var credentials = {url: jsonURL, user: jsonUser, password: jsonPassword};
WL.JSONStore.get(collectionName)
.add(credentials)
.then(function() {
console.log("credentials loaded " + credentials.url);
})
.fail(function(errorObject) {
console.log("credential load failed");
});
} // end of else
// Query the model list
queryModels();
}) // end of get(collectionName) then
.fail(function(errorObject) {
console.log("credentials not retrived");
}); // end of get(collectionName) fail
}) // end of init(collections) then
.fail(function(errorObject) {
console.log("store init failed" + errorObject);
}); // end of init(collections) fail
}); // end of ready
When I step thru it flows in this sequence.
init(collections)
Then it jumps immediately to the "end of ready". Seems weird but I'm a rookie so maybe it's OK?
Back to the get(collectionName)
to the .then and logs "credentials retrieved" with and array length of 0
To the else clause of the statement
And it breaks on the get(collectionName) in the else clause. So far so good
From here it jumps to queryModels(), skipping over the .add (far as I can tell)
Then it returns to the .then under the 2nd get and logs "credentials loaded"
At this point execution ends "normally" except,
The item never gets added to the collection, and
The queryModels runs before I expect it to, I want it to run after the item is added.
By now it's probably obvious that I'm a rookie, so I'm probably making the rookie mistake. I know
I'm dealing with deferreds here with the .then and .fails, and I'm nesting them, which seems to be
an accepted technique, but I'm not getting the execution sequence I want.
I've tried this code commenting out the 2nd get(collections) in a couple of formats and it barfs both ways.
// WL.JSONStore.get(collectionName)
.add(credentials)
and
// WL.JSONStore.get(collectionName)
servers.add(credentials)
Any help greatly appreciated. Thanks!
Here's my "answer" below based on what I learned from the other answers below.
Bluewing and cnandrue's answers were both very helpful, and I got it working. The main issues I had turned out to be.
I had failed to grasp that slot 0 in a collection equates to a document _id key of 1. I was trying to query _id = 0, and never getting a hit. The add to the collection was working all along, I was just not reading it correctly.
Moving the queryModels into the if/else clauses (bluewing's suggestion) worked, and reading the material cnandreu referenced (very worthwhile to read) explained why it worked. Thanks!
The tip about the "weird" execution sequence being an artifact of the breakpoints was also very useful, I quit chasing that red herring.
Here is a working draft of the code after fixing these issues. I did not implement all of the suggestions yet, but probably will as I polish this up. Thanks again.
// Initialize the JSONStore - you have to .init to start the collection before you can read it.
WL.JSONStore.init(collections)
.then(function() {
console.log("store initialized");
// query the store
var query = {_id: 1};
WL.JSONStore.get(collectionName) // get 1
.find(query)
.then(function(arrayResults) {
console.log("credentials retrieved " + arrayResults.length);
if (arrayResults.length > 0) {
// retrieve the credentials from the json object
console.log("password retrieved " + arrayResults[0].json.password);
jsonURL = arrayResults[0].json.url;
jsonUser = arrayResults[0].json.user;
jsonPassword = arrayResults[0].json.password;
queryModels();
} else {
// load the default credentials into jsonStore
var credentials = {url: jsonURL, user: jsonUser, password: jsonPassword};
WL.JSONStore.get(collectionName) // get 2
.add(credentials)
.then(function(numberOfDocumentsAdded) {
console.log("Number of Docs Added" + numberOfDocumentsAdded);
queryModels();
}); // end of .add then
} // end of else
}); // end of get(collectionName) 1 then
}) // end of init(collections) then
.fail(function(errorObject) {
console.log("something failed" + errorObject);
}); // end of init(collections) fail
All the JSON store calls ( like add , init etc) are asynchronous. So only you are getting that weird flows when you are checking with Breakpoints.
To get you execution sequence try to move the queryModels(); once the credentials are loaded.
WL.JSONStore.get(collectionName)
.add(credentials)
.then(function() {
console.log("credentials loaded " + credentials.url);
queryModels();
})
My suggestion is the same as Bluewings', but I wanted to share some pseudocode:
function handleCredentials (arrayResults, callback) {
if (arrayResults.length > 0) {
//.... synchronous code here.
setTimeout(function () {
callback();
}, 0);
} else {
WL.JSONStore.get(collectionName)
.add({url: jsonURL, user: jsonUser, password: jsonPassword})
.then(function() {
callback();
});
}
}
WL.JSONStore.init(collections)
.then(function() {
WL.JSONStore.get(collectionName)
.find({_id: 1})
.then(function (arrayResults) {
handleCredentials(arrayResults, function () {
queryModels();
});
});
});
Notice I created a function for handleCredentials, that function will either do a synchronous operation (setting some variables with the result from the find call) or an asynchronous operation (calling add to add credentials). A setTimeout with 0 is called to preserve async behavior, this is explained in detail here. After the handleCredentials function has finished, you call the queryModels function via the callback pattern.
As an aside, I recommended reading this blog post: What’s so great about JavaScript Promises?. Especially the "Error Handling" section. You don't need to add a .fail to every promise, you can get away with less failure functions and the error object should provide enough details into what went wrong. JSONStore error objects are documented here, notice they contain the source of the failure (e.g. src: 'find').