Worklight JsonStore advanced find - ibm-mobilefirst

How to use advanced find in worklight JSONStore using QueryPart?
I have tried the following code but its not working properly, I doubt if I am calling advancedFind correctly.
var query = WL.JSONStore.QueryPart().equal('age', 35);
var collectionName = "people";
WL.JSONStore.get(collectionName).find(query).then(function(arrayResults) {
// if data not present , get the data from DB
if (arrayResults.length == 0) {
} else {
}
}).fail(function(errorObject) {
alert("fail" + errorObject);
// handle failure
});

You are calling the find() method. The one you want to call is advancedFind(). Also, advancedFind receives an array of query parts, not just one query part. Your code should look like this:
var queryPart = WL.JSONStore.QueryPart().equal('age', 35);
var collectionName = "people";
WL.JSONStore.get(collectionName).advancedFind([queryPart]).then(function(arrayResults) {
// if data not present , get the data from DB
if (arrayResults.length == 0) {
} else {
}
}).fail(function(errorObject) {
alert("fail" + errorObject);
// handle failure
});
For future reference, here is the API and some examples on how to use the Javascript JSONStore API.

Related

Write rows to BigQuery via nodejs BigQuery Storage Write API

It seems quite new, but just hoping someone here has been able to use nodejs to write directly to BigQuery storage using #google-cloud/bigquery-storage.
There is an explanation of how the overall backend API works and how to write a collection of rows atomically using BigQuery Write API but no such documentation for nodejs yet. A recent release 2.7.0 documents the addition of said feature but there is no documentation, and the code is not easily understood.
There is an open issue requesting an example but thought I'd try my luck to see if anyone has been able to use this API yet.
Suppose you have a BigQuery table called student with three columns id,name and age. Following steps will get you to load data into the table with nodejs storage write api.
Define student.proto file as follows
syntax = "proto2";
message Student {
required int64 id = 1;
optional string name = 2;
optional int64 age = 3;
}
Run the following at the command prompt
protoc --js_out=import_style=commonjs,binary:. student.proto
It should generate student_pb.js file in the current directory.
Write the following js code in the current directory and run it
const {BigQueryWriteClient} = require('#google-cloud/bigquery-storage').v1;
const st = require('./student_pb.js')
const type = require('#google-cloud/bigquery-storage').protos.google.protobuf.FieldDescriptorProto.Type
const mode = require('#google-cloud/bigquery-storage').protos.google.cloud.bigquery.storage.v1.WriteStream.Type
const storageClient = new BigQueryWriteClient();
const parent = `projects/${project}/datasets/${dataset}/tables/student`
var writeStream = {type: mode.PENDING}
var student = new st.Student()
var protoDescriptor = {}
protoDescriptor.name = 'student'
protoDescriptor.field = [{'name':'id','number':1,'type':type.TYPE_INT64},{'name':'name','number':2,'type':type.TYPE_STRING},{'name':'age','number':3,'type':type.TYPE_INT64}]
async function run() {
try {
var request = {
parent,
writeStream
}
var response = await storageClient.createWriteStream(request);
writeStream = response[0].name
var serializedRows = []
//Row 1
student.setId(1)
student.setName('st1')
student.setAge(15)
serializedRows.push(student.serializeBinary())
//Row 2
student.setId(2)
student.setName('st2')
student.setAge(15)
serializedRows.push(student.serializeBinary())
var protoRows = {
serializedRows
}
var proto_data = {
writerSchema: {protoDescriptor},
rows: protoRows
}
// Construct request
request = {
writeStream,
protoRows: proto_data
};
// Insert rows
const stream = await storageClient.appendRows();
stream.on('data', response => {
console.log(response);
});
stream.on('error', err => {
throw err;
});
stream.on('end', async () => {
/* API call completed */
try {
var response = await storageClient.finalizeWriteStream({name: writeStream})
response = await storageClient.batchCommitWriteStreams({parent,writeStreams: [writeStream]})
}
catch(err) {
console.log(err)
}
});
stream.write(request);
stream.end();
}
catch(err) {
console.log(err)
}
}
run();
Make sure your environment variables are set correctly to point to the file containing google cloud credentials.
Change project and dataset values accordingly.

how to load new data from api using RefreshIndicator

Future<void> _fetchPage(int pageKey) async {
try {
final newItems = await ApiServices.fetchArticleList(pageKey, _pageSize);
final isLastPage = newItems.length < _pageSize;
if (isLastPage) {
_pagingController.appendLastPage(newItems);
} else {
final nextPageKey = pageKey + newItems.length;
_pagingController.appendPage(newItems, nextPageKey);
}
} catch (error) {
print(error);
_pagingController.error = error;
}
}
so, i have method _fetchPage() that loads data from api initially, but on refresh new data are not updated in screen. Any help will be appreciated.
change your onRefresh callback to
onRefresh:()=>_fetchPage(0)
_fetchPage method in your code is expecting a parameter but you are not passing the value. since the onRefresh callback doesn't have any value to pass as parameter to the function

Get JavaScript Array of Objects to bind to .Net Core List of ViewModel

I have a JS Array of Objects which, at time of Post contains three variables per object:
ParticipantId,
Answer,
ScenarioId
During post, there is an Array the size of 8 (at current anyway) which all correctly contain data. When I call post request, the Controller does get hit as the breakpoint triggers, the issue is when I view the List<SurveyResponse> participantScenarios it is shown as having 0 values.
The thing I always struggle to understand is that magic communication and transform between JS and .Net so I am struggling to see where it is going wrong.
My JS Call:
postResponse: function () {
var data = JSON.stringify({ participantScenarios: this.scenarioResponses})
// POST /someUrl
this.$http.post('ScenariosVue/PostScenarioChoices', data).then(response => {
// success callback
}, response => {
// error callback
});
}
My .Net Core Controller
[HttpPost("PostScenarioChoices")]
public async Task<ActionResult> PostScenarioChoices(List<SurveyResponse> participantScenarios)
{
List<ParticipantScenarios> addParticipantScenarios = new List<ParticipantScenarios>();
foreach(var result in participantScenarios)
{
bool temp = false;
if(result.Answer == 1)
{
temp = true;
}
else if (result.Answer == 0)
{
temp = false;
}
else
{
return StatusCode(400);
}
addParticipantScenarios.Add(new ParticipantScenarios
{
ParticipantId = result.ParticipantId,
Answer = temp,
ScenarioId = result.ScenarioId
});
}
try
{
await _context.ParticipantScenarios.AddRangeAsync(addParticipantScenarios);
await _context.SaveChangesAsync();
return StatusCode(201);
}
catch
{
return StatusCode(400);
}
}

How to manage depending functions in nodejs

I am trying to teach myself nodejs and expressjs, however coming from java and c++ this is proving difficult to get used to.
I made a simple and messy module that it is supposed to return a weather forecast for a given zip code.
The way this happens is by taking the user zip code and using a google api to generate the geo coordinates for that zip code. I get the coordinates from the JASON file and then provide them to the next api call, this call is done to the forecast.io api and this time the weather data for the location is also taken from a JASON file.
Coming from java and with a not so solid background on JavaScript I am having a hard time making these two functions wait for one another, in this case I need the google api call to finish first because the coordinates it will provide are needed for the second api call. Can someone take a look at this code and tell me if the strategy I used is correct/ provide a suggestion so that I can know what is done in javascript in situations like this.
here is the code:
// The required modules.
var http = require("http");
var https = require("https");
//result object
var resultSet = {
latitude :"",
longitude:"",
localInfo:"",
weather:"",
humidity:"",
pressure:"",
time:""
};
//print out error messages
function printError(error){
console.error(error.message);
}
//Forecast API required information:
//key for the forecast IO app
var forecast_IO_Key = "this is my key, not publishing for security reasons";
var forecast_IO_Web_Adress = "https://api.forecast.io/forecast/";
//Create Forecast request string function
function createForecastRequest(latitude, longitude){
var request = forecast_IO_Web_Adress + forecast_IO_Key + "/"
+ latitude +"," + longitude;
return request;
}
//Google GEO API required information:
//Create Google Geo Request
var google_GEO_Web_Adress = "https://maps.googleapis.com/maps/api/geocode/json?address=";
function createGoogleGeoMapRequest(zipCode){
var request = google_GEO_Web_Adress+zipCode + "&sensor=false";
return request;
}
function get(zipCode){
// 1- Need to request google for geo locations using a given zip
var googleRequest = https.get(createGoogleGeoMapRequest(zipCode), function(response){
//console.log(createGoogleGeoMapRequest(zipCode));
var body = "";
var status = response.statusCode;
//a- Read the data.
response.on("data", function(chunk){
body+=chunk;
});
//b- Parse the data.
response.on("end", function(){
if(status === 200){
try{
var coordinates = JSON.parse(body);
resultSet.latitude = coordinates.results[0].geometry.location.lat;
resultSet.longitude = coordinates.results[0].geometry.location.lng;
resultSet.localInfo = coordinates.results[0].address_components[0].long_name + ", " +
coordinates.results[0].address_components[1].long_name + ", " +
coordinates.results[0].address_components[2].long_name + ", " +
coordinates.results[0].address_components[3].long_name + ". ";
}catch(error){
printError(error.message);
}finally{
connectToForecastIO(resultSet.latitude,resultSet.longitude);
}
}else{
printError({message: "Error with GEO API"+http.STATUS_CODES[response.statusCode]})
}
});
});
function connectToForecastIO(latitude,longitude){
var forecastRequest = https.get(createForecastRequest(latitude,longitude),function(response){
// console.log(createForecastRequest(latitude,longitude));
var body = "";
var status = response.statusCode;
//read the data
response.on("data", function(chunk){
body+=chunk;
});
//parse the data
response.on("end", function(){
try{
var weatherReport = JSON.parse(body);
resultSet.weather = weatherReport.currently.summary;
resultSet.humidity = weatherReport.currently.humidity;
resultSet.temperature = weatherReport.currently.temperature;
resultSet.pressure = weatherReport.currently.pressure;
resultSet.time = weatherReport.currently.time;
}catch(error){
printError(error.message);
}finally{
return resultSet;
}
});
});
}
}
//define the name of the outer module.
module.exports.get = get;
is the return statement properly placed? Is my use of finally proper in here? Please notice that I come from a java background and in java is perfectly fine to use the try{} catch(){} and finally{} blocks to execute closure code, it was the only way i managed this module to work. But now that i have incorporated some Express and I try to execute this module's method from another module, all I am getting is an undefined return.
You could use the Promise API, kind of like Futures in Java, so basically what you could do is wrap both functions in promises and the you could wait for resolve to execute the next function
var googleRequest = function(zipcode) {
return new Promise(function(resolve, reject) {
var request = https.get(createGoogleGeoMapRequest(zipCode), function(response) {
if (response.statusCode !== 200) {
reject(new Error('Failed to get request status:' + response.statusCode));
}
var body = "";
//a- Read the data.
response.on("data", function(chunk) {
body+=chunk;
});
//b- Parse the data.
response.on("end", function(body) {
var coordinates = JSON.parse(body);
resultSet.latitude = coordinates.results[0].geometry.location.lat;
resultSet.longitude = coordinates.results[0].geometry.location.lng;
resultSet.localInfo = coordinates.results[0].address_components[0].long_name + ", " +
coordinates.results[0].address_components[1].long_name + ", " +
coordinates.results[0].address_components[2].long_name + ", " +
coordinates.results[0].address_components[3].long_name + ". ";
resolve(resultSet);
})
});
request.on('error', function(err) {
reject(err);
});
});
}
After that you could just do
googleRequest(90210).then(function(result) {
connectToForecastIO(result.latitude, result.longitude);
}
You can find out more about Promise's usage in the Promise API docs
You should also note that there are several libraries available that allow for promise based http requests such as fetch

jQuery Data table reload

I am using jQuery datatable plugin. I am trying to fetch the record count and based on the count doing some HTML manipulation using jQuery.
So far I have used this code
$('#tb').on('init.dt', function () {
var totalRecords = table.page.info().recordsTotal;
if(totalRecords != 0) {
$('#tb_div').show();
table.columns.adjust().draw();
}else{
$('#tb_div').hide();
$('#no_rec_msg').show();
}
} );
But this init.dt gets executed just once and it doesn't work on table.ajax.reload(); Any API method that would fix this?
Use xhr event instead that will be fired when an Ajax request is completed.
$('#tb').on('xhr.dt', function () {
var totalRecords = table.page.info().recordsTotal;
if(totalRecords != 0) {
$('#tb_div').show();
table.columns.adjust().draw();
} else {
$('#tb_div').hide();
$('#no_rec_msg').show();
}
} );