Error which hit the limit of JSONStore - ibm-mobilefirst

I found Worklight JSONStore has no size limit by Worklight runtime.
Does WL JSONStore API return error/error code if you add to collection and hit the size limit of your mobile device?

Yes, you should get an error, but it will be a generic one like PERSISTENT_STORE_FAILURE (-1). I recommend testing this as part of your regular unit, funcional, etc. tests and QA process for your application. Report back if you see something unexpected.
Recently I answered a similar question "Can the JSON offline device store be size restricted?". I'll add my answer here because I believe it could be helpful.
While this functionality is not baked into the core API, it should be fairly straightforward to implement.
JSONStore has an enhance method you can use to add functions to the JSONStoreInstance prototype. There's an example inside that should help.
Cordova has a File API
Note: "size: The size of the file in bytes. (long)"
JSONStore stores its data here:
iOS: [app]/Documents/wljsonstore/jsonstore.sqlite
Android: /data/data/com.[app-name]/databases/wljsonstor/jsonstore.sqlite
I talked a bit about that file in these StackOverflow answers:
What are the recommended ways to debug Worklight applications?
JSONStore difference between 'number' and 'integer' in searchFields
Check the file size of jsonstore.sqlite using Cordova's File API before you add more data to your JSONStore collection.
Basically you would do something like this:
if(checkFileSize(collection.name+'.sqlite') < LIMIT){
collection.add(...);
}
Using enhance you can wrap that logic into its own method (e.g. collection.addWithSizeCheck(....)) that checks the file size and calls collection.add(...).
Note that the default username is jsonstore, hence jsonstore.sqlite. If you pass a username to init it will create a new .sqlite file with that username.

Related

How to add double tap detection to sample app gatt_sensordata_app?

I am developing an Android app that gets data from Movesense using the GATT profile from the sample app GATT Sensor Data App here.
I followed the tutorial available here. Building the app and getting the DFU worked fine. I can get IMU, HR and temperature data with no issues.
Now I'd like add a tap detection feature to my app. I understand that I have to subscribe to 'System/States', but first I need to be able to receive the system state data.
I understand that I need a modified DFU for that, but I don't understand what changes I should make in which files of the gatt_sensordata_app before rebuilding and generating the new DFU.
What changes should I make in order to broadcast /System/State data?
(I usually just deal with Android so apologies for the very basic question.)
I tried adding #include "system_states/resources.h" to GATTSensorDataClient.cpp but I don't know how to continue.
The normal data straming in the gatt_sensordata_app uses the sbem-encoding code that the build process generates when building the firmware. However /System/States is not among the paths that the code can serialize. Therefore the only possibility is to implement the States-support to the firmware.
Easiest way is to do as follows:
In your python app call data subscription with "/System/States/3" (3 == DOUBLE_TAP)
Add a special case to the switch in onNotify which matches the localResourceId to the WB_RES::LOCAL::SYSTEM_STATES_STATEID::LID
In that handler, return the data the way you want. Easiest is to copy paste the "default" handler but replace the code between getSbemLength() & writeToSbemBuffer(...) calls with your own serialization code
Full disclosure: I work for the Movesense team

How to access SAP OData messages in Kapsel offline app?

We are developing an SAP Fiori App to be used on the Launchpad and as an offline-enabled hybrid app as well using the SAP SDK and its Kapsel Plug Ins. One issue we are facing at the moment is the ODATA message handling.
On the Gateway, we are using the Message Manager to add additional information to the response
" ABAP snippet, random Gateway entity method
[...]
DATA(lo_message_container) = me->mo_context->get_message_container( ).
lo_message_container->add_message(
iv_msg_type = /iwbep/cl_cos_logger=>warning
iv_msg_number = '123'
iv_msg_id = 'ZFOO'
).
" optional, only used for 'true' errors
RAISE EXCEPTION TYPE /iwbep/cx_mgw_busi_exception
EXPORTING
message_container = lo_message_container.
In the Fiori app, we can directly access those data from the message manager. The data can be applied to a MessageView control.
// Fiori part (Desktop, online)
var aMessageData = sap.ui.getCore().getMessageManager().getMessageModel().getData();
However, our offline app always has an empty message model. After a sync or flush, the message model is always empty - even after triggering message generating methods in the backend.
The only way to get some kind of messages is to raise a /iwbep/cx_mgw_busi_exception and pass the message container. The messages can be found, in an unparsed state, in the /ErrorArchive entity and be read for further use.
// Hybrid App part, offline, after sync and flush
this.getModel().read("/ErrorArchive", { success: .... })
This approach limits us to negative, "exception worthy", messages only. We also have to code some parts of our app twice (Desktop vs. Offlne App).
So: Is there a "proper" to access those messages after an offline sync and flush?
For analyzing the issue, you might use the tool ILOData as seen in this blog:
Step by Step with the SAP Cloud Platform SDK for Android — Part 6c — Using ILOData
Note, ILOData is part of the Kapsel SDK, so while the blog above was part of a series on the SAP Cloud Platform SDK for Android, it also applies to Kapsel apps.
ILOData is a command line based tool that lets you execute OData requests and queries against an offline store.
It functions as an offline OData client, without the need for an application.
Therefore, it’s a good tool to use to test data from the backend system, as well as verify app behavior.
If a client has a problem with some entries on their device, the offline store from the device can be retrieved using the sendStore method and then ILOData can be used to query the database.
This blog about Kapsel Offline OData plugin might also be helpful.

IBM Worklight - JSONStore logic to refresh data from the server and be able to work offline

currently the JSONStore API provides a load() method that says in the documentation:
This function always stores whatever it gets back from the adapter. If
the data exists, it is duplicated in the collection". This means that
if you want to avoid duplicates by calling load() on an already
populated collection, you need to empty or drop the collection before.
But if you want to be able to keep the elements you already have in
the collection in case there is no more connectivity and your
application goes for offline mode, you also need to keep track of
these existing elements.
Since the API doesn't provide a "overwrite" option that would replace the existing elements in case the call to the adapter succeeds, I'm wondering what kind of logic should be put in place in order to manage both offline availability of data and capability to refresh at any time? It is not that obvious to manage all the failure cases by nesting the JS code due to the promises...
Thanks for your advices!
One approach to achieve this:
Use enhance to create your own load method (i.e. loadAndOverwrite). You should have access to the all the variables kept inside an JSONStore instance (collection name, adapter name, adapter load procedure name, etc. -- you will probably use those variables in the invokeProcedure step below).
Call push to make sure there are no local changes.
Call invokeProcedure to get data, all the variables you need should be provided in the context of enhance.
Find if the document already exists and then remove it. Use {push: false} so JSONStore won't track that change.
Use add to add the new/updated document. Use {push: false} so JSONStore won't track that change.
Alternatively, if the document exists you can use replace to update it.
Alternatively, you can use removeCollection and call load again to refresh the data.
There's an example that shows how to use all those API calls here.
Regarding promises, read this from InfoCenter and this from HTML5Rocks. Google can provide more information.

Worklight Direct update

Does anybody know what if direct update updates everything that lives in the common directory structure. I used the same code base for multiple apps, the only change being certain settings within a js file that tells the application how to act. Is there a directory i can put that js file that would be safe from the direct update feature?
I cant seen to find any specific information on IBM's website.
I think you guys need to be careful which terms you are using in order to not confuse people who may be looking for similar help.
Environments are specific to the OS you are using. iOS, Blackberry, Android, and etc. environments.
Skins are based on the environment, and aren't generic to all platform. When you create a skin you must choose which environment you are running in.
So to correct some, direct updates will update all skin resources in targeted environments.
For example: You have an app with Android and iOS versions
When you create skins, you are creating essentially a responsive type of design to your parameters. For instance, if you have a 2.3 vs 4.2 Android OS, you can set a look and feel for both. However, these utilize a single web resource base. The APK would be the same for both versions of the app (by default) and have 2 available skins. On runtime utilizing IBM Worklight's 'Runtime Skinning' (hence the name) it goes through the parameter check for the OS and loads that skins overriding web code.
You could technically override all of the web code to be completely different for both skins, but that would be bulky and inefficient.
When you direct update you are updating all the resources of that particular environment (to include both skins), not the common folder/environment.
So an updated Android (both skins) would have updated web resources (if you deployed the android wlapp) and an iOS version would stay the same.
If you look at the Android project after build (native -> assets -> www -> default or skin) you can find the shared web resources generated by the common environment. However that is only put there every time you do a new build.
In the picture, I have an older version of the Android built for both skins on the left. On the right is a preview of the newer common resources after deploying only the common.wlapp. So you can see that they are separate.
Sorry if it was long winded, but I thought I would be thorough.
To answer the original question, have you thought of having all the parameters of the store loaded from user input or a setup? If you are trying to connect to 3 different store, create some form for settings control that will access different back ends or specific adapters. You could also create 3 different config.js that load depending on the parameters that you set so that you set. The other option is to set different versions of your apps specific to the store.
Example. Version 1.11, 1.12,1.13 can be 3 versions of the same app for store 1, 2, & 3. They can be modified and change and have 3 sets of web resources. When you need to update, jump up to version 1.21, 1.22,1.23. It seems a bit of a work around, but it may be your best bet at getting 3 versions of the same app to fall within the single application category. (keep 3 config.js types for modifying for the 3 stores).
To the best of my knowledge Direct Update will update every web resource of the skin you're using (html, css, js). However, I'm no expert with it.
If you're supporting only Android and iOS applications and need a way to store settings I recommend JSONStore. Otherwise look into Cordova Storage, Local Storage or IndexedDB.
Using a JSONStore collection called settings will allow you to store data on disk inside the app's directory. It will persist until you call one of the removal methods like destroy or until the application is uninstalled. There are also ways of linking collections to Worklight Adapters to pull/push data from/to a server. The links below will provide further details.
the only change being certain settings within a js
Create a collection for your settings:
var options = {};
options.onSuccess = function () {
//... what to do after init finished
};
options.onFailure = function () {
//... what to do if init fails
}
var settings = WL.JSONStore.initCollection('settings',
{background: 'string', itemsPerPage: 'number'}, options);
You can add new settings after initCollection onSuccess has been called:
settings.add({background: 'red', itemsPerPage: 20}, options);
You can find settings stored after initCollection onSuccess has been called:
settings.findAll({onSuccess: function (results) {
console.log(JSON.stringify(results));
}});
You can read more about JSONStore in the Getting Started Modules. See Modules: 7.9, 7.10, 7.11, 7.12. There is further information inside the API Documentation in IBM InfoCenter. The methods described above are: initCollection, add and findAll.
Since version 5.0.3 I think, direct update will not update all the webresources, only those of the skin you are using.
say you have skin def and skin skin2
you are on def
make change to def on the server -> you will get a direct update for
def only
make change to skin2 on server-> no direct update for you.
you are on skin2:
make change to skin2 on server -> direct update for skin2 only
make change to def javascript which also resides on skin2 ( and therefor end result is def+skin2 concatination), update only skin2
make change to def,just to a picture(also checking pic extension from application-descriptor: ") -> no direct update
Thats how direct update works.
Please also share some more details about what is the problem, I see you use a js file, where do you change it? what do you mean excatly, give a better (simplified) real life example, because it is unclear what you are trying.

How do I use the LookbackAPI for burnup charts?

I need a good example of using the LookbackAPI to get the data for a burn up chart. I see some limited questions and responses on the API but no examples on how I would use it to do so. I need to get the current scope on story points and story points completed.
Sorry for the scarcity of available examples. More and better examples will be coming as the LBAPI beta matures. I'd definitely recommend that you become familiar with the Lookback API (LBAPI) Documentation, as there are good examples there for formulating queries.
For a burnup, let's say you want to get the state Snapshots for an Iteration going from 15-Jan-2013 through 30-Jan-2013, and that the Iteration applies to a Project hierarchy that is four-deep. The following LBAPI query would obtain the PlanEstimate, ToDo, and Schedule State for Stories scheduled into that Iteration:
{
find:
{
_TypeHierarchy:"HierarchicalRequirement",
Children:null,
_ValidFrom:{
$gte:"2013-01-15TZ",
$lt:"2013-01-30TZ"
},
Iteration:{
$in:[
12345678910,
12345678911,
12345678912,
12345678913
]
}
},
fields:[
"PlanEstimate",
"ToDo",
"ScheduleState"
]
}
Where:
$in:[
12345678910,
12345678911,
12345678912,
12345678913
]
Are the ObjectID's of the Iteration called "Iteration 1". It's probably easiest to get these Object ID's from a standard WSAPI query on Iterations: (Name = "Iteration 1"). For Iterations copied into a four-deep project hierarchy, we would see the four Iteration OID's similar to the above.
For charting, the toughest part right now is an easy way to deal with the Time-Series data. The most robust way to query and process LBAPI data currently is by working directly against the REST endpoint and processing the returned JSON results in your own code.
With Javascript apps, for processing the data and turning it into a Chart, the preferred toolkit is AppSDK2, specifically the SnapshotStore.
For Javascript apps, the Lumenize javascript library is separate from LBAPI, but was developed by Rally's director of analytics and is bundled in the SDK. You can find some examples of using LBAPI and Lumenize to produce charting as part of some Rally-internal and Rally-customer Hackathon projects here:
https://github.com/RallyHackathon
Please be cautious with these examples for a couple of reasons:
Several aspects of the Lumenize namespace will be changing/renamed for clarity
There's a bug in the current version of Lumenize where its timeSeriesCalculator does not correctly account for stories deleted or reparented.
Hopefully there will be an updated version of AppSDK2 bundled and released soon to consolidate the Lumenize namespace and resolve the bug, so that there's better glue between AppSDK2 and LBAPI for Javascript App development.
Unfortunately, the .NET, Java and Python toolkits have not yet been updated to support the Lookback API. As a result, you'll have to do an HTTP POST to the Lookback API's REST endpoint directly, with a body similar to the one Mark W listed above and Content-Type 'application/json'.
I'd recommend using the Chrome extension 'XHR Poster' to experiment with what you're sending from a browser:
https://chrome.google.com/webstore/detail/xhr-poster/akdbimilobjkfhgamdhneckaifceicen