I have a table which gets its data server-side, using custom server-side initialization params which vary depending upon which report is produced. Once the table is generated, the user may open a popup in which they can add multiple additional filters on which to search. I need to be able to use the same initialization params as the original table, and add the new ones using fnServerParams.
I can't figure out how to get the original initialization params using the datatables API. I had thought I could get a reference to the object, get the settings using fnSettings, and pass those settings into a new datatables instance like so:
var oSettings = $('#myTable').dataTable().fnSettings();
// add additional params to the oSettings object
$('#myTable').dataTable(oSettings);
but the variable returned through fnSettings isn't what I need and doesn't work.
At this point, it seems like I'm going to re-architect things so that I can pass the initialization params around as a variable and add params as needed, unless somebody can steer me in the right direction.
EDIT:
Following tduchateau's answer below, I was able to get partway there by using
var oTable= $('#myTable').dataTable(),
oSettings = oTable.fnSettings(),
oParams = oTable.oApi._fnAjaxParameters(oSettings);
oParams.push('name':'my-new-filter', 'value':'my-new-filter-value');
and can confirm that my new serverside params are added on to the existing params.
However, I'm still not quite there.
$('#myTable').dataTable(oSettings);
gives the error:
DataTables warning(table id = 'myTable'): Cannot reinitialise DataTable.
To retrieve the DataTables object for this table, please pass either no arguments
to the dataTable() function, or set bRetrieve to true.
Alternatively, to destroy the old table and create a new one, set bDestroy to true.
Setting
oTable.bRetrieve = true;
doesn't get rid of the error, and setting
oSettings.bRetrieve = true;
causes the table to not execute the ajax call. Setting
oSettings.bDestroy = true;
loses all the custom params, while setting
oTable.bDestroy = true;
returns the above error. And simply calling
oTable.fnDraw();
causes the table to be redrawn with its original settings.
Finally got it to work using fnServerParams. Note that I'm both deleting unneccessary params and adding new ones, using a url var object:
"fnServerParams": function ( aoData ) {
var l = aoData.length;
// remove unneeded server params
for (var i = 0; i < l; ++i) {
// if param name starts with bRegex_, sSearch_, mDataProp_, bSearchable_, or bSortable_, remove it from the array
if (aoData[i].name.search(/bRegex_|sSearch_|mDataProp_|bSearchable_|bSortable_/) !== -1 ){
aoData.splice(i, 1);
// since we've removed an element from the array, we need to decrement both the index and the length vars
--i;
--l;
}
}
// add the url variables to the server array
for (i in oUrlvars) {
aoData.push( { "name": i, "value": oUrlvars[i]} );
}
}
This is normally the right way to retrieve the initialization settings:
var oSettings = oTable.fnSettings();
Why is it not what you need? What's wrong with these params?
If you need to filter data depending on your additional filters, you can complete the array of "AJAX data" sent to the server using this:
var oTable = $('#myTable').dataTable();
var oParams = oTable.oApi._fnAjaxParameters( oTable );
oParams.push({name: "your-additional-param-name", value: your-additional-param-value });
You can see some example usages in the TableTools plugin.
But I'm not sure this is what you need... :-)
Related
I am trying to add new row in datatables, and by using the API .any() to check if the id is already exist in the rows and if it exist I will not add new row to my datatable, and here is the result form my request from databse see http://pastie.org/10196001 , but I am having trouble in checking.
socket.on('displayupdate',function(data){
var dataarray = JSON.parse(data);
dataarray.forEach(function(d){
if ( table.row.DT_RowId(d.DT_RowId).any() ) { // TypeError: table.row.DT_RowId is not a function
console.log('already exist cannot be added');
}else{
table.row.add(d).draw();
}
});
});
Thank you in advance.
You get the error, of course, because DT_RowId not is a function in the API. But DT_RowId is in fact the one and only property that get some special treatment from dataTables :
By assigning the ID you want to apply to each row using the property
DT_RowId of the data source object for each row, DataTables will
automatically add it for you.
So why not check rows() for that automatically injected id along with any()?
socket.on('displayupdate',function(data){
var DT_RowId,
dataarray = JSON.parse(data);
dataarray.forEach(function(d){
DT_RowId = d.DT_RowId;
if (table.rows('[id='+DT_RowId+']').any()) {
console.log('already exist cannot be added');
} else {
table.row.add(d).draw();
}
});
});
simplified demo -> http://jsfiddle.net/f1yyuz1c/
I am trying to duplicate an object every time I click it in Verold. I have attached the Object Picker to the scene and successfully triggered a function which prints to the console.
I've tried this code but I get a Type Error - can't read property of undefined.
var xxx = this.getEntity().clone();
var threeDataxxx = xxx.getThreeData();
threeDataxxx.position.x += Math.random() * 5;
The clone() method is asynchronous (because the same method would be used if you were creating persistent copies of your objects on the server). This function, like many functions in the Verold API, takes an 'options' object as a parameter. In here, you need to specify the 'success' callback method like in the following example. Once you have the clone, you then need to add it to the scene hierarchy using the addChild() method. This will automatically trigger the cloned object to load.
var parent = this.getEntity().getParentObject();
this.getEntity().clone( {
success: function( newEntity ) {
parent.addChild( newEntity );
var position = newEntity.getPosition();
position.x += Math.random() * 10;
newEntity.setPosition( position.x, position.y, position.z );
}
});
The multiple steps are useful because you may want to clone several objects and have them ready to add to the scene at a later time.
And, of course, if you don't require the cloned object to have components or any of the other functionality of a VeroldObject, you can always just get the threeData and then use Three.JS's clone() method.
Hope that helps.
I am trying to develop a search filter and making use of the HTML5 history API to reduce the number of requests sent to the server. If the user checks a checkbox to apply a certain filter I am saving that data in the history state, so that when the user unchecks it I am able to load the data back from the history rather than fetching it again from the server.
When the user checks or unchecks a filter I am changing the window URL to match the filter that was set, for instance if the user tries to filter car brands only of a certain category I change the URL like 'cars?filter-brand[]=1'.
But when mutiple filters are applied I have no way of figuring out whether to load the data from the server or to load it from the history.
At the moment I am using the following code.
pushString variable is the new query string that will be created.
var back = [],forward = [];
if(back[back.length-1] === decodeURI(pushString)){ //check last back val against the next URL to be created
back.pop();
forward.push(currentLocation);
history.back();
return true;
}else if(forward[forward.length-1] === decodeURI(pushString)){
forward.pop();
back.push(currentLocation);
history.forward();
return true;
}else{
back.push(currentLocation); //add current win location
}
You can check if your filters are equivalent.
Comparing Objects
This is a simple function that takes two files, and lets you know if they're equivalent (note: not prototype safe for simplicity).
function objEqual(a, b) {
function toStr(o){
var keys = [], values = [];
for (k in o) {
keys.push(k);
values.push(o[k]);
}
keys.sort();
values.sort();
return JSON.stringify(keys)
+ JSON.stringify(values);
}
return toStr(a) === toStr(b);
}
demo
Using the URL
Pass the query part of the URL (window.location.search) to this function. It'll give you an object you can compare to another object using the above function.
function parseURL(url){
var obj = {}, parts = url.split("&");
for (var i=0, part; part = parts[i]; i++) {
var x = part.split("="), k = x[0], v = x[1];
obj[k] = v;
}
return obj;
}
Demo
History API Objects
You can store the objects with the History API.
window.history.pushState(someObject, "", "someURL")
You can get this object using history.state or in a popState handler.
Keeping Track of Things
If you pull out the toStr function from the first section, you can serialize the current filters. You can then store all of the states in an object, and all of the data associated.
When you're pushing a state, you can update your global cache object. This code should be in the handler for the AJAX response.
var key = toStr(parseUrl(location.search));
cache[key] = dataFromTheServer;
Then abstract your AJAX function to check the cache first.
function getFilterResults(filters, callback) {
var cached = cache[toStr(filters)]
if (cached != null) callback(cached);
else doSomeAJAXStuff().then(callback);
}
You can also use localstorage for more persistent caching, however this would require more advanced code, and expiring data.
I am new to Dojo, I am using QueryReadStore as the store for loading my TreeGrid, working fine. But the QueryReadStore appends some paramters to the url, parameters like parentId, count, sort etc., I have looked at this link http://dojotoolkit.org/reference-guide/1.7/dojox/data/QueryReadStore.html, but not able to understand.
Parameters are getting passed like this servlet/DataHandler?start=0&count=25
How to manipulate the parameters, like I want to set the value for parentId paramters so that I only get that particular row details.
In theory you wold have to create a new class by extending the "dojox.data.QueryReadStore", in the link you posted have an example for doing exactly what you want. See if you get it now(changed a bit):
dojo.require("dojox.data.QueryReadStore");
dojo.declare("custom.MyReadStore", dojox.data.QueryReadStore, {
fetch:function(request){
//append here your custom parameters:
var qs = {p1:"This is parameter 1",
q:request.query.name
}
request.serverQuery = qs;
// Call superclasses' fetch
return this.inherited("fetch", arguments);
}
});
So When come to create the QueryReadStore you actually create a object with the class you defined. something like this:
var queryReadStore = new custom.MyReadStore({args...})
Explore the request parameter passed to the function to see what else you can do.
I create a JSonStore with a JSON formatted array of objects.
I have verified it is properly formatted.
I then try to use a dojo forEach loop on it but the JSonStore doesn't seem to have any data in it. I can specify the target in my web page URL and it shows the right data. But using console.log(myJsonStore) shows an object but I don't see the data in Firebug. I also don't see any GET for the service providing the data. It's like specifying the target path in a URL in the browser fires the GET but not when I try to trigger it in the postCreate where my foreach is located.
The answer from Ricardo, i believe is a little incorrect, seeing as the JsonRest.query function returns a dojo.Deferred.
You have a REST call being made asynchroniously through store read api - and once it returns values, it will promise to run whats set as the callback.
Try this for your loop iterator instead
storeObj.query( {} ).then(function ( results ) {
dojo.forEach( results, function( obj ) {
console.log( obj );
});
}
you can do this:
var storeObj = new JsonRest({
target: "/some/resource"
});
storeObj.query({}).forEach(function(obj){console.log(obj);});
that should do the trick