Is it possible to point a dojo dgrid at the same rest store & query as a filteringselect? - dojo

Surely it should be possible to issue the same query against the same store to populate a dGrid (or any other form of grid) and the dropdown in a filteringSelect with the same rows.
However it looks like the filteringSelect needs a response of the form
{
"identifier": "abbreviation",
"label": "name",
"items": [
{ "abbreviation": "AL", "name": "Alabama" },
{ "abbreviation": "AK", "name": "Alaska" },
{ "abbreviation": "WY", "name": "Wyoming" }
]
}
and the dgrid needs
[
{ "abbreviation": "AL", "name": "Alabama" },
{ "abbreviation": "AK", "name": "Alaska" },
{ "abbreviation": "WY", "name": "Wyoming" }
]
It seems that the identifier and label attributes coming from the store are completely superflous because you are getting the identifier via the identity API and in any case you can specify everything when you instantiate the filteringselect.
Yes I know there are work-arounds - I could use two different stores or queries and get the server-side to generate generate both of these based on some parameter in the query. But if I do this will the changes propagate properly when I make changes via the dgrid? or I could wrap the store API with something that puts the extra fields on the front of the response and pass the wrapped store into the filteringselect, but is there a simpler way?

Use your array to create a data store like:
var store = Observable(new Memory({
idProperty: "abbreviation",
data: [ {
"abbreviation": "AL", "name": "Alabama"
}, {
"abbreviation": "AK", "name": "Alaska"
}, {
"abbreviation": "WY", "name": "Wyoming"
}]
}));
Then, your can set this store to dgrid as :
dgrid.set('store', store);
For your filteringSelect, you can set this store as
new FilteringSelect({
searchAttr:"name",
store: store
});

Related

Mimic the ( Show All ) link in datatables.net

I have a situation where I want to get the full (data) from the backend as a CSV file. I have already prepared the backend for that, but normally the front-end state => (filters) is not in contact with the backend unless I send a request, so I managed to solve the problem by mimicking the process of showing all data but by a custom button and a GET request ( not an ajax request ). knowing that I am using serverSide: true in datatables.
I prepared the backend to receive a request like ( Show All ) but I want that link to be sent by custom button ( Export All ) not by the show process itself as by the picture down because showing all data is not practical at all.
This is the code for the custom button
{
text: "Export All",
action: function (e, dt, node, config) {
// get the backend file here
},
},
So, How could I send a request like the same request sent by ( Show All ) by a custom button, I prepared the server to respond by the CSV file. but I need a way to get the same link to send a get request ( not by ajax ) by the same link that Show All sends?
If you are using serverSide: true that should mean you have too much data to use the default (serverSide: false) - because the browser/DataTables cannot handle the volume. For this reason I would say you should also not try to use the browser to generate a full export - it's going to be too much data (otherwise, why did you choose to use serverSide: true?).
Instead, use a server-side export utility - not DataTables.
But if you still want to pursuse this approach, you can build a custom button which downloads the entire data set to the DataTables (in your browser) and then exports that complete data to Excel.
Full Disclosure:
The following approach is inspired by the following DataTables forum post:
Customizing the data from export buttons
The following approach requires you to have a separate REST endpoint which delivers the entire data set as a JSON response (by contrast, the standard response should only be one page of data for the actual table data display and pagination.)
How you set up this endpoint is up to you (in Laravel, in your case).
Step 1: Create a custom button:
I tested with Excel, but you can do CSV, if you prefer.
buttons: [
{
extend: 'excelHtml5', // or 'csvHtml5'
text: 'All Data to Excel', // or CSV if you prefer
exportOptions: {
customizeData: function (d) {
var exportBody = getDataToExport();
d.body.length = 0;
d.body.push.apply(d.body, exportBody);
}
}
}
],
Step 2: The export function, used by the above button:
function GetDataToExport() {
var jsonResult = $.ajax({
url: '[your_GET_EVERYTHING_url_goes_here]',
success: function (result) {},
async: false
});
var exportBody = jsonResult.responseJSON.data;
return exportBody.map(function (el) {
return Object.keys(el).map(function (key) {
return el[key]
});
});
}
In the above code, my assumption is that the JSON response has the standard DataTables object structure - so, something like:
{
"data": [
{
"id": "1",
"name": "Tiger Nixon",
"position": "System Architect",
"salary": "$320,800",
"start_date": "2011/04/25",
"office": "Edinburgh",
"extn": "5421"
},
{
"id": "2",
"name": "Garrett Winters",
"position": "Accountant",
"salary": "$170,750",
"start_date": "2011/07/25",
"office": "Tokyo",
"extn": "8422"
},
{
"id": "3",
"name": "Ashton Cox",
"position": "Junior Technical Author",
"salary": "$86,000",
"start_date": "2009/01/12",
"office": "San Francisco",
"extn": "1562"
}
]
}
So, it's an object, containing a data array.
The DataTables customizeData function is what controls writing this complete JSON to the Excel file.
Overall, your DataTables code will look something like this:
$(document).ready(function() {
$('#example').DataTable( {
serverSide: true,
dom: 'Brftip',
buttons: [
{
extend: 'excelHtml5',
text: 'All Data to Excel',
exportOptions: {
customizeData: function (d) {
var exportBody = GetDataToExport();
d.body.length = 0;
d.body.push.apply(d.body, exportBody);
}
}
}
],
ajax: {
url: "[your_SINGLE_PAGE_url_goes_here]"
},
"columns": [
{ "title": "ID", "data": "id" },
{ "title": "Name", "data": "name" },
{ "title": "Position", "data": "position" },
{ "title": "Salary", "data": "salary" },
{ "title": "Start Date", "data": "start_date" },
{ "title": "Office", "data": "office" },
{ "title": "Extn.", "data": "extn" }
]
} );
} );
function GetDataToExport() {
var jsonResult = $.ajax({
url: '[your_GET_EVERYTHING_url_goes_here]',
success: function (result) {},
async: false
});
var exportBody = jsonResult.responseJSON.data;
return exportBody.map(function (el) {
return Object.keys(el).map(function (key) {
return el[key]
});
});
}
Just to repeat my initial warning: This is probably a bad idea, if you really needed to use serverSide: true because of the volume of data you have.
Use a server-side export tool instead - I'm sure Laravel/PHP has good support for generating Excel files.

Microsoft adaptive card choiceset iteration

i have a adaptive card choice set like below, as you can see am trying to get the value under title from a variable which is an array, is there a way i can iterate the choice set automatically because i don't know how many values the array has i want to show all the values inside the array in the choice set title
{
"type" : "Input.ChoiceSet",
"isMultiSelect": true,
"id": "myColor",
"style": "compact",
"value": "1",
"choices": [
{
"title": vars.responsedata.items[0].topic,
"value": "1"
},
{
"title": vars.responsedata.items[1].topic,
"value": "2"
},
{
"title": "Recording 3 sample",
"value": "3"
}
]
}
You can use the map() function.
Example in DataWeave:
{
choices: vars.responsedata.items map {
title: $.topic,
value: $$
}
}

Importing Data to Contentful programatically from a json file

I am trying to import some data programatically into contentful:
I am following the docs here
And running the command inside my integrated terminal
contentful space import --config config.json
Where the config file is
{
"spaceId": "abc123",
"managementToken": "112323132321adfWWExample",
"contentFile": "./dataToImport.json"
}
And the dataToImport.json file is
{
"data": [
{
"address": "11234 New York City"
},
{
"address": "1212 New York City"
}
]
}
The thing is I don't understand what format my dataToImport.json should be and what is missing inside this file or in my config file so that the array of addresses from the .json file get added as new entries to an already created content model inside the Contentful UI show in the screenshot below
I am not specifying the content model for the data to go into so I believe that is one issue, and I don't know how I do that. An example or repo would help me out greatly
The types of data you can import are listed : in their documentation
your json top level should say "entries" and not data, if new content of a content type is what you would like to import.
This is an example of a blog post as per content model of the tutorial they provide.
The only thing i didn't work out yet is where the user id is :D so i substituted for one of the content type 'person' also provided in their tutorial (I think it's called Gatsby Starter)
{"entries": [
{
"sys": {
"space": {
"sys": {
"type": "Link",
"linkType": "Space",
"id": "theSpaceIdToReceiveYourImport"
}
},
"type": "Entry",
"createdAt": "2019-04-17T00:56:24.722Z",
"updatedAt": "2019-04-27T09:11:56.769Z",
"environment": {
"sys": {
"id": "master",
"type": "Link",
"linkType": "Environment"
}
},
"publishedVersion": 149, -- these are not compulsory, you can skip
"publishedAt": "2019-04-27T09:11:56.769Z", -- you can skip
"firstPublishedAt": "2019-04-17T00:56:28.525Z", -- you can skip
"publishedCounter": 3, -- you can skip
"version": 150,
"publishedBy": { -- this is an example of a linked content
"sys": {
"type": "Link",
"linkType": "person",
"id": "personId"
}
},
"contentType": {
"sys": {
"type": "Link",
"linkType": "ContentType",
"id": "blogPost" -- here should be your content type 'RealtorProperties'
}
}
},
"fields": { -- here should go your content type fields, i can't see it in your post
"title": {
"en-US": "Test 1"
},
"slug": {
"en-US": "Test-1"
},
"description": {
"en-US": "some description"
},
"body": {
"en-US": "some body..."
},
"publishDate": {
"en-US": "2016-12-19"
},
"heroImage": { -- another example of a linked content
"en-US": {
"sys": {
"type": "Link",
"linkType": "Asset",
"id": "idOfTHisImage"
}
}
}
}
},
--another entry, ...]}
Have a look at this repo. I am also trying to figure this out. Looks like there's quite a lot of fields that need to be included in the json file. I was hoping there'd be a simple solution but it seems you (me too actually) will need to create scripts to "convert" your json file to data contentful can read and import.
I'll let you know if I find anything better.

Update Amcharts4 chart dynamically using vue js

I'm using AmCharts4 with Vue.JS,
For the moment I've added default chart design when page loads, But when I trying to add dynamic values after page loads (Using Button click), It doesn't reflect on view.
Code:- gist
data: () => ({
dataForChart: {
data: [{
"name": "Anne",
"steps": 32
}, {
"name": "Rose",
"steps": 30
}, {
"name": "Jane",
"steps": 25
}]
}
}),
chart creation on mounted()
let chart = am4core.create("chart-div", am4charts.XYChart);
chart.data = this.dataForChart.data
when dynamically change values using button click those data doesnt reflect on chart.
method followed to change data set.
this.dataForChart.data = {
data: [{
"name": "Anne",
"steps": 54
}, {
"name": "Rose",
"steps": 44
}, {
"name": "Jane",
"steps": 33
}]
}
The reason for this is that although this.dataForChart.data is reactive (it's a vue instance property) the chart.data (amcharts chart property) is not.
You have to manually set chart.data array values whenever your this.dataForChart.data property changes. I've done this by setting a watcher on this.dataForChart.data like so:
watch: {
dataForChart: {
data(values) {
this.chart.data = values;
},
deep: true
}
}
Cheers!

express-graphql: How to remove external "data" object layer.

I am replacing an existing REST endpoint with GraphQL.
In our existing REST endpoint, we return a JSON array.
[{
"id": "ABC"
},
{
"id": "123"
},
{
"id": "xyz"
},
{
"id": "789"
}
]
GraphQL seems to be wrapping the array in two additional object layers. Is there any way to remove the "data" and "Client" layers?
Response data:
{
"data": {
"Client": [
{
"id": "ABC"
},
{
"id": "123"
},
{
"id": "xyz"
},
{
"id": "789"
}
]
}
}
My query:
{
Client(accountId: "5417727750494381532d735a") {
id
}
}
No. That was the whole purpose of GraphQL. To have a single endoint and allow users to fetch different type/granularity of data by specifying the input in a query format as opposed to REST APIs and then map them onto the returned JSON output.
'data' acts as a parent/root level container for different entities that you have queried. Without these keys in the returned JSON data, there won't be any way to segregate the corresponding data. e.g.
Your above query can be modified to include another entity like Owner,
{
Client(accountId: "5417727750494381532d735a") {
id
}
Owner {
id
}
}
In which case, the output will be something like
{
"data": {
"Client": [
...
],
"Owner": [
...
]
}
}
Without the 'Client' and 'Owner' keys in the JSON outout, there is no way to separate the corresponding array values.
In your case, you can get only the array by doing data.Client on the returned output.