I'm looking at Keen IO Dashboards, but don't see drag & drop functionality in examples. Is that possible and I'm just missing it, or is it planned? Thanks.
Keen released a new point-and-click dashboard creator that can be accessed when you're logged into your account. A few clicks can build metrics into dashboards you can share with your teammates. https://keen.io/blog/146028881101/announcing-keen-dashboards
Fully customizable external or customer-facing dashboards can be created using keen-dataviz.js if you're into templates some bootstrap dashboard templates can be downloaded from github too.
Drag and Drop isn't currently supported but the team is very open to feature requests. I'd suggest filing in issue on the Github repository – keen/dashboards.
If you want to discuss it further, Keen has a developer group.
Related
I'm trying to automate some process that requires downloading a custom report created from Shopify Admin daily. I tried resort to Shopify API to manually pull orders via the Order API, however, the process requires going through pagination which took several minutes to go through a few thousands of orders while it will only a few seconds to manually export the csv file from the Shopify Admin page.
So I'm just wondering if it's possible to implement a service that export those custom reports that were created in Shopify Admin into a csv file without human interaction?
Cheers!
Sure. Just write some Javascript to run in the admin that will do the custom report for you, without pushing all the buttons you'd push manually. You can fake being human that way and automate things. Once you have that straightened away, you could setup a service that would run that scripting automatically when you wanted, say on a schedule.
Of course, the old pokey way you pointed out is probably way easier, but as you say, it takes minutes and hey, who has minutes to spare!
I am not sure if it can solve your problem. I use a cloud data platform (not Shopify Apps) called Acho to retrieve data from Shopify and then export the report to Google Sheets automatically. Sometimes, I also create my own reports on the platform before exporting them to Google Sheets. I think it can somehow reduce lots of manual works. You can try its free trial if you are interested.
I have been given an ETL project as a task which requires me to ingest some data gleaned from GA into Gooddata via an API and perform some ETL operations. Also, the creation of reports and dashboards are an integral part of this assignment.
It is my first time using this platform. If there's any way, method or procedure that you can recommend me for doing this, that will be great.
Thanks
As you are new to GoodData, the basic tutorial may help you understand basic concepts - https://help.gooddata.com/display/doc/GoodData+Developer+Tutorial
Specifically for data loads, there are several ways how to load data into GoodData platform.
You can look at https://help.gooddata.com/display/doc/Data+Loading as a starting point.
As you are specifically mentioning API, I recommend to this part of documentation - https://help.gooddata.com/display/doc/Loading+Data+via+REST+API
In case you are interested for component ready for direct communication with GA, you can use "Google Analytics Reader" component https://help.gooddata.com/cloudconnect/manual/gareader.html within CloudConnect Designer (https://help.gooddata.com/display/doc/CloudConnect+Designer)
If you intend to transform data prior loading into GoodData platform, CloudConnect can be utilised or (depending on your GoodData environment) Agile Datawarehousing Service (https://help.gooddata.com/pages/viewpage.action?pageId=34341138) tied to Automated Data Distribution (https://help.gooddata.com/display/doc/Automated+Data+Distribution) may be option for you.
Your question is quite general, but regarding GA and using CloudConnect I recommend you to check the example in GoodData documentation:
https://help.gooddata.com/display/doc/CloudConnect+Example+Projects#CloudConnectExampleProjects-GoogleAnalyticsExampleProject
I have some scripts running from GSheet getting data from BigQuery. However, in order to make the files run, I need to manually enable the API every time for a given sheet.
So the question is: How to enable API within the code, so that if I share the GSheet or make a copy I don't have to go to the script editor and enable the API from there?
Thanks
I am a huge fan of this particular use of the Google ecosystem, so I'm happy to help get others up and running using GSheets with BigQuery! Hopefully it is working well for you!
When sharing the sheet with others, there is no need to alter anything in the script editor at all. The scripts should run and query BigQuery without issue; this has been my experience at least. The obvious caveat to this is that the users you share it with must have access to the Google Developer Project that the BigQuery instance is associated with.
However, when copying the sheet, I do not believe it is possible to have it replicate the connection. This is because when the file is copied, it becomes associated with a new Google Developer Project. Thus, you have to go into the script editor, then go to Resources > Developers Console Project and change the project listed to the one in which you have BigQuery enabled.
Hopefully this helps! Sorry I don't have better news for you!
I have installed and configured SSRS using SharePoint integrated deployment mode and have been able to successfully run a report from SharePoint. I created a custom deployment application that will upload all reports and datasets as well as create all data sources and make the proper connections between them when necessary.
I have one report that failed and I need to manually mess with the reports connection to a data source but I found that the drop down does not contain the options to let me manage its shared data sources (see example below).
In this image you can see the option that I am missing. Please excuse the colors, this is the best image I could find online in a pinch.
This is only happening in one environment so there must be a configuration change I am not thinking of to show these options. Here are the things I have already checked:
The account I am using is in the sites Owners group and has full control of everything, including the report file.
The item is being uploaded as a Document content type for some reason, but I edited properties and changed that to Report Builder Report content type.
The Report Server Integration site collection feature has been activated.
All of the Reporting Service content types have been added to the list.
I would revert to deployment from BIDS to debug this issue. It will perform some validation during that process and possibly return meaningful errors.
So this turned out to be caused by one of our customizations. We had an old custom javascript function that was named the same as a SharePoint javascript function that has something to do with those drop down actions. Hope this helps someone else.
I develop -- from time to time -- yahoo open tables to access different resources on the web. Currently I am using a JavaScript editor and -- when I want to test if my open table works -- I upload the xml table description to a server to test it with a yql client application. However this approach is quite slow and -- sometimes -- I get blocked by yahoo, because of a mistake in my open table description. Therefore I would like to learn about best practices on how to test and develop yahoo open table locally. How does look your set up for the open table development?
To clarify my question, I am looking for any convenient way (best practise) to develop and test yql tables, e.g., running a part of a Java Script inside Rhino.
First of all: I agree that I don't see a really convenient way either to test YQL datatable definitions locally. Nevertheless, here is how I approach this issue.
Hosting on github
YQL datatable definitions are often used in very open scenarios e.g. when there is an existing API that you want to wrap via YQL. Therefore I am normally working on a fork of the YQL community tables and I just add my own definitions there. The hosting of the .xml files takes place on github in this case: https://github.com/yql/yql-tables
The other advantage of this approach is as well that it is easy for me to share my datatables with the community if I feel that they might be valuable for others as well.
Hosting privately
The free github account only comes with free repositories though, so everybody would be able to see and use your datatables. If that is not good for you then you could either buy a github pro account to get private repositories, or host the datatable definitions yourself.
To do that you can upload them to your own server - as you are already doing - or you should also be able to set up a web server like Apache locally on your machine and then get a dynamic hostname from dyndns.com or similar, so that you can point to this definitions from YQL. I have not tried this because github was working sufficiently well for me but I am sure that it is possible.
Why don't you just put the file you are editing in a public dropbox folder? This is what I do and it works pretty well.