API Key use w/ Tableau - api

We would like to use the Podio API Key to directly connect to Tableau and have the data refreshed at a cadence set in Tableau. Is this possible?

Yes, connecting to data via an API is possible and there are a couple of ways to do it:
Option #1: Web Data Connector
A WDC is a hosted web application built with JavaScript that connects to an API, converts the data to a JSON format, and passes the data to Tableau. You'll require a webserver to host your WDC and JavaScript skills to write it. Once set up, anyone in your org can just grab the link and use it in Desktop. With WDCs, since the data connection is made when the end-user is requesting the WDC, you can build in customizations for your users. (Ex: users can add filter parameters or authenticate with their own user/pass to only get what they have access to). WDC connections are extracts and can be refreshed on Tableau Server and Online. If you're using Tableau Online you'll need to use Bridge to auto-refresh.
Option #2: Hyper API
The Hyper API allows you to create, modify, and update extract (.hyper) files that you can then publish to Tableau Server/Online on a regular cadence. It's available for Python, Java, .NET, and C++ so you will need skills in one of those languages. I suggest Python as we have the most samples for it. You'll also need a server where you can run the extract refresh and publish scripts on a schedule. With the Hyper API you are creating a single extract for everyone to connect to. Once published to Tableau Server/Online, end users can just connect to this data source directly without having to do any input but this also means the connection can't be customized per user or use case.
Option #3: Use a 3rd-party connector
If building your own connector doesn't appeal to you there are also plenty of services out there you can pay for that can bring your data into Tableau. Ex: tray.io, dataddo, and skyvia are a couple I found after a quick google search.

Related

CosmosDB Rest API - Http Request

Is it possible to retrieve my data in Azure Cosmos in JSON format and share it with someone else without them accessing the actual environment? Something like an HTTP get from sharepoint. I am new to cosmos and APIs, so sorry if I am using the wrong terms here.
Update Attempting Azure Function:
I attempted to create an HTTPTrigger. Can I copy and paste the JSON into function.json and javascript into index.js? I changed the databaseName and collectionName, but it doesn't return the cosmos documents.
General
I think the easiest way to offer someone access to a specified collection would be to create an Azure Function. From the docs:
Azure Functions allows you to run small pieces of code (called "functions") without worrying about application infrastructure. With Azure Functions, the cloud infrastructure provides all the up-to-date servers you need to keep your application running at scale.
A function is "triggered" by a specific type of event. Supported triggers include responding to changes in data, responding to messages, running on a schedule, or as the result of an HTTP request.
C#
Here's an example of how this might look if you want to query documents by id:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-cosmosdb-v2-input?tabs=csharp#http-trigger-look-up-id-from-query-string
If you want more complex queries to be executed, take a look at this section of the abovementioned documentation:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-cosmosdb-v2-input?tabs=csharp#http-trigger-look-up-id-from-route-data-using-sqlquery
So basically this enables you to provide a HTTP endpoint, that's configured to run specific query against your CosmosDB instance.
JavaScript
An example of how to set up a CosmosDB instance and create functions for CRUD operations in JS can be found here:
https://dev.to/vidamrr/cosmos-db-crud-operations-using-azure-functions-4d27

Can we access Data from ADLA tables creating ODATA source using REST API?

Is there a way if ODATA can be established on top of azure data lake analytics table via REST API's?
It seems there are REST API's to get ADLA job informations, account information etc.,
Is there any such existing API's to get data or is it possible to create API to access data via ODATA concept?
If you want to access data in ADLS files, there are REST APIs to get data from the lake. ADLS supports WebHDFS APIs with OAuth.
If you want to send queries and see their results or get U-SQL table data, you would have to write your own shim that translates the query you submit via your API into a U-SQL Script that outputs a file and then transparently download the file and returns it as the result.
Note that so-called interactive support is on the roadmap and being worked on. Once that is available, you can access the data using standard query APIs (such as ODBC, JDBC etc).

Is tableau able to access data dynamically?

Usually a Tableau dashboard operates on "static" data that are "attached" to the published dashboard. I wonder if it is possible to make Tableau able to read data on-the-fly (when a user interacts with it). By that I mean that the data, that should be visualized, are taken from a data base that can by "dynamic". It means, for example, that the data shown by Tableau today and yesterday should not be the same because content of the database might change. Alternatively, we might try to retrieve data from an API. For example Tableau sends a request to a HTTP server and gets a data table in form of JSON and than visualizes it. Is Tableau able to do that?
Yes, Tableau can connect to live data sources such as any number of database technologies. No, it cannot send HTTP requests for JSON directly. It does a have web data connection feature if you or someone has built that web service. Here are some tips on when to use Live connections versus taking an Extract. http://mindmajix.com/use-direct-connection-data-extract-tableau/

What's the easiest/cheapest way to create a cloud-based SQL database?

I have a website that I've built (hosted on Amazon S3) and it works great. The only problem is that all of the data is static. I'd like to create a SQL database in the cloud that would allow me to store basic text data from users after they submit forms. I'm still a novice web-developer but I've used sqlite3 for several of my Java desktop apps and I'd like to use that SQL knowledge to create this online database. I guess what i'm asking (in my ignorance) is: how can I create a sqlite-type database that is stored in the cloud and that I can query against using javascript?
Where do I get started? Is there a service like Amazon AWS or Azure or something where I can create this database and then use some sort of jQuery/Javascript API to query data from it? I don't need a ton of storage space and my queries would be very basic SQL type stuff.
Any help would be greatly appreciated. Thanks in advance.
For more flexibility, less service lock-in, and cheaper scalability: I would suggest CouchDB (though you would likely still use a hosting service like Cloudant). CouchDB can host your website, and provides a HTTP API for storing data, to which your client-side JavaScript can make REST calls.
StackMob has a free package that you can use. You can use the JS SDK to write your HTML5 app and save stuff to the StackMob DB. You can host your HTML5 on StackMob for free and point your own domain to it as well. There is also S3 integration.
Some references:
JS SDK
JS SDK Tutorial
Hosting your HTML5
Custom Domains
Create a Postgres database on Heroku for free.
https://devcenter.heroku.com/articles/heroku-postgres-plans#hobby-tier
As you mentioned your website is hosted on Amazon S3 I am sure it is a static website with lots of JavaScript embedded HTML files. Due to having a static website, I can understand your requirement to use a database which can be connected from your static site and to be very honest there are not a lot options you have. Static website are considered to have no dependency on database so honestly you have very limited choice because what you are looking for is "A Database which is accessible over HTTP which you can call from scripting language withing HTML"
If you have ability to write back to S3 directly from your JavaScript code you can create a JavaScript based database within your static site which is complex but consider a choice.
In my thinking you are better off to have an extra-small instance in Windows Azure (or your choice of cloud service) and connect with a cloud based database which will be comparative cheaper and fit for your requirement.
Or unless Amazon can come up with a DB accessible from status content as S3, you really have no great choices here.
Since you are already familiar some of AWS's offerings, you should check out:
Amazon RDS - Managed Relational Database Service for MySQL or Oracle
Amazon DynamoDB - Fast, Predictable, Highly-scalable NoSQL data store
But to do what you are asking (access data via JavaScript), check out www.stackmob.com. You can host an HTML5 application with data access via backbone (javascript based framework) on StackMob.
Create a Virtual Private Server on Vultr.com. It's not the easiest way, but it's the best way for you to learn about Database Security, and it will be significantly cheaper than the other solutions, should your server begin to require more storage.
[vitrobridgedb] is free for hobby applications and pretty straight-forward to use
SQLite isn't really a good choice for web facing applications due to its scaling issues.
Both AWS and Azure support SQL databases. They also each support alternatives like MongoDB and Redis. For something as basic as you describe the only real difference is cost.

How to sync online & offline databases

I have a web application which provide some information for my customers. I have another version (windows) that exactly work same as web application.
This is because Web connection may lost for some hours and in this time user is going to use the application.
I'm wonder how to sync these SQL Server databases.
Note that web application is using from 3 different cities and all of them have a windows based application too. What should I do?
NoteL windows verision is exactly web application which installed in the Local Web Server in 3 different cities and users have access to them via their LAN.
All updates in data to from/to web/windows would originate from the windows application. But the problem is that the windows app will run when there is no internet connection.
So you will have to use a windows service which will call a webservice for local and remote database updates. The windows can wake up every x mins and update the remote and local databases.
The webservice will have two methods:
GetData(DateTime getRecordsFromThisDate) - Windows service should call this on regular intervals and update the local database.
UploadData(dataRows/collection) - Windows service should call this on regular intervals and update the remote database.
Each record in database will have a timestamp. For local update, get the largest timestamp and send it as parameter to GetData(). The webservice will return the records created after this time.
For upload data, you will have to store the last time when an successful upload operation was run. Get the records(inserted and updated) after this time and send them to UploadData().
Your choices could be the use of a database backup to synchronize (probably pretty slow and impractical). After that you must use ETL. Pick your favorite tool. You could use either sql server CDC or I would recommend Change tracking to identify your changes and load just those. Then use merge to synchronize your changes. Granted these solutions will require you to set up linked servers or use a third party to temporarily hold the dml changes.
http://technet.microsoft.com/en-us/library/bb510625.aspx
http://msdn.microsoft.com/en-us/library/cc305322.aspx
I thought I would add one non microsoft solution http://www.red-gate.com/products/sql-development/sql-data-compare/ its not free but does exactly what you need.