I'm working on MicroStrategy Project. I need to retrieve the metadata and underlying data used by existing reports and dossiers (could be created from outside of the organization and imported into my Intelligence Server, which means I don't have the data source available directly) through REST APIs.
I found the REST APIs family in this page https://lw.microstrategy.com/msdz/MSDL/GARelease_Current/docs/projects/RESTSDK/Content/topics/REST_API/REST_API_API_families.htm, and tried the libraries, reports, dossiers APIs, but the result is not what I expected. Could someone suggest which API I should try out?
I also notice that the GET api/search/results can probably be a solution, but I'm having difficulty to find the correct int for the types, subtypes and extTypes from the TypeTable (https://lw.microstrategy.com/msdz/msdl/GARelease_Current/docs/ReferenceFiles/reference/com/microstrategy/webapi/EnumDSSXMLObjectTypes.html#DssXmlTypeTable). Any suggestion for the int of types I should try would be appreciated!
Thank you!
I think you will have a lot more luck querying both the metadata and data within the objects using the newer Library Rest API. Here is a link to the documentation, which is much clearer and provides the opportunity to test out some of the code using MicroStrategy's demo server:
MicroStrategy REST
Related
Sorry, I am not sure this question belongs here. We are building a solution that requires a "mapping engine", so to speak. We need to map our own db fields against multiple external APIs and vice versa. To us, this is a very time-consuming and repetitive process, and I have been unable to find any solution out there other than maker, zapier, etc. which work with pre-existing integrations. What we are dealing with is a bunch of legacy custom APIs that are never part of any of these solutions.
I was hoping there's a solution out there that would just let me paste the JSON and XML bodies of successful requests and responses of the legacy APIs we're working with, then allow us to match our internal fields against those fields in a table as API templates that we can define, and then run as an automation middleware, executing on the templates so to speak.
Does this exist?
I'm trying to automate a workflow using Google Data Studio. Requirements are simple - I need to be able to programatically copy a templated report (from a Python/Java application) and import/set a data source (Google Sheets doc) for that report. Nothing more fancy (no visualisation creation, formatting, or anything graphical, etc.).
Sources here, here and here (last two require Google Cloud Console account) suggest an API does exist (and detail a setup process to access it). However, after going through this setup process, I can find no details or documentation of any functionality, and consequently have been unable to progress.
Can anyone authoritatively state whether:
1. There does exist any API functionality for GDS? and
2. If not, are there plans to develop such? (since the Google links above suggest there is, I'm wondering if this means it's in the pipeline for near future).
The only directly related SO posts I can find are here and here. The first suggests there isn't, but doesn't account for the Google links I've provided above which suggest there is; the second doesn't really cover the same use case, so doesn't provide answers applicable here.
FYI - I've posted a Google Community forum post here asking essentially the same question.
If anyone is able to help out, that would be greatly appreciated :) Many thanks in advance for your time and help! :)
Fresh as of 2022-05-23
There does exist any API functionality for GDS?
Not in the way you are expecting. The three links you posted all refer to the current Data Studio API. The only things you can do with that API is view your Data Studio assets and update permissions. That's it. This API won't let you create/copy/modify reports or data sources.
If not, are there plans to develop such?
Not in the near future. You can make/vote for this feature request in the official tracker. More popular feature requests are usually prioritized in roadmaps.
That being said, a lot of the API use cases can be resolved using combinations of Community Connectors, config parameters, direct linking, viewer's credentials, Linking/Integration API etc.
We have a medium size app (100+ SQL tables), and we often need to integrate it with partner APIs (with our system as a client/consumer). Process of designing such integration is non-trivial:
We often need to map columns in our database to fields in requests to partner API.
Some fields in requests to partner API must be constant, or conditional
In rare occasions output from one API response becomes an input to another API request
There are many resources on the web to document REST APIs - there are specific formats for that (Swagger, RAML, etc.). These formats allow efficient generation of client code and human-readable documentation. However these formats are not very helpful for describing how your app integrates with an API. We create lengthy Microsoft Word documents which contain more or less a copy of partner API methods with comments how every individual field should be used. Such solution seems sub-optimal.
Googling for better options did not yield many results, namely Swaggerhub seems to have "comments" functionality which seems to target the problem above and pretty much that's all.
Question: are there some tools, formats, workflows, ideas, etc. which facilitate designing and documenting API integrations described above?
I dont know which language you use, but i work with ApiDoc
https://apidocjs.com/
He is perfect to generate an API REST doc with comment in NodeJS, he can be used with many language
I got a project assigned where we already have an up and running website and one of our clients wants to be able to track statistics from the website.
We want to make this available to all our clients as soon as we finish the development. Note that each 'client' have their own 'subdomain' to say so. Eg. www.website.com/client1 , www.website.com/client2 , etc. And we want to track the usage separately for each of these clients.
We will need to create statistics based on the usage of our own platform, pull in data registered by Google Analytics and also pull in data from a 3rd party which they will offer by an API of their own (they have a 3rd party solution that uses the data accessible via our API).
All this data needs to be shown on a webpage with graphs and tables.
I wanted to make sure we choose the right architecture from the start, in order to avoid scalability issues later on.
Started reading about Private and Public API's lately.
For now, we do not have another (internal) application yet that would use our own statisics, it would just be the website using it. But in order to be able to scale-up later if needed, and another application would like to use the statistics I think a private API would benefit us greatly.
In order to allow 3rd parties to use the statistical data we chose to let out, I was thinking of creating a Public API.
Is a Private&Public API the correct way to go about this?
One of the questions I am stuck with is how does the architecture for these API's look like. Mostly, right now we already have a public API regarding vacancy data. This 'API' is basically just a PHP class (controller) inside our CodeIgniter solution. It gets called via its URL and returns a JSON object with the results. (e.g. www.website.com/api/vacancy/xxx)
In order to create a (proper) private & public API solution/architecture. Should the API be set free from the website (CodeIgniter)? What are the common go-to solutions for this?
Or is it fine to keep it in our current platform the way it is now? (and people call the stats API via www.website.com/api/stats/xxx for example?)
It's almost always right to go with microservices like architecture so your initial thoughts sounds reasonable. Acting like this will give the possibility to scale and deploy your api independently and also will help you avoid performance side effects to your site (and vice versa). Pay attention how you access your main site data from within the new api if you don't want to finish with a monolith application.
Regarding the API i would suggest you to implement protocol like oauth2 in order to achieve the flexibility you (might) need. Also you can use swagger to document and test your API.
All i said might helps you a lot but first you have to answer yourself do you really need to go so deep or you just need a simple solution.
I think multitenancy is the best choice. Generally speaking, multitenancy is when every customer has own database. Data is separate. The codebase is same and already exists. As I understood the project is in progress status. You do not redesign and rewrite anything.
Has anyone ever tried to integrate QlikView BI application with SimPro business management software? If so, would be interested to know if you were successful and if so how?!!
Dorey.
Jonathan from simPRO here.
I'm not aware of any clients that have an integration with QlikView but it is a well respected reporting platform with a myriad of ways to extract data from other systems.
A quick scout around their documentation tells me that the easiest option for integration would using their QVSource API connector.
Details are here: http://www.qvsource.com/Connectors-For-QlikView/General-JSON-XML-SOAP-Web-API-Connector-For-QlikView
For simPROs API you would use this in "POST" methodology - ie: supplying the post data as per our API docs at: http://api.simpro.co . Our API uses OAuth authentication and there is dome detail and examples on using that with QVsource at: http://wiki.qvsource.com/General-Web-Connector-For-QlikView.ashx
Whilst we may not be able assist technically with setting up the QlikView system our technical support department can assist with any queries you have in regards to our API calls etc...
Hope that is of some help!
I am going to try and respond to your dilemma, while being a QV expert for the last 8 years I am not too familiar with simpro, however I did quickly browse through their API docs here -> http://api.simpro.co/
I have noticed that they support 3 formats, namely soap, json and xml. For qlikview you would choose xml. I have also noticed that they support 2 authentication methods namely basic auth and O Auth. IN this scenario you would be using basic auth (so you can pass a username and password via the url) as this will also work perfectly with Qlikview.
In your Edit Script, you will notice a "Data from files"-> Web Files, then in the popup you can enter the url from where you would retrieve your information from.
Note you need to pass format=xml to the url, along with your basic auth username and password to the relevant url (which you will find in simpro's docs), for any other format you will need QVSource which has a fee attached.
I hope this has pointed you in the right direction.