Using kettle as Saiku data source - pentaho

I want use saiku in Pentaho user console. But i want to create in the ETL (PDI or kettle) all the dimensions and measures and show them in Saiku. I am able to use a kettle step as a DS of a chart from the CDE c-tools. Is posible to use the same approach with Saiku, or is there a way to fill a mondrian scheme with the PDI?
Thanks

Related

Change BigQuery Datasource in Google Data Studio

I need to Change datasource for bigquery connector in data studio dashboard as i am moving for QA to production environment, schema is same for both of the datasouces.
Is there any way to achieve it so that i don't have to create custom metrics again for new data source.
If the schema is exactly the same I believe you can just edit for data source, choose reconnect and then select the new data source.
Be warned though, this will change that data source for all reports that use it.

How to restrict cube access on pentaho bi server 6

I have been desperately trying to find how to restrict Mondrain cube access on pentaho BI server 6. Knowing, my schema file is been developped manually (without using the Workbench tool). I have been desperatly been looking for answers every way.
To the best of my knowledge, you have to use the Schema Workbench tool or edit the Mondrian .xml file manually to restrict cube access to certain pre-existing work groups.
I used this resource from Pentaho to figure out how to create roles to restrict cube access: Pentaho Mondrian Access Control

Pentaho BI data source for custom OLAP (XML/A) provider

I need to be able to create simple ad-hoc reports on Pentaho BI using plugins like Pivot4j or Saiku. My data provider is SAP with XML/A interface. So the question is how could I create an OLAP data source that based on XMLA protocol? Or it is possible through Mondrian only?
Any help will be greatly appreciated.

How can i design star schema in pentahoo data integration

I have just started learning PDI community edition.
My main aim to create the star schema from normalized tables.
i was reading this tut
http://diethardsteiner.blogspot.com.au/2011/11/star-schema-modeling-with-pentaho-data.html
But in mt pentaho CE and i dont see any star schema option?
My Main aim is to get data from normalized tables and then insert into flattened tables with spoon kettle.
How can i go
The star schema plugin is not available anymore as Pentaho have suspended the same.
You will have to develop your transformations manually.
Try to create the transformation using the steps shown in the tutorial you shared.

The Pentaho BI Platform Workflow Issue

I have been working with Pentaho for the last few days. I have been able to setup the Pentaho Report Designer to generate a sample report by follow their documentation. Then I follow this article http://www.robertomarchetto.com/www/how_to_use_pentaho_report_designer_tutorial and managed to export the report to Pentaho BI server.
All I don't understand is Pentaho workflow. What should be the process I should follow which means what's the purpose of exporting the export to Pentaho BI server? Why there is a Data Integration tool? Why there is a BI sever when I can export the report from the Designer tool?
Requirement
All I want to do is retrieve the data from the MYSQL DB. Put them into a data-mart. Then from the data-mart generate a report.(According to what I have read, creating a data mart is the efficient way).
How can I get it done?
Pentaho Data Integration can be used to make this report generation automated.
In report designer you will be passing a parameter or set of parameters to generate a single report output.
With Data integration you can generate the reports for different set of parameters. for eg: if reports are generated on daily basis, we can make it automated for the whole month, so that there is no need of generating reports daily and manually.
And using the Pentaho Business Intelligence server we can make all these operations scheduled.
To generate Data/Table(Fact tables/dimension table) in MYSQL DB From difference source like files/different DB - Data Integration tool comes in to picture .
To create Schema on top of Fact tables - Mondrian tool
To handle user/roles on top of created cubes -Meta data editor
To create simple reports on top of small tables - Report Designer
For sequential Execution (at a go) usage of DI jobs/transformation , Reports, Java script - Design Studio
thanks to user surya.thanuri # forums.pentaho.com
The Data Integration tool is mostly for ETL, it's a separate tool and you can ignore it unless you are doing complex analysis of data from multiple dissimilar data sources. You don't need to 'export' reports to the pentaho server, you can write them directly to a directory then refresh the repository from inside the Pentaho web application. Exporting them is just one workflow technique.
You're going to find that there are about a dozen ways to do any one thing with Pentaho. For instance I use the CDA datasources with my reports vice placing the sql code inside my report. Alternatively you can link up to a Data Integration server to execute the Data Integration scripts to view a result set.
Just to answer your datamart question. In general a datamart should probably be supported by either the Data Integration tool (depending on your situation I don't exactly recommend this) or database functions/replication streams (recommended).
Just to hazard a guess, it sounds like someone tossed you a project saying: We need a BI system, here's the database where the data is stored, here are the reports we're already getting. X looked at Pentaho and liked it. You should use that.
First thing you need to do is understand the shape of the data, volume, tables, interrelations. Figure out what the real questions they want to answer are. Determine whether they need real time reporting, etc..etc. Just getting the datamart together itself, if you even need one, can take quite awhile. I think you may have jumped the gun on Pentaho itself.
thanks to user flamierd # forums.pentaho.com