Processing mdx cube in pentaho - mdx

I'm using pentaho BI server(biserver-ce-5.0.1-stable) version.
once i create the data source for reporting and analyse purpose(OLAP cube) it's working fine using at that time Data but i need to know how to process it according to time period(need to change the cube data after mid night).
please share the ideas.

Download Pentaho Data Integration from here.
Go step by step when you created data source and write it down.
Then make Job (or Transformation if it's simple) and make a scheduler.
Here you will find info about PDI.
Good luck!

Related

Use firebird database as data source in Power BI via ODBC

hope someone can provide some insights or experience in this area.
I am currently building some reports in Power BI, the data source is connected via ODBC to a firebird database (testing environment atm). I copied the script to SQL statement and it runs ok.
However, something went wrong every time I refreshed the data, it stops at 'loading data to model' stage. I have to end Power BI in task manager then restart it again.
I don't know why this happens.
Thanks in advance.

Can any dimension be added to a SSAS CUBE dynamically

We build an SSAS ROLAP cube where data source is memSQL. The cube is built using Visual Studio 2019 and the driver used to connect to memSQL data source is "MySQL .NET Provider 8.0.19". The cube is built and processed successfully. As it is a ROLAP cube, so one of the requirement we have in our hand is to add new dimension/measure dynamically without developer intervention. Now I am looking for some expert advise, how dynamically can we add a dimension or a measure (may be through any Autosys job which will schedule to run in every hour and check for new dimension or measure).
Is it possible to do through any back-end C# code which will update the XMLA; whenever we are trying to add a new dimension or measure ?
I found a good example code at Analysis Services Cube Programatically/Automatically Generated via c# and AMO
This helped me to build a code for dynamic cube using memSQL as data source.

Pentaho CE report executes KTR tranformate twice

I'm using Pentaho CE for general reporting.
We are running reports using the build In Report Viewer.
We've noticed that each report created using Report Designer and using KTR as data source executes transformate twice ... (as seen in logs)
For large, complicated KTRs with SQL queries this is an overkill.
Does anyone know what is the purpose of this ? How can this be changed ? or just give me some light on the subject ?
Thanks, help much appreciated.

Fetch Data From Remote Database Every Hour

Yesterday i Download
Pentaho BI Server
Data Integration
Report Designer
Than I connect report designer to the remote database and fetch table and draw chart of that data successfully.
My Question is ,I want to run that file (which i create in report designing ) every hour by fetching the new data from remote database
can you please guide me step by step how to do it because i am new in all those stuff.
I will answer My own Question. So In order to schedule Job in Data Integration You Have To Follow these Steps
Step 1: Create Transformation
Create Transformation which you want to run like in my case i fetch the data from database and than write it to file.
Step 2: Create Job
Create Job and run above transformation according to your desired interval
In Pentaho you would create a Transformation or a Job in Data Integration to fetch data from the remote database, and then use the report created with Report Designer to visualize the data that you have downloaded with Data Integration.

how to use the Pentaho data integration for reporting

In my project we are using Pentaho data integration , saiku-server for reporting.
Now i am new to the business intelligence thing and i am confused which functioning will be performed by which software.
Senior coder here dont tell me so thats why i am asking here.
I am confused what are the functioning provided by these tools
Pentahoo
PAN
Kitchen
Spoon
Saiku
There are scripting which generates the these four files
cube.json sims.json schema.json path.json
now i don't know which software will be using that json files pentahoo , saiku spoon or what.
can anyone give me some idea
Pentaho data integration is one of open source tool provided by pentaho suit.
Spoon is used to create transformation using GUI interface.
if you simply want to run the transformations and jobs then use kitchen or spoon. (mainly used for running this things using command line)
saiku is a 1 of server, pentaho itself has a server (pentaho bi server) and in this you can add saiku pluggin for displaying cubes which are designed in pentaho schema workbench.
for more understanding google the terms which i mansion in the answer.