Define a new olap datasource with Pentaho 5.0 CE - pentaho

With Pentaho v4.x we can define new olap datasource in the file pentaho-solutions/system/olap/datasources.xml, but this file is missing in Pentaho 5.0 CE.
How can I define new a olap datasource with Pentaho 5.0 CE?
The Pentaho documentation is not very helpful, they talk about a migration tool but I can't find a link to download it.
Some users on the Pentaho forum have similar questions, but no answer are given.
http://forums.pentaho.com/showthread.php?153855-In-version-5-0-1-pentaho-solutions-system-olap-datasources-xml-missing
http://forums.pentaho.com/showthread.php?156929-DSP-In-pentaho-5-0-2
Thanks.

You can define olap datasources in 5.0 using the manage Data Sources Functionality.
If you create a new Analysis Data Source, you'll be prompted for a schema file (the mondrian cube definition) that you can upload.

Also you can upload with the help of
Manage datasource -->setting button (Next to new data source) -- >analysis -->Select you file .
You can pass parameter mannually if you are using DynamicSchemaProcessor.

Related

Schema published but not seen in BI Server with jpivot

I created an schema in Schema Workbench and publish with no errors, but when I got in BI Server with the standard user admin, I choose New->jpivot, then it display the name of the schema I created but it does not display the cube. For reference the error I get from catalina.out is:
17:11:45,174 ERROR [PentahoDataSourceResolver] PentahoXmlaServlet.ERROR_0002 - IDatasourceService.UNABLE_TO_INSTANTIATE_OBJECT
org.pentaho.platform.api.data.DBDatasourceServiceException: javax.naming.NameNotFoundException: Name [Esquema Salario] is not bound in this Context. Unable to find [Esquema Salario].
Name [Esquema Salario] is not bound in this Context errors usually appear if you use JNDI name, which is not defined on your system. So, I assumed, that this is the name of the datasource, which you reference while publishing Mondrian schema files to BI server.
Xml file with Mondrian schema definition generated by Schema Workbench does not contain any information regarding how to connect to database. So, you need to specify these details when you upload your schema file on BI server (It's done on step 4 below).
But first you have to create the connection itself (steps 1-2):
Create new JDBC datasource:
Define connection parameters:
If cubes still don't appear after these steps, you may republish your cube: follow same steps as in step 1, but select "Analysis" instead of "JDBC" in the end.
Upload the xml file, generated by schema workbench and select the datasource, which you have created on step 2. .
If cube still does not appear - check your log again. If you see the same Name is not bound error, you may try to restart your BI server application (new connections usually get recongnized immediately, but if you had a connection with same name before, than you might need to restart tomcat).
If that does not work, than once again, check log files. I guess, you'll have some different error in this case.
I had the same problem as the OP (blank screen when clicking New View) with the latest version of Pentaho BI server 7.1 (latest at the moment) and even with the 6.0 version one, Pivot4J SNAPSHOT 1.0 plug in version (latest as of today), Schema Workbech 3.14 (latest as of today).
And as, in line with OP, my catalina.out log was also spitting the Name [DatasourceName] is not bound in this Context. Unable to find [DatasourceName].
After several trials and errors I noticed the problem showed up when I checked the "Register the XMLA Data Source" when publishing the schema on Schema Workbench. So to fix the problem I just unchecked it before publishing.
Another way to fix this is going to the Manage Datasources option on the BI server, Import Analysis, choosing the schema created by Schema Workbench, AND manually setting the data source parameter value EnableXmla to false and saving changes. Now the schema should show up when clicking on Create New > Pivot4J view.

How to use web service as data source on Pentaho

I am trying to use a web service as a data source in the Pentaho report designer.
Can you please guide me on this.
If you don't have Pentaho Data Integration, you can do it the hard way with Groovy scripting, I have an example here (although it uses a Java client not a Web client):
http://funpdi.blogspot.com/2014/09/groovy-datasources-with-pentaho-report.html
With Pentaho Data Integration, you can create a transformation that uses a REST step to get data from a web service. Then in Pentaho Report Designer you can create a Pentaho Data Integration datasource, choose the step you want to get fields from, then use those fields in your report. There's a great blog post explaining this process:
http://infochick76.blogspot.com/2013/10/pentaho-report-integration-with-web.html

Deploy a mondrian schema in pentaho 5.1 without schema workbench

I have a question, in pentaho 5.1, how can I deploy a cube without using the schema workbench? I'm kind of newbie in Pentaho.
Is there a cmd line? Java code? Or something like...
Thanks a lot!
You can do that in the User Console.
There is a menu Manage Data source... There you can upload your xml and refer to a database connection for it.
First, I suppose you have installed BA Server and have made at least fact table.
In case you don't know what the fact table is, or someone else is reading this answer, you can find brief explanation here.
Of course, it's better to have full Star Schema. You cannot create Snow Flake inside Pentaho User Console. You can create it with Pentaho Schema Workbench or by manually edit mondarian.xml.
Make sure that your JDBC driver is inside BA Server driver directory. Look! Open Pentaho User Console. It's by default at localhost:8080/pentaho or yourdomain.name:8080/pentaho and login as administrator
File -> New -> Data Source
Choose Data source
Type Choose fact table and define connections to dimensions (if exists)
Choose to modify cube on the end of data source wizard
.

Directly accessing data in Essbase

Is there any provision to access the physical data(in table format or something) from essbase directly??
We want to migrate data from Hyperion Essbase 9.3.1 to SAP SSM.
Please help, if you have any idea about this.
Regards,
Sreenath
You have some alternatives:
Use a Data Integration (ETL) tool that support Essbase (Oracle Data Inegrator, Informatica Power Center, IBM Data Stage)
You can export Essbase Data to a file (using Calc Script or Report Script)
You can use the java api to access the data from other language/application
You can even try some MDX connector to extract that data. However, I will just make some txt Report to extract the data according to your specifications

how to use the Pentaho data integration for reporting

In my project we are using Pentaho data integration , saiku-server for reporting.
Now i am new to the business intelligence thing and i am confused which functioning will be performed by which software.
Senior coder here dont tell me so thats why i am asking here.
I am confused what are the functioning provided by these tools
Pentahoo
PAN
Kitchen
Spoon
Saiku
There are scripting which generates the these four files
cube.json sims.json schema.json path.json
now i don't know which software will be using that json files pentahoo , saiku spoon or what.
can anyone give me some idea
Pentaho data integration is one of open source tool provided by pentaho suit.
Spoon is used to create transformation using GUI interface.
if you simply want to run the transformations and jobs then use kitchen or spoon. (mainly used for running this things using command line)
saiku is a 1 of server, pentaho itself has a server (pentaho bi server) and in this you can add saiku pluggin for displaying cubes which are designed in pentaho schema workbench.
for more understanding google the terms which i mansion in the answer.