Schema loads in Pentaho BI Server 5 but cube is not displayed - pentaho

im running schema-workbench 3.6.1, Pentaho BI server 5.0.1, Saiku analytics installed from the marketplace and I successfully publish from schema-workbench. (I also have the datasource present and working in both schema-workbench and BI Server)
When I go to BI server the analytics file created by the workbench is present but the cube is not in saiku analytics, I try to refresh and only are present the two test cubes. What im doing wrong?
I appreciate your help.

Check your pentaho.log and catalina.out files, there's most likely an error on the schema (bad column or table name, wrong data type on a level, or something similar).

Related

Is there any file where I can edit the database connection in between pentaho and postgreSQL

I have installed Alflytics V 5.0 on the Pentaho Server.
When I am made changes in the Configuration tab and trying to save, it shows that "configuration not saved".
My issue is that I am unable to make changes in configuration and extract the data in Alflytics.
So I want to know the file in which we can change the database connection setting for Pentaho server and the PostgreSQL.
I have installed Penthao server on Windows 10 platform.
Thanks and Regards.

simple-jdin/jdbc.properties is ignored in pentaho-server 8.2

If we need to perform queries in pentaho data integration (IDE), we need to add manually the datasource in simple-jdin/jdbc.properties
MyDatabase/type=javax.sql.DataSource
MyDatabase/driver=org.hsqldb.jdbcDriver
MyDatabase/url=jdbc:hsqldb:hsql://localhost/sampledata
MyDatabase/user=pentaho_admin
MyDatabase/password=password
This works as expected in the ide known as pentaho data integration, spoon or kettle.
But, the same in pentaho server 8.2 does not works.
Steps to reproduce the error
deploy or upload the transformation(.ktr) in the pentaho-server 8.2
add manually the datasource in the server /../pentaho-server/pentaho-solutions/system/simple-jndi/jdbc.properties
executed the transformation using pentaho server web console : run in background or schedule options
error: datasource not found
Alternative
Create the datasource manually using web console of pentaho server instead of manually modification of file : /../pentaho-server/pentaho-solutions/system/simple-jndi/jdbc.properties
Question
simple-jndi/jdbc.properties works in server or is just for development purposes (pdi)?
are the settings available in jdbc.properties file on server can you verify the same by referring to the pentaho data integrator > simple-jndi folder.
also are you able to connect to the database from the server may be using a database client to confirm.
NOTE:whenever you work with databases make sure you relevant library files to connect to the respective database.
From my personal experience, I was not able to make the server pick JNDI connection definitions from the simple-jndi/jdbc.properties file when the Pentaho Server was running with Tomcat.
I was only able to use JNDI on the Pentaho Server by defining JNDI Datasources in the Tomcat configuration files.

Access PME Metadata in Pivot4J/Saiku

I am using Pentaho BI Server 5.0.1 with the Pivot4J plug-in for Pentaho and Pentaho Metadata Editor (PME) 5.1.0. I created a domain in PME with tables, relationships, security, etc. and published it to the Pentaho server. I can see this Domain as a datasource of type 'Metadata' when I go to 'Manage Datasources' in the Admin console. However my Domain does not appear as a source when I try to create a new Pivot4J view.
I do see the Pentaho demos (Sampledata and Steel Wheels) in Pivot4J but I do not see my domain/metadata. I also edited the PentahoObject.spring.xml file to remove security on metadata objects as suggested in the Pentaho doc. Same thing for Saiku. What is the step that I am missing here?

Issue with data sources that are created through the Pentaho Admin Console

Issue with data sources that are created through the Pentaho Admin Console ,Now my MySQL IP has been changes, i did modification in my BI server .
But Pentaho Admin console in not up .!
From logs i found this is having
01:24:40,370 ERROR [Logger] misc-org.pentaho.platform.engine.services.connection.datasource.dbcp.PooledDatasourceSystemListener: PooledDatasourceSystemListener.ERROR_0003 - Unable to pool datasource object: MyLocalDatabase caused by com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
01:24:40,857 WARN [PersistenceEngine] Falling back to built-in config
This MyLocalDatabase database Created at Pentaho Admin Console has to be changed with new IP . Do any one help me to know , which file consists of datasources that are created through the Pentaho Admin Console
My PAC is down, no error in server.log file
Datasource information is kept in the hibernate database.
By default, this is kept in an hypersonic database that is launched when you start the BI server. Check context.xml in webapps/pentaho/META-INF to make sure.
There's a DATASOURCE table in there that stores the data source definitions.

How I manage the sample datasource in Saiku

I installed the Saiku as a plugin in Pentaho 4.8. Saiku comes with the sample data which is SteelWheels. The user is able to choose "Select a cube" in the dropdown box. I just need to know how can I modify the data in that sample data? Where is the file which contains the datastore? And how does the database communicate with the XML file?
Thank you
You have to create a cube in pentaho schema workbench or in pentaho BI Server..
If you are creating it in Schema Workbench then you have to publish it after you publish your schema into bi server you will directly able to see that selected cube into Saiku Server..