I installed the Saiku as a plugin in Pentaho 4.8. Saiku comes with the sample data which is SteelWheels. The user is able to choose "Select a cube" in the dropdown box. I just need to know how can I modify the data in that sample data? Where is the file which contains the datastore? And how does the database communicate with the XML file?
Thank you
You have to create a cube in pentaho schema workbench or in pentaho BI Server..
If you are creating it in Schema Workbench then you have to publish it after you publish your schema into bi server you will directly able to see that selected cube into Saiku Server..
Related
I have installed Alflytics V 5.0 on the Pentaho Server.
When I am made changes in the Configuration tab and trying to save, it shows that "configuration not saved".
My issue is that I am unable to make changes in configuration and extract the data in Alflytics.
So I want to know the file in which we can change the database connection setting for Pentaho server and the PostgreSQL.
I have installed Penthao server on Windows 10 platform.
Thanks and Regards.
If we need to perform queries in pentaho data integration (IDE), we need to add manually the datasource in simple-jdin/jdbc.properties
MyDatabase/type=javax.sql.DataSource
MyDatabase/driver=org.hsqldb.jdbcDriver
MyDatabase/url=jdbc:hsqldb:hsql://localhost/sampledata
MyDatabase/user=pentaho_admin
MyDatabase/password=password
This works as expected in the ide known as pentaho data integration, spoon or kettle.
But, the same in pentaho server 8.2 does not works.
Steps to reproduce the error
deploy or upload the transformation(.ktr) in the pentaho-server 8.2
add manually the datasource in the server /../pentaho-server/pentaho-solutions/system/simple-jndi/jdbc.properties
executed the transformation using pentaho server web console : run in background or schedule options
error: datasource not found
Alternative
Create the datasource manually using web console of pentaho server instead of manually modification of file : /../pentaho-server/pentaho-solutions/system/simple-jndi/jdbc.properties
Question
simple-jndi/jdbc.properties works in server or is just for development purposes (pdi)?
are the settings available in jdbc.properties file on server can you verify the same by referring to the pentaho data integrator > simple-jndi folder.
also are you able to connect to the database from the server may be using a database client to confirm.
NOTE:whenever you work with databases make sure you relevant library files to connect to the respective database.
From my personal experience, I was not able to make the server pick JNDI connection definitions from the simple-jndi/jdbc.properties file when the Pentaho Server was running with Tomcat.
I was only able to use JNDI on the Pentaho Server by defining JNDI Datasources in the Tomcat configuration files.
I am using Pentaho BI Server 5.0.1 with the Pivot4J plug-in for Pentaho and Pentaho Metadata Editor (PME) 5.1.0. I created a domain in PME with tables, relationships, security, etc. and published it to the Pentaho server. I can see this Domain as a datasource of type 'Metadata' when I go to 'Manage Datasources' in the Admin console. However my Domain does not appear as a source when I try to create a new Pivot4J view.
I do see the Pentaho demos (Sampledata and Steel Wheels) in Pivot4J but I do not see my domain/metadata. I also edited the PentahoObject.spring.xml file to remove security on metadata objects as suggested in the Pentaho doc. Same thing for Saiku. What is the step that I am missing here?
I have a database, currently hosted out on Microsoft Azure. I exported the database to my storage account and I attempted to import the database to a local instance of SQL 2012.
When importing, I am able to copy the BACPAC file from Azure, but I get an error on "Creating database on target," with the error reading:
Count not load schema model from package. (Microsoft.SqlServer.Dac)
ADDITIONAL INFORMATION:
Internal Error. The internal target platform type
SqlAzureDatabaseSchemaProvider does not support schema file version
'2.5'.
I have installed SQL Server Data Tools for 2012 from this download, which still was not able to fix the problem.
This was an old bug being reported a couple of years back that was fixed by an update to SSDT in 2012. Have you got this update: https://msdn.microsoft.com/en-us/jj650015?f=255&MSPPError=-2147217396
More discussion on Technet about this issue: https://social.technet.microsoft.com/Forums/en-US/66a4dfeb-c626-45eb-af3c-00e7e5996203/bacpac-file-import-from-windows-azure-fails?forum=ssdt
im running schema-workbench 3.6.1, Pentaho BI server 5.0.1, Saiku analytics installed from the marketplace and I successfully publish from schema-workbench. (I also have the datasource present and working in both schema-workbench and BI Server)
When I go to BI server the analytics file created by the workbench is present but the cube is not in saiku analytics, I try to refresh and only are present the two test cubes. What im doing wrong?
I appreciate your help.
Check your pentaho.log and catalina.out files, there's most likely an error on the schema (bad column or table name, wrong data type on a level, or something similar).