I'm studying pentaho using this guide, but I'm having a problem showing the foodmart schema on the Schema drop down menu, both inside the Pentaho analysis and inside the Saiku analysis.
What do I have to do to make it work?
Further details can be provided upon asking.
I managed to make it work. Turns out my version of Pentaho didn't come with foodmart.montrian.xml
I downloaded it here and put it inside pentaho/biserver-ce/pentaho-solutions/foodmart
check the logs when pentaho/saiku start. See if there's an error validating the schema.
Also check that system/olap/datasources.xml is setup correctly.
what is your build? it's included in the downloadable demo, so you must have something different?
Related
I'am trying to download a pentaho server version that will be deployed on linux production environnement, and on sourceforge pentaho open source files, I found that it contained apprently two versions as shown in the picture below, Pentaho-Server-ce and Pentaho-Server-manual-ce and I want to know whether there are some differences in termes of requirements or functionalities and such.
And on the occasion I would like also to ask about how the pentaho server works in general (for instance it took me a while to figure out that I need a separate application called Pentaho Repor Designer to do my dashboards and then use them in the pentaho server. (the documentation is good but lacks a little bit -in my opinion- in termes of clarity and how to use it properly, moreover there are almost no tutorials for the pentaho server (I found some about the pentaho data integration) but almost none on pentaho server.
So any details and leads on how to master it would be highly appreciated.
could someone tell me how to share a query saiku by url? When I try it on another computer using "open in new window" url sends me to the initial screen of pentaho server. I tried using:
http://localhost:8080/pentaho/api/repos/path:mysaikuquery.saiku/run
but it disables the edit functions, and i need them.
Thank you very much in advance!!
Regards
I think it's a saiku issue, not pentaho's.
Why don't you ask to the author or the community. Unlike Pentaho they do not charge for an answer.
There's not a very active community around pentaho and their plugins like saiku. I have posted it in saiku community forums as well. But to tell that saiku have nothing to be related with pentaho...really I dont understand your comment, when the people who works with Saiku is used to use Pentaho suite too.
I found a temporal solution accidentally, so I'll share it: in Pentaho console, right mouse click on the tab with the saiku project opened (after have ejecuted it on browse file of course), and on the new menu choose the option "Create a deep link"...copy and paste it where you need it. It works with users and roles, so you will need such credentials to be able to view and modify the query to the cube. So it's perfect to share projects with clients or non admin users.
Regards
Can you try with edit instead of run?
I'm not sure it will work or not, I have worked long back ago
I created ms sql database in SSMS 2012. Connected successfully to Azure and trying to deploy db to the cloud.
Encountering following errors:
Please see screen shot
Numerous Usupported property errors — not supported when used as part of a data package
You're likely using a feature not supported in Azure SQL Database. Please refer to this non supported features list to help you pinpoint the problem:
http://msdn.microsoft.com/en-us/library/azure/ff394115.aspx
This happened with me too. In my case ,i changed the schema of a table after creating once for the first time. After deleting that table database deployed correctly. Usually this error occurs when validating schema fails.
Regards
MAnoj Bojja
I have a question, in pentaho 5.1, how can I deploy a cube without using the schema workbench? I'm kind of newbie in Pentaho.
Is there a cmd line? Java code? Or something like...
Thanks a lot!
You can do that in the User Console.
There is a menu Manage Data source... There you can upload your xml and refer to a database connection for it.
First, I suppose you have installed BA Server and have made at least fact table.
In case you don't know what the fact table is, or someone else is reading this answer, you can find brief explanation here.
Of course, it's better to have full Star Schema. You cannot create Snow Flake inside Pentaho User Console. You can create it with Pentaho Schema Workbench or by manually edit mondarian.xml.
Make sure that your JDBC driver is inside BA Server driver directory. Look! Open Pentaho User Console. It's by default at localhost:8080/pentaho or yourdomain.name:8080/pentaho and login as administrator
File -> New -> Data Source
Choose Data source
Type Choose fact table and define connections to dimensions (if exists)
Choose to modify cube on the end of data source wizard
.
I have an AppFuse struts 2 app. I am trying to deploy it on cloudbees. I have created a database and bound it with the DB but I am not sure how to add tables to the database on cloudbees. Is there a way I can run .sql script on database created in cloudbees?
Also when I try to run the app using link it gives error Requested resource not available. I am guessing its because of lack of DB . Can anyone help me on adding tables and data to DB? and also execute the app smoothly on cloudbees?
Thanks a ton for your help.
You can add the tables and populate the data using different ways. You have here an article which explains you how to do it.
From your application, using Spring, you can use something like this. You need to figure it out for other frameworks.
<!-- Database initializer. If any of the script fails, the initialization stops. -->
<jdbc:initialize-database data-source="dataSource">
<jdbc:script location="${jdbc.initLocation}"/>
<jdbc:script location="${jdbc.dataLocation}"/>
</jdbc:initialize-database>
If you plan to use a MySQL client, then you should take a look at this, which explains it step by step.
Regarding how to deploy and to bind a Tomcat 7 app with your database, you can take a look at this blog post.
You can find info about all the containers we support here.