Connecting Pentaho to Apache Druid - pentaho

Can anybody let me know the steps to connect to Apache Druid using Tableau, please?
I tried the following steps:
Copy the Avatica jar file to tomcat/lib
Create a new Generic Database (JDBC connection) using the following parameters options-
MSTR_JDBC_JAR_FOLDER=/opt/mstr/MicroStrategy/install/JDBC;
DRIVER=org.apache.calcite.avatica.remote.Driver;
URL={jdbc:avatica:remote:url=http://localhost:8082/druid/v2/sql/avatica/};
Note-
1. userid/password is not enabled i.e. Druid can be accessed without login credentials
Able to access the Druid console "http://localhost:8888/unified-console.html"
Able to connect to Druid using Apache Superset and Tableau applications.

jdbc:avatica:remote:url=http://localhost:8082/druid/v2/sql/avatica/
instead of HTTP replace it with druid. like I did below , may be it works.
jdbc:avatica:remote:url=druid://localhost:8082/druid/v2/sql/avatica/
Also let me know the update that were you able to connect druid to pentaho

Related

How to connect to Odoo Data with JDBC api via Talend?

I wanted to extract data from an odoo database on a cloud environment, I tried this jdbc url : jdbc:odoo:User=MyUser;Password=MyPassword;URL=http://MyOdooSite/;Database=MyDatabase;
But I didn't got any results. Do you have any idea, please?

simple-jdin/jdbc.properties is ignored in pentaho-server 8.2

If we need to perform queries in pentaho data integration (IDE), we need to add manually the datasource in simple-jdin/jdbc.properties
MyDatabase/type=javax.sql.DataSource
MyDatabase/driver=org.hsqldb.jdbcDriver
MyDatabase/url=jdbc:hsqldb:hsql://localhost/sampledata
MyDatabase/user=pentaho_admin
MyDatabase/password=password
This works as expected in the ide known as pentaho data integration, spoon or kettle.
But, the same in pentaho server 8.2 does not works.
Steps to reproduce the error
deploy or upload the transformation(.ktr) in the pentaho-server 8.2
add manually the datasource in the server /../pentaho-server/pentaho-solutions/system/simple-jndi/jdbc.properties
executed the transformation using pentaho server web console : run in background or schedule options
error: datasource not found
Alternative
Create the datasource manually using web console of pentaho server instead of manually modification of file : /../pentaho-server/pentaho-solutions/system/simple-jndi/jdbc.properties
Question
simple-jndi/jdbc.properties works in server or is just for development purposes (pdi)?
are the settings available in jdbc.properties file on server can you verify the same by referring to the pentaho data integrator > simple-jndi folder.
also are you able to connect to the database from the server may be using a database client to confirm.
NOTE:whenever you work with databases make sure you relevant library files to connect to the respective database.
From my personal experience, I was not able to make the server pick JNDI connection definitions from the simple-jndi/jdbc.properties file when the Pentaho Server was running with Tomcat.
I was only able to use JNDI on the Pentaho Server by defining JNDI Datasources in the Tomcat configuration files.

How to connect snappy-data server with Apache zeppelin for %snappy.sql query support

I am currently working on snappy-data sql query functionality,Snappy-data support for Apache zeppelin ,We can do all the functionality using Apache zeppelin connecting to the snappy-data , Then i configured all the setup as per the snappy-data documentation snappy-data with zeppelin link but after finishing this i can run the using spark interpreter with snappy data Scala programming,at the same time i can't able to run the job on using snappy-data interpreter, its through on "org.apache.thrift.transport.TTransportException" even i checked my logs also same error only appear on the log file. could you tell me anyone knows what was the issue on it. if know anything about the issue please share me, Thanks advanced.

Adding Informix JDBC to Jasper Server

i have installed Jasper Server on my local. I need to generate a report which will extract data from Informix DB. Therefore a JDBC for it is needed (duh). I found the JDBC file for informix called ifxjdbc.jar and i copy it to "/apache-tomcat/lib". Restarted the server. Then when i want to create a new data source, the dropdown list doesn't show my JDBC driver that i just added.
I even tried copying the JDBC file to "/apache-tomcat/webapps/jasperserver-pro/lib" then restart the server but still no luck
You should install JDBC driver which is available on IBM pages. It contains more jar files than just only ifxjdbc.jar. In my environment I copied: ifxjdbc.jar, ifxjdbcx.jar, ifxlang.jar, ifxlsupp.jar, ifxsqlj.jar and ifxtools.jar.
I think that you should run simple Java/Jython program that tries to connect to your database. If it will work then you are sure you have complete environment.

I can't create apache derby DB in worklight 6.0

I upgraded to worklight 6.0. I added both derby_core_plugin_10_8_2, derby_iu_doc_plugin_1.1.3, into eclipse-->plugins I restarted Eclipse and right click my project to add Apache Derby nature...nothing seems to happened, then I tried to go directly to server/java-->apache derby--> run SQL Script using "ij"...I get (the selected project does not have Apache derby nature please add it and try again)
Your question needs more clarity...It is unclear what you are actually trying to accomplish.
If you're trying to get worklight to run using a Derby database, then you have to modify your worklight.properties file with the following
wl.db.url=jdbc:derby:${worklight.home}/derby/WorklightDB;create=true
wl.db.username=dbusername
wl.db.password=dbpassword
If you're trying to call an SQL adapter to connect to a Derby database, then you'll need to post more information in order for us to see what it is that isn't actually working.