How to connect snappy-data server with Apache zeppelin for %snappy.sql query support - apache-spark-sql

I am currently working on snappy-data sql query functionality,Snappy-data support for Apache zeppelin ,We can do all the functionality using Apache zeppelin connecting to the snappy-data , Then i configured all the setup as per the snappy-data documentation snappy-data with zeppelin link but after finishing this i can run the using spark interpreter with snappy data Scala programming,at the same time i can't able to run the job on using snappy-data interpreter, its through on "org.apache.thrift.transport.TTransportException" even i checked my logs also same error only appear on the log file. could you tell me anyone knows what was the issue on it. if know anything about the issue please share me, Thanks advanced.

Related

Connecting Pentaho to Apache Druid

Can anybody let me know the steps to connect to Apache Druid using Tableau, please?
I tried the following steps:
Copy the Avatica jar file to tomcat/lib
Create a new Generic Database (JDBC connection) using the following parameters options-
MSTR_JDBC_JAR_FOLDER=/opt/mstr/MicroStrategy/install/JDBC;
DRIVER=org.apache.calcite.avatica.remote.Driver;
URL={jdbc:avatica:remote:url=http://localhost:8082/druid/v2/sql/avatica/};
Note-
1. userid/password is not enabled i.e. Druid can be accessed without login credentials
Able to access the Druid console "http://localhost:8888/unified-console.html"
Able to connect to Druid using Apache Superset and Tableau applications.
jdbc:avatica:remote:url=http://localhost:8082/druid/v2/sql/avatica/
instead of HTTP replace it with druid. like I did below , may be it works.
jdbc:avatica:remote:url=druid://localhost:8082/druid/v2/sql/avatica/
Also let me know the update that were you able to connect druid to pentaho

Google App Script - MySql 8 - JDBC Connection Failed

I'm trying to connect to a self-hosted MySql 8 instance from a Google AdWords Script (effectively an App Script) using the Jdbc utility over SSL. My code follows the specifications in these answers here and here.
No matter what I do, I still get the following error:
ERROR: Failed to establish a database connection. Check connection string, username and password.
It turns out that the Jdbc utility only works with versions of MySql up to 5.7.
I thought I would post this here posthumously in case I can save someone else time.
The issue has been raised here and marked as fixed recently.
Other related issues can be searched here
it seems like Google has updated the jdbc connector. It is working for me now,

ColdFusion SQL Server Error Too many open files error?

I have an API developed in ColdFusion 9 that continuously searches for items and inserts a record on the outcome of that search into a SQL Server 2008 table but I'm noticing a lot of errors in my Application log for the following error:
Error Executing Database Query.[Macromedia][SQLServer JDBC Driver]Error establishing socket to host and port: X.X.X.X:X. Reason: Too many open files. The specific sequence of files included or processed is: foo.cfm, line: 203
I realise there's not much to go on here but that's all the info I have from the logs.
Anyone have the faintest idea what might be going on?!
I got a similar error from using and old version of Lucene. Because Lucene used an old version of apache commons io that would sometimes stop closing the file read by the Lucene Index. So every time someone woul do a search a file would be opened and never closed. Eventually we hit the file open limit which would cause various problem on the server. One Of which is you can't connect to a datasource.
We had to bounce the server a couple times to release the open files. And then we updated our Lucene software to the latest version.
I believe Lucene is what Solr runs on (the cf index).
This happened on a Linux machine and we were running java, not coldfusion (but cf runs on Java)

I can't create apache derby DB in worklight 6.0

I upgraded to worklight 6.0. I added both derby_core_plugin_10_8_2, derby_iu_doc_plugin_1.1.3, into eclipse-->plugins I restarted Eclipse and right click my project to add Apache Derby nature...nothing seems to happened, then I tried to go directly to server/java-->apache derby--> run SQL Script using "ij"...I get (the selected project does not have Apache derby nature please add it and try again)
Your question needs more clarity...It is unclear what you are actually trying to accomplish.
If you're trying to get worklight to run using a Derby database, then you have to modify your worklight.properties file with the following
wl.db.url=jdbc:derby:${worklight.home}/derby/WorklightDB;create=true
wl.db.username=dbusername
wl.db.password=dbpassword
If you're trying to call an SQL adapter to connect to a Derby database, then you'll need to post more information in order for us to see what it is that isn't actually working.

Could not open "Pentaho User Console" page in browser

I'm working on Java Pos and I'm a newbie. I need (kettle) Pentaho Data Integration in order to integrate the Java POS' database with the database in the ERP. I followed the following manual
"http://www.scribd.com/doc/19583351/Install-Guide-for-Pentaho-Business-Intelligence-BI-Suite-CE"
and I'm stuck at Part 3- Step 1. When I type localhost address in the browser, instead of getting pentaho login page i'm getting a "HTTP Status 404" error.
Do I've to change the tomcat server port or anything else? Please, help me find out the glitch in this program?
Check your server.xml to see what port is listening on. I assume when you started tomcat it started successfully? ( check the log for errors )
Use google.
Finally, if you want to use ETL/Kettle then you need to start off looking at the Spoon tool - this is the UI for building ETL. So look at that first perhaps.
(you dotn even need the BI server if all you're doing is ETL.)