Dbeaver Connecting to Hive - SQLException: Method not supported - hive

I'm getting this error when trying to run a select after connecting to Hive.
Is this a bad jar file?
org.jkiss.dbeaver.model.impl.jdbc.JDBCException: SQL Error: Method not supported
at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCConnectionImpl.prepareStatement(JDBCConnectionImpl.java:170)
at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCConnectionImpl.prepareStatement(JDBCConnectionImpl.java:1)
at org.jkiss.dbeaver.model.DBUtils.createStatement(DBUtils.java:985)
at org.jkiss.dbeaver.model.DBUtils.prepareStatement(DBUtils.java:963)
at org.jkiss.dbeaver.runtime.sql.SQLQueryJob.executeSingleQuery(SQLQueryJob.java:313)
at org.jkiss.dbeaver.runtime.sql.SQLQueryJob.extractData(SQLQueryJob.java:633)
at org.jkiss.dbeaver.ui.editors.sql.SQLEditor$QueryResultsProvider.readData(SQLEditor.java:1169)
at org.jkiss.dbeaver.ui.controls.resultset.ResultSetDataPumpJob.run(ResultSetDataPumpJob.java:132)
at org.jkiss.dbeaver.runtime.AbstractJob.run(AbstractJob.java:91)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:54)
Caused by: java.sql.SQLException: Method not supported
at org.apache.hadoop.hive.jdbc.HiveConnection.createStatement(HiveConnection.java:229)
at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCConnectionImpl.createStatement(JDBCConnectionImpl.java:350)
at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCConnectionImpl.prepareStatement(JDBCConnectionImpl.java:138)
... 9 more

There is a calls in hive jdbc jar called org.apache.hive.jdbc.HiveResultSetMetaData . This class contains a method isWritable which is not supported by hive yet. This is the reason why you get the error "Method not supported".
Take the source code of this class and update the above method. Then generate the class and replaced it in the jar. This worked for me.

Related

create sequence via ucanaccess

I have some troubles creating a sequence in a MS Access database via Dbeaver / ucanaccess connector.
Does anybody know, if this should be possible or is not implemented?
The error I get is the following:
org.jkiss.dbeaver.model.sql.DBSQLException: SQL-Fehler: UCAExc:::5.0.1 net.ucanaccess.jdbc.FeatureNotSupportedException: Feature not supported yet.
at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCStatementImpl.executeStatement(JDBCStatementImpl.java:133)
at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.executeStatement(SQLQueryJob.java:577)
at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.lambda$1(SQLQueryJob.java:486)
at org.jkiss.dbeaver.model.exec.DBExecUtils.tryExecuteRecover(DBExecUtils.java:172)
at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.executeSingleQuery(SQLQueryJob.java:493)
at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.extractData(SQLQueryJob.java:894)
at org.jkiss.dbeaver.ui.editors.sql.SQLEditor$QueryResultsContainer.readData(SQLEditor.java:3645)
at org.jkiss.dbeaver.ui.controls.resultset.ResultSetJobDataRead.lambda$0(ResultSetJobDataRead.java:123)
at org.jkiss.dbeaver.model.exec.DBExecUtils.tryExecuteRecover(DBExecUtils.java:172)
at org.jkiss.dbeaver.ui.controls.resultset.ResultSetJobDataRead.run(ResultSetJobDataRead.java:121)
at org.jkiss.dbeaver.ui.controls.resultset.ResultSetViewer$ResultSetDataPumpJob.run(ResultSetViewer.java:4949)
at org.jkiss.dbeaver.model.runtime.AbstractJob.run(AbstractJob.java:105)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:63)

Flink 1.9 SQL Client throws ClassNotFoundException: org.apache.kafka.clients.consumer.ConsumerRecord

Trying to use Flink 1.9 SQL-Client with Kafka without success.
After figuring out about the required jar files and copying them into the lib directory, I am getting the following run-time exception when doing SELECT * FROM table-name:
Flink SQL> select * from default_catalog.default_database.member_customer_newsletters ;
[ERROR] Could not execute SQL statement. Reason:
java.lang.ClassNotFoundException: org.apache.kafka.clients.consumer.ConsumerRecord
Doing jar -tf of the jar-file, I can see the class ConsumerRecord being there:
jar -tf ./lib/flink-sql-connector-kafka-0.11_2.12-1.9.0.jar|grep 'ConsumerRecord'
org/apache/flink/kafka011/shaded/org/apache/kafka/clients/consumer/ConsumerRecord.class
So, I am not sure why it is trhowing a ClassNotFoundException as the class is already in the jar-file?
I just need to add the run-time is looking for "org.apache.kafka.clients.consumer.ConsumerRecord" but because this jar-file is shaded the full qualified name for the class is "org.apache.flink.kafka011.shaded.org.apache.kafka.clients.consumer.ConsumerRecord.class"
But, this should be true with any other class in this shaded jar as well!
There are two different classes:
org.apache.kafka.clients.consumer.ConsumerRecord
org.apache.flink.kafka011.shaded.org.apache.kafka.clients.consumer.ConsumerRecord
In flink-sql-connector-kafka-0.11_2.12-1.9.0.jar, you found the class
org.apache.flink.kafka011.shaded.org.apache.kafka.clients.consumer.ConsumerRecord
while Flink is complaining about:
org.apache.kafka.clients.consumer.ConsumerRecord
The first is a class used internally by Flink, after a kind of copy-paste from Kafka.
The second one is a class in kafka-clients-0.11.0.2.jar.
So Flink is right to complain about a missing library.

failed to connect to a database in B1if

sometimes i have this error when i try to connect to my database in B1if :
vBIU.errhdlg='' exceptionmsg='com.sap.b1i.xcellerator.XcelleratorException: XCE001 Nested exception:
com.sap.b1i.bizprocessor.BizProcException: BPE001 Nested exception:
com.sap.b1i.xcellerator.XcelleratorException: XCE001 Nested exception:
com.sap.b1i.xcellerator.XcelleratorException: XCE001 Nested exception: java.lang.RuntimeException: Connect to
Business One failed. (-119) Database server type not supported -b1Server=SRV-B1-HYPPROD,
company=SBO_HYPFR_PROD, licenseServer=HYP-B1-LIC:30000, dbType=7, dbUser=sa, userName=B1i-' msglogexcl='false'
handover2CentralSrv='' MessageLog='true' msglogdbop='insert'>
the question is how to fix that ?
The Error happens when B1i don't get a connection to the DB. That should not be a problem because B1i repeats the process after a minuten and then the connection should be there.
We fixed this by Installing the 64 Bit version of B1DIAPI.x64. Not sure why that worked.
Note you need to update the SLD links for your database and SBO-COMMON - the jcoPath will likely be different - replace the "Program Files (x86)" with just "Program Files".

Error when connecting Hive with kibi?

I am using kibi-community-demo-full-4.6.4-linux-x64 version.
In datasource:
"connection_string": "jdbc:hive://localhost:10000/root",
"libpath": "/home/pare/Downloads/jar/",
"drivername": "org.apache.hadoop.hive.jdbc.HiveDriver",
"libs": "hive-jdbc-0.11.0.jar,hive-metastore-0.11.0.jar,libthrift-0.9.1.jar,hive-service-0.13.1.jar,hive-jdbc-1.2.1.2.3.2.0-2950-standalone.jar,hadoop-common-2.7.1.2.3.2.0-2950.jar",
After that when in queries I write a query it will show error like:
Queries Editor: Error 400 Bad Request: Error running static method java.lang.IllegalArgumentException: Bad URL format at org.apache.hive.jdbc.Utils.parseURL(Utils.java:185) at org.apache.hive.jdbc.HiveConnection.(HiveConnection.java:84)
What is the error can any one explain me how to solve it?
I am able to connect after changing the jars version.
and also I changed the driver name "org.apache.hive.jdbc.HiveDriver".

Exporting an HSQLDB to XML using DBUnit results in null pointer errors

I'm trying to export the entire contents of my database, an HSQLDB, to XML using DBUnit, and I'm getting null pointer errors that I can't understand. I'm following the example in the FAQ:
IDatabaseConnection xmlConnection = new DatabaseConnection(conn);
IDataSet allTables = xmlConnection.createDataSet();
XmlDataSet.write(allTables, new FileOutputStream(DATABASE_PATH + ".xml"));
The null pointer error occurs on the last line. conn and DATABASE_PATH aren't null as they're both checked for that and used later in the program without a problem (exporting the database into CSV using OpenCSV, which works perfectly and exactly as expected).
The stacktrace is as follows:
org.dbunit.dataset.DataSetException: java.sql.SQLException: java.lang.NullPointerException java.lang.NullPointerException
at org.dbunit.database.DatabaseDataSet.initialize(DatabaseDataSet.java:243)
at org.dbunit.database.DatabaseDataSet.getTableNames(DatabaseDataSet.java:272)
at org.dbunit.database.DatabaseDataSet.createIterator(DatabaseDataSet.java:258)
at org.dbunit.dataset.AbstractDataSet.iterator(AbstractDataSet.java:189)
at org.dbunit.dataset.stream.DataSetProducerAdapter.(DataSetProducerAdapter.java:63)
at org.dbunit.dataset.xml.XmlDataSetWriter.write(XmlDataSetWriter.java:128)
at org.dbunit.dataset.xml.XmlDataSet.write(XmlDataSet.java:104)
at org.dbunit.dataset.xml.XmlDataSet.write(XmlDataSet.java:91)
at pms.DatabaseExporter.exportToXML(DatabaseExporter.java:181)
at pms.DatabaseExporter.main(DatabaseExporter.java:301)
Caused by: java.sql.SQLException: java.lang.NullPointerException java.lang.NullPointerException
at org.hsqldb.jdbc.Util.sqlException(Util.java:224)
at org.hsqldb.jdbc.JDBCStatement.fetchResult(JDBCStatement.java:1830)
at org.hsqldb.jdbc.JDBCStatement.executeQuery(JDBCStatement.java:181)
at org.hsqldb.jdbc.JDBCDatabaseMetaData.execute(JDBCDatabaseMetaData.java:6150)
at org.hsqldb.jdbc.JDBCDatabaseMetaData.getTables(JDBCDatabaseMetaData.java:3170)
at org.dbunit.database.DefaultMetadataHandler.getTables(DefaultMetadataHandler.java:137)
at org.dbunit.database.DatabaseDataSet.initialize(DatabaseDataSet.java:199)
... 9 more
Caused by: org.hsqldb.HsqlException: java.lang.NullPointerException
at org.hsqldb.error.Error.error(Error.java:108)
at org.hsqldb.result.Result.newErrorResult(Result.java:1069)
at org.hsqldb.StatementDMQL.execute(StatementDMQL.java:192)
at org.hsqldb.Session.executeCompiledStatement(Session.java:1315)
at org.hsqldb.Session.executeDirectStatement(Session.java:1206)
at org.hsqldb.Session.execute(Session.java:990)
at org.hsqldb.jdbc.JDBCStatement.fetchResult(JDBCStatement.java:1822)
... 14 more
Caused by: java.lang.NullPointerException
at org.hsqldb.types.CharacterType.compare(CharacterType.java:418)
at org.hsqldb.index.IndexAVL.compareRowForInsertOrDelete(IndexAVL.java:617)
at org.hsqldb.index.IndexAVLMemory.insert(IndexAVLMemory.java:214)
at org.hsqldb.persist.RowStoreAVL.indexRow(RowStoreAVL.java:171)
at org.hsqldb.persist.RowStoreAVLHybridExtended.indexRow(RowStoreAVLHybridExtended.java:99)
at org.hsqldb.Table.insertSys(Table.java:2625)
at org.hsqldb.dbinfo.DatabaseInformationMain.SYSTEM_TABLES(DatabaseInformationMain.java:2353)
at org.hsqldb.dbinfo.DatabaseInformationMain.generateTable(DatabaseInformationMain.java:348)
at org.hsqldb.dbinfo.DatabaseInformationFull.generateTable(DatabaseInformationFull.java:379)
at org.hsqldb.dbinfo.DatabaseInformationMain.setStore(DatabaseInformationMain.java:507)
at org.hsqldb.persist.PersistentStoreCollectionSession.getStore(PersistentStoreCollectionSession.java:138)
at org.hsqldb.Table.getRowStore(Table.java:2817)
at org.hsqldb.RangeVariable$RangeIteratorMain.(RangeVariable.java:939)
at org.hsqldb.RangeVariable$RangeIteratorMain.(RangeVariable.java:917)
at org.hsqldb.RangeVariable.getIterator(RangeVariable.java:770)
at org.hsqldb.QuerySpecification.buildResult(QuerySpecification.java:1293)
at org.hsqldb.QuerySpecification.getSingleResult(QuerySpecification.java:1245)
at org.hsqldb.QuerySpecification.getResult(QuerySpecification.java:1235)
at org.hsqldb.StatementQuery.getResult(StatementQuery.java:66)
at org.hsqldb.StatementDMQL.execute(StatementDMQL.java:190)
... 18 more
I've googled and couldn't find anything relating to this kind of error during export. I'm not that experienced with SQL or JDBC so I'm hoping there's enough info in the stack trace for someone more knowledgeable to tell me what's going wrong. If there's some other library that would be better for my needs, I have no problem switching... The only thing I need is export/import with XML right now, so I'm not using DBUnit for anything else. Anyway if anybody can tell me what's going on wrong or if I ought to be using something else I'd really appreciate it.
This is an error in the particular version of HSQLDB's system table creation, which was spotted and corrected recently. You can try the updated HSQLDB jar from http://hsqldb.org/support/