SAP HANA Studio's Remote Source has stopped working - hive

My Hive's remote source in SAP HANA Studio Has stopped working. The image below shows it:
The error that shows is the following:
SAP DBTech JDBC: [403]: internal error: Cannot get remote source objects: [unixODBC][Driver Manager]Data source name not found, and no default driver specified
In SAP HANA machine, I have the next configuration in /etc/odbc.ini:
[HIVE]
Description=Hortonworks Hive ODBC Driver (64-bit) DSN
Driver=/usr/lib/hive/lib/native/Linux-amd64-64/libhortonworkshiveodbc64.so
HOST="My AWS IP instance"
PORT=10000
Schema=default
ServiceDiscoveryMode=0
ZKNamespace=
HiveServerType=2
AuthMech=3
ThriftTransport=1
UseNativeQuery=0
UID=hive
PWD=hive
KrbHostFQDN=_HOST
KrbServiceName=hive
KrbRealm=
SSL=0
TwoWaySSL=0
ClientCert=
ClientPrivateKey=
ClientPrivateKeyPassword=
When I try to run isql, it runs successfully and I can run queries successfully too. The image below shows it:
What could be the error?
Thanks for the support!

You may want to check that the ODBC.ini for the adm OS user is still configured correctly. "Data source name not found" sounds a lot like "there is no ODBC.ini entry for that".

Related

Database can't be opened

I get this error message whenever I try to get into my SQL Server using Visual Studio:
The database dbPath cannot be opened because it is version 869. This server supports version 852 and earlier. A downgrade path is not supported.
Could not open new database dbPath. CREATE DATABASE is aborted.
An attempt to attach an auto-named database for file dbPath failed. A database with the same name exists, or specified file cannot be opened, or it is located on UNC share.
Can you please tell me what should I do to fix this error?

Why the installing process of R package "RODBC" in "R CMD INSTALL" can't find ODBC driver manager?

I am trying to connect to an Vertica DB from R using "RODBC" package. Also, the machine I am using is an remote server which doesn't have direct internet access so I basically "transfer" all source files from my local to the remote server to build the system. So, in order to give you an clear context, I am listing all my steps in attending of installing "RODBC" package below:
Step1 - I downloaded the RODBC_1.3-13.tar.gz source file for RODBC and then tried to directly install it with "R CMD INSTALL". However, I encountered error as "ODBC headers sql.h and sqlext.h not found".
Step2 - After a few researches, I found that the installation of "unixodbc-dev" would potentially solve this issue. Therefore, I downloaded all needed dependencies for "unixodbc-dev" and transferred them to the server. As you can see the list:
Therefore, I also successfully installed "unixodbc-dev":
However, another error message appears when I tried to re-install the "RODBC" using "sudo R CMD INSTALL /home/mli/RODBC_1.3-13.tar.gz" in which it returns error "no ODBC driver manager found":
As the message indicates, the installation program can't locate my ODBC driver manager. So, I downloaded "vertica-client-7.2.3-0.x86_64.tar.gz" and unzipped it on the server:
So, now my question is how can I customize the "R CMD INSTALL" command say, using some parameter handles to direct the installation program to locate the driver manager? Or am I trying this in a right direction? Please let me know. Any help would be really appreciated!!! :)
ADDITION:
I have also tried it with JDBC in which the I successfully loaded the "RJDBC" package in R and used the JDBC driver from vertica-client-7.2.3-0.x86_64.tar.gz. Also, I have already had "rJava" installed. However, I have still got an error when I tried to make the connection. I am listing my result below:
I successfully installed the "RJDBC" with "$R CMD INSTALL RJDBC_0.2-5.tar.gz --library=/usr/local/lib/R/site-library/" and then I tried the following scripts in R. All the lines are successfully executed except on the line 16:
Based on the error message, I assumed the version of the JDBC driver that I was using is too new for the Vertica server. So, I was trying to use an older version JDBC driver instead, like the "vertica-jdk5-6.1.0-0.jar" which I have downloaded from this link:http://www.java2s.com/Code/Jar/v/Downloadverticajdk56100jar.htm
So, I moved the file "vertica-jdk5-6.1.0-0.jar" to my home directory on the server and then changed the JDBC driver path in the R script:
As you can see, it still returns error "FATAL: Unsupported frontend protocol 3.6: server supports 3.0 to 3.5". Am I doing it right? Or is there an issue with the new driver that I downloaded? How can make it works? Please, any help will be really appreciated! Thanks!!!
A few things:
First, just do sudo apt-get install r-cran-rodbc. The package was created (by yours truly) in no small part because dealing with unixODBC or iODBC is not fun. But even once you have that, you still need the ODBC driver for Linux from Vertica. And that part is filly.
Second, I just did something similar the other day but just used JDBC, which worked. You do of course need sudo apt-get install r-cran-rjava which has its own can of worms (but I already mentioned Java...) Still, maybe try that instead?
Third, you can cheat and just use psql pointed to the Vertica port (usually one above the PostgreSQL port).

Pentaho connecting to Cloudera Impala error

I'm trying to set up a connection in Pentaho Data Integrator to Cloudera Impala, but I keep getting the following error:
Driver class 'org.apache.hive.jdbc.ImpalaSimbaDriver' could not be found, make sure the 'Cloudera Impala' driver (jar file) is installed.
Could not initialize class org.apache.hive.jdbc.ImpalaSimbaDriver
org.pentaho.di.core.exception.KettleDatabaseException:
Error occurred while trying to connect to the database
Driver class 'org.apache.hive.jdbc.ImpalaSimbaDriver' could not be found, make sure the 'Cloudera Impala' driver (jar file) is installed.
Could not initialize class org.apache.hive.jdbc.ImpalaSimbaDriver
I downloaded the driver from Cloudera; put all the files in the in lib folder and CDH54 folder. Also, I added the environmental variable CLASSPATH making it point to the folder containing my .jar files.
The connection from Tableau to Impala works fine, and I can also connect from SQL Workbench to the distribution.
Am I missing something?
Thank you for any response.

DBVisualizer and HIVE

I am using DBVisualizer 9.2 and Cloudera 5.4.1
I want to setup my db visualizer such that I can query hive database from the dbvisualizer tool.
I downloaded the jdbc driver for HIVE from here
http://www.cloudera.com/downloads/connectors/hive/jdbc/2-5-16.html
I extracted all the jar files in /Users/User1/.dbvis/jdbc
But now, when I start dbvisualizer, I get an error
Ignored as there is no matching Default Driver for "com.cloudera.hive.jdbc41.HS1Driver", "com.cloudera.hive.jdbc41.HS2Driver"
/Users/User1/.dbvis/jdbc
HiveJDBC41.jar
TCLIServiceClient.jar
hive_metastore.jar
hive_service.jar
libfb303-0.9.0.jar
libthrift-0.9.0.jar
log4j-1.2.14.jar
ql.jar
slf4j-api-1.5.11.jar
slf4j-log4j12-1.5.11.jar
zookeeper-3.4.6.jar
So my question is, has anyone successfully configured the DBVisualizer tool to connect to cloudera hive server?
After several hours of troubleshooting. I was able to resolve the error and successfully connect to HIVE from DB Visualizer using the HIVE JDBC Driver from cloudera.
These are the steps I took
First go to Tools -> Tool Properties -> Driver finder paths.
Here register a new empty directory. this will be the place where you will download all your jars.
First in this directory extract all the JAR files which come along with the cloudera JDBC Hive Driver.
http://www.cloudera.com/downloads/connectors/hive/jdbc/2-5-4.html
Now go to Tools -> driver manager and select Hive. In the "user specified" tab. click on the "folder icon" on the right hand side and select all the jar files which you just unzipped. (not just the folder... select all jars).
Make sure you select com.cloudera.hive.jdbc41.HS2Driver
Now define connection to Hive using these parameters
url: jdbc:hive2://foo:10000/default
user: admin
password: admin
Now when I tried to connect, I still got errors.
"Type: java.lang.reflect.UndeclaredThrowableException"
In order to resolve the above, I you need to see the error log. (this was the most important step).
Tools -> Debug Window -> Error log
Here I saw that the mysterious "UndeclaredThrowableException" is occuring because a bunch of class files like http utils, http core, hadoop core, hive core and hive cli jar files were missing. I downloaded these jars from maven central
hadoop-core-0.20.2.jar
hive-exec-2.0.0.jar
hive-service-1.1.1.jar
httpclient-4.5.2.jar
httpcore-4.4.4.jar
and again I went inside Tools->DriverManager -> Hive -> user defined and clicked on folder on right hand side and selected each of these jars as well.
Now when I restarted DBVisualizer, I connected to hive just fine and I can query it using DBVisualizer.

Cannot open database because of Version error

I am getting the following error when I attempt the following actions in sequence.
Start Debugging.
Log in as a User.
Stop Debugging.
Clean Build
Without closing browser or logging out start Debugging again.
Attempt to Edit or Create or Update some database record.
If the user is not logged in or the browser is closed the error does not occur. I have even upgraded the SQL instance referring this SQL Server: attach incorrect version 661.
The database 'C:\USERS\ME\DOCUMENTS\VISUAL STUDIO 2012\PROJECTS\PROJ1\PROJ1\APP_DATA\ASPNETDB.MDF' cannot be opened because it is version 661. This server supports version 662 and earlier. A downgrade path is not supported.
Could not open new database 'C:\USERS\ME\DOCUMENTS\VISUAL STUDIO 2012\PROJECTS\PROJ1\PROJ1\APP_DATA\ASPNETDB.MDF'. CREATE DATABASE is aborted.
An attempt to attach an auto-named database for file C:\USERS\ME\DOCUMENTS\VISUAL STUDIO 2012\PROJECTS\PROJ1\PROJ1\APP_DATA\ASPNETDB.MDF failed. A database with the same name exists, or specified file cannot be opened, or it is located on UNC share.
I am testing this on Chrome.
Any Ideas on what area of either MVC Code/ SQL/ Web.Config I should look for to find the cause of this error. I haven't changed the default Membership that is created with VS2012 when you start a new project.