Are SSL and Kerberos compatible to each other on Hive Server? - ssl

My Hive server is SSL as well as Kerberos enabled. But when I try to connect to hiverserver2 via beeline using following command:
*!connect jdbc:hive2://**hostnameOfServer**:10000/hive;ssl=true;sslTrustStore=**keystorePath**;trustStorePassword=**passwordfor keystore**;principal=**Kerberos hive principal** **database username** **database password** org.apache.hive.jdbc.HiveDriver*
I get following error :
Error: Could not open client transport with JDBC Uri: jdbc:hive2://hostnameOfServer:10000/hive;ssl=true;sslTrustStore=keystorePath;trustStorePassword=passwordfor
keystore;principal=Kerberos hive principal database username
database password org.apache.hive.jdbc.HiveDriver: Invalid status 21 (state=08S01,code=0)
Also I tried using following command on beeline:
jdbc:hive2://**hostnameOfServer**:10000/hive;principal=**Kerberos hive principal**?transportMode=https;httpPath=cliservice;auth=kerberos;sasl.qop=auth.
But got same error.
Are ssl and kerberos compatible to each other?

Yes it is compatible from version Hive-2.0.0. Check the below JIRA task for more information
https://issues.apache.org/jira/browse/HIVE-14019

Related

ODBC Hive, Credential cache is empty

I got error "Credential cache is empty" during ODBC Hive tests. See full error detail
ODBC Hive - Test Results
[Cloudera][Hardy] (34) Error from server: SASL(-1): generic failure: GSSAPI Error: Unspecified GSS failure. Minor code may provide more information (Credential cache is empty).
Do you have any experience?
I tested different setting of MIT Kerberos in Windows e.g.:
generate kerberos ticket, kinit.exe -k -t app_store.keytab app_store#HW.PROD.BDP'
checked kerberos tickets in cache, klist.exe
setup KRB5CCNAME=C:\cache\krb5cache and KRB5_CONFIG=c:\ProgramData\MIT\Kerberos5\krb5.ini
I see a few possible issues:
You have to check that krb5cache is file (not directory), it is important point
The path to the cache have to be different for different person use this setting for variable KRB5CCNAME=%USERPROFILE%\krb5cache
You have to generate kerberos ticket before run ODBC Test, see
kinit.exe -k -t "c:\Apps\MIT\Kerberos\store.keytab" store#HW.PROD.BDP

Configuring ODBC 32bit on Windows11 is failing

Trying to configure the ODBC driver on Windows is returning the following message
[Simba][DriverSupport] (1120) SSL verification failed because the server host name specified for the connection does not match the’CN’ entry in the ‘Subject’ field or any of the ‘DNS Name’ entries of the ‘Subject Alternavice Name’ field in the server certificate.
Using the default cacerts.pem file
What step am I missing?

Could not start SASL when connect hive with LDAP

when connecting to a hive-server without authenticate, it works fine, like this:
conn = connect(host='host.without.authenticate.', port=xxx, database=xxx, auth_mechanism='PLAIN')
when connecting to a hive-server with ldap authenticate as follow, occurs the sasl error.
Could not start SASL: Error in sasl_client_start (-4) SASL(-4)
I have installed saal and thrift-sasl, and can login in hive by shell:
conn = connect(host='host.with.ldap.authenticate.', port=xxx, database=xxx, auth_mechanism='LDAP', user=xxx, password=xxx)
Configuration: ubuntu 14, python2.7
I have visit the issue https://github.com/cloudera/impyla/issues/149 but no applicable methods
I don't know what's wrong, and appreciate for your answers

How to connect spark application to secure HBase with Kerberos

I`m trying to connect a Spark application to HBase with Kerberos enabled. Spark version is 1.5.0, CDH 5.5.2 and it's executed in yarn cluster mode.
When HbaseContext is initialized, it throws this error:
ERROR ipc.AbstractRpcClient: SASL authentication failed. The most likely cause is missing or invalid credentials. Consider 'kinit'.
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
I have tried to do the authentication in the code, adding:
UserGroupInformation.setConfiguration(config)
UserGroupInformation.loginUserFromKeytab(principalName, keytabFilename)
I distribute the keytab file with --files option in spark-submit.
Now, the error is:
java.io.IOException: Login failure for usercomp#COMPANY.CORP from keytab krb5.usercomp.keytab: javax.security.auth.login.LoginException: Unable to obtain password from user
...
Caused by: javax.security.auth.login.LoginException: Unable to obtain password from user
at com.sun.security.auth.module.Krb5LoginModule.promptForPass(Krb5LoginModule.java:856)
Is this the way to connect to Kerberized HBase from a Spark app?
please see the example configuration like below if you are missing anything like hadoop.security.authentication
val conf= HBaseConfiguration.create()
conf.set("hbase.zookeeper.quorum", "list of ip's")
conf.set("hbase.zookeeper"+ ".property.clientPort","2181");
conf.set("hbase.master", "masterIP:60000");
conf.set("hadoop.security.authentication", "kerberos");
Actually try to put your hbase-site.xml directly in the SPARK_CONF directory of your edge node (should be something like /etc/spark/conf or /etc/spark2/conf).
you can use loginUserFromKeytabAndReturnUGI, and uig.doAs
or you could put you hbase classpath to SPARK_DIST_CLASSPATH.

HiveServer2 not picking up the right Kerberos Principal while starting up

We have put these entries in hive-site.xml:
hive.server2.authentication : KERBEROS
hive.server2.authentication.kerberos.keytab : /tmp/hive.keytab
hive.server2.authentication.kerberos.principal : hive/FQDN of the hive VM#xxxxxxxx.COM
Using kinit command on the hive VM, we have verified that Kerberos principal and the keytab file are valid:
kinit -t FILE:/tmp/hive.keytab -k hive/FQDN of the hive VM#xxxxxxxx.COM
Then if we do,
klist
it shows the same in Ticket Cache as the default Principal.
But, when we try to start the HiveServer2 using :
sudo service hive-server2 start
it throws the exception :
Starting HiveServer2
javax.security.auth.login.LoginException: Kerberos principal should have 3 parts: hive
at org.apache.hive.service.auth.HiveAuthFactory.getAuthTransFactory(HiveAuthFactory.java:127)
at org.apache.hive.service.cli.thrift.ThriftCLIService.run(ThriftCLIService.java:505)
at java.lang.Thread.run(Thread.java:679)
When we try to start the service (using ./hiveserver2) with any other logged in user, say User123, it throws the same exception with :
Starting HiveServer2
javax.security.auth.login.LoginException: Kerberos principal should have 3 parts: User123
at org.apache.hive.service.auth.HiveAuthFactory.getAuthTransFactory(HiveAuthFactory.java:127)
at org.apache.hive.service.cli.thrift.ThriftCLIService.run(ThriftCLIService.java:505)
at java.lang.Thread.run(Thread.java:679)
Shouldn’t Kerberos Principal be picked up from the hive-site.xml and not the login user? Are we missing out something.
--
I have created a principal hive/FQDN of the hive VM#xxxxxxxx.COM in advance and created a keytab file for it.
We are on CDH 4.7 (not installed using CM), OEL6 and Kerberos5
Kerberos secuirty should be configured for HDFS and MR too, and not just HIVE.