Could not start SASL when connect hive with LDAP - ldap

when connecting to a hive-server without authenticate, it works fine, like this:
conn = connect(host='host.without.authenticate.', port=xxx, database=xxx, auth_mechanism='PLAIN')
when connecting to a hive-server with ldap authenticate as follow, occurs the sasl error.
Could not start SASL: Error in sasl_client_start (-4) SASL(-4)
I have installed saal and thrift-sasl, and can login in hive by shell:
conn = connect(host='host.with.ldap.authenticate.', port=xxx, database=xxx, auth_mechanism='LDAP', user=xxx, password=xxx)
Configuration: ubuntu 14, python2.7
I have visit the issue https://github.com/cloudera/impyla/issues/149 but no applicable methods
I don't know what's wrong, and appreciate for your answers

Related

Are SSL and Kerberos compatible to each other on Hive Server?

My Hive server is SSL as well as Kerberos enabled. But when I try to connect to hiverserver2 via beeline using following command:
*!connect jdbc:hive2://**hostnameOfServer**:10000/hive;ssl=true;sslTrustStore=**keystorePath**;trustStorePassword=**passwordfor keystore**;principal=**Kerberos hive principal** **database username** **database password** org.apache.hive.jdbc.HiveDriver*
I get following error :
Error: Could not open client transport with JDBC Uri: jdbc:hive2://hostnameOfServer:10000/hive;ssl=true;sslTrustStore=keystorePath;trustStorePassword=passwordfor
keystore;principal=Kerberos hive principal database username
database password org.apache.hive.jdbc.HiveDriver: Invalid status 21 (state=08S01,code=0)
Also I tried using following command on beeline:
jdbc:hive2://**hostnameOfServer**:10000/hive;principal=**Kerberos hive principal**?transportMode=https;httpPath=cliservice;auth=kerberos;sasl.qop=auth.
But got same error.
Are ssl and kerberos compatible to each other?
Yes it is compatible from version Hive-2.0.0. Check the below JIRA task for more information
https://issues.apache.org/jira/browse/HIVE-14019

How to connect spark application to secure HBase with Kerberos

I`m trying to connect a Spark application to HBase with Kerberos enabled. Spark version is 1.5.0, CDH 5.5.2 and it's executed in yarn cluster mode.
When HbaseContext is initialized, it throws this error:
ERROR ipc.AbstractRpcClient: SASL authentication failed. The most likely cause is missing or invalid credentials. Consider 'kinit'.
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
I have tried to do the authentication in the code, adding:
UserGroupInformation.setConfiguration(config)
UserGroupInformation.loginUserFromKeytab(principalName, keytabFilename)
I distribute the keytab file with --files option in spark-submit.
Now, the error is:
java.io.IOException: Login failure for usercomp#COMPANY.CORP from keytab krb5.usercomp.keytab: javax.security.auth.login.LoginException: Unable to obtain password from user
...
Caused by: javax.security.auth.login.LoginException: Unable to obtain password from user
at com.sun.security.auth.module.Krb5LoginModule.promptForPass(Krb5LoginModule.java:856)
Is this the way to connect to Kerberized HBase from a Spark app?
please see the example configuration like below if you are missing anything like hadoop.security.authentication
val conf= HBaseConfiguration.create()
conf.set("hbase.zookeeper.quorum", "list of ip's")
conf.set("hbase.zookeeper"+ ".property.clientPort","2181");
conf.set("hbase.master", "masterIP:60000");
conf.set("hadoop.security.authentication", "kerberos");
Actually try to put your hbase-site.xml directly in the SPARK_CONF directory of your edge node (should be something like /etc/spark/conf or /etc/spark2/conf).
you can use loginUserFromKeytabAndReturnUGI, and uig.doAs
or you could put you hbase classpath to SPARK_DIST_CLASSPATH.

OpenDJ Multi-master replication fails(Hangs at Initializing registration information step):: javax.naming.AuthenticationException

I am using OpenDJ-2.4.6 along with Oracle JDK 7.80 and I want to run Multi-master replication on 2 of my servers, the OS for these servers is Amazon Linux.
The OpenDJ setup runs perfectly fine; I can start the server too without any errors.
It is when I run the "dsreplication" script as follows:
./dsreplication enable --host1 server1.example,com --port1 4444 --bindDN1 "cn=Directory Manager" --bindPassword1 "Passw0rd" --replicationPort1 1388 --host2 server2.example,com --port2 4444 --bindDN2 "cn=Directory Manager" --bindPassword2 "Passw0rd" --replicationPort2 1388 --adminUID admin --adminPassword "Passw0rd" --baseDN "dc=example,dc=com"
the script hangs on the following step:
Initializing registration information on server server2.example.com:4444 with the contents of server server1.example.com:4444 .....
And on checking the logs, there is no error reported in there.
But, when I run the following command:
./dsreplication status -h localhost -p 4444 --adminUID admin --adminPassword "Passw0rd" -X
it throws the following error:
The displayed information might not be complete because the following
errors were encountered reading the configuration of the existing
servers: Error on server2.example.com:4444: An error occurred
connecting to the server. Details:
javax.naming.AuthenticationException: [LDAP: error code 49 - Invalid
Credentials] Error on server:4444: An error occurred connecting to the
server. Details: javax.naming.AuthenticationException: [LDAP: error
code 49 - Invalid Credentials]
Please help me.
Thanks in advance.
The error could not be more explicit: "Invalid Credentials" on server 2.
Check the bindDN and bindPassword are valid against server 2.
When doing replication with OpenDJ, the hostnames must be resolved and addressable from either machines. Have you checked that this is the case with your Amazon Linux servers ?

Spring AMQP + RabbitMQ 3.3.5 ACCESS_REFUSED - Login was refused using authentication mechanism PLAIN

I am getting below exception
org.springframework.amqp.AmqpAuthenticationException: com.rabbitmq.client.AuthenticationFailureException: ACCESS_REFUSED - Login was refused using authentication mechanism PLAIN. For details see the broker logfile.
Configuration: RabbitMQ 3.3.5 on windows
On Config file in %APPDATA%\RabbitMQ\rabbit.config
I have done below change as per https://www.rabbitmq.com/access-control.html
[{rabbit, [{loopback_users, []}]}].
I also tried creating a user/pwd - test/test doesn't seem to make it work.
Tried the Steps from this post.
Other Configuration Details are as below:
Tomcat hosted Spring Application Context:
<!-- Rabbit MQ configuration Start -->
<!-- Connection Factory -->
<rabbit:connection-factory id="rabbitConnFactory" virtual-host="/" username="guest" password="guest" port="5672"/>
<!-- Spring AMQP Template -->
<rabbit:template id="rabbitTemplate" connection-factory="rabbitConnFactory" routing-key="ecl.down.queue" queue="ecl.down.queue" />
<!-- Spring AMQP Admin -->
<rabbit:admin id="admin" connection-factory="rabbitConnFactory"/>
<rabbit:queue id="ecl.down.queue" name="ecl.down.queue" />
<rabbit:direct-exchange name="ecl.down.exchange">
<rabbit:bindings>
<rabbit:binding key="ecl.down.key" queue="ecl.down.queue"/>
</rabbit:bindings>
</rabbit:direct-exchange>
In my Controller Class
#Autowired
RmqMessageSender rmqMessageSender;
//Inside a method
rmqMessageSender.submitToECLDown(orderInSession.getOrderNo());
In My Message sender:
import org.springframework.amqp.core.AmqpTemplate;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Component;
#Component("messageSender")
public class RmqMessageSender {
#Autowired
AmqpTemplate rabbitTemplate;
public void submitToRMQ(String orderId){
try{
rabbitTemplate.convertAndSend("Hello World");
} catch (Exception e){
LOGGER.error(e.getMessage());
}
}
}
Above exception Block gives below Exception
org.springframework.amqp.AmqpAuthenticationException: com.rabbitmq.client.AuthenticationFailureException: ACCESS_REFUSED - Login was refused using authentication mechanism PLAIN. For details see the broker logfile.
Error Log
=ERROR REPORT==== 7-Nov-2014::18:04:37 ===
closing AMQP connection <0.489.0> (10.1.XX.2XX:52298 -> 10.1.XX.2XX:5672):
{handshake_error,starting,0,
{amqp_error,access_refused,
"PLAIN login refused: user 'guest' can only connect via localhost",
'connection.start_ok'}}
Pls find below the pom.xml entry
<dependency>
<groupId>org.springframework.amqp</groupId>
<artifactId>spring-rabbit</artifactId>
<version>1.3.6.RELEASE</version>
</dependency>
<dependency>
<groupId>org.springframework.integration</groupId>
<artifactId>spring-integration-amqp</artifactId>
<version>4.0.4.RELEASE</version>
</dependency>
Please let me know if you have any thoughts/suggestions
I am sure what Artem Bilan has explained here might be one of the reasons for this error:
Caused by: com.rabbitmq.client.AuthenticationFailureException:
ACCESS_REFUSED - Login was refused using authentication mechanism PLAIN.
For details see the
but the solution for me was that I logged in to rabbitMQ admin page (http://localhost:15672/#/users) with the default user name and password which is guest/guest then added a new user and for that new user I enabled the permission to access it from virtual host and then used the new user name and password instead of default guest and that cleared the error.
To complete #cpu-100 answer,
in case you don't want to enable/use web interface, you can create a new credentials using command line like below and use it in your code to connect to RabbitMQ.
$ rabbitmqctl add_user YOUR_USERNAME YOUR_PASSWORD
$ rabbitmqctl set_user_tags YOUR_USERNAME administrator
$ rabbitmqctl set_permissions -p / YOUR_USERNAME ".*" ".*" ".*"
user 'guest' can only connect via localhost
That's true since RabbitMQ 3.3.x. Hence you should upgrade to the same version the client library, or just upgrade Spring AMQP to the latest version (if you use dependency managent system).
Previous version of client used 127.0.0.1 as default value for the host option of ConnectionFactory.
The error
ACCESS_REFUSED - Login was refused using authentication mechanism PLAIN. For details see the broker logfile.
can occur if the credentials that your application is trying to use to connect to RabbitMQ are incorrect or missing.
I had this happen when the RabbitMQ credentials stored in my ASP.NET application's web.config file had a value of "" for the password instead of the actual password string value.
To allow guest access remotely, write this
[{rabbit, [{loopback_users, []}]}].
to here
c:\Users\[your user name]\AppData\Roaming\RabbitMQ\rabbitmq.config
then restart the rabbitmq windows service (Source https://www.rabbitmq.com/access-control.html)
On localhost , By default use 'amqp://guest:guest#localhost:5672'
So on a remote or hosted RabbitMQ. Let's say you have the following credentials
username: niceboy
password: notnice
host: goxha.com
port : 1597
then the uri you should pass will be
amqp://niceboy:notnice#goxha.com:1597
following the template amqp://user:pass#host:10000
if you have a vhost you can do amqp://user:pass#host:10000/vhost where the trailing vhost will be the name of your vhost
New solution:
The node module can't handle : in a password properly. Even url encoded, like it would work normally, it does not work.
Don't use typicalspecial characters from an URL in the password!
Like one of the following: : . ? + %
Original, wrong answer:
The error message clearly complains about using PLAIN, it does not mean the crendentials are wrong, it means you must use encrypted data delivery (TLS) instead of plaintext.
Changing amqp:// in the connection string to amqps:// (note the s) solves this.
just add login password to connect to RabbitMq
CachingConnectionFactory connectionFactory =
new CachingConnectionFactory("rabbit_host");
connectionFactory.setUsername("login");
connectionFactory.setPassword("password");
For me the solution was simple: the user name is case sensitive. Failing to use the correct caps will also lead to the error.
if you use the number as your password, maybe you should try to change your password using string.
I can login using deltaqin:000000 on the website, but had this while running the program. then change the password to deltaiqn. and it works.
I made exactly what #grepit made.
But I had to made some changes in my Java code:
In Producer and Receiver project I altered:
ConnectionFactory factory = new ConnectionFactory();
factory.setHost("your-host-ip");
factory.setUsername("username-you-created");
factory.setPassword("username-password");
Doing that, you are connecting an specific host as the user you have created.
It works for me!
In my case I had this error, cuz of wrongly set password (I tried to use 5672, when the actual one in my system was 5676).
Maybe this will help someone to double check ports...
I was facing this issue due to empty space at the end of the password(spring.rabbitmq.password=rabbit ) in spring boot application.properties got resolved on removing the empty space. Hope this checklist helps some one facing this issue.
For C# coder, I tried below code and It worked, may be this can help someone so posting here.
scenario- RabbitMQ queue is running on another system in local area network but I was having same error.
by default there is a "guest" user exists. but you can not access remote server's queue (rabbitMq) using "guest" user so you need to create new user, Here I created "tester001" user to access data of remote server's queue.
ConnectionFactory factory = new ConnectionFactory();
factory.UserName = "tester001";
factory.Password = "testing";
factory.VirtualHost = "/";
factory.HostName = "192.168.1.101";
factory.Port = AmqpTcpEndpoint.UseDefaultPort;
If you tried all of these answers for your issue but you still got "ACCESS_REFUSED - Login was refused using authentication mechanism PLAIN", maybe you should remove rabbitmq and install it with a newer version.
Newer version worked for me.
Add one user and pass and connect to them. You can add 1 user via env variables (e.g., useful when Rabbit initializes in a Docker): RABBITMQ_DEFAULT_USER and RABBITMQ_DEFAULT_PASS. See more details here:
https://stackoverflow.com/a/70676040/1200914
set ConnectionFactory or Connection hostname to localhost

HiveServer2 not picking up the right Kerberos Principal while starting up

We have put these entries in hive-site.xml:
hive.server2.authentication : KERBEROS
hive.server2.authentication.kerberos.keytab : /tmp/hive.keytab
hive.server2.authentication.kerberos.principal : hive/FQDN of the hive VM#xxxxxxxx.COM
Using kinit command on the hive VM, we have verified that Kerberos principal and the keytab file are valid:
kinit -t FILE:/tmp/hive.keytab -k hive/FQDN of the hive VM#xxxxxxxx.COM
Then if we do,
klist
it shows the same in Ticket Cache as the default Principal.
But, when we try to start the HiveServer2 using :
sudo service hive-server2 start
it throws the exception :
Starting HiveServer2
javax.security.auth.login.LoginException: Kerberos principal should have 3 parts: hive
at org.apache.hive.service.auth.HiveAuthFactory.getAuthTransFactory(HiveAuthFactory.java:127)
at org.apache.hive.service.cli.thrift.ThriftCLIService.run(ThriftCLIService.java:505)
at java.lang.Thread.run(Thread.java:679)
When we try to start the service (using ./hiveserver2) with any other logged in user, say User123, it throws the same exception with :
Starting HiveServer2
javax.security.auth.login.LoginException: Kerberos principal should have 3 parts: User123
at org.apache.hive.service.auth.HiveAuthFactory.getAuthTransFactory(HiveAuthFactory.java:127)
at org.apache.hive.service.cli.thrift.ThriftCLIService.run(ThriftCLIService.java:505)
at java.lang.Thread.run(Thread.java:679)
Shouldn’t Kerberos Principal be picked up from the hive-site.xml and not the login user? Are we missing out something.
--
I have created a principal hive/FQDN of the hive VM#xxxxxxxx.COM in advance and created a keytab file for it.
We are on CDH 4.7 (not installed using CM), OEL6 and Kerberos5
Kerberos secuirty should be configured for HDFS and MR too, and not just HIVE.