Using MobileFirst to authenticate users with LDAP server in mobile app - ldap

I followed and used this tutorial:
https://developer.ibm.com/mobilefirstplatform/documentation/getting-started-7-1/foundation/authentication-security/using-ldap-login-module-to-authenticate-users-with-ldap-server-in-hybrid-applications/
I only edit the authenticationConfig.xml and run the application
<loginModule expirationInSeconds="-1" name="LDAPLoginModule">
<className>com.worklight.core.auth.ext.LdapLoginModule</className>
<parameter name="ldapProviderUrl" value="**************"/>
<parameter name="ldapTimeoutMs" value="2000"/>
<parameter name="ldapSecurityAuthentication" value="simple"/>
<parameter name="validationType" value="searchPattern"/>
<parameter name="ldapSecurityPrincipalPattern" value="{username}"/>
<parameter name="ldapSearchFilterPattern" value="(&(objectClass=user)(cn={username})(memberof=CN=******,OU=Clients,O=******))"/>
<parameter name="ldapSearchBase" value="OU=Clients,O=******"/>
</loginModule>
When I press the "Call protected adapter" it goes to LDAP Login Module, and when I entered the user credentials, nothing happens, the password just disappears.
Here's the log:
[WARNING ] FWLSE4014W: LdapLoginModule authentication failed. Reason 'javax.naming.AuthenticationException: [LDAP: error code 49 - 8009030C: LdapErr: DSID-0C0903A9, comment: AcceptSecurityContext error, data 2030, v1db1
at com.sun.jndi.ldap.LdapCtx.mapErrorCode(Unknown Source)
at com.sun.jndi.ldap.LdapCtx.processReturnCode(Unknown Source)
at com.sun.jndi.ldap.LdapCtx.processReturnCode(Unknown Source)
at com.sun.jndi.ldap.LdapCtx.connect(Unknown Source)
at com.sun.jndi.ldap.LdapCtx.(Unknown Source)
at com.sun.jndi.ldap.LdapCtxFactory.getUsingURL(Unknown Source)
at com.sun.jndi.ldap.LdapCtxFactory.getUsingURLs(Unknown Source)
at com.sun.jndi.ldap.LdapCtxFactory.getLdapCtxInstance(Unknown Source)
at com.sun.jndi.ldap.LdapCtxFactory.getInitialContext(Unknown Source)
at org.apache.aries.jndi.ContextHelper.getInitialContextUsingBuilder(ContextHelper.java:244)
at org.apache.aries.jndi.ContextHelper.getContextProvider(ContextHelper.java:208)
at org.apache.aries.jndi.ContextHelper.getInitialContext(ContextHelper.java:141)
at org.apache.aries.jndi.OSGiInitialContextFactoryBuilder.getInitialContext(OSGiInitialContextFactoryBuilder.java:51)
at javax.naming.spi.NamingManager.getInitialContext(Unknown Source)
at javax.naming.InitialContext.getDefaultInitCtx(Unknown Source)
at javax.naming.InitialContext.init(Unknown Source)
at javax.naming.ldap.InitialLdapContext.(Unknown Source)
at com.worklight.core.auth.ext.LdapLoginModule.login(LdapLoginModule.java:158)
at com.worklight.core.auth.impl.LoginContext.invokeLoginModule(LoginContext.java:252)
at com.worklight.core.auth.impl.LoginContext.processRequest(LoginContext.java:217)
at com.worklight.core.auth.impl.AuthenticationContext.processRequest(AuthenticationContext.java:510)
at com.worklight.core.auth.impl.AuthenticationFilter.doFilter(AuthenticationFilter.java:182)
at com.ibm.ws.webcontainer.filter.FilterInstanceWrapper.doFilter(FilterInstanceWrapper.java:206)
at com.ibm.ws.webcontainer.filter.WebAppFilterChain.doFilter(WebAppFilterChain.java:86)
at com.worklight.analytics.AnalyticsFilter.doFilter(AnalyticsFilter.java:124)
at com.ibm.ws.webcontainer.filter.FilterInstanceWrapper.doFilter(FilterInstanceWrapper.java:206)
at com.ibm.ws.webcontainer.filter.WebAppFilterChain.doFilter(WebAppFilterChain.java:86)
at com.ibm.ws.webcontainer.filter.WebAppFilterManager.doFilter(WebAppFilterManager.java:978)
at com.ibm.ws.webcontainer.filter.WebAppFilterManager.invokeFilters(WebAppFilterManager.java:1100)
at com.ibm.ws.webcontainer.webapp.WebApp.handleRequest(WebApp.java:4730)
at com.ibm.ws.webcontainer.osgi.DynamicVirtualHost$2.handleRequest(DynamicVirtualHost.java:297)
at com.ibm.ws.webcontainer.WebContainer.handleRequest(WebContainer.java:981)
at com.ibm.ws.webcontainer.osgi.DynamicVirtualHost$2.run(DynamicVirtualHost.java:262)
at com.ibm.ws.http.dispatcher.internal.channel.HttpDispatcherLink$TaskWrapper.run(HttpDispatcherLink.java:955)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
' [project LDAPLoginModule]
[LDAP: error code 49 - 8009030C: LdapErr: DSID-0C0903A9, comment: AcceptSecurityContext error, data 2030, v1db1
Any idea what seems to be the problem?

According to the exception provided in the error log: LdapLoginModule authentication failed. Reason 'javax.naming.AuthenticationException: [LDAP: error code 49 - 8009030C: LdapErr: DSID-0C0903A9
This may be related to the following: https://social.technet.microsoft.com/Forums/windowsserver/en-US/c98f3569-072a-4677-9b89-635ed2b8dffc/ldap-error-code-49-8009030c-ldaperr-dsid0c0903a9-comment-acceptsecuritycontext-error-data?forum=winserverDS
The error code 49 related to LDAP is caused by the invalid credentials. Please refer to the following most possible causes.
1. The DN path or password which you have specified for the administrator is invalid. Any of the below will result in this error:
1). Pointed to non-user DN
2). Pointed to a non-existent user, but in existing DN
3). Pointed to non existent DN
4). Pointed to an existing user, but non existing DN
5). Pointed to an incorrect admin DN, uid instead of cn
6). Pointed to a non administrator user
7). Pointed to a valid admin but password is incorrect
2. Could not authenticate the user trying to login. This can be the result of an incorrect username or password, or an incorrect prefix and/or suffix specified in the Settings tab, depending on the type of LDAP/AD system. Could also mean the authentication type is incorrect.

Related

javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided

I am getting this error when i try to connect to hive metastore using Spark SQL HiveContext.
i am running this on standalone cluster using spark-submit command from my desktop, not from the hadoop cluster.
Is it something to do with Security related issue? do i have to add something in the hive_site.xml? is there anything we need to update in the below entry?
<property>
<name>hive.metastore.sasl.enabled</name>
<value>true</value>
</property>
<property>
<name>hive.server2.authentication</name>
<value>kerberos</value>
</property>
The spark version is 1.4.0 and the hive-site.xml is placed under conf folder.
below is the error log.
15/08/25 18:27:15 INFO HiveContext: Initializing execution hive, version 0.13.1
15/08/25 18:27:16 INFO metastore: Trying to connect to metastore with URI thrift://metastore.com:9083
15/08/25 18:27:16 ERROR TSaslTransport: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Ker
beros tgt)]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:253)
at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:336)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:214)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340)
at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:105)
at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:163)
at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:161)
at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:167)
at com.cap1.ct.SparkSQLHive.main(SparkSQLHive.java:17)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)
at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
... 35 more
15/08/25 18:27:16 WARN metastore: Failed to connect to the MetaStore Server...
15/08/25 18:27:16 INFO metastore: Waiting 1 seconds before next connection attempt.
15/08/25 18:27:17 INFO metastore: Trying to connect to metastore with URI thrift://metastore.com:9083
15/08/25 18:27:17 ERROR TSaslTransport: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Ker
beros tgt)]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:253)
at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:336)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:214)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340)
at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:105)
at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:163)
at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:161)
at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:167)
at com.cap1.ct.SparkSQLHive.main(SparkSQLHive.java:17)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)
at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
... 35 more
Prerequisite: your hive-site.xml works for hive cli with kerberos enabled.
Spark with hive needs another property:
-Djavax.security.auth.useSubjectCredsOnly=false
Quote from official troubleshooting
GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos Ticket)
Cause: This may occur if no valid Kerberos credentials are obtained. In particular, this occurs if you want the underlying mechanism to obtain credentials but you forgot to indicate this by setting the javax.security.auth.useSubjectCredsOnly system property value to false (for example via -Djavax.security.auth.useSubjectCredsOnly=false in your execution command).
This issue is inherently tied to krb5.config file which has permissible servers. If it's not found or doesn't have a server domain entry then you may run into this.

Jenkins - ssh connection exception

This question may be asked multiple times but I need latest update on this issue.
I am trying to configure a slave for my jenkins Master but getting following exception:
ERROR: Unexpected error in launching a slave. This is probably a bug in Jenkins.
java.lang.IllegalStateException: Connection is not established!
at com.trilead.ssh2.Connection.getRemainingAuthMethods(Connection.java:1030)
at com.cloudbees.jenkins.plugins.sshcredentials.impl.TrileadSSHPasswordAuthenticator.canAuthenticate(TrileadSSHPasswordAuthenticator.java:82)
at com.cloudbees.jenkins.plugins.sshcredentials.SSHAuthenticator.newInstance(SSHAuthenticator.java:207)
at com.cloudbees.jenkins.plugins.sshcredentials.SSHAuthenticator.newInstance(SSHAuthenticator.java:169)
at hudson.plugins.sshslaves.SSHLauncher.openConnection(SSHLauncher.java:1173)
at hudson.plugins.sshslaves.SSHLauncher$2.call(SSHLauncher.java:701)
at hudson.plugins.sshslaves.SSHLauncher$2.call(SSHLauncher.java:696)
at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
[01/07/15 16:51:10] Launch failed - cleaning up connection
[01/07/15 16:51:10] [SSH] Connection closed.
Any idea of what's the issue here?

jenkins API token with LDAP authentication

We have Jenkins running with Active Directory authentocation and all good, Today when i tried curl to post files and it didn't work with API token but it does work with Password.
I got following ldap error.. Anybody has any experience? but it works with plain text password. because of security reason i don't want to use password
curl -X POST http://spatel:d606d1857409cc8ef1f2dff8aaab1cf1#server1.example.com/job/Test/config.xml --data-binary "#config.xml"
Jun 3, 2013 2:33:01 PM hudson.plugins.active_directory.ActiveDirectoryUnixAuthenticationProvider retrieveUser
WARNING: Failed to retrieve user information for spatel
javax.naming.NamingException: [LDAP: error code 1 - 00000000: LdapErr: DSID-0C090627, comment: In order to perform this operation a successful bind must be completed on the connection., data 0, vece]; remaining name 'DC=example,DC=com'
at com.sun.jndi.ldap.LdapCtx.mapErrorCode(Unknown Source)
at com.sun.jndi.ldap.LdapCtx.processReturnCode(Unknown Source)
at com.sun.jndi.ldap.LdapCtx.processReturnCode(Unknown Source)
at com.sun.jndi.ldap.LdapCtx.searchAux(Unknown Source)
at com.sun.jndi.ldap.LdapCtx.c_search(Unknown Source)
at com.sun.jndi.ldap.LdapCtx.c_search(Unknown Source)
at com.sun.jndi.toolkit.ctx.ComponentDirContext.p_search(Unknown Source)
at com.sun.jndi.toolkit.ctx.PartialCompositeDirContext.search(Unknown Source)
at com.sun.jndi.toolkit.ctx.PartialCompositeDirContext.search(Unknown Source)
at hudson.plugins.active_directory.LDAPSearchBuilder.search(LDAPSearchBuilder.java:52)
at hudson.plugins.active_directory.LDAPSearchBuilder.searchOne(LDAPSearchBuilder.java:42)
at hudson.plugins.active_directory.ActiveDirectoryUnixAuthenticationProvider.retrieveUser(ActiveDirectoryUnixAuthenticationProvider.java:191)
at hudson.plugins.active_directory.ActiveDirectoryUnixAuthenticationProvider.retrieveUser(ActiveDirectoryUnixAuthenticationProvider.java:130)
at hudson.plugins.active_directory.ActiveDirectoryUnixAuthenticationProvider.retrieveUser(ActiveDirectoryUnixAuthenticationProvider.java:95)
at hudson.plugins.active_directory.AbstractActiveDirectoryAuthenticationProvider.loadUserByUsername(AbstractActiveDirectoryAuthenticationProvider.java:27)
at hudson.plugins.active_directory.ActiveDirectorySecurityRealm.loadUserByUsername(ActiveDirectorySecurityRealm.java:551)
at hudson.model.User.impersonate(User.java:255)
at jenkins.security.ApiTokenFilter.doFilter(ApiTokenFilter.java:52)

Anyone having experience with DominoTomcatSSO project?

Is there anyone having experience with DominoTomcatSSO project? (see here http://www.openntf.org/internal/home.nsf/project.xsp?action=openDocument&name=DominoTomcatSSO )
I'm running Tomcat 6.0.37 and Domino 9 server. I have configured everything and can't see any error on Domino console. When I try to login in my Tomcat application and mistype the password, Domino and Tomcat correctly report, that username/password is incorrect so things are working fine. But when I type the correct password, I'm getting an error in my Tomcat application as below. When I look into my browser cookies, the LtpaToken is there but somehow cannot be used when creating a Notes Session as in error message. Any hint what can be wrong here?
NotesException - 4616 Cookie is invalid
NotesException: Cookie is invalid
at lotus.domino.NotesExceptionHelper.read(Unknown Source)
at lotus.domino.NotesExceptionHolder._read(Unknown Source)
at lotus.priv.CORBA.iiop.RepImpl.invoke(Unknown Source)
at lotus.priv.CORBA.portable.ObjectImpl._invoke(Unknown Source)
at lotus.domino.corba._IObjectServerStub.createSessionWithCookie(Unknown Source)
at lotus.domino.cso.Session.initSession(Unknown Source)
at lotus.domino.cso.Session.<init>(Unknown Source)
at lotus.domino.cso.Session.createSession(Unknown Source)
at lotus.domino.NotesFactory.createSessionC(Unknown Source)
at lotus.domino.NotesFactory.createSession(Unknown Source)
at com.automatedlogic.domino.sso.DominoBridge.openDominoSession(DominoBridge.java:84)
at com.automatedlogic.domino.sso.DominoBridge.openDominoSession(DominoBridge.java:68)
at com.automatedlogic.domino.sso.DominoUserProfile.refresh(DominoUserProfile.java:180)
at com.automatedlogic.domino.sso.DominoLoginFilter.doFilter(DominoLoginFilter.java:141)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)
at org.apache.coyote.http11.Http11AprProcessor.process(Http11AprProcessor.java:879)
at org.apache.coyote.http11.Http11AprProtocol$Http11ConnectionHandler.process(Http11AprProtocol.java:617)
at org.apache.tomcat.util.net.AprEndpoint$Worker.run(AprEndpoint.java:1760)
at java.lang.Thread.run(Thread.java:722)
Just found a problem, see URL below. I have updated the InternetSite configuration (descriptive name) based on point 1 from the answer in URL below.
http://www-01.ibm.com/support/docview.wss?rs=899&uid=swg21202709

Using Apache Directory Studio

I have followed http://confluence.atlassian.com/display/CROWD/Creating+a+Connection+to+your+LDAP+Directory
for basic understanding and trying to set up LDAP for my office use.
scenario:
I have started Apache 2.2 (Apache Directory Server) at localhost:389 and it's running fine.
Next, I installed Apache DirectoryStudio and trying to create new connection. But I am getting the following error while opening the connection:
Error while opening connection - localhost:389; socket closed
javax.naming.ServiceUnavailableException: localhost:389; socket closed
at com.sun.jndi.ldap.Connection.readReply(Unknown Source)
at com.sun.jndi.ldap.LdapClient.ldapBind(Unknown Source)
at com.sun.jndi.ldap.LdapClient.authenticate(Unknown Source)
at com.sun.jndi.ldap.LdapCtx.connect(Unknown Source)
at com.sun.jndi.ldap.LdapCtx.ensureOpen(Unknown Source)
at com.sun.jndi.ldap.LdapCtx.ensureOpen(Unknown Source)
at com.sun.jndi.ldap.LdapCtx.reconnect(Unknown Source)
at javax.naming.ldap.InitialLdapContext.reconnect(Unknown Source)
at org.apache.directory.studio.connection.core.io.jndi.JNDIConnectionWrapper$7.run(JNDIConnectionWrapper.java:1055)
at org.apache.directory.studio.connection.core.io.jndi.JNDIConnectionWrapper.runAndMonitor(JNDIConnectionWrapper.java:1272)
at org.apache.directory.studio.connection.core.io.jndi.JNDIConnectionWrapper.doBind(JNDIConnectionWrapper.java:1065)
at org.apache.directory.studio.connection.core.io.jndi.JNDIConnectionWrapper.bind(JNDIConnectionWrapper.java:254)
at org.apache.directory.studio.connection.core.jobs.OpenConnectionsRunnable.run(OpenConnectionsRunnable.java:114)
at org.apache.directory.studio.connection.core.jobs.StudioConnectionJob.run(StudioConnectionJob.java:114)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:55)
Please guide me!
I resolved this by changing theport numbers.
Now I am getting the following error:
Error while opening connection - [LDAP: error code 49 - cannot bind the principalDn.]
javax.naming.AuthenticationException: [LDAP: error code 49 - cannot bind the principalDn.]
at com.sun.jndi.ldap.LdapCtx.mapErrorCode(Unknown Source)
at com.sun.jndi.ldap.LdapCtx.processReturnCode(Unknown Source)
at com.sun.jndi.ldap.LdapCtx.processReturnCode(Unknown Source)
at com.sun.jndi.ldap.LdapCtx.connect(Unknown Source)
at com.sun.jndi.ldap.LdapCtx.ensureOpen(Unknown Source)
at com.sun.jndi.ldap.LdapCtx.ensureOpen(Unknown Source)
at com.sun.jndi.ldap.LdapCtx.reconnect(Unknown Source)
at javax.naming.ldap.InitialLdapContext.reconnect(Unknown Source)
at org.apache.directory.studio.connection.core.io.jndi.JNDIConnectionWrapper$7.run(JNDIConnectionWrapper.java:1055)
at org.apache.directory.studio.connection.core.io.jndi.JNDIConnectionWrapper.runAndMonitor(JNDIConnectionWrapper.java:1272)
at org.apache.directory.studio.connection.core.io.jndi.JNDIConnectionWrapper.doBind(JNDIConnectionWrapper.java:1065)
at org.apache.directory.studio.connection.core.io.jndi.JNDIConnectionWrapper.bind(JNDIConnectionWrapper.java:254)
at org.apache.directory.studio.connection.core.jobs.OpenConnectionsRunnable.run(OpenConnectionsRunnable.java:114)
at org.apache.directory.studio.connection.core.jobs.StudioConnectionJob.run(StudioConnectionJob.java:114)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:55)
This means that your connection to the ldap server is wrong. I know the default username and password for Apache DS is admin and secret respectively.
Try the following as the bind DN or user string:
uid=admin,ou=system