Springboot application deployment to EKS - amazon-eks

We are seeing issues while creating an Ingress for a springboot application in EKS.
We are following the available AWS documentation while trying to create the Ingress.
https://aws.amazon.com/premiumsupport/knowledge-center/eks-alb-ingress-controller-setup/
While checking the logs, we are seeing the permission errors for a serviceAccount being able to access the its service.
Please see the logs below. Any help in this regard is highly appreciated.
{"level":"error","ts":1614308496.6815305,"logger":"controller","msg":"Reconciler error","controller":"ingress","name":"ingress-2048","namespace":"game-2048","error":"couldn't auto-discover subnets: WebIdentityErr: failed to retrieve credentials\ncaused by: AccessDenied: Not authorized to perform sts:AssumeRoleWithWebIdentity\n\tstatus code: 403, request id: fda38613-cded-49ba-9542-624962ffdb80"}
{"level":"error","ts":1614308506.9275057,"logger":"controller","msg":"Reconciler error","controller":"ingress","name":"ingress-2048","namespace":"game-2048","error":"couldn't auto-discover subnets: WebIdentityErr: failed to retrieve credentials\ncaused by: AccessDenied: Not authorized to perform sts:AssumeRoleWithWebIdentity\n\tstatus code: 403, request id: 73c54038-3259-4e04-8e44-930bccdc6ac0"}
{"level":"error","ts":1614308527.413746,"logger":"controller","msg":"Reconciler error","controller":"ingress","name":"ingress-2048","namespace":"game-2048","error":"couldn't auto-discover subnets: WebIdentityErr: failed to retrieve credentials\ncaused by: AccessDenied: Not authorized to perform sts:AssumeRoleWithWebIdentity\n\tstatus code: 403, request id: 4281835c-e491-4a64-9926-df5f22d2b630"}
{"level":"error","ts":1614308568.3853583,"logger":"controller","msg":"Reconciler error","controller":"ingress","name":"ingress-2048","namespace":"game-2048","error":"couldn't auto-discover subnets: WebIdentityErr: failed to retrieve credentials\ncaused by: AccessDenied: Not authorized to perform sts:AssumeRoleWithWebIdentity\n\tstatus code: 403, request id: f5ceec2c-a163-49a1-ba24-810c556278b8"}

Related

Importing an S3 storage via amplify CLI for Android project throws 403 unauthenticated error while trying to upload file to S3

When I create s3 from amplify CLI using: amplify add storage .. everything works fine and I can upload files.
But importing an existing S3 bucket via amplify import storage in CLI and uploading the file to S3 is throwing me the below error,
2021-02-06 11:39:26.571 16838-16838/com.example.seki E/MyAmplifyApp: Upload failed
StorageException{message=Something went wrong with your AWS S3 Storage upload file operation, cause=com.amazonaws.services.s3.model.AmazonS3Exception: Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID: AE9D3FCEA04F4A6B), S3 Extended Request ID: el6xSv6LQmsoV+JxPAiKy92Qmh5opEARO6RIhQLISz+vHBxp83+uY6jmLyQP8GXlrdDdhJoCTrY=, recoverySuggestion=See attached exception for more information and suggestions}
at com.amplifyframework.storage.s3.operation.AWSS3StorageUploadFileOperation$UploadTransferListener.onError(AWSS3StorageUploadFileOperation.java:207)
at com.amazonaws.mobileconnectors.s3.transferutility.TransferStatusUpdater$4.run(TransferStatusUpdater.java:306)
at android.os.Handler.handleCallback(Handler.java:873)
at android.os.Handler.dispatchMessage(Handler.java:99)
at android.os.Looper.loop(Looper.java:214)
at android.app.ActivityThread.main(ActivityThread.java:7050)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:494)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:965)
Caused by: com.amazonaws.services.s3.model.AmazonS3Exception: Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID: AE9D3FCEA04F4A6B), S3 Extended Request ID: el6xSv6LQmsoV+JxPAiKy92Qmh5opEARO6RIhQLISz+vHBxp83+uY6jmLyQP8GXlrdDdhJoCTrY=
at com.amazonaws.http.AmazonHttpClient.handleErrorResponse(AmazonHttpClient.java:742)
at com.amazonaws.http.AmazonHttpClient.executeHelper(AmazonHttpClient.java:420)
at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:229)
at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4913)
at com.amazonaws.services.s3.AmazonS3Client.putObject(AmazonS3Client.java:1960)
at com.amazonaws.mobileconnectors.s3.transferutility.UploadTask.uploadSinglePartAndWaitForCompletion(UploadTask.java:282)
at com.amazonaws.mobileconnectors.s3.transferutility.UploadTask.call(UploadTask.java:114)
at com.amazonaws.mobileconnectors.s3.transferutility.UploadTask.call(UploadTask.java:59)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1167)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:641)
at java.lang.Thread.run(Thread.java:764)
2021-02-06 11:39:26.574 16838-16838/com.example.seki E/MyAmplifyApp: Upload failed
StorageException{message=Storage upload operation was interrupted., cause=null, recoverySuggestion=Please verify that you have a stable internet connection.}
at com.amplifyframework.storage.s3.operation.AWSS3StorageUploadFileOperation$UploadTransferListener.onStateChanged(AWSS3StorageUploadFileOperation.java:188)
at com.amazonaws.mobileconnectors.s3.transferutility.TransferStatusUpdater$2.run(TransferStatusUpdater.java:224)
at android.os.Handler.handleCallback(Handler.java:873)
at android.os.Handler.dispatchMessage(Handler.java:99)
at android.os.Looper.loop(Looper.java:214)
at android.app.ActivityThread.main(ActivityThread.java:7050)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:494)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:965)
Note : I enable unauthenticated access for the Cognito identity pool created when importing S3.
I have raised the same in amplify Github as well, - https://github.com/aws-amplify/amplify-cli/issues/6515

Trouble to access S3 from Flink job on EMR

I have trouble to access S3 from a Flink job.
If I submit my assembled jar for my job, I get an access denied error:
Caused by:
com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.services.s3.model.AmazonS3Exception:
Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID: ...; S3 Extended Request ID: ...), S3 Extended Request ID: ...
This is my setup:
EMR cluster with is created with 'advanced configuration',
Flink 1.4.0. and Hadoop 2.8.3. as applications.
1x master, 2x nodes
Instances have EMR_EC2_DefaultRole, which have policy AmazonElasticMapReduceforEC2Role, which has S3 full access.
Indeed I can issue this commands successfully on either the master and slave nodes:
aws s3api list-buckets
hdfs dfs -ls s3://bucketA
I connect to the master and start a cluster:
/usr/lib/flink/bin/yarn-session.sh -n 2 -d
The Flink job reads from a bucket as a source:
object TestS3 {
def main(args: Array[String]): Unit = {
val env: ExecutionEnvironment = ExecutionEnvironment.getExecutionEnvironment
val input: DataSet[String] = env.readTextFile("s3://bucketA/source/file")
input.writeAsText("s3://bucketB/delete/me/later")
env.execute()
}
}
This is my simple build.sbt:
name := "TestS3"
scalaVersion := "2.11.11"
version := "0.1"
val flinkVersion = "1.4.0"
libraryDependencies ++= Seq(
"org.apache.flink" % "flink-core" % flinkVersion % "provided",
"org.apache.flink" %% "flink-scala" % flinkVersion % "provided"
)
The bucket has no policy to deny read access. It has the policy to deny deletes, but this should not affect the Flink job.
The EMR_EC2_Default_Role grants full access to S3.
As always, any hint of what I'm doing wrong is very much appreciated. Or maybe my expectation this should work is wrong?!
This is the full stacktrace:
java.io.IOException: Error opening the Input Split s3://bucketA/source/file [0,-1]: com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.services.s3.model.AmazonS3Exception: Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID: xyz_request_id; S3 Extended Request ID: xyz_ext_request_id), S3 Extended Request ID: xyz_ext_request_id
at org.apache.flink.api.common.io.FileInputFormat.open(FileInputFormat.java:705)
at org.apache.flink.api.common.io.DelimitedInputFormat.open(DelimitedInputFormat.java:477)
at org.apache.flink.api.common.io.DelimitedInputFormat.open(DelimitedInputFormat.java:48)
at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:145)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:718)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.IOException: com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.services.s3.model.AmazonS3Exception: Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID: xyz_request_id; S3 Extended Request ID: xyz_ext_request_id), S3 Extended Request ID: xyz_ext_request_id
at com.amazon.ws.emr.hadoop.fs.s3n.Jets3tNativeFileSystemStore.handleAmazonServiceException(Jets3tNativeFileSystemStore.java:434)
at com.amazon.ws.emr.hadoop.fs.s3n.Jets3tNativeFileSystemStore.retrievePair(Jets3tNativeFileSystemStore.java:461)
at com.amazon.ws.emr.hadoop.fs.s3n.Jets3tNativeFileSystemStore.retrievePair(Jets3tNativeFileSystemStore.java:439)
at com.amazon.ws.emr.hadoop.fs.s3n.S3NativeFileSystem.open(S3NativeFileSystem.java:1097)
at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:790)
at com.amazon.ws.emr.hadoop.fs.EmrFileSystem.open(EmrFileSystem.java:166)
at org.apache.flink.runtime.fs.hdfs.HadoopFileSystem.open(HadoopFileSystem.java:119)
at org.apache.flink.runtime.fs.hdfs.HadoopFileSystem.open(HadoopFileSystem.java:36)
at org.apache.flink.api.common.io.FileInputFormat$InputSplitOpenThread.run(FileInputFormat.java:865)
Caused by: com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.services.s3.model.AmazonS3Exception: Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID: xyz_request_id; S3 Extended Request ID: xyz_ext_request_id), S3 Extended Request ID: xyz_ext_request_id
at com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleErrorResponse(AmazonHttpClient.java:1639)
at com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1304)
at com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1056)
at com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:743)
at com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:717)
at com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:699)
at com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:667)
at com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:649)
at com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:513)
at com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4325)
at com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4272)
at com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.services.s3.AmazonS3Client.getObject(AmazonS3Client.java:1409)
at com.amazon.ws.emr.hadoop.fs.s3.lite.call.GetObjectCall.perform(GetObjectCall.java:22)
at com.amazon.ws.emr.hadoop.fs.s3.lite.call.GetObjectCall.perform(GetObjectCall.java:9)
at com.amazon.ws.emr.hadoop.fs.s3.lite.executor.GlobalS3Executor.execute(GlobalS3Executor.java:91)
at com.amazon.ws.emr.hadoop.fs.s3.lite.AmazonS3LiteClient.invoke(AmazonS3LiteClient.java:176)
at com.amazon.ws.emr.hadoop.fs.s3.lite.AmazonS3LiteClient.getObject(AmazonS3LiteClient.java:99)
at com.amazon.ws.emr.hadoop.fs.s3n.Jets3tNativeFileSystemStore.retrievePair(Jets3tNativeFileSystemStore.java:452)
... 7 more
We figured it out, the source bucket is encrypted with a KMS key. Therefore it was not sufficient for the EMR_EC2_DefaultRole to have full access to S3, but it needed access to the KMS key additionally. We extended the EMR_EC2_DefaultRole respectively and the Flink job can access the file now.
Maybe this post helps somebody to save some time (and not to forget KMS encryption).

Errot while Integrating Sonarqube with LDAP

sonar.security.realm=LDAP
ldap.url=ldap://ldap-company.com
ldap.bindDn=CN=xxxxx,OU=Restricted,OU=xxxx,DC=company,DC=com
ldap.bindPassword=none
# User Configuration
ldap.user.baseDn=ou=Users,dc=mycompany,dc=com
ldap.user.request=(&(objectClass=inetOrgPerson)(uid={login}))
ldap.user.realNameAttribute=cn
ldap.user.emailAttribute=mail
# Group Configuration
ldap.group.baseDn=OU=Groups,OU=companyname,DC=comapany,DC=com
ldap.group.request=(&(objectClass=posixGroup)(memberUid={uid}))
These are my configurations----sonarqube version-6.2
Database-Embedded
Do you guys have any idea how to integrate LDAP with Sonarqube. I tried different ways but couldn't get succeeded. this I my configuration for sonar.properties
I got an error 2017.03.15 15:57:25 ERROR web[AVrTij8L9uoXNT8qAAAK][o.s.s.a.RealmAuthenticator] Error during authentication
org.sonar.plugins.ldap.LdapException: Unable to retrieve details for user xxx in <default> and also Caused by: javax.naming.NamingException: [LDAP: error code 1 - 000004DC: LdapErr: DSID-0C090752, comment: In order to perform this operation a successful bind must be completed on the connection., dat
2017.03.15 15:55:05 INFO web[][o.s.s.a.EmbeddedTomcat] HTTP connector enabled on port 9000
2017.03.15 15:55:49 ERROR web[AVrTij8L9uoXNT8qAAAJ][o.s.s.a.RealmAuthenticator] Error during authentication
org.sonar.plugins.ldap.LdapException: Unable to retrieve details for user xxxxx in <default>
at org.sonar.plugins.ldap.LdapUsersProvider.getUserDetails(LdapUsersProvider.java:84)
at org.sonar.plugins.ldap.LdapUsersProvider.doGetUserDetails(LdapUsersProvider.java:58)
at org.sonar.server.authentication.RealmAuthenticator.doAuthenticate(RealmAuthenticator.java:89)
at org.sonar.server.authentication.RealmAuthenticator.authenticate(RealmAuthenticator.java:83)
at org.sonar.server.authentication.CredentialsAuthenticator.authenticate(CredentialsAuthenticator.java:56)
at org.sonar.server.authentication.CredentialsAuthenticator.authenticate(CredentialsAuthenticator.java:45)
at org.sonar.server.authentication.ws.LoginAction.authenticate(LoginAction.java:91)
This is my web.log
2017.03.16 13:10:09 INFO web[][o.s.s.p.UpdateCenterClient] Update center: https://update.sonarsource.org/update-center.properties (no proxy)
2017.03.16 13:10:09 INFO web[][org.sonar.INFO] Security realm: LDAP
2017.03.16 13:10:09 INFO web[][o.s.p.l.LdapSettingsManager] User mapping: LdapUserMapping{baseDn=DC=company,DC=com, request=(&(objectClass=inetOrgPerson)(uid={0})), realNameAttribute=cn, emailAttribut
e=mail}
2017.03.16 13:10:09 INFO web[][o.s.p.l.LdapSettingsManager] Group mapping: LdapGroupMapping{baseDn=OU=Groups,OU=comapny,Dc=company,DC=com, idAttribute=cn, requiredUserAttributes=[uid], request=(&(objectC
lass=posixGroup)(memberUid={0}))}
2017.03.16 13:10:09 INFO web[][o.s.p.l.LdapContextFactory] Test LDAP connection: FAIL
2017.03.16 13:10:09 INFO web[][o.s.s.p.d.EmbeddedDatabase] Embedded database stopped
2017.03.16 13:10:09 ERROR web[][o.a.c.c.C.[.[.[/]] Exception sending context initialized event to listener instance of class org.sonar.server.platform.web.PlatformServletContextListener
org.sonar.plugins.ldap.LdapException: Unable to open LDAP connection
at org.sonar.plugins.ldap.LdapContextFactory.testConnection(LdapContextFactory.java:206)
at org.sonar.plugins.ldap.LdapRealm.init(LdapRealm.java:63)
at org.sonar.server.user.SecurityRealmFactory.start(SecurityRealmFactory.java:84)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.picocontainer.lifecycle.ReflectionLifecycleStrategy.invokeMethod(ReflectionLifecycleStrategy.java:110)​
Your bind is failing. You need to test with an external LDAP tool like Apache Directory Studio tool, or Softerra's LDAP Browser.
It could be a firewall issue from your server to the LDAP server. It could be the password is incorrect. It does look like your Sonar server is able to talk to the LDAP server (Which looks like Active Directory) since you get an AD style error message about needing to bind before searching.
If you can get the error on the bind failing it will return an error code 49 with a subcode that is of interest. 525, 52e, 777 or the like that refer to different reasons Active Directory will not let you connect.
Note: Your password is 'none' which is hard to tell if that is you trying to hide the password, or an actual literal password.

OAUth2.0: WSO2 Identity server as a key manager in wso2 API Manager

I am using WSO2 Identity server as a key manager in WSO2 API manager. I am creating sample playground app with OAuth2.0 flow with WSO2 API manager.I have added new app in API store and generated consumer key and consumer secret.
steps:
Run Sample playground app -localhost:8080/playground2
Playground app home page
Click on Image we will go to oauth2.jsp page
here we have to fill:
response_type = code
client_id = VALUE_OF_CONSUMER_KEY
redirect_uri = REDIRECT_URL_OF_THE_APPLICATION
scope = SCOPE_OF_THE_ACCESS_REQUEST
Authorize
On Clicking Authorize button The application (client) requests an authorization code from the authorization server(WSO2 identity server) by sending a HTTP GET request with the following query parameters.
response_type = code
client_id = VALUE_OF_CONSUMER_KEY
redirect_uri = REDIRECT_URL_OF_THE_APPLICATION
scope = SCOPE_OF_THE_ACCESS_REQUEST
and it displays consent page
Consent page
when we click approve it redirects to wso2 login page
Login page
After entering credentials it gives following error: AfterLogin Error
HTTP Status 500 - org.apache.cxf.interceptor.Fault
type Exception report
message org.apache.cxf.interceptor.Fault
description The server encountered an internal error that prevented it from fulfilling this request.
exception
java.lang.RuntimeException: org.apache.cxf.interceptor.Fault
org.apache.cxf.interceptor.AbstractFaultChainInitiatorObserver.onMessage(AbstractFaultChainInitiatorObserver.java:116)
org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(PhaseInterceptorChain.java:336)
org.apache.cxf.transport.ChainInitiationObserver.onMessage(ChainInitiationObserver.java:121)
org.apache.cxf.transport.http.AbstractHTTPDestination.invoke(AbstractHTTPDestination.java:249)
org.apache.cxf.transport.servlet.ServletController.invokeDestination(ServletController.java:248)
org.apache.cxf.transport.servlet.ServletController.invoke(ServletController.java:222)
org.apache.cxf.transport.servlet.ServletController.invoke(ServletController.java:153)
org.apache.cxf.transport.servlet.CXFNonSpringServlet.invoke(CXFNonSpringServlet.java:171)
org.apache.cxf.transport.servlet.AbstractHTTPServlet.handleRequest(AbstractHTTPServlet.java:289)
org.apache.cxf.transport.servlet.AbstractHTTPServlet.doPost(AbstractHTTPServlet.java:209)
javax.servlet.http.HttpServlet.service(HttpServlet.java:650)
org.apache.cxf.transport.servlet.AbstractHTTPServlet.service(AbstractHTTPServlet.java:265)
org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
org.apache.catalina.filters.HttpHeaderSecurityFilter.doFilter(HttpHeaderSecurityFilter.java:120)
root cause
org.apache.cxf.interceptor.Fault
org.apache.cxf.service.invoker.AbstractInvoker.createFault(AbstractInvoker.java:170)
org.apache.cxf.service.invoker.AbstractInvoker.invoke(AbstractInvoker.java:136)
org.apache.cxf.jaxrs.JAXRSInvoker.invoke(JAXRSInvoker.java:204)
org.apache.cxf.jaxrs.JAXRSInvoker.invoke(JAXRSInvoker.java:101)
org.apache.cxf.interceptor.ServiceInvokerInterceptor$1.run(ServiceInvokerInterceptor.java:58)
org.apache.cxf.interceptor.ServiceInvokerInterceptor.handleMessage(ServiceInvokerInterceptor.java:94)
org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(PhaseInterceptorChain.java:272)
org.apache.cxf.transport.ChainInitiationObserver.onMessage(ChainInitiationObserver.java:121)
org.apache.cxf.transport.http.AbstractHTTPDestination.invoke(AbstractHTTPDestination.java:249)
org.apache.cxf.transport.servlet.ServletController.invokeDestination(ServletController.java:248)
org.apache.cxf.transport.servlet.ServletController.invoke(ServletController.java:222)
org.apache.cxf.transport.servlet.ServletController.invoke(ServletController.java:153)
org.apache.cxf.transport.servlet.CXFNonSpringServlet.invoke(CXFNonSpringServlet.java:171)
org.apache.cxf.transport.servlet.AbstractHTTPServlet.handleRequest(AbstractHTTPServlet.java:289)
org.apache.cxf.transport.servlet.AbstractHTTPServlet.doPost(AbstractHTTPServlet.java:209)
javax.servlet.http.HttpServlet.service(HttpServlet.java:650)
org.apache.cxf.transport.servlet.AbstractHTTPServlet.service(AbstractHTTPServlet.java:265)
org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
org.apache.catalina.filters.HttpHeaderSecurityFilter.doFilter(HttpHeaderSecurityFilter.java:120)
root cause
java.lang.NullPointerException
org.wso2.carbon.identity.oauth.endpoint.authz.OAuth2AuthzEndpoint.authorize(OAuth2AuthzEndpoint.java:251)
org.wso2.carbon.identity.oauth.endpoint.authz.OAuth2AuthzEndpoint.sendRequestToFramework(OAuth2AuthzEndpoint.java:1163)
org.wso2.carbon.identity.oauth.endpoint.authz.OAuth2AuthzEndpoint.authorize(OAuth2AuthzEndpoint.java:135)
org.wso2.carbon.identity.oauth.endpoint.authz.OAuth2AuthzEndpoint.authorizePost(OAuth2AuthzEndpoint.java:574)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.apache.cxf.service.invoker.AbstractInvoker.performInvocation(AbstractInvoker.java:188)
org.apache.cxf.service.invoker.AbstractInvoker.invoke(AbstractInvoker.java:104)
org.apache.cxf.jaxrs.JAXRSInvoker.invoke(JAXRSInvoker.java:204)
org.apache.cxf.jaxrs.JAXRSInvoker.invoke(JAXRSInvoker.java:101)
org.apache.cxf.interceptor.ServiceInvokerInterceptor$1.run(ServiceInvokerInterceptor.java:58)
org.apache.cxf.interceptor.ServiceInvokerInterceptor.handleMessage(ServiceInvokerInterceptor.java:94)
org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(PhaseInterceptorChain.java:272)
org.apache.cxf.transport.ChainInitiationObserver.onMessage(ChainInitiationObserver.java:121)
org.apache.cxf.transport.http.AbstractHTTPDestination.invoke(AbstractHTTPDestination.java:249)
org.apache.cxf.transport.servlet.ServletController.invokeDestination(ServletController.java:248)
org.apache.cxf.transport.servlet.ServletController.invoke(ServletController.java:222)
org.apache.cxf.transport.servlet.ServletController.invoke(ServletController.java:153)
org.apache.cxf.transport.servlet.CXFNonSpringServlet.invoke(CXFNonSpringServlet.java:171)
org.apache.cxf.transport.servlet.AbstractHTTPServlet.handleRequest(AbstractHTTPServlet.java:289)
org.apache.cxf.transport.servlet.AbstractHTTPServlet.doPost(AbstractHTTPServlet.java:209)
javax.servlet.http.HttpServlet.service(HttpServlet.java:650)
org.apache.cxf.transport.servlet.AbstractHTTPServlet.service(AbstractHTTPServlet.java:265)
org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
org.apache.catalina.filters.HttpHeaderSecurityFilter.doFilter(HttpHeaderSecurityFilter.java:120)
note The full stack trace of the root cause is available in the Apache Tomcat/7.0.73 logs.
Apache Tomcat/7.0.73
In API Manager console it is giving following error:
I.
ERROR - AMDefaultKeyManagerImpl Can not retrieve OAuth application for the given consumer key : PBMTE1piS4sKNdn2HdsJAbMeCQga
org.apache.axis2.AxisFault: Access Denied. Authentication failed - System error occurred. Please check server logs for more details.
at org.apache.axis2.util.Utils.getInboundFaultFromMessageContext(Utils.java:531)
II.
[2017-03-10 11:48:21,226] ERROR - item-info:jag org.wso2.carbon.apimgt.api.APIManagementException: Can not retrieve OAuth application for the given consumer key : PBMTE1piS4sKNdn2HdsJAbMeCQga
[2017-03-10 11:48:49,063] WARN - APIAuthenticationHandler API authentication failure due to Unclassified Authentication Failure
[2017-03-10 11:49:02,990] INFO - TimeoutHandler This engine will expire all callbacks after GLOBAL_TIMEOUT: 120 seconds, irrespective of the timeout action, after the specified or optional timeout
[2017-03-10 11:49:30,889] INFO - DependencyTracker Local entry : gov:/apimgt/statistics/ga-config.xml was added to the Synapse configuration successfully
[2017-03-10 11:49:31,028] INFO - JMSConnectionFactory JMS ConnectionFactory : jmsEventPublisher initialized
[2017-03-10 11:49:32,336] INFO - AMQConnection Unable to connect to broker at tcp://localhost:5672
org.wso2.andes.transport.TransportException: Could not open connection
at org.wso2.andes.transport.network.mina.MinaNetworkTransport$IoConnectorCreator.connect(MinaNetworkTransport.java:216)
at org.wso2.andes.transport.network.mina.MinaNetworkTransport.connect(MinaNetworkTransport.java:74)
at org.wso2.andes.client.AMQConnectionDelegate_8_0.makeBrokerConnection(AMQConnectionDelegate_8_0.java:130)
at org.wso2.andes.client.AMQConnection$2.run(AMQConnection.java:631)
at org.wso2.andes.client.AMQConnection$2.run(AMQConnection.java:628)
at java.security.AccessController.doPrivileged(Native Method)
at org.wso2.andes.client.AMQConnection.makeBrokerConnection(AMQConnection.java:628)
at org.wso2.andes.client.AMQConnection.<init>(AMQConnection.java:409)
at org.wso2.andes.client.AMQConnectionFactory.createConnection(AMQConnectionFactory.java:351)
III.
org.wso2.andes.AMQConnectionFailureException: Could not open connection
at org.wso2.andes.client.AMQConnection.<init>(AMQConnection.java:486)
at org.wso2.andes.client.AMQConnectionFactory.createConnection(AMQConnectionFactory.java:351)
IV.
Caused by: org.wso2.andes.transport.TransportException: Could not open connection
at org.wso2.andes.transport.network.mina.MinaNetworkTransport$IoConnectorCreator.connect(MinaNetworkTransport.java:216)
at org.wso2.andes.transport.network.mina.MinaNetworkTransport.connect(MinaNetworkTransport.java:74)
V.
[2017-03-10 11:49:32,345] ERROR - JMSConnectionFactory Error acquiring a Connection from the JMS CF : jmsEventPublisher using properties : {transport.jms.ConcurrentPublishers=allow, java.naming.provider.url=repository/conf/jndi.properties, java.naming.factory.initial=org.wso2.andes.jndi.PropertiesFileInitialContextFactory, transport.jms.DestinationType=topic, transport.jms.ConnectionFactoryJNDIName=TopicConnectionFactory, transport.jms.Destination=throttleData}
javax.jms.JMSException: Error creating connection: Could not open connection
at org.wso2.andes.client.AMQConnectionFactory.createConnection(AMQConnectionFactory.java:361)
at org.wso2.andes.client.AMQConnectionFactory.createConnection(AMQConnectionFactory.java:40)
VI.
[2017-03-10 11:49:32,350] ERROR - JMSConnectionFactory Error acquiring a Connection from the JMS CF : jmsEventPublisher using properties : {transport.jms.ConcurrentPublishers=allow, java.naming.provider.url=repository/conf/jndi.properties, java.naming.factory.initial=org.wso2.andes.jndi.PropertiesFileInitialContextFactory, transport.jms.DestinationType=topic, transport.jms.ConnectionFactoryJNDIName=TopicConnectionFactory, transport.jms.Destination=throttleData}
org.wso2.carbon.event.output.adapter.core.exception.OutputEventAdapterRuntimeException: Error acquiring a Connection from the JMS CF : jmsEventPublisher using properties : {transport.jms.ConcurrentPublishers=allow, java.naming.provider.url=repository/conf/jndi.properties, java.naming.factory.initial=org.wso2.andes.jndi.PropertiesFileInitialContextFactory, transport.jms.DestinationType=topic, transport.jms.ConnectionFactoryJNDIName=TopicConnectionFactory, transport.jms.Destination=throttleData}
at org.wso2.carbon.event.output.adapter.jms.internal.util.JMSConnectionFactory.handleException(JMSConnectionFactory.java:197)
Please help me to solve this issue.

Apache Drill Impersonation

I'm trying to build in security on our Drill (1.6.0) system. I managed to get the security user authentication to work(JPam as explained in the documentation), but the impersonation does not seem to work. It seems to execute and fetch via the the admin user regardless of who has logged in via ODBC.
My drill-override.conf file is configured as follows:
drill.exec: {
cluster-id: "drillbits1",
zk.connect: "localhost:2181",
impersonation: {
enabled: true,
max_chained_user_hops: 3
},
security.user.auth {
enabled: true,
packages += "org.apache.drill.exec.rpc.user.security",
impl: "pam",
pam_profiles: [ "sudo", "login" ]
}
}
We are also only using Drill on one server, therefore I'm running drill-embedded to start things up. Troubleshooting:
root#srv001:/opt/apache-drill-1.6.0# bin/sqlline -u "jdbc:drill:schema=dfs;zk=localhost:2181;impersonation_target=dUser001" -n entryUser -p entryUserPassword
Error: Failure in connecting to Drill: org.apache.drill.exec.rpc.RpcException: Failure setting up ZK for client. (state=,code=0)
java.sql.SQLException: Failure in connecting to Drill: org.apache.drill.exec.rpc.RpcException: Failure setting up ZK for client.
at org.apache.drill.jdbc.impl.DrillConnectionImpl.<init> (DrillConnectionImpl.java:159)
at org.apache.drill.jdbc.impl.DrillJdbc41Factory.newDrillConnection(DrillJdbc41Factory.java:64)
at org.apache.drill.jdbc.impl.DrillFactory.newConnection(DrillFactory.java:69)
at net.hydromatic.avatica.UnregisteredDriver.connect(UnregisteredDriver.java:126)
at org.apache.drill.jdbc.Driver.connect(Driver.java:72)
at sqlline.DatabaseConnection.connect(DatabaseConnection.java:167)
at sqlline.DatabaseConnection.getConnection(DatabaseConnection.java:213)
at sqlline.Commands.connect(Commands.java:1083)
at sqlline.Commands.connect(Commands.java:1015)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at sqlline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:36)
at sqlline.SqlLine.dispatch(SqlLine.java:742)
at sqlline.SqlLine.initArgs(SqlLine.java:528)
at sqlline.SqlLine.begin(SqlLine.java:596)
at sqlline.SqlLine.start(SqlLine.java:375)
at sqlline.SqlLine.main(SqlLine.java:268)
Caused by: org.apache.drill.exec.rpc.RpcException: Failure setting up ZK for client.
at org.apache.drill.exec.client.DrillClient.connect(DrillClient.java:200)
at org.apache.drill.jdbc.impl.DrillConnectionImpl.<init>(DrillConnectionImpl.java:151)
... 18 more
Caused by: java.io.IOException: Failure to connect to the zookeeper cluster service within the allotted time of 10000 milliseconds.
at org.apache.drill.exec.coord.zk.ZKClusterCoordinator.start(ZKClusterCoordinator.java:123)
at org.apache.drill.exec.client.DrillClient.connect(DrillClient.java:198)
... 19 more
Any ideas on this?
I have also looked at doing my own built in security, but I'm not able to retrieve the username from a SQL query. I have tried the following without any luck:
CURRENT_USER()
USER()
SESSION_USER()
Any ideas on this approach?
I suggest to create a different pam profile (say drill) rather than login and sudo.
Then create drill file under /etc/pam.d/ directory with the content:
#%PAM-1.0
auth include password-auth
account include password-auth
To get connections run:
select * from sys.connections;