Unable to connect to host with Apache Camel SFTP with public/private ssh keys - ssh

Im facing a problem when im trying to use Apache Camel to connect to a SFTP host which is controlled by a business partner. I have created a ssh public/private keypair and they have installed the public key at their server and through both fileZilla and shell sftp im able to connect without any problems.
But when im trying to connect with apache Camel i receive an error : Auth fail for methods 'publickey,password'
Im aware that theres an issue about the jsch library in Camel, but i have upgraded to Camel version 3.19 and according to the dependency tree that can be viewed by ./gradlew dependencies' im using the fork of jsch 'mwiede' version 0.2.1
The SFTP server that im trying to connect to is apparantly rather old but i have no influence on that. When using the shell sftp command it was nescessary to use an option '-oHostKeyAlgorithms=+ssh-dss' but after that its working without a problem.
Im running it locally from a MacOS in IntelliJ - with springboot 2.6.7 and java 17
The Camel route is looking like this
public void configure() {
String privateKeyString = Files.readString(Path.of("/Users/jaan/.ssh/id_rsa_cloud-integration_test"), StandardCharsets.UTF_8);
getCamelContext().getRegistry().bind("myPrivateKey", privateKeyString.getBytes(StandardCharsets.UTF_8));
from(aws2S3(bucketId + "?amazonS3Client=#s3Client" + awsGetObjectUriParams))
.choice()
.when(body().isNull())
.log("Looking for files in S3 bucket - but found none")
.otherwise()
.log("Found file in S3 [${headers.CamelAwsS3Key}]")
.process(exchange -> {
exchange.getIn().setHeader("CamelAwsS3BucketDestinationName", bucketId);
exchange.getIn().setHeader("CamelAwsS3DestinationKey", generateFileName(exchange));
log.info("Uploading file to S3 bucket [{}] and prefix [{}]", bucketId, exchange.getIn().getHeader("CamelAwsS3DestinationKey"));
})
.to(aws2S3(bucketId + "?amazonS3Client=#s3Client&operation=copyObject"))
.to(sftp(host+":22/test?maximumReconnectAttempts=1")
.binary(true)
.privateKey("#myPrivateKey")
.username(sshUserName)
.jschLoggingLevel("TRACE")
.serverHostKeys("ssh-dss")
.knownHostsFile("/Users/jka/.ssh/known_hosts")
I have also tried to simply copy the ssh private key into the route as a string.
The stacktrace that im receiving is below
. ____ _ __ _ _
/\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \
( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \
\\/ ___)| |_)| | | | | || (_| | ) ) ) )
' |____| .__|_| |_|_| |_\__, | / / / /
=========|_|==============|___/=/_/_/_/
:: Spring Boot :: (v2.6.7)
dk.ds.cargo.Application : Starting Application using Java 17.0.5 on COM1865 with PID 47585 (/Users/jka/workspace_git/bis-cargo-programblade/build/classes/java/main started by jka in /Users/jka/workspace_git/bis-cargo-programblade)
dk.ds.cargo.Application : Running with Spring Boot v2.6.7, Spring v5.3.19
dk.ds.cargo.Application : The following 1 profile is active: "local"
o.s.b.devtools.restart.ChangeableUrls : The Class-Path manifest attribute in /Users/jka/.m2/repository/com/sun/xml/bind/jaxb-core/2.3.0/jaxb-core-2.3.0.jar referenced one or more files that do not exist: file:/Users/jka/.m2/repository/com/sun/xml/bind/jaxb-core/2.3.0/jaxb-api.jar
.e.DevToolsPropertyDefaultsPostProcessor : Devtools property defaults active! Set 'spring.devtools.add-properties' to 'false' to disable
.e.DevToolsPropertyDefaultsPostProcessor : For additional web related logging consider setting the 'logging.level.web' property to 'DEBUG'
o.s.cloud.context.scope.GenericScope : BeanFactory id=5934d1b4-b141-3085-8f00-cedb8da5fbc5
o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat initialized with port(s): 8080 (http)
o.apache.catalina.core.StandardService : Starting service [Tomcat]
org.apache.catalina.core.StandardEngine : Starting Servlet engine: [Apache Tomcat/9.0.62]
o.a.c.c.C.[Tomcat].[localhost].[/] : Initializing Spring embedded WebApplicationContext
w.s.c.ServletWebServerApplicationContext : Root WebApplicationContext: initialization completed in 3519 ms
o.s.s.web.DefaultSecurityFilterChain : Will secure any request with [org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter#6587be01, org.springframework.security.web.context.SecurityContextPersistenceFilter#5943fb8e, org.springframework.security.web.header.HeaderWriterFilter#1182b1fe, org.springframework.security.web.csrf.CsrfFilter#47903918, org.springframework.security.web.authentication.logout.LogoutFilter#268e02b2, org.springframework.security.web.authentication.UsernamePasswordAuthenticationFilter#66a704a1, org.springframework.security.web.authentication.ui.DefaultLoginPageGeneratingFilter#4c442cf0, org.springframework.security.web.authentication.ui.DefaultLogoutPageGeneratingFilter#3a072250, org.springframework.security.web.savedrequest.RequestCacheAwareFilter#1bbe8c42, org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter#491c5377, org.springframework.security.web.authentication.AnonymousAuthenticationFilter#2100053f, org.springframework.security.web.session.SessionManagementFilter#7cca7c8d, org.springframework.security.web.access.ExceptionTranslationFilter#1a79bb88, org.springframework.security.web.access.intercept.FilterSecurityInterceptor#2297c946]
o.s.b.d.a.OptionalLiveReloadServer : LiveReload server is running on port 35729
o.s.b.a.e.web.EndpointLinksResolver : Exposing 2 endpoint(s) beneath base path '/monitor'
o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat started on port(s): 8080 (http) with context path ''
d.d.cargo.programblade.ProgrambladRoute : host <host ip adress>
d.d.cargo.programblade.ProgrambladRoute : userName <username>
.c.i.e.DefaultAutowiredLifecycleStrategy : Autowired property: amazonS3Client on component: aws2-s3 as exactly one instance of type: software.amazon.awssdk.services.s3.S3Client (software.amazon.awssdk.services.s3.DefaultS3Client) found in the registry
o.a.c.impl.engine.AbstractCamelContext : Apache Camel 3.19.0 (camel-1) is starting
o.a.c.impl.engine.AbstractCamelContext : Routes startup (started:1)
o.a.c.impl.engine.AbstractCamelContext : Started route1 (aws2-s3://<bucket ID>)
o.a.c.impl.engine.AbstractCamelContext : Apache Camel 3.19.0 (camel-1) started in 1s687ms (build:85ms init:777ms start:825ms)
dk.ds.cargo.Application : Started Application in 11.607 seconds (JVM running for 12.253)
dk.ds.cargo.Application : Spring application is ready to serve!
route1 : Found file in S3 [s3 bucket prefix]
d.d.cargo.programblade.ProgrambladRoute : Uploading file to S3 bucket [bucketID] and prefix [prefix]
o.a.c.c.file.remote.SftpOperations : JSCH -> Connecting to <host IP adress> port 22
o.a.c.c.file.remote.SftpOperations : JSCH -> Connection established
o.a.c.c.file.remote.SftpOperations : JSCH -> Remote version string: SSH-2.0-9.99 sshlib
o.a.c.c.file.remote.SftpOperations : JSCH -> Local version string: SSH-2.0-JSCH_0.2.1
o.a.c.c.file.remote.SftpOperations : JSCH -> CheckCiphers: chacha20-poly1305#openssh.com
o.a.c.c.file.remote.SftpOperations : JSCH -> CheckKexes: curve25519-sha256,curve25519-sha256#libssh.org,curve448-sha512
o.a.c.c.file.remote.SftpOperations : JSCH -> CheckSignatures: ssh-ed25519,ssh-ed448
o.a.c.c.file.remote.SftpOperations : JSCH -> SSH_MSG_KEXINIT sent
o.a.c.c.file.remote.SftpOperations : JSCH -> SSH_MSG_KEXINIT received
o.a.c.c.file.remote.SftpOperations : JSCH -> kex: server: diffie-hellman-group-exchange-sha256,diffie-hellman-group14-sha1,diffie-hellman-group-exchange-sha1,diffie-hellman-group1-sha1
o.a.c.c.file.remote.SftpOperations : JSCH -> kex: server: ssh-dss
o.a.c.c.file.remote.SftpOperations : JSCH -> kex: server: aes256-ctr,twofish256-ctr,twofish-ctr,aes128-ctr,twofish128-ctr,3des-ctr,cast128-ctr,aes256-cbc,twofish256-cbc,twofish-cbc,aes128-cbc,twofish128-cbc,blowfish-cbc,3des-cbc,arcfour,cast128-cbc
o.a.c.c.file.remote.SftpOperations : JSCH -> kex: server: aes256-ctr,twofish256-ctr,twofish-ctr,aes128-ctr,twofish128-ctr,3des-ctr,cast128-ctr,aes256-cbc,twofish256-cbc,twofish-cbc,aes128-cbc,twofish128-cbc,blowfish-cbc,3des-cbc,arcfour,cast128-cbc
o.a.c.c.file.remote.SftpOperations : JSCH -> kex: server: hmac-sha2-512,hmac-sha2-256,hmac-sha1,hmac-md5,hmac-sha1-96,hmac-md5-96
o.a.c.c.file.remote.SftpOperations : JSCH -> kex: server: hmac-sha2-512,hmac-sha2-256,hmac-sha1,hmac-md5,hmac-sha1-96,hmac-md5-96
o.a.c.c.file.remote.SftpOperations : JSCH -> kex: server: zlib,none
o.a.c.c.file.remote.SftpOperations : JSCH -> kex: server: zlib,none
o.a.c.c.file.remote.SftpOperations : JSCH -> kex: server:
o.a.c.c.file.remote.SftpOperations : JSCH -> kex: server:
o.a.c.c.file.remote.SftpOperations : JSCH -> kex: client: curve25519-sha256,curve25519-sha256#libssh.org,ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group16-sha512,diffie-hellman-group18-sha512,diffie-hellman-group14-sha256,ext-info-c
o.a.c.c.file.remote.SftpOperations : JSCH -> kex: client: ssh-dss
o.a.c.c.file.remote.SftpOperations : JSCH -> kex: client: aes128-ctr,aes192-ctr,aes256-ctr,aes128-gcm#openssh.com,aes256-gcm#openssh.com
o.a.c.c.file.remote.SftpOperations : JSCH -> kex: client: aes128-ctr,aes192-ctr,aes256-ctr,aes128-gcm#openssh.com,aes256-gcm#openssh.com
o.a.c.c.file.remote.SftpOperations : JSCH -> kex: client: hmac-sha2-256-etm#openssh.com,hmac-sha2-512-etm#openssh.com,hmac-sha1-etm#openssh.com,hmac-sha2-256,hmac-sha2-512,hmac-sha1
o.a.c.c.file.remote.SftpOperations : JSCH -> kex: client: hmac-sha2-256-etm#openssh.com,hmac-sha2-512-etm#openssh.com,hmac-sha1-etm#openssh.com,hmac-sha2-256,hmac-sha2-512,hmac-sha1
o.a.c.c.file.remote.SftpOperations : JSCH -> kex: client: none
o.a.c.c.file.remote.SftpOperations : JSCH -> kex: client: none
o.a.c.c.file.remote.SftpOperations : JSCH -> kex: client:
o.a.c.c.file.remote.SftpOperations : JSCH -> kex: client:
o.a.c.c.file.remote.SftpOperations : JSCH -> kex: algorithm: diffie-hellman-group-exchange-sha256
o.a.c.c.file.remote.SftpOperations : JSCH -> kex: host key algorithm: ssh-dss
o.a.c.c.file.remote.SftpOperations : JSCH -> kex: server->client cipher: aes128-ctr MAC: hmac-sha2-256 compression: none
o.a.c.c.file.remote.SftpOperations : JSCH -> kex: client->server cipher: aes128-ctr MAC: hmac-sha2-256 compression: none
o.a.c.c.file.remote.SftpOperations : JSCH -> SSH_MSG_KEX_DH_GEX_REQUEST(2048<3072<8192) sent
o.a.c.c.file.remote.SftpOperations : JSCH -> expecting SSH_MSG_KEX_DH_GEX_GROUP
o.a.c.c.file.remote.SftpOperations : JSCH -> SSH_MSG_KEX_DH_GEX_INIT sent
o.a.c.c.file.remote.SftpOperations : JSCH -> expecting SSH_MSG_KEX_DH_GEX_REPLY
o.a.c.c.file.remote.SftpOperations : JSCH -> ssh_dss_verify: signature true
o.a.c.c.file.remote.SftpOperations : JSCH -> Host '<IP adress>' is known and matches the DSA host key
o.a.c.c.file.remote.SftpOperations : JSCH -> SSH_MSG_NEWKEYS sent
o.a.c.c.file.remote.SftpOperations : JSCH -> SSH_MSG_NEWKEYS received
o.a.c.c.file.remote.SftpOperations : JSCH -> SSH_MSG_SERVICE_REQUEST sent
o.a.c.c.file.remote.SftpOperations : JSCH -> SSH_MSG_SERVICE_ACCEPT received
o.a.c.c.file.remote.SftpOperations : JSCH -> Authentications that can continue: publickey
o.a.c.c.file.remote.SftpOperations : JSCH -> Next authentication method: publickey
o.a.c.c.file.remote.SftpOperations : JSCH -> Disconnecting from <IP adress> port 22
o.a.c.c.file.remote.RemoteFileProducer : Writing file failed with: Cannot connect to sftp://<username>#<IP adress>:22
o.a.c.p.e.DefaultErrorHandler : Failed delivery for (MessageId: 1EFB2ABB1EFFD39-0000000000000000 on ExchangeId: 1EFB2ABB1EFFD39-0000000000000000). Exhausted after delivery attempt: 1 caught: org.apache.camel.component.file.GenericFileOperationFailedException: Cannot connect to sftp://<username>#<IP adress>:22
Message History (source location and message history is disabled)
---------------------------------------------------------------------------------------------------------------------------------------
Source ID Processor Elapsed (ms)
route1/route1 from[aws2-s3://ds-cloud-integration-test?amazonS3C 12845806
...
route1/to2 sftp://<IP adress>:22/test-folder?maximumReconnec 0
Stacktrace
---------------------------------------------------------------------------------------------------------------------------------------
org.apache.camel.component.file.GenericFileOperationFailedException: Cannot connect to sftp://<username>#<IP adress>:22
at org.apache.camel.component.file.remote.SftpOperations.connect(SftpOperations.java:137)
at org.apache.camel.component.file.remote.RemoteFileProducer.connectIfNecessary(RemoteFileProducer.java:184)
at org.apache.camel.component.file.remote.RemoteFileProducer.preWriteCheck(RemoteFileProducer.java:133)
at org.apache.camel.component.file.GenericFileProducer.processExchange(GenericFileProducer.java:113)
at org.apache.camel.component.file.remote.RemoteFileProducer.process(RemoteFileProducer.java:61)
at org.apache.camel.support.AsyncProcessorConverterHelper$ProcessorToAsyncProcessorBridge.process(AsyncProcessorConverterHelper.java:66)
at org.apache.camel.processor.SendProcessor.lambda$process$2(SendProcessor.java:191)
at org.apache.camel.support.cache.DefaultProducerCache.doInAsyncProducer(DefaultProducerCache.java:327)
at org.apache.camel.processor.SendProcessor.process(SendProcessor.java:190)
at org.apache.camel.processor.errorhandler.RedeliveryErrorHandler$SimpleTask.run(RedeliveryErrorHandler.java:477)
at org.apache.camel.impl.engine.DefaultReactiveExecutor$Worker.schedule(DefaultReactiveExecutor.java:181)
at org.apache.camel.impl.engine.DefaultReactiveExecutor.scheduleMain(DefaultReactiveExecutor.java:59)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:175)
at org.apache.camel.impl.engine.CamelInternalProcessor.process(CamelInternalProcessor.java:392)
at org.apache.camel.component.aws2.s3.AWS2S3Consumer.processBatch(AWS2S3Consumer.java:300)
at org.apache.camel.component.aws2.s3.AWS2S3Consumer.poll(AWS2S3Consumer.java:175)
at org.apache.camel.support.ScheduledPollConsumer.doRun(ScheduledPollConsumer.java:202)
at org.apache.camel.support.ScheduledPollConsumer.run(ScheduledPollConsumer.java:116)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)
at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305)
at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
at java.base/java.lang.Thread.run(Thread.java:833)
Caused by: com.jcraft.jsch.JSchException: Auth fail for methods 'publickey,password'
at com.jcraft.jsch.Session.connect(Session.java:532)
at org.apache.camel.component.file.remote.SftpOperations.tryConnect(SftpOperations.java:160)
at org.apache.camel.support.task.ForegroundTask.run(ForegroundTask.java:92)
at org.apache.camel.component.file.remote.SftpOperations.connect(SftpOperations.java:135)
... 23 common frames omitted
2022-12-16 13:27:03.248 WARN o.a.c.component.aws2.s3.AWS2S3Consumer : Exchange failed, so rolling back message status: Exchange[1EFB2ABB1EFFD39-0000000000000000]
org.apache.camel.component.file.GenericFileOperationFailedException: Cannot connect to sftp://<username>#<IP adress>:22
at org.apache.camel.component.file.remote.SftpOperations.connect(SftpOperations.java:137)
at org.apache.camel.component.file.remote.RemoteFileProducer.connectIfNecessary(RemoteFileProducer.java:184)
at org.apache.camel.component.file.remote.RemoteFileProducer.preWriteCheck(RemoteFileProducer.java:133)
at org.apache.camel.component.file.GenericFileProducer.processExchange(GenericFileProducer.java:113)
at org.apache.camel.component.file.remote.RemoteFileProducer.process(RemoteFileProducer.java:61)
at org.apache.camel.support.AsyncProcessorConverterHelper$ProcessorToAsyncProcessorBridge.process(AsyncProcessorConverterHelper.java:66)
at org.apache.camel.processor.SendProcessor.lambda$process$2(SendProcessor.java:191)
at org.apache.camel.support.cache.DefaultProducerCache.doInAsyncProducer(DefaultProducerCache.java:327)
at org.apache.camel.processor.SendProcessor.process(SendProcessor.java:190)
at org.apache.camel.processor.errorhandler.RedeliveryErrorHandler$SimpleTask.run(RedeliveryErrorHandler.java:477)
at org.apache.camel.impl.engine.DefaultReactiveExecutor$Worker.schedule(DefaultReactiveExecutor.java:181)
at org.apache.camel.impl.engine.DefaultReactiveExecutor.scheduleMain(DefaultReactiveExecutor.java:59)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:175)
at org.apache.camel.impl.engine.CamelInternalProcessor.process(CamelInternalProcessor.java:392)
at org.apache.camel.component.aws2.s3.AWS2S3Consumer.processBatch(AWS2S3Consumer.java:300)
at org.apache.camel.component.aws2.s3.AWS2S3Consumer.poll(AWS2S3Consumer.java:175)
at org.apache.camel.support.ScheduledPollConsumer.doRun(ScheduledPollConsumer.java:202)
at org.apache.camel.support.ScheduledPollConsumer.run(ScheduledPollConsumer.java:116)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)
at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305)
at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
at java.base/java.lang.Thread.run(Thread.java:833)
Caused by: com.jcraft.jsch.JSchException: Auth fail for methods 'publickey,password'
at com.jcraft.jsch.Session.connect(Session.java:532)
at org.apache.camel.component.file.remote.SftpOperations.tryConnect(SftpOperations.java:160)
at org.apache.camel.support.task.ForegroundTask.run(ForegroundTask.java:92)
at org.apache.camel.component.file.remote.SftpOperations.connect(SftpOperations.java:135)
... 23 common frames omitted
2022-12-16 13:27:03.249 WARN o.a.c.component.aws2.s3.AWS2S3Consumer : Error processing exchange. Exchange[1EFB2ABB1EFFD39-0000000000000000]. Caused by: [org.apache.camel.component.file.GenericFileOperationFailedException - Cannot connect to sftp://<username>#<IP adress>:22]
org.apache.camel.component.file.GenericFileOperationFailedException: Cannot connect to sftp://<username>#<IP adress>:22
at org.apache.camel.component.file.remote.SftpOperations.connect(SftpOperations.java:137)
at org.apache.camel.component.file.remote.RemoteFileProducer.connectIfNecessary(RemoteFileProducer.java:184)
at org.apache.camel.component.file.remote.RemoteFileProducer.preWriteCheck(RemoteFileProducer.java:133)
at org.apache.camel.component.file.GenericFileProducer.processExchange(GenericFileProducer.java:113)
at org.apache.camel.component.file.remote.RemoteFileProducer.process(RemoteFileProducer.java:61)
at org.apache.camel.support.AsyncProcessorConverterHelper$ProcessorToAsyncProcessorBridge.process(AsyncProcessorConverterHelper.java:66)
at org.apache.camel.processor.SendProcessor.lambda$process$2(SendProcessor.java:191)
at org.apache.camel.support.cache.DefaultProducerCache.doInAsyncProducer(DefaultProducerCache.java:327)
at org.apache.camel.processor.SendProcessor.process(SendProcessor.java:190)
at org.apache.camel.processor.errorhandler.RedeliveryErrorHandler$SimpleTask.run(RedeliveryErrorHandler.java:477)
at org.apache.camel.impl.engine.DefaultReactiveExecutor$Worker.schedule(DefaultReactiveExecutor.java:181)
at org.apache.camel.impl.engine.DefaultReactiveExecutor.scheduleMain(DefaultReactiveExecutor.java:59)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:175)
at org.apache.camel.impl.engine.CamelInternalProcessor.process(CamelInternalProcessor.java:392)
at org.apache.camel.component.aws2.s3.AWS2S3Consumer.processBatch(AWS2S3Consumer.java:300)
at org.apache.camel.component.aws2.s3.AWS2S3Consumer.poll(AWS2S3Consumer.java:175)
at org.apache.camel.support.ScheduledPollConsumer.doRun(ScheduledPollConsumer.java:202)
at org.apache.camel.support.ScheduledPollConsumer.run(ScheduledPollConsumer.java:116)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)
at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305)
at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
at java.base/java.lang.Thread.run(Thread.java:833)
Caused by: com.jcraft.jsch.JSchException: Auth fail for methods 'publickey,password'
at com.jcraft.jsch.Session.connect(Session.java:532)
at org.apache.camel.component.file.remote.SftpOperations.tryConnect(SftpOperations.java:160)
at org.apache.camel.support.task.ForegroundTask.run(ForegroundTask.java:92)
at org.apache.camel.component.file.remote.SftpOperations.connect(SftpOperations.java:135)
... 23 common frames omitted
I hope i can get some help to make this work and avoid being forced to try to implement it in plain java with a sftp library

In relation to this issue : Auth fail with JSch against libssh server with "rsa-sha2-512"
the solution is to set the 2 properties
.serverHostKeys("ssh-dss")
.publicKeyAcceptedAlgorithms("ssh-rsa")
and then it worked

Related

Zookeeper TLS error: Unsuccessful handshake with session 0x0 (org.apache.zookeeper.server.NettyServerCnxnFactory)

Can't start Zookeeper with TSL, help me please!
Zookeeper version: 3.5.8-f439ca583e70862c3068a1f2a7d4d068eec33315, built on 05/04/2020 15:53 GMT
zookeeper.properties:
###################[ MAIN ]###################
dataDir=~/zookeeper_ssl/data/zookeeper-data
clientPort=2185
secureClientPort=2186
maxClientCnxns=0
##############[ AUTHENTICATION ]##############
authProvider.sasl=org.apache.zookeeper.server.auth.SASLAuthenticationProvider # (tried change to authProvider.1 but no success)
jaasLoginRenew=3600000
requireClientAuthScheme=sasl
#############[ SSL ]############ authProvider.x509=org.apache.zookeeper.server.auth.X509AuthenticationProvider # (tried to remove - but no success)
serverCnxnFactory=org.apache.zookeeper.server.NettyServerCnxnFactory
ssl.keyStore.location=~/zookeeper_ssl/ssl/broker1.jks
ssl.keyStore.password=xxx
ssl.trustStore.location=~/zookeeper_ssl/ssl/broker1.jks
ssl.trustStore.password=xxx
clientAuth=none
tickTime=3000
initLimit=10
syncLimit=5
##############[ OTHER CONFIGS ]#############
4lw.commands.whitelist=*
admin.enableServer=true
admin.serverPort=8181
It is starting well. Then try to connect:
./bin/kafka-run-class \
> -Dzookeeper.clientCnxnSocket=org.apache.zookeeper.ClientCnxnSocketNetty \
> -Dzookeeper.ssl.client.enable=true \
> -Dzookeeper.ssl.keyStore.location=~/zookeeper_ssl/ssl/dev1.jks \
> -Dzookeeper.ssl.keyStore.password=xxx \
> -Dzookeeper.ssl.trustStore.location=~/zookeeper_ssl/ssl/dev1.jks \
> -Dzookeeper.ssl.trustStore.password=xxx \
> org.apache.zookeeper.ZooKeeperMain -server localhost:2186
Have got:
Connecting to localhost:2186
Welcome to ZooKeeper!
JLine support is disabled
ACTUALLY NOTHING HAPPENS HERE - SO PRESSED CTRL+C
^C
zookeeper.log:
[2020-08-17 18:02:07,667] DEBUG Using Java8 optimized cipher suites for Java version 1.8 (org.apache.zookeeper.common.X509Util)
[2020-08-17 18:02:07,981] DEBUG Default protocols (JDK): [TLSv1.2, TLSv1.1, TLSv1] (io.netty.handler.ssl.JdkSslContext)
[2020-08-17 18:02:07,981] DEBUG Default cipher suites (JDK): [TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384, TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256, TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256, TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384, TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA, TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA, TLS_RSA_WITH_AES_128_GCM_SHA256, TLS_RSA_WITH_AES_128_CBC_SHA, TLS_RSA_WITH_AES_256_CBC_SHA] (io.netty.handler.ssl.JdkSslContext)
[2020-08-17 18:02:08,104] DEBUG SSL handler added for channel: [id: 0x6bcbf86b, L:/x.x.x.x:2186 - R:/x.x.x.x:56620] (org.apache.zookeeper.server.NettyServerCnxnFactory)
[2020-08-17 18:02:08,123] DEBUG -Dio.netty.recycler.maxCapacityPerThread: 4096 (io.netty.util.Recycler)
[2020-08-17 18:02:08,123] DEBUG -Dio.netty.recycler.maxSharedCapacityFactor: 2 (io.netty.util.Recycler)
[2020-08-17 18:02:08,123] DEBUG -Dio.netty.recycler.linkCapacity: 16 (io.netty.util.Recycler)
[2020-08-17 18:02:08,123] DEBUG -Dio.netty.recycler.ratio: 8 (io.netty.util.Recycler)
[2020-08-17 18:02:08,133] DEBUG -Dio.netty.buffer.checkAccessible: true (io.netty.buffer.AbstractByteBuf)
[2020-08-17 18:02:08,133] DEBUG -Dio.netty.buffer.checkBounds: true (io.netty.buffer.AbstractByteBuf)
[2020-08-17 18:02:08,134] DEBUG Loaded default ResourceLeakDetector: io.netty.util.ResourceLeakDetector#3021f880 (io.netty.util.ResourceLeakDetectorFactory)
[2020-08-17 18:02:08,149] ERROR Unsuccessful handshake with session 0x0 (org.apache.zookeeper.server.NettyServerCnxnFactory)
[2020-08-17 18:02:08,149] DEBUG close called for sessionid:0x0 (org.apache.zookeeper.server.NettyServerCnxn)
[2020-08-17 18:02:08,149] DEBUG cnxns size:0 (org.apache.zookeeper.server.NettyServerCnxn)
[2020-08-17 18:02:08,153] WARN Exception caught (org.apache.zookeeper.server.NettyServerCnxnFactory)
io.netty.handler.codec.DecoderException: io.netty.handler.ssl.NotSslRecordException: not an SSL/TLS record: 0000002d000000000000000000000000000075300000000000000000000000100000000000000000000000000000000000
at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:468)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:276)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:714)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:650)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:576)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493)
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
at java.lang.Thread.run(Thread.java:745)
Caused by: io.netty.handler.ssl.NotSslRecordException: not an SSL/TLS record: 0000002d000000000000000000000000000075300000000000000000000000100000000000000000000000000000000000
at io.netty.handler.ssl.SslHandler.decodeJdkCompatible(SslHandler.java:1214)
at io.netty.handler.ssl.SslHandler.decode(SslHandler.java:1282)
at io.netty.handler.codec.ByteToMessageDecoder.decodeRemovalReentryProtection(ByteToMessageDecoder.java:498)
at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:437)
... 17 more
[2020-08-17 18:02:08,153] DEBUG Closing /x.x.x.x:56620[0](queued=0,recved=0,sent=0) (org.apache.zookeeper.server.NettyServerCnxnFactory)
[2020-08-17 18:02:08,153] DEBUG close called for sessionid:0x0 (org.apache.zookeeper.server.NettyServerCnxn)
[2020-08-17 18:02:08,153] DEBUG cnxns size:0 (org.apache.zookeeper.server.NettyServerCnxn)
Inside jks
keystore broker1.jks
Alias name: zserver
Entry type: PrivateKeyEntry
Owner: CN=zserver, C=RU
Alias name: dev1
Entry type: trustedCertEntry
Owner: CN=dev1, C=RU
keystore dev1.jks
Alias name: zserver
Entry type: trustedCertEntry
Owner: CN=zserver, C=RU
Alias name: dev1
Entry type: PrivateKeyEntry
Owner: CN=dev1, C=RU
Have found the problem - correct parameter is:
-Dzookeeper.client.secure=true
(Dzookeeper.ssl.client.enable=true is wrong)

Camel sftp - jsch auth fail

Camel sftp publickey connection fails with the below auth fail error.
Created id_rsa and id_rsa.pub and known_hosts file in source system (Tried with both windows and linux).
Copy pasted the contents of id_rsa.pub in the target system authorized_keys file.
Note: This is working fine with winscp, putty, and local jcraft sample code.
This code was working fine couple of months back. Suspecting jar changes or version conflict in the application I checked but couldn't find anything.
I ran out of options to try. Please help.
Camel route address:
sftp://user#192.168.1.1:22/messages/out?preferredAuthentications=publicKey&privateKeyFile=C:/Users/user/.ssh/id_rsa&privateKeyPassphrase=&jschLoggingLevel=INFO
Error and JSCH log:
org.apache.camel.component.file.remote.SftpOperations - Using private keyfile: C:/Users/user/.ssh/id_rsa
org.apache.camel.component.file.remote.SftpOperations - Known host file not configured, using user known host file: C:\Users\user/.ssh/known_hosts
org.apache.camel.component.file.remote.SftpOperations - Using known hosts information from file: C:\Users\user/.ssh/known_hosts
org.apache.camel.component.file.remote.SftpOperations - Using StrickHostKeyChecking: no
org.apache.camel.component.file.remote.SftpOperations - Using PreferredAuthentications: publicKey
org.apache.camel.component.file.remote.SftpOperations - JSCH -> Connecting to 192.168.84.243 port 22
org.apache.camel.component.file.remote.SftpOperations - JSCH -> Connection established
org.apache.camel.component.file.remote.SftpOperations - JSCH -> Remote version string: SSH-2.0-OpenSSH_5.3
org.apache.camel.component.file.remote.SftpOperations - JSCH -> Local version string: SSH-2.0-JSCH-0.1.54
org.apache.camel.component.file.remote.SftpOperations - JSCH -> CheckCiphers: aes256-ctr,aes192-ctr,aes128-ctr,aes256-cbc,aes192-cbc,aes128-cbc,3des-ctr,arcfour,arcfour128,arcfour256
org.apache.camel.component.file.remote.SftpOperations - JSCH -> CheckKexes: diffie-hellman-group14-sha1,ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521
org.apache.camel.component.file.remote.SftpOperations - JSCH -> CheckSignatures: ecdsa-sha2-nistp256,ecdsa-sha2-nistp384,ecdsa-sha2-nistp521
org.apache.camel.component.file.remote.SftpOperations - JSCH -> SSH_MSG_KEXINIT sent
org.apache.camel.component.file.remote.SftpOperations - JSCH -> SSH_MSG_KEXINIT received
org.apache.camel.component.file.remote.SftpOperations - JSCH -> kex: server: diffie-hellman-group-exchange-sha256,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1
org.apache.camel.component.file.remote.SftpOperations - JSCH -> kex: server: ssh-rsa,ssh-dss
org.apache.camel.component.file.remote.SftpOperations - JSCH -> kex: server: aes128-ctr,aes192-ctr,aes256-ctr,arcfour256,arcfour128,aes128-cbc,3des-cbc,blowfish-cbc,cast128-cbc,aes192-cbc,aes256-cbc,arcfour,rijndael-cbc#lysator.liu.se
org.apache.camel.component.file.remote.SftpOperations - JSCH -> kex: server: aes128-ctr,aes192-ctr,aes256-ctr,arcfour256,arcfour128,aes128-cbc,3des-cbc,blowfish-cbc,cast128-cbc,aes192-cbc,aes256-cbc,arcfour,rijndael-cbc#lysator.liu.se
org.apache.camel.component.file.remote.SftpOperations - JSCH -> kex: server: hmac-md5,hmac-sha1,umac-64#openssh.com,hmac-sha2-256,hmac-sha2-512,hmac-ripemd160,hmac-ripemd160#openssh.com,hmac-sha1-96,hmac-md5-96
org.apache.camel.component.file.remote.SftpOperations - JSCH -> kex: server: hmac-md5,hmac-sha1,umac-64#openssh.com,hmac-sha2-256,hmac-sha2-512,hmac-ripemd160,hmac-ripemd160#openssh.com,hmac-sha1-96,hmac-md5-96
org.apache.camel.component.file.remote.SftpOperations - JSCH -> kex: server: none,zlib#openssh.com
org.apache.camel.component.file.remote.SftpOperations - JSCH -> kex: server: none,zlib#openssh.com
org.apache.camel.component.file.remote.SftpOperations - JSCH -> kex: server:
org.apache.camel.component.file.remote.SftpOperations - JSCH -> kex: server:
org.apache.camel.component.file.remote.SftpOperations - JSCH -> kex: client: ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group14-sha1,diffie-hellman-group-exchange-sha256,diffie-hellman-group-exchange-sha1,diffie-hellman-group1-sha1
org.apache.camel.component.file.remote.SftpOperations - JSCH -> kex: client: ssh-rsa,ssh-dss,ecdsa-sha2-nistp256,ecdsa-sha2-nistp384,ecdsa-sha2-nistp521
org.apache.camel.component.file.remote.SftpOperations - JSCH -> kex: client: aes128-ctr,aes128-cbc,3des-ctr,3des-cbc,blowfish-cbc,aes192-ctr,aes192-cbc,aes256-ctr,aes256-cbc
org.apache.camel.component.file.remote.SftpOperations - JSCH -> kex: client: aes128-ctr,aes128-cbc,3des-ctr,3des-cbc,blowfish-cbc,aes192-ctr,aes192-cbc,aes256-ctr,aes256-cbc
org.apache.camel.component.file.remote.SftpOperations - JSCH -> kex: client: hmac-md5,hmac-sha1,hmac-sha2-256,hmac-sha1-96,hmac-md5-96
org.apache.camel.component.file.remote.SftpOperations - JSCH -> kex: client: hmac-md5,hmac-sha1,hmac-sha2-256,hmac-sha1-96,hmac-md5-96
org.apache.camel.component.file.remote.SftpOperations - JSCH -> kex: client: none
org.apache.camel.component.file.remote.SftpOperations - JSCH -> kex: client: none
org.apache.camel.component.file.remote.SftpOperations - JSCH -> kex: client:
org.apache.camel.component.file.remote.SftpOperations - JSCH -> kex: client:
org.apache.camel.component.file.remote.SftpOperations - JSCH -> kex: server->client aes128-ctr hmac-md5 none
org.apache.camel.component.file.remote.SftpOperations - JSCH -> kex: client->server aes128-ctr hmac-md5 none
org.apache.camel.component.file.remote.SftpOperations - JSCH -> SSH_MSG_KEXDH_INIT sent
org.apache.camel.component.file.remote.SftpOperations - JSCH -> expecting SSH_MSG_KEXDH_REPLY
org.apache.camel.component.file.remote.SftpOperations - JSCH -> ssh_rsa_verify: signature true
org.apache.camel.component.file.remote.SftpOperations - JSCH -> Host '192.168.1.1' is known and matches the RSA host key
org.apache.camel.component.file.remote.SftpOperations - JSCH -> SSH_MSG_NEWKEYS sent
org.apache.camel.component.file.remote.SftpOperations - JSCH -> SSH_MSG_NEWKEYS received
org.apache.camel.component.file.remote.SftpOperations - JSCH -> SSH_MSG_SERVICE_REQUEST sent
org.apache.camel.component.file.remote.SftpOperations - JSCH -> SSH_MSG_SERVICE_ACCEPT received
org.apache.camel.component.file.remote.SftpOperations - JSCH -> Disconnecting from 192.168.1.1 port 22
org.apache.camel.component.file.remote.RemoteFileProducer - Could not connect to: sftp://user#192.168.1.1:22/messages/out?jschLoggingLevel=INFO&preferredAuthentications=publicKey&privateKeyFile=C%3A%2FUsers%2Fuser%2F.ssh%2Fid_rsa&privateKeyPassphrase=xxxxxx. Will try to recover.
org.apache.camel.component.file.GenericFileOperationFailedException: Cannot connect to sftp://user#192.168.1.1:22
at org.apache.camel.component.file.remote.SftpOperations.connect(SftpOperations.java:146)
at org.apache.camel.component.file.remote.RemoteFileProducer.connectIfNecessary(RemoteFileProducer.java:214)
at org.apache.camel.component.file.remote.RemoteFileProducer.recoverableConnectIfNecessary(RemoteFileProducer.java:184)
at org.apache.camel.component.file.remote.RemoteFileProducer.preWriteCheck(RemoteFileProducer.java:133)
at org.apache.camel.component.file.GenericFileProducer.processExchange(GenericFileProducer.java:113)
at org.apache.camel.component.file.remote.RemoteFileProducer.process(RemoteFileProducer.java:58)
at org.apache.camel.util.AsyncProcessorConverterHelper$ProcessorToAsyncProcessorBridge.process(AsyncProcessorConverterHelper.java:61)
at org.apache.camel.processor.SendProcessor$2.doInAsyncProducer(SendProcessor.java:173)
at org.apache.camel.impl.ProducerCache.doInAsyncProducer(ProducerCache.java:436)
at org.apache.camel.processor.SendProcessor.process(SendProcessor.java:168)
at org.apache.camel.management.InstrumentationProcessor.process(InstrumentationProcessor.java:77)
at org.apache.camel.processor.RedeliveryErrorHandler.process(RedeliveryErrorHandler.java:542)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:197)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:120)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:83)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:197)
at org.apache.camel.component.file.GenericFileConsumer.processExchange(GenericFileConsumer.java:460)
at org.apache.camel.component.file.GenericFileConsumer.processBatch(GenericFileConsumer.java:227)
at org.apache.camel.component.file.GenericFileConsumer.poll(GenericFileConsumer.java:191)
at org.apache.camel.impl.ScheduledPollConsumer.doRun(ScheduledPollConsumer.java:175)
at org.apache.camel.impl.ScheduledPollConsumer.run(ScheduledPollConsumer.java:102)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: com.jcraft.jsch.JSchException: Auth fail
at com.jcraft.jsch.Session.connect(Session.java:519)
at org.apache.camel.component.file.remote.SftpOperations.connect(SftpOperations.java:118)
... 27 more
It was just that i put preferredAuthentications as publicKey instead of the correct publickey. Just caps issue and wasted 2 days. JSCH didn't log a proper error...

Not able to start RabbitMQ source in Flink 1.3.2

I want to start a RabbitMQ source and then sink, but I not able to perform first step i.e starting Rabbit MQ source . RabbitMQ server is running and I can see dashboard as well.
My code is as below
public class rabbitmq_source {
public static void main(String[] args) throws Exception {
StreamExecutionEnvironment envrionment = StreamExecutionEnvironment.getExecutionEnvironment();
RMQConnectionConfig connectionConfig = new RMQConnectionConfig.Builder()
.setHost("localhost")
.setPort(50000).
setUserName("root")
.setPassword("root").
setVirtualHost("/").build();
DataStream<String> stream = envrionment
.addSource(new RMQSource<String>(
connectionConfig, // config for the RabbitMQ connection
"queue", // name of the RabbitMQ queue to consume
new SimpleStringSchema()));
stream.print();
envrionment.execute();
}
}
I am not sure what username and pass should I set, should they be guest and guest. However, I am getting the following error
java.lang.RuntimeException: Cannot create RMQ connection with queue at localhost
at org.apache.flink.streaming.connectors.rabbitmq.RMQSource.open(RMQSource.java:172)
at org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:36)
at org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.open(AbstractUdfStreamOperator.java:111)
at org.apache.flink.streaming.runtime.tasks.StreamTask.openAllOperators(StreamTask.java:376)
at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:253)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:702)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.net.ConnectException: Connection refused (Connection refused)
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:589)
at com.rabbitmq.client.impl.FrameHandlerFactory.create(FrameHandlerFactory.java:32)
at com.rabbitmq.client.ConnectionFactory.newConnection(ConnectionFactory.java:588)
at com.rabbitmq.client.ConnectionFactory.newConnection(ConnectionFactory.java:612)
Use LocalStreamEnvironment.createLocalEnvironment() with guest username and password.

Run Ansible single command via ssh

I have server with Ansible. When I try to execute single command from shell on Ansible machine - everything is good.
Example:
omazilov#ansible:~$ ansible all -m ping
192.168.1.10 | success >> {
"changed": false,
"ping": "pong"
}
192.168.1.11 | success >> {
"changed": false,
"ping": "pong"
}
But when I try to run command over SSH from my local machine, I got an error.
omazilov#local:~$ ssh omazilov#ansible.local "ansible all -m ping -vvvv"
omazilov#ansible.local's password:
192.168.1.10 | FAILED => SSH encountered an unknown error. The output was:
OpenSSH_5.9p1 Debian-5ubuntu1.4, OpenSSL 1.0.1 14 Mar 2012
debug1: Reading configuration data /etc/ssh/ssh_config
debug1: /etc/ssh/ssh_config line 19: Applying options for *
debug1: auto-mux: Trying existing master
debug1: Control socket "/home/omazilov/.ansible/cp/ansible-ssh-192.168.1.10-22-omazilov" does not exist
debug2: ssh_connect: needpriv 0
debug1: Connecting to 192.168.1.10 [192.168.1.10] port 22.
debug2: fd 3 setting O_NONBLOCK
debug1: fd 3 clearing O_NONBLOCK
debug1: Connection established.
debug3: timeout: 9932 ms remain after connect
debug3: Incorrect RSA1 identifier
debug3: Could not load "/home/omazilov/.ssh/id_rsa" as a RSA1 public key
debug1: identity file /home/omazilov/.ssh/id_rsa type 1
debug1: Checking blacklist file /usr/share/ssh/blacklist.RSA-2048
debug1: Checking blacklist file /etc/ssh/blacklist.RSA-2048
debug1: identity file /home/omazilov/.ssh/id_rsa-cert type -1
debug1: identity file /home/omazilov/.ssh/id_dsa type -1
debug1: identity file /home/omazilov/.ssh/id_dsa-cert type -1
debug1: identity file /home/omazilov/.ssh/id_ecdsa type -1
debug1: identity file /home/omazilov/.ssh/id_ecdsa-cert type -1
debug1: Remote protocol version 2.0, remote software version OpenSSH_5.9p1 Debian-5ubuntu1.1
debug1: match: OpenSSH_5.9p1 Debian-5ubuntu1.1 pat OpenSSH*
debug1: Enabling compatibility mode for protocol 2.0
debug1: Local version string SSH-2.0-OpenSSH_5.9p1 Debian-5ubuntu1.4
debug2: fd 3 setting O_NONBLOCK
debug3: load_hostkeys: loading entries for host "192.168.1.10" from file "/home/omazilov/.ssh/known_hosts"
debug3: load_hostkeys: found key type ECDSA in file /home/omazilov/.ssh/known_hosts:752
debug3: load_hostkeys: loaded 1 keys
debug3: order_hostkeyalgs: prefer hostkeyalgs: ecdsa-sha2-nistp256-cert-v01#openssh.com,ecdsa-sha2-nistp384-cert-v01#openssh.com,ecdsa-sha2-nistp521-cert-v01#openssh.com,ecdsa-sha2-nistp256,ecdsa-sha2-nistp384,ecdsa-sha2-nistp521
debug1: SSH2_MSG_KEXINIT sent
debug1: SSH2_MSG_KEXINIT received
debug2: kex_parse_kexinit: ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1
debug2: kex_parse_kexinit: ecdsa-sha2-nistp256-cert-v01#openssh.com,ecdsa-sha2-nistp384-cert-v01#openssh.com,ecdsa-sha2-nistp521-cert-v01#openssh.com,ecdsa-sha2-nistp256,ecdsa-sha2-nistp384,ecdsa-sha2-nistp521,ssh-rsa-cert-v01#openssh.com,ssh-dss-cert-v01#openssh.com,ssh-rsa-cert-v00#openssh.com,ssh-dss-cert-v00#openssh.com,ssh-rsa,ssh-dss
debug2: kex_parse_kexinit: aes128-ctr,aes192-ctr,aes256-ctr,arcfour256,arcfour128,aes128-cbc,3des-cbc,blowfish-cbc,cast128-cbc,aes192-cbc,aes256-cbc,arcfour,rijndael-cbc#lysator.liu.se
debug2: kex_parse_kexinit: aes128-ctr,aes192-ctr,aes256-ctr,arcfour256,arcfour128,aes128-cbc,3des-cbc,blowfish-cbc,cast128-cbc,aes192-cbc,aes256-cbc,arcfour,rijndael-cbc#lysator.liu.se
debug2: kex_parse_kexinit: hmac-md5,hmac-sha1,umac-64#openssh.com,hmac-sha2-256,hmac-sha2-256-96,hmac-sha2-512,hmac-sha2-512-96,hmac-ripemd160,hmac-ripemd160#openssh.com,hmac-sha1-96,hmac-md5-96
debug2: kex_parse_kexinit: hmac-md5,hmac-sha1,umac-64#openssh.com,hmac-sha2-256,hmac-sha2-256-96,hmac-sha2-512,hmac-sha2-512-96,hmac-ripemd160,hmac-ripemd160#openssh.com,hmac-sha1-96,hmac-md5-96
debug2: kex_parse_kexinit: zlib#openssh.com,zlib,none
debug2: kex_parse_kexinit: zlib#openssh.com,zlib,none
debug2: kex_parse_kexinit:
debug2: kex_parse_kexinit:
debug2: kex_parse_kexinit: first_kex_follows 0
debug2: kex_parse_kexinit: reserved 0
debug2: kex_parse_kexinit: ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1
debug2: kex_parse_kexinit: ssh-rsa,ssh-dss,ecdsa-sha2-nistp256
debug2: kex_parse_kexinit: aes128-ctr,aes192-ctr,aes256-ctr,arcfour256,arcfour128,aes128-cbc,3des-cbc,blowfish-cbc,cast128-cbc,aes192-cbc,aes256-cbc,arcfour,rijndael-cbc#lysator.liu.se
debug2: kex_parse_kexinit: aes128-ctr,aes192-ctr,aes256-ctr,arcfour256,arcfour128,aes128-cbc,3des-cbc,blowfish-cbc,cast128-cbc,aes192-cbc,aes256-cbc,arcfour,rijndael-cbc#lysator.liu.se
debug2: kex_parse_kexinit: hmac-md5,hmac-sha1,umac-64#openssh.com,hmac-sha2-256,hmac-sha2-256-96,hmac-sha2-512,hmac-sha2-512-96,hmac-ripemd160,hmac-ripemd160#openssh.com,hmac-sha1-96,hmac-md5-96
debug2: kex_parse_kexinit: hmac-md5,hmac-sha1,umac-64#openssh.com,hmac-sha2-256,hmac-sha2-256-96,hmac-sha2-512,hmac-sha2-512-96,hmac-ripemd160,hmac-ripemd160#openssh.com,hmac-sha1-96,hmac-md5-96
debug2: kex_parse_kexinit: none,zlib#openssh.com
debug2: kex_parse_kexinit: none,zlib#openssh.com
debug2: kex_parse_kexinit:
debug2: kex_parse_kexinit:
debug2: kex_parse_kexinit: first_kex_follows 0
debug2: kex_parse_kexinit: reserved 0
debug2: mac_setup: found hmac-md5
debug1: kex: server->client aes128-ctr hmac-md5 zlib#openssh.com
debug2: mac_setup: found hmac-md5
debug1: kex: client->server aes128-ctr hmac-md5 zlib#openssh.com
debug1: sending SSH2_MSG_KEX_ECDH_INIT
debug1: expecting SSH2_MSG_KEX_ECDH_REPLY
debug1: Server host key: ECDSA 4f:ae:27:26:3d:8e:22:0d:e0:95:ca:0c:19:17:47:37
debug3: load_hostkeys: loading entries for host "192.168.1.10" from file "/home/omazilov/.ssh/known_hosts"
debug3: load_hostkeys: found key type ECDSA in file /home/omazilov/.ssh/known_hosts:752
debug3: load_hostkeys: loaded 1 keys
debug1: Host '192.168.1.10' is known and matches the ECDSA host key.
debug1: Found key in /home/omazilov/.ssh/known_hosts:752
debug1: ssh_ecdsa_verify: signature correct
debug2: kex_derive_keys
debug2: set_newkeys: mode 1
debug1: SSH2_MSG_NEWKEYS sent
debug1: expecting SSH2_MSG_NEWKEYS
debug2: set_newkeys: mode 0
debug1: SSH2_MSG_NEWKEYS received
debug1: Roaming not allowed by server
debug1: SSH2_MSG_SERVICE_REQUEST sent
debug2: service_accept: ssh-userauth
debug1: SSH2_MSG_SERVICE_ACCEPT received
debug2: key: /home/omazilov/.ssh/id_rsa (0x7f3d62750d70)
debug2: key: /home/omazilov/.ssh/id_dsa ((nil))
debug2: key: /home/omazilov/.ssh/id_ecdsa ((nil))
debug1: Authentications that can continue: publickey,gssapi-keyex,gssapi-with-mic,password
debug3: start over, passed a different list publickey,gssapi-keyex,gssapi-with-mic,password
debug3: preferred gssapi-with-mic,gssapi-keyex,hostbased,publickey
debug3: authmethod_lookup gssapi-with-mic
debug3: remaining preferred: gssapi-keyex,hostbased,publickey
debug3: authmethod_is_enabled gssapi-with-mic
debug1: Next authentication method: gssapi-with-mic
debug1: Unspecified GSS failure. Minor code may provide more information
Cannot determine realm for numeric host address
debug1: Unspecified GSS failure. Minor code may provide more information
Cannot determine realm for numeric host address
debug1: Unspecified GSS failure. Minor code may provide more information
debug1: Unspecified GSS failure. Minor code may provide more information
Cannot determine realm for numeric host address
debug2: we did not send a packet, disable method
debug3: authmethod_lookup gssapi-keyex
debug3: remaining preferred: hostbased,publickey
debug3: authmethod_is_enabled gssapi-keyex
debug1: Next authentication method: gssapi-keyex
debug1: No valid Key exchange context
debug2: we did not send a packet, disable method
debug3: authmethod_lookup publickey
debug3: remaining preferred: ,publickey
debug3: authmethod_is_enabled publickey
debug1: Next authentication method: publickey
debug1: Offering RSA public key: /home/omazilov/.ssh/id_rsa
debug3: send_pubkey_test
debug2: we sent a publickey packet, wait for reply
debug1: Server accepts key: pkalg ssh-rsa blen 279
debug2: input_userauth_pk_ok: fp c8:7b:ed:81:6d:83:d8:9b:55:7b:a7:3d:5c:53:a8:a5
debug3: sign_and_send_pubkey: RSA c8:7b:ed:81:6d:83:d8:9b:55:7b:a7:3d:5c:53:a8:a5
debug1: key_parse_private_pem: PEM_read_PrivateKey failed
debug1: read PEM private key done: type <unknown>
debug1: read_passphrase: can't open /dev/tty: No such device or address
debug2: no passphrase given, try next key
debug1: Trying private key: /home/omazilov/.ssh/id_dsa
debug3: no such identity: /home/omazilov/.ssh/id_dsa
debug1: Trying private key: /home/omazilov/.ssh/id_ecdsa
debug3: no such identity: /home/omazilov/.ssh/id_ecdsa
debug2: we did not send a packet, disable method
debug1: No more authentication methods to try.
Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password).
192.168.1.11 | FAILED => SSH encountered an unknown error. The output was:
OpenSSH_5.9p1 Debian-5ubuntu1.4, OpenSSL 1.0.1 14 Mar 2012
debug1: Reading configuration data /etc/ssh/ssh_config
debug1: /etc/ssh/ssh_config line 19: Applying options for *
debug1: auto-mux: Trying existing master
debug1: Control socket "/home/omazilov/.ansible/cp/ansible-ssh-192.168.1.11-22-omazilov" does not exist
debug2: ssh_connect: needpriv 0
debug1: Connecting to 192.168.1.11 [192.168.1.11] port 22.
debug2: fd 3 setting O_NONBLOCK
debug1: fd 3 clearing O_NONBLOCK
debug1: Connection established.
debug3: timeout: 9932 ms remain after connect
debug3: Incorrect RSA1 identifier
debug3: Could not load "/home/omazilov/.ssh/id_rsa" as a RSA1 public key
debug1: identity file /home/omazilov/.ssh/id_rsa type 1
debug1: Checking blacklist file /usr/share/ssh/blacklist.RSA-2048
debug1: Checking blacklist file /etc/ssh/blacklist.RSA-2048
debug1: identity file /home/omazilov/.ssh/id_rsa-cert type -1
debug1: identity file /home/omazilov/.ssh/id_dsa type -1
debug1: identity file /home/omazilov/.ssh/id_dsa-cert type -1
debug1: identity file /home/omazilov/.ssh/id_ecdsa type -1
debug1: identity file /home/omazilov/.ssh/id_ecdsa-cert type -1
debug1: Remote protocol version 2.0, remote software version OpenSSH_5.9p1 Debian-5ubuntu1.1
debug1: match: OpenSSH_5.9p1 Debian-5ubuntu1.1 pat OpenSSH*
debug1: Enabling compatibility mode for protocol 2.0
debug1: Local version string SSH-2.0-OpenSSH_5.9p1 Debian-5ubuntu1.4
debug2: fd 3 setting O_NONBLOCK
debug3: load_hostkeys: loading entries for host "192.168.1.11" from file "/home/omazilov/.ssh/known_hosts"
debug3: load_hostkeys: found key type ECDSA in file /home/omazilov/.ssh/known_hosts:988
debug3: load_hostkeys: loaded 1 keys
debug3: order_hostkeyalgs: prefer hostkeyalgs: ecdsa-sha2-nistp256-cert-v01#openssh.com,ecdsa-sha2-nistp384-cert-v01#openssh.com,ecdsa-sha2-nistp521-cert-v01#openssh.com,ecdsa-sha2-nistp256,ecdsa-sha2-nistp384,ecdsa-sha2-nistp521
debug1: SSH2_MSG_KEXINIT sent
debug1: SSH2_MSG_KEXINIT received
debug2: kex_parse_kexinit: ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1
debug2: kex_parse_kexinit: ecdsa-sha2-nistp256-cert-v01#openssh.com,ecdsa-sha2-nistp384-cert-v01#openssh.com,ecdsa-sha2-nistp521-cert-v01#openssh.com,ecdsa-sha2-nistp256,ecdsa-sha2-nistp384,ecdsa-sha2-nistp521,ssh-rsa-cert-v01#openssh.com,ssh-dss-cert-v01#openssh.com,ssh-rsa-cert-v00#openssh.com,ssh-dss-cert-v00#openssh.com,ssh-rsa,ssh-dss
debug2: kex_parse_kexinit: aes128-ctr,aes192-ctr,aes256-ctr,arcfour256,arcfour128,aes128-cbc,3des-cbc,blowfish-cbc,cast128-cbc,aes192-cbc,aes256-cbc,arcfour,rijndael-cbc#lysator.liu.se
debug2: kex_parse_kexinit: aes128-ctr,aes192-ctr,aes256-ctr,arcfour256,arcfour128,aes128-cbc,3des-cbc,blowfish-cbc,cast128-cbc,aes192-cbc,aes256-cbc,arcfour,rijndael-cbc#lysator.liu.se
debug2: kex_parse_kexinit: hmac-md5,hmac-sha1,umac-64#openssh.com,hmac-sha2-256,hmac-sha2-256-96,hmac-sha2-512,hmac-sha2-512-96,hmac-ripemd160,hmac-ripemd160#openssh.com,hmac-sha1-96,hmac-md5-96
debug2: kex_parse_kexinit: hmac-md5,hmac-sha1,umac-64#openssh.com,hmac-sha2-256,hmac-sha2-256-96,hmac-sha2-512,hmac-sha2-512-96,hmac-ripemd160,hmac-ripemd160#openssh.com,hmac-sha1-96,hmac-md5-96
debug2: kex_parse_kexinit: zlib#openssh.com,zlib,none
debug2: kex_parse_kexinit: zlib#openssh.com,zlib,none
debug2: kex_parse_kexinit:
debug2: kex_parse_kexinit:
debug2: kex_parse_kexinit: first_kex_follows 0
debug2: kex_parse_kexinit: reserved 0
debug2: kex_parse_kexinit: ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1
debug2: kex_parse_kexinit: ssh-rsa,ssh-dss,ecdsa-sha2-nistp256
debug2: kex_parse_kexinit: aes128-ctr,aes192-ctr,aes256-ctr,arcfour256,arcfour128,aes128-cbc,3des-cbc,blowfish-cbc,cast128-cbc,aes192-cbc,aes256-cbc,arcfour,rijndael-cbc#lysator.liu.se
debug2: kex_parse_kexinit: aes128-ctr,aes192-ctr,aes256-ctr,arcfour256,arcfour128,aes128-cbc,3des-cbc,blowfish-cbc,cast128-cbc,aes192-cbc,aes256-cbc,arcfour,rijndael-cbc#lysator.liu.se
debug2: kex_parse_kexinit: hmac-md5,hmac-sha1,umac-64#openssh.com,hmac-sha2-256,hmac-sha2-256-96,hmac-sha2-512,hmac-sha2-512-96,hmac-ripemd160,hmac-ripemd160#openssh.com,hmac-sha1-96,hmac-md5-96
debug2: kex_parse_kexinit: hmac-md5,hmac-sha1,umac-64#openssh.com,hmac-sha2-256,hmac-sha2-256-96,hmac-sha2-512,hmac-sha2-512-96,hmac-ripemd160,hmac-ripemd160#openssh.com,hmac-sha1-96,hmac-md5-96
debug2: kex_parse_kexinit: none,zlib#openssh.com
debug2: kex_parse_kexinit: none,zlib#openssh.com
debug2: kex_parse_kexinit:
debug2: kex_parse_kexinit:
debug2: kex_parse_kexinit: first_kex_follows 0
debug2: kex_parse_kexinit: reserved 0
debug2: mac_setup: found hmac-md5
debug1: kex: server->client aes128-ctr hmac-md5 zlib#openssh.com
debug2: mac_setup: found hmac-md5
debug1: kex: client->server aes128-ctr hmac-md5 zlib#openssh.com
debug1: sending SSH2_MSG_KEX_ECDH_INIT
debug1: expecting SSH2_MSG_KEX_ECDH_REPLY
debug1: Server host key: ECDSA 4f:ae:27:26:3d:8e:22:0d:e0:95:ca:0c:19:17:47:37
debug3: load_hostkeys: loading entries for host "192.168.1.10" from file "/home/omazilov/.ssh/known_hosts"
debug3: load_hostkeys: found key type ECDSA in file /home/omazilov/.ssh/known_hosts:988
debug3: load_hostkeys: loaded 1 keys
debug1: Host '192.168.1.10"' is known and matches the ECDSA host key.
debug1: Found key in /home/omazilov/.ssh/known_hosts:988
debug1: ssh_ecdsa_verify: signature correct
debug2: kex_derive_keys
debug2: set_newkeys: mode 1
debug1: SSH2_MSG_NEWKEYS sent
debug1: expecting SSH2_MSG_NEWKEYS
debug2: set_newkeys: mode 0
debug1: SSH2_MSG_NEWKEYS received
debug1: Roaming not allowed by server
debug1: SSH2_MSG_SERVICE_REQUEST sent
debug2: service_accept: ssh-userauth
debug1: SSH2_MSG_SERVICE_ACCEPT received
debug2: key: /home/omazilov/.ssh/id_rsa (0x7f6409faad70)
debug2: key: /home/omazilov/.ssh/id_dsa ((nil))
debug2: key: /home/omazilov/.ssh/id_ecdsa ((nil))
debug1: Authentications that can continue: publickey,gssapi-keyex,gssapi-with-mic,password
debug3: start over, passed a different list publickey,gssapi-keyex,gssapi-with-mic,password
debug3: preferred gssapi-with-mic,gssapi-keyex,hostbased,publickey
debug3: authmethod_lookup gssapi-with-mic
debug3: remaining preferred: gssapi-keyex,hostbased,publickey
debug3: authmethod_is_enabled gssapi-with-mic
debug1: Next authentication method: gssapi-with-mic
debug1: Unspecified GSS failure. Minor code may provide more information
Cannot determine realm for numeric host address
debug1: Unspecified GSS failure. Minor code may provide more information
Cannot determine realm for numeric host address
debug1: Unspecified GSS failure. Minor code may provide more information
debug1: Unspecified GSS failure. Minor code may provide more information
Cannot determine realm for numeric host address
debug2: we did not send a packet, disable method
debug3: authmethod_lookup gssapi-keyex
debug3: remaining preferred: hostbased,publickey
debug3: authmethod_is_enabled gssapi-keyex
debug1: Next authentication method: gssapi-keyex
debug1: No valid Key exchange context
debug2: we did not send a packet, disable method
debug3: authmethod_lookup publickey
debug3: remaining preferred: ,publickey
debug3: authmethod_is_enabled publickey
debug1: Next authentication method: publickey
debug1: Offering RSA public key: /home/omazilov/.ssh/id_rsa
debug3: send_pubkey_test
debug2: we sent a publickey packet, wait for reply
debug1: Server accepts key: pkalg ssh-rsa blen 279
debug2: input_userauth_pk_ok: fp c8:7b:ed:81:6d:83:d8:9b:55:7b:a7:3d:5c:53:a8:a5
debug3: sign_and_send_pubkey: RSA c8:7b:ed:81:6d:83:d8:9b:55:7b:a7:3d:5c:53:a8:a5
debug1: key_parse_private_pem: PEM_read_PrivateKey failed
debug1: read PEM private key done: type <unknown>
debug1: read_passphrase: can't open /dev/tty: No such device or address
debug2: no passphrase given, try next key
debug1: Trying private key: /home/omazilov/.ssh/id_dsa
debug3: no such identity: /home/omazilov/.ssh/id_dsa
debug1: Trying private key: /home/omazilov/.ssh/id_ecdsa
debug3: no such identity: /home/omazilov/.ssh/id_ecdsa
debug2: we did not send a packet, disable method
debug1: No more authentication methods to try.
Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password).
<192.168.1.10> ESTABLISH CONNECTION FOR USER: omazilov
<192.168.1.10> REMOTE_MODULE ping
<192.168.1.10> EXEC ['ssh', '-C', '-tt', '-vvv', '-o', 'ControlMaster=auto', '-o', 'ControlPersist=60s', '-o', 'ControlPath=/home/omazilov/.ansible/cp/ansible-ssh-%h-%p-%r', '-o', 'Port=22', '-o', 'KbdInteractiveAuthentication=no', '-o', 'PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey', '-o', 'PasswordAuthentication=no', '-o', 'ConnectTimeout=10', '192.168.1.10', "/bin/sh -c 'mkdir -p $HOME/.ansible/tmp/ansible-tmp-1412166408.61-193292311991754 && chmod a+rx $HOME/.ansible/tmp/ansible-tmp-1412166408.61-193292311991754 && echo $HOME/.ansible/tmp/ansible-tmp-1412166408.61-193292311991754'"]
<192.168.1.11> ESTABLISH CONNECTION FOR USER: omazilov
<192.168.1.11> REMOTE_MODULE ping
<192.168.1.11> EXEC ['ssh', '-C', '-tt', '-vvv', '-o', 'ControlMaster=auto', '-o', 'ControlPersist=60s', '-o', 'ControlPath=/home/omazilov/.ansible/cp/ansible-ssh-%h-%p-%r', '-o', 'Port=22', '-o', 'KbdInteractiveAuthentication=no', '-o', 'PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey', '-o', 'PasswordAuthentication=no', '-o', 'ConnectTimeout=10', '192.168.1.11', "/bin/sh -c 'mkdir -p $HOME/.ansible/tmp/ansible-tmp-1412166408.61-7233830878884 && chmod a+rx $HOME/.ansible/tmp/ansible-tmp-1412166408.61-7233830878884 && echo $HOME/.ansible/tmp/ansible-tmp-1412166408.61-7233830878884'"]
Ansible config:
[defaults]
hostfile = /etc/ansible/hosts
library = /usr/share/ansible
remote_tmp = $HOME/.ansible/tmp
pattern = *
forks = 5
poll_interval = 15
sudo_user = root
#ask_sudo_pass = True
#ask_pass = True
transport = smart
remote_port = 22
module_lang = C
gathering = implicit
sudo_exe = sudo
timeout = 10
action_plugins = /usr/share/ansible_plugins/action_plugins
callback_plugins = /usr/share/ansible_plugins/callback_plugins
connection_plugins = /usr/share/ansible_plugins/connection_plugins
lookup_plugins = /usr/share/ansible_plugins/lookup_plugins
vars_plugins = /usr/share/ansible_plugins/vars_plugins
filter_plugins = /usr/share/ansible_plugins/filter_plugins
[accelerate]
accelerate_port = 5099
accelerate_timeout = 30
accelerate_connect_timeout = 5.0
accelerate_daemon_timeout = 30
First I don't see the use case for ssh'ing locally (I assume ansible.local is localhost) to run an ansible command. This is a mystery to me !
About the problem, since you're using a password for your ssh connection, there is not autghentication forwarding.
Thus the ansible command can not work unless either :
you use ssh-keys for your local connections (add your ~/.ssh/id_rsa.pub to your ~/.ssh/authorized_keys), and use ssh -A to forward authentication
use the -k switch in the ansible command

SSLHandshakeException: Received fatal alert: handshake_failure when setting ciphers on tomcat 7 server

I have a Tomcat7 web-server which I tried to configure to accept secure connections by adding this connector to the server.xml file:
<Connector SSLEnabled="true"
acceptCount="100"
connectionTimeout="20000"
executor="tomcatThreadPool"
keyAlias="server"
keystoreFile="c:\opt\engine\conf\tc.keystore"
keystorePass="o39UI12z"
maxKeepAliveRequests="15"
port="8443"
protocol="HTTP/1.1"
redirectPort="8443"
scheme="https"
secure="true"
sslProtocol="TLS"/>
I'm using a self-signed certificate generated using this command:
%JAVA_HOME%/bin/keytool -genkeypair -keystore c:\opt\engine\conf\tc.keystore -storepass o39UI12z-keypass o39UI12z-dname "cn=Company, ou=Company, o=Com, c=US" -alias server -validity 36500
On the client side I have a spring application that connects with the server using RestTemplate. On application context startup I initalize the restTemplate instance this way:
final ClientHttpRequestFactory clientHttpRequestFactory =
new MyCustomClientHttpRequestFactory(new NullHostNameVerifier(), serverInfo);
restTemplate.setRequestFactory(clientHttpRequestFactory);
The class MyCustomClientHttpRequestFactory looks like this:
public class MyCustomClientHttpRequestFactory extends SimpleClientHttpRequestFactory {
private static final Logger LOGGER = LoggerFactory
.getLogger(MyCustomClientHttpRequestFactory.class);
private final HostnameVerifier hostNameVerifier;
private final ServerInfo serverInfo;
public MyCustomClientHttpRequestFactory (final HostnameVerifier hostNameVerifier,
final ServerInfo serverInfo) {
this.hostNameVerifier = hostNameVerifier;
this.serverInfo = serverInfo;
}
#Override
protected void prepareConnection(final HttpURLConnection connection, final String httpMethod)
throws IOException {
if (connection instanceof HttpsURLConnection) {
((HttpsURLConnection) connection).setHostnameVerifier(hostNameVerifier);
((HttpsURLConnection) connection).setSSLSocketFactory(initSSLContext()
.getSocketFactory());
}
super.prepareConnection(connection, httpMethod);
}
private SSLContext initSSLContext() {
try {
System.setProperty("https.protocols", "TLSv1");
// Set ssl trust manager. Verify against our server thumbprint
final SSLContext ctx = SSLContext.getInstance("TLSv1");
final SslThumbprintVerifier verifier = new SslThumbprintVerifier(serverInfo);
final ThumbprintTrustManager thumbPrintTrustManager =
new ThumbprintTrustManager(null, verifier);
ctx.init(null, new TrustManager[] { thumbPrintTrustManager }, null);
return ctx;
} catch (final Exception ex) {
LOGGER.error(
"An exception was thrown while trying to initialize HTTP security manager.", ex);
return null;
}
}
Up until here everything worked fine. When I put a break point in the SslThumbprintVerifier class, the code reached to this point and also to the NullHostNameVerifier class. This was tested in production and worked great.
Now I wanted to extend security by limiting the cipher suites and I added this property to the Connector I presented at the beginning:
ciphers="TLS_KRB5_WITH_RC4_128_SHA,SSL_RSA_WITH_RC4_128_SHA,SSL_DHE_RSA_WITH_AES_256_CBC_SHA,SSL_RSA_WITH_AES_256_CBC_SHA,SSL_DHE_RSA_WITH_3DES_EDE_CBC_SHA,SSL_RSA_WITH_3DES_EDE_CBC_SHA,SSL_DHE_RSA_WITH_AES_128_CBC_SHA,SSL_RSA_WITH_AES_128_CBC_SHA,TLS_RSA_WITH_RC4_128_SHA,TLS_RSA_WITH_3DES_EDE_CBC_SHA,TLS_DHE_RSA_WITH_3DES_EDE_CBC_SHA,TLS_RSA_WITH_AES_128_CBC_SHA,TLS_RSA_WITH_AES_256_CBC_SHA,TLS_DHE_RSA_WITH_AES_128_CBC_SHA,TLS_DHE_RSA_WITH_AES_256_CBC_SHA,TLS_KRB5_WITH_3DES_EDE_CBC_SHA"
Now when I'm running the client code I get this exception:
javax.net.ssl.SSLHandshakeException: Received fatal alert: handshake_failure
This happens in the restTemplate.doExecute() method. The code doesn't reach to the thumbprint verifier class nor to the host name verifier class as it did before adding the ciphers.
In debug I checked the ctx.getSocketFactory().getSupportedCipherSuites() which showed:
[SSL_RSA_WITH_RC4_128_MD5, SSL_RSA_WITH_RC4_128_SHA, TLS_RSA_WITH_AES_128_CBC_SHA, TLS_RSA_WITH_AES_256_CBC_SHA, TLS_DHE_RSA_WITH_AES_128_CBC_SHA, TLS_DHE_RSA_WITH_AES_256_CBC_SHA, TLS_DHE_DSS_WITH_AES_128_CBC_SHA, TLS_DHE_DSS_WITH_AES_256_CBC_SHA, SSL_RSA_WITH_3DES_EDE_CBC_SHA, SSL_DHE_RSA_WITH_3DES_EDE_CBC_SHA, SSL_DHE_DSS_WITH_3DES_EDE_CBC_SHA, SSL_RSA_WITH_DES_CBC_SHA, SSL_DHE_RSA_WITH_DES_CBC_SHA, SSL_DHE_DSS_WITH_DES_CBC_SHA, SSL_RSA_EXPORT_WITH_RC4_40_MD5, SSL_RSA_EXPORT_WITH_DES40_CBC_SHA, SSL_DHE_RSA_EXPORT_WITH_DES40_CBC_SHA, SSL_DHE_DSS_EXPORT_WITH_DES40_CBC_SHA, TLS_EMPTY_RENEGOTIATION_INFO_SCSV, SSL_RSA_WITH_NULL_MD5, SSL_RSA_WITH_NULL_SHA, SSL_DH_anon_WITH_RC4_128_MD5, TLS_DH_anon_WITH_AES_128_CBC_SHA, TLS_DH_anon_WITH_AES_256_CBC_SHA, SSL_DH_anon_WITH_3DES_EDE_CBC_SHA, SSL_DH_anon_WITH_DES_CBC_SHA, SSL_DH_anon_EXPORT_WITH_RC4_40_MD5, SSL_DH_anon_EXPORT_WITH_DES40_CBC_SHA, TLS_KRB5_WITH_RC4_128_SHA, TLS_KRB5_WITH_RC4_128_MD5, TLS_KRB5_WITH_3DES_EDE_CBC_SHA, TLS_KRB5_WITH_3DES_EDE_CBC_MD5, TLS_KRB5_WITH_DES_CBC_SHA, TLS_KRB5_WITH_DES_CBC_MD5, TLS_KRB5_EXPORT_WITH_RC4_40_SHA, TLS_KRB5_EXPORT_WITH_RC4_40_MD5, TLS_KRB5_EXPORT_WITH_DES_CBC_40_SHA, TLS_KRB5_EXPORT_WITH_DES_CBC_40_MD5]
and also checked ctx.getSocketFactory().getDefaultCipherSuites():
[SSL_RSA_WITH_RC4_128_MD5, SSL_RSA_WITH_RC4_128_SHA, TLS_RSA_WITH_AES_128_CBC_SHA, TLS_RSA_WITH_AES_256_CBC_SHA, TLS_DHE_RSA_WITH_AES_128_CBC_SHA, TLS_DHE_RSA_WITH_AES_256_CBC_SHA, TLS_DHE_DSS_WITH_AES_128_CBC_SHA, TLS_DHE_DSS_WITH_AES_256_CBC_SHA, SSL_RSA_WITH_3DES_EDE_CBC_SHA, SSL_DHE_RSA_WITH_3DES_EDE_CBC_SHA, SSL_DHE_DSS_WITH_3DES_EDE_CBC_SHA, SSL_RSA_WITH_DES_CBC_SHA, SSL_DHE_RSA_WITH_DES_CBC_SHA, SSL_DHE_DSS_WITH_DES_CBC_SHA, SSL_RSA_EXPORT_WITH_RC4_40_MD5, SSL_RSA_EXPORT_WITH_DES40_CBC_SHA, SSL_DHE_RSA_EXPORT_WITH_DES40_CBC_SHA, SSL_DHE_DSS_EXPORT_WITH_DES40_CBC_SHA, TLS_EMPTY_RENEGOTIATION_INFO_SCSV]
and lastly the ctx.getSupportedSSLParameters().getCipherSuites():
[SSL_RSA_WITH_RC4_128_MD5, SSL_RSA_WITH_RC4_128_SHA, TLS_RSA_WITH_AES_128_CBC_SHA, TLS_RSA_WITH_AES_256_CBC_SHA, TLS_DHE_RSA_WITH_AES_128_CBC_SHA, TLS_DHE_RSA_WITH_AES_256_CBC_SHA, TLS_DHE_DSS_WITH_AES_128_CBC_SHA, TLS_DHE_DSS_WITH_AES_256_CBC_SHA, SSL_RSA_WITH_3DES_EDE_CBC_SHA, SSL_DHE_RSA_WITH_3DES_EDE_CBC_SHA, SSL_DHE_DSS_WITH_3DES_EDE_CBC_SHA, SSL_RSA_WITH_DES_CBC_SHA, SSL_DHE_RSA_WITH_DES_CBC_SHA, SSL_DHE_DSS_WITH_DES_CBC_SHA, SSL_RSA_EXPORT_WITH_RC4_40_MD5, SSL_RSA_EXPORT_WITH_DES40_CBC_SHA, SSL_DHE_RSA_EXPORT_WITH_DES40_CBC_SHA, SSL_DHE_DSS_EXPORT_WITH_DES40_CBC_SHA, TLS_EMPTY_RENEGOTIATION_INFO_SCSV, SSL_RSA_WITH_NULL_MD5, SSL_RSA_WITH_NULL_SHA, SSL_DH_anon_WITH_RC4_128_MD5, TLS_DH_anon_WITH_AES_128_CBC_SHA, TLS_DH_anon_WITH_AES_256_CBC_SHA, SSL_DH_anon_WITH_3DES_EDE_CBC_SHA, SSL_DH_anon_WITH_DES_CBC_SHA, SSL_DH_anon_EXPORT_WITH_RC4_40_MD5, SSL_DH_anon_EXPORT_WITH_DES40_CBC_SHA, TLS_KRB5_WITH_RC4_128_SHA, TLS_KRB5_WITH_RC4_128_MD5, TLS_KRB5_WITH_3DES_EDE_CBC_SHA, TLS_KRB5_WITH_3DES_EDE_CBC_MD5, TLS_KRB5_WITH_DES_CBC_SHA, TLS_KRB5_WITH_DES_CBC_MD5, TLS_KRB5_EXPORT_WITH_RC4_40_SHA, TLS_KRB5_EXPORT_WITH_RC4_40_MD5, TLS_KRB5_EXPORT_WITH_DES_CBC_40_SHA, TLS_KRB5_EXPORT_WITH_DES_CBC_40_MD5]
As far as I understand, if there's an intersection between the client's supported/enabled cipher suites and the server's supported cipher suites it should work (we have such intersection with SSL_RSA_WITH_RC4_128_SHA for example). And yet I'm getting this error.
In the next step I added this java parameter:
-Djavax.net.debug=ssl,handshake,failure
The log showed only the client-hello, with no server-hello response:
[2013-03-20 15:29:51.315] [INFO ] data-service-pool-37 System.out trigger seeding of SecureRandom
[2013-03-20 15:29:51.315] [INFO ] data-service-pool-37 System.out done seeding SecureRandom
[2013-03-20 15:30:38.894] [INFO ] data-service-pool-37 System.out Allow unsafe renegotiation: false
[2013-03-20 15:30:38.894] [INFO ] data-service-pool-37 System.out Allow legacy hello messages: true
[2013-03-20 15:30:38.894] [INFO ] data-service-pool-37 System.out Is initial handshake: true
[2013-03-20 15:30:38.894] [INFO ] data-service-pool-37 System.out Is secure renegotiation: false
[2013-03-20 15:30:38.894] [INFO ] data-service-pool-37 System.out %% No cached client session
[2013-03-20 15:30:38.894] [INFO ] data-service-pool-37 System.out *** ClientHello, TLSv1
[2013-03-20 15:30:38.941] [INFO ] data-service-pool-37 System.out RandomCookie: GMT: 1363720446 bytes = { 99, 249, 173, 214, 110, 82, 58, 52, 189, 92, 74, 169, 133, 128, 250, 109, 160, 64, 112, 253, 50, 160, 255, 196, 85, 93, 33, 172 }
[2013-03-20 15:30:38.941] [INFO ] data-service-pool-37 System.out Session ID: {}
[2013-03-20 15:30:38.941] [INFO ] data-service-pool-37 System.out Cipher Suites: [SSL_RSA_WITH_RC4_128_MD5, SSL_RSA_WITH_RC4_128_SHA, TLS_RSA_WITH_AES_128_CBC_SHA, TLS_RSA_WITH_AES_256_CBC_SHA, TLS_DHE_RSA_WITH_AES_128_CBC_SHA, TLS_DHE_RSA_WITH_AES_256_CBC_SHA, TLS_DHE_DSS_WITH_AES_128_CBC_SHA, TLS_DHE_DSS_WITH_AES_256_CBC_SHA, SSL_RSA_WITH_3DES_EDE_CBC_SHA, SSL_DHE_RSA_WITH_3DES_EDE_CBC_SHA, SSL_DHE_DSS_WITH_3DES_EDE_CBC_SHA, SSL_RSA_WITH_DES_CBC_SHA, SSL_DHE_RSA_WITH_DES_CBC_SHA, SSL_DHE_DSS_WITH_DES_CBC_SHA, SSL_RSA_EXPORT_WITH_RC4_40_MD5, SSL_RSA_EXPORT_WITH_DES40_CBC_SHA, SSL_DHE_RSA_EXPORT_WITH_DES40_CBC_SHA, SSL_DHE_DSS_EXPORT_WITH_DES40_CBC_SHA, TLS_EMPTY_RENEGOTIATION_INFO_SCSV]
[2013-03-20 15:30:38.956] [INFO ] data-service-pool-37 System.out Compression Methods: { 0 }
[2013-03-20 15:30:38.956] [INFO ] data-service-pool-37 System.out ***
[2013-03-20 15:30:38.956] [INFO ] data-service-pool-37 System.out data-service-pool-37, WRITE: TLSv1 Handshake, length = 81
[2013-03-20 15:30:38.956] [INFO ] data-service-pool-37 System.out data-service-pool-37, READ: TLSv1 Alert, length = 2
[2013-03-20 15:30:38.956] [INFO ] data-service-pool-37 System.out data-service-pool-37, RECV TLSv1 ALERT: fatal, handshake_failure
[2013-03-20 15:30:38.972] [INFO ] data-service-pool-37 System.out data-service-pool-37, called closeSocket()
[2013-03-20 15:30:38.972] [INFO ] data-service-pool-37 System.out data-service-pool-37, handling exception: javax.net.ssl.SSLHandshakeException: Received fatal alert: handshake_failure
When removing the ciphers from the server.xml it works again.
Also I will mention that I'm testing both the server and the client on the same machine. I tried setting the server's ip in the client request to 'localhost' and to the actual machine ip and it didn't work in both cases. Also, I had this server running on a different linux machine (the keystore was generated on the linux machine with a linux path of course) and still - it works without the ciphers and stops working with the ciphers.
Well, I got this issue solved. It appears that by creating a self-signed certificate, using keytool, without providing -keyalg parameter makes the key-pair algorithm default to DSA. None of my ciphers suite included DSA algorithm. In that case, although the client and the server had intersection between their cipher-suites, neither was suitable for the key algoritm.
Adding -keyalg RSA when generating the keystore, solved the problem.