red5 decode error on Linux - red5

I am building a 2-way video chat app with Red5. When I run the app in my Windows environment, it works fine, but when I try to use it on Linux, I get an error that the live stream cannot be decoded. HEre is the error:
2012-02-18 01:32:20,955 [NioProcessor-1] INFO o.r.s.stream.ClientBroadcastStream - Stream start
2012-02-18 01:32:20,959 [NioProcessor-1] INFO o.r.s.a.MultiThreadedApplicationAdapter - W3C x-category:stream x-event:publish c-ip:223.205.177.236 x-sname:aaeec362-7a79-41d2-b572-1c2962fa1a77 x-name:doctorb
2012-02-18 01:32:21,591 [NioProcessor-1] INFO o.r.server.stream.VideoCodecFactory - Trying codec org.red5.server.stream.codec.ScreenVideo#565539d8
2012-02-18 01:32:21,591 [NioProcessor-1] INFO o.r.server.stream.VideoCodecFactory - Trying codec org.red5.server.stream.codec.SorensonVideo#7548c02f
2012-02-18 01:32:21,592 [NioProcessor-1] INFO o.r.s.stream.ClientBroadcastStream - dispatchEvent: adjust archaic videoTime, from: 5246 to 5305
2012-02-18 01:32:21,629 [NioProcessor-1] INFO o.r.s.stream.ClientBroadcastStream - dispatchEvent: adjust archaic videoTime, from: 5339 to 5369
2012-02-18 01:32:21,896 [NioProcessor-1] INFO o.r.s.stream.ClientBroadcastStream - dispatchEvent: adjust archaic videoTime, from: 5625 to 5689
2012-02-18 01:32:22,547 [NioProcessor-1] INFO o.r.s.stream.ClientBroadcastStream - dispatchEvent: adjust archaic videoTime, from: 6439 to 6491
2012-02-18 01:32:22,847 [NioProcessor-1] INFO o.r.s.stream.ClientBroadcastStream - dispatchEvent: adjust archaic videoTime, from: 6848 to 6907
2012-02-18 01:32:23,597 [NioProcessor-1] INFO o.r.s.stream.ClientBroadcastStream - dispatchEvent: adjust archaic videoTime, from: 7594 to 7643
2012-02-18 01:32:26,249 [NioProcessor-1] INFO o.r.s.stream.ClientBroadcastStream - dispatchEvent: adjust archaic videoTime, from: 10272 to 10331
2012-02-18 01:32:27,450 [NioProcessor-1] INFO o.r.s.stream.ClientBroadcastStream - dispatchEvent: adjust archaic videoTime, from: 11366 to 11387
2012-02-18 01:32:28,150 [NioProcessor-1] INFO o.r.s.stream.ClientBroadcastStream - dispatchEvent: adjust archaic videoTime, from: 12059 to 12091
2012-02-18 01:32:29,250 [NioProcessor-1] INFO o.r.s.stream.ClientBroadcastStream - dispatchEvent: adjust archaic videoTime, from: 13149 to 13179
2012-02-18 01:32:29,501 [NioProcessor-1] INFO o.r.s.stream.ClientBroadcastStream - dispatchEvent: adjust archaic videoTime, from: 13523 to 13563
2012-02-18 01:32:30,479 [NioProcessor-1] INFO o.r.s.stream.ClientBroadcastStream - dispatchEvent: adjust archaic videoTime, from: 14320 to 14363
2012-02-18 01:32:30,646 [NioProcessor-2] WARN org.red5.server.Context - Bean lookup failed for everyone_37b0bf5186e9a223d514a0641f4cbef0..soservice in the application context
2012-02-18 01:32:30,688 [NioProcessor-1] INFO o.r.s.stream.ClientBroadcastStream - dispatchEvent: adjust archaic videoTime, from: 14725 to 14779
2012-02-18 01:32:31,251 [NioProcessor-1] WARN org.red5.server.Context - Bean lookup failed for everyone_37b0bf5186e9a223d514a0641f4cbef0..soservice in the application context
2012-02-18 01:32:31,634 [NioProcessor-1] INFO o.r.s.stream.ClientBroadcastStream - dispatchEvent: adjust archaic videoTime, from: 15518 to 15547
2012-02-18 01:32:32,703 [NioProcessor-1] INFO o.r.s.stream.ClientBroadcastStream - dispatchEvent: adjust archaic videoTime, from: 16605 to 16635
2012-02-18 01:32:33,703 [NioProcessor-1] INFO o.r.s.stream.ClientBroadcastStream - dispatchEvent: adjust archaic videoTime, from: 17699 to 17755
2012-02-18 01:32:33,994 [NioProcessor-1] WARN org.red5.server.Context - Bean lookup failed for everyone_37b0bf5186e9a223d514a0641f4cbef0..soservice in the application context
2012-02-18 01:32:34,904 [NioProcessor-1] INFO o.r.s.stream.ClientBroadcastStream - dispatchEvent: adjust archaic videoTime, from: 18755 to 18811
2012-02-18 01:32:35,500 [NioProcessor-2] ERROR o.r.s.n.r.codec.RTMPProtocolDecoder - Error decoding buffer
org.red5.server.net.protocol.ProtocolException: Error during decoding
at org.red5.server.net.rtmp.codec.RTMPProtocolDecoder.decode(RTMPProtocolDecoder.java:196) [red5.jar:na]
at org.red5.server.net.rtmp.codec.RTMPProtocolDecoder.decodeBuffer(RTMPProtocolDecoder.java:119) [red5.jar:na]
at org.red5.server.net.rtmp.codec.RTMPMinaProtocolDecoder.decode(RTMPMinaProtocolDecoder.java:61) [red5.jar:na]
at org.apache.mina.filter.codec.ProtocolCodecFilter.messageReceived(ProtocolCodecFilter.java:225) [mina-core-2.0.0-M6.jar:na]
at org.apache.mina.core.filterchain.DefaultIoFilterChain.callNextMessageReceived(DefaultIoFilterChain.java:433) [mina-core-2.0.0-M6.jar:na]
at org.apache.mina.core.filterchain.DefaultIoFilterChain.access$1200(DefaultIoFilterChain.java:47) [mina-core-2.0.0-M6.jar:na]
at org.apache.mina.core.filterchain.DefaultIoFilterChain$EntryImpl$1.messageReceived(DefaultIoFilterChain.java:801) [mina-core-2.0.0-M6.jar:na]
at org.apache.mina.core.filterchain.IoFilterAdapter.messageReceived(IoFilterAdapter.java:119) [mina-core-2.0.0-M6.jar:na]
at org.apache.mina.core.filterchain.DefaultIoFilterChain.callNextMessageReceived(DefaultIoFilterChain.java:433) [mina-core-2.0.0-M6.jar:na]
at org.apache.mina.core.filterchain.DefaultIoFilterChain.fireMessageReceived(DefaultIoFilterChain.java:425) [mina-core-2.0.0-M6.jar:na]
at org.apache.mina.core.polling.AbstractPollingIoProcessor.read(AbstractPollingIoProcessor.java:603) [mina-core-2.0.0-M6.jar:na]
at org.apache.mina.core.polling.AbstractPollingIoProcessor.process(AbstractPollingIoProcessor.java:563) [mina-core-2.0.0-M6.jar:na]
at org.apache.mina.core.polling.AbstractPollingIoProcessor.process(AbstractPollingIoProcessor.java:552) [mina-core-2.0.0-M6.jar:na]
at org.apache.mina.core.polling.AbstractPollingIoProcessor.access$400(AbstractPollingIoProcessor.java:56) [mina-core-2.0.0-M6.jar:na]
at org.apache.mina.core.polling.AbstractPollingIoProcessor$Processor.run(AbstractPollingIoProcessor.java:891) [mina-core-2.0.0-M6.jar:na]
at org.apache.mina.util.NamePreservingRunnable.run(NamePreservingRunnable.java:64) [mina-core-2.0.0-M6.jar:na]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110) [na:1.6.0_20]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603) [na:1.6.0_20]
at java.lang.Thread.run(Thread.java:636) [na:1.6.0_20]
Caused by: java.lang.IndexOutOfBoundsException: Index: 0, Size: 0
at java.util.ArrayList.rangeCheck(ArrayList.java:571) [na:1.6.0_20]
at java.util.ArrayList.get(ArrayList.java:349) [na:1.6.0_20]
at org.red5.io.amf3.Input.readString(Input.java:349) [red5.jar:na]
at org.red5.io.object.Deserializer.deserialize(Deserializer.java:73) [red5.jar:na]
at org.red5.server.net.rtmp.codec.RTMPProtocolDecoder.decodeNotifyOrInvoke(RTMPProtocolDecoder.java:828) [red5.jar:na]
at org.red5.server.net.rtmp.codec.RTMPProtocolDecoder.decodeInvoke(RTMPProtocolDecoder.java:734) [red5.jar:na]
at org.red5.server.net.rtmp.codec.RTMPProtocolDecoder.decodeMessage(RTMPProtocolDecoder.java:506) [red5.jar:na]
at org.red5.server.net.rtmp.codec.RTMPProtocolDecoder.decodePacket(RTMPProtocolDecoder.java:391) [red5.jar:na]
at org.red5.server.net.rtmp.codec.RTMPProtocolDecoder.decode(RTMPProtocolDecoder.java:182) [red5.jar:na]
... 18 common frames omitted
2012-02-18 01:32:35,508 [NioProcessor-2] WARN o.r.s.n.r.codec.RTMPProtocolDecoder - Closing connection because decoding failed: RTMPMinaConnection from 223.205.177.236 : 5673 to 184.107.183.106 (in: 3906 out 4006 )

Lee, try the newer builds of Red5 like revision 4329 or newer; from your log it would seem that you're using 0.7 or older.
http://red5.googlecode.com/

Related

unable to run client command for apache karaf 4.3.3 through remote server

I have downloaded apache karaf 4.3.3 on several Ubuntu18 machine. When I try to install feature or run any command through client, I get the below log trace. I am able to run the client when I am on the server, but it fails through the script.
command i'm using are
$KARAF_HOME/bin/start
$KARAF_HOME/bin/client -u karaf -p karaf "features:install myfeature"
1155 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.common.io.nio2.Nio2Session - handleReadCycleCompletion(Nio2Session[local=/127.0.0.1:45906, remote=localhost/127.0.0.1:8101]) read 376 bytes
1155 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.client.session.ClientSessionImpl - handleKexInit(ClientSessionImpl[karaf#localhost/127.0.0.1:8101]) SSH_MSG_KEXINIT
1181 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.client.session.ClientSessionImpl - setNegotiationResult([ClientSessionImpl[karaf#localhost/127.0.0.1:8101], aes128-ctr, hmac-sha2-256, none]) Kex: server->client null {} {}
1181 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.client.session.ClientSessionImpl - setNegotiationResult([ClientSessionImpl[karaf#localhost/127.0.0.1:8101], aes128-ctr, hmac-sha2-256, none]) Kex: client->server null {} {}
1268 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.client.kex.DHGClient - init(DHGClient[ecdh-sha2-nistp521])[ClientSessionImpl[karaf#localhost/127.0.0.1:8101]] Send SSH_MSG_KEXDH_INIT
1268 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.client.session.ClientSessionImpl - encode([ClientSessionImpl[karaf#localhost/127.0.0.1:8101], 1, 30, 30, 138]) packet #null sending command={}[{}] len={}
1269 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.common.io.nio2.Nio2Session - writePacket(Nio2Session[local=/127.0.0.1:45906, remote=localhost/127.0.0.1:8101]) Writing 152 bytes
1285 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.common.io.nio2.Nio2Session - handleCompletedWriteCycle(Nio2Session[local=/127.0.0.1:45906, remote=localhost/127.0.0.1:8101]) finished writing len=152
1322 [sshd-SshClient[4f209819]-nio2-thread-3] DEBUG org.apache.sshd.common.io.nio2.Nio2Session - handleReadCycleCompletion(Nio2Session[local=/127.0.0.1:45906, remote=localhost/127.0.0.1:8101]) read 712 bytes
1325 [sshd-SshClient[4f209819]-nio2-thread-3] DEBUG org.apache.sshd.client.kex.DHGClient - next([DHGClient[ecdh-sha2-nistp521], ClientSessionImpl[karaf#localhost/127.0.0.1:8101], SSH_MSG_KEXDH_REPLY])[null] process command={}
1384 [sshd-SshClient[4f209819]-nio2-thread-3] DEBUG org.apache.sshd.client.session.ClientSessionImpl - handleKexMessage([ClientSessionImpl[karaf#localhost/127.0.0.1:8101], ecdh-sha2-nistp521, 31])[null] KEX processing complete after cmd={}
1384 [sshd-SshClient[4f209819]-nio2-thread-3] WARN org.apache.sshd.client.keyverifier.AcceptAllServerKeyVerifier - Server at localhost/127.0.0.1:8101 presented unverified RSA key: SHA256:L7f8vYz8AGuXY9JOATcykbVn2IeyR2AVDLYudN7r9bw
1385 [sshd-SshClient[4f209819]-nio2-thread-3] DEBUG org.apache.sshd.client.session.ClientSessionImpl - checkKeys([ClientSessionImpl[karaf#localhost/127.0.0.1:8101], ssh-rsa, SHA256:L7f8vYz8AGuXY9JOATcykbVn2IeyR2AVDLYudN7r9bw, true]) key=null-{}, verified={}
1385 [sshd-SshClient[4f209819]-nio2-thread-3] DEBUG org.apache.sshd.client.session.ClientSessionImpl - sendNewKeys(ClientSessionImpl[karaf#localhost/127.0.0.1:8101]) Send SSH_MSG_NEWKEYS
1385 [sshd-SshClient[4f209819]-nio2-thread-3] DEBUG org.apache.sshd.client.session.ClientSessionImpl - encode([ClientSessionImpl[karaf#localhost/127.0.0.1:8101], 2, 21, SSH_MSG_NEWKEYS, 1]) packet #null sending command={}[{}] len={}
1385 [sshd-SshClient[4f209819]-nio2-thread-3] DEBUG org.apache.sshd.common.io.nio2.Nio2Session - writePacket(Nio2Session[local=/127.0.0.1:45906, remote=localhost/127.0.0.1:8101]) Writing 16 bytes
1396 [sshd-SshClient[4f209819]-nio2-thread-3] DEBUG org.apache.sshd.common.io.nio2.Nio2Session - handleCompletedWriteCycle(Nio2Session[local=/127.0.0.1:45906, remote=localhost/127.0.0.1:8101]) finished writing len=16
1397 [sshd-SshClient[4f209819]-nio2-thread-3] DEBUG org.apache.sshd.common.io.nio2.Nio2Session - handleReadCycleCompletion(Nio2Session[local=/127.0.0.1:45906, remote=localhost/127.0.0.1:8101]) read 16 bytes
1397 [sshd-SshClient[4f209819]-nio2-thread-3] DEBUG org.apache.sshd.client.session.ClientSessionImpl - handleNewKeys(ClientSessionImpl[karaf#localhost/127.0.0.1:8101]) SSH_MSG_NEWKEYS command=SSH_MSG_NEWKEYS
1397 [sshd-SshClient[4f209819]-nio2-thread-3] DEBUG org.apache.sshd.client.session.ClientSessionImpl - receiveNewKeys(ClientSessionImpl[karaf#localhost/127.0.0.1:8101]) session ID=8c:46:43:0f:d1:11:8f:f7:ed:eb:40:b1:29:64:86:52:0f:34:f0:60:00:7f:69:df:d9:a8:ec:dc:36:27:95:d2:38:d2:e9:39:e7:8b:fd:8b:b4:0c:81:4a:7d:58:57:2b:cf:76:43:50:b5:de:e2:45:72:61:b7:b6:5f:1e:a2:5c
1422 [sshd-SshClient[4f209819]-nio2-thread-3] DEBUG org.apache.sshd.client.session.ClientSessionImpl - receiveNewKeys([ClientSessionImpl[karaf#localhost/127.0.0.1:8101], BaseCipher[AES, ivSize=16, kdfSize=16,AES/CTR/NoPadding, blkSize=16], BaseCipher[AES, ivSize=16, kdfSize=16,AES/CTR/NoPadding, blkSize=16], 4294967296, 4294967296]) inCipher=null, outCipher={}, recommended blocks limit={}, actual={}
1422 [sshd-SshClient[4f209819]-nio2-thread-3] DEBUG org.apache.sshd.client.session.ClientSessionImpl - sendInitialServiceRequest(ClientSessionImpl[karaf#localhost/127.0.0.1:8101]) Send SSH_MSG_SERVICE_REQUEST for ssh-userauth
1423 [sshd-SshClient[4f209819]-nio2-thread-3] DEBUG org.apache.sshd.client.session.ClientSessionImpl - encode([ClientSessionImpl[karaf#localhost/127.0.0.1:8101], 3, 5, SSH_MSG_SERVICE_REQUEST, 17]) packet #null sending command={}[{}] len={}
1423 [sshd-SshClient[4f209819]-nio2-thread-3] DEBUG org.apache.sshd.common.io.nio2.Nio2Session - writePacket(Nio2Session[local=/127.0.0.1:45906, remote=localhost/127.0.0.1:8101]) Writing 80 bytes
1423 [sshd-SshClient[4f209819]-nio2-thread-3] DEBUG org.apache.sshd.common.io.nio2.Nio2Session - handleCompletedWriteCycle(Nio2Session[local=/127.0.0.1:45906, remote=localhost/127.0.0.1:8101]) finished writing len=80
1424 [sshd-SshClient[4f209819]-nio2-thread-3] DEBUG org.apache.sshd.client.session.ClientSessionImpl - encode([ClientSessionImpl[karaf#localhost/127.0.0.1:8101], 4, 50, SSH_MSG_USERAUTH_REQUEST, 36]) packet #null sending command={}[{}] len={}
1424 [sshd-SshClient[4f209819]-nio2-thread-3] DEBUG org.apache.sshd.common.io.nio2.Nio2Session - writePacket(Nio2Session[local=/127.0.0.1:45906, remote=localhost/127.0.0.1:8101]) Writing 96 bytes
1424 [sshd-SshClient[4f209819]-nio2-thread-3] DEBUG org.apache.sshd.common.io.nio2.Nio2Session - handleCompletedWriteCycle(Nio2Session[local=/127.0.0.1:45906, remote=localhost/127.0.0.1:8101]) finished writing len=96
1424 [sshd-SshClient[4f209819]-nio2-thread-3] DEBUG org.apache.sshd.client.session.ClientSessionImpl - handleNewKeys(ClientSessionImpl[karaf#localhost/127.0.0.1:8101]) sent 1 pending packets
1426 [sshd-SshClient[4f209819]-nio2-thread-1] DEBUG org.apache.sshd.common.io.nio2.Nio2Session - handleReadCycleCompletion(Nio2Session[local=/127.0.0.1:45906, remote=localhost/127.0.0.1:8101]) read 80 bytes
1427 [sshd-SshClient[4f209819]-nio2-thread-1] DEBUG org.apache.sshd.client.session.ClientSessionImpl - handleServiceAccept(ClientSessionImpl[karaf#localhost/127.0.0.1:8101]) SSH_MSG_SERVICE_ACCEPT service=ssh-userauth
1428 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.common.io.nio2.Nio2Session - handleReadCycleCompletion(Nio2Session[local=/127.0.0.1:45906, remote=localhost/127.0.0.1:8101]) read 112 bytes
1428 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.client.session.ClientUserAuthService - processUserAuth([ClientSessionImpl[karaf#localhost/127.0.0.1:8101], false, keyboard-interactive,password,publickey]) Received SSH_MSG_USERAUTH_FAILURE - partial=null, methods={}
1428 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.client.session.ClientUserAuthService - tryNext([ClientSessionImpl[karaf#localhost/127.0.0.1:8101], [publickey, keyboard-interactive, password], [keyboard-interactive, password, publickey]]) starting authentication mechanisms: client=null, server={}
1430 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.client.session.ClientUserAuthService - tryNext(ClientSessionImpl[karaf#localhost/127.0.0.1:8101]) attempting method=publickey
1446 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.client.config.keys.DefaultClientIdentitiesWatcher - loadKeys(/root/.ssh/id_rsa) no key loaded
1447 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.client.config.keys.DefaultClientIdentitiesWatcher - loadKeys(/root/.ssh/id_dsa) no key loaded
1447 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.client.config.keys.DefaultClientIdentitiesWatcher - loadKeys(/root/.ssh/id_ecdsa) no key loaded
1447 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.client.config.keys.DefaultClientIdentitiesWatcher - loadKeys(/root/.ssh/id_ed25519) no key loaded
1447 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.client.auth.pubkey.UserAuthPublicKey - sendAuthDataRequest(ClientSessionImpl[karaf#localhost/127.0.0.1:8101])[ssh-connection] no more keys to send
1447 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.client.session.ClientUserAuthService - tryNext(ClientSessionImpl[karaf#localhost/127.0.0.1:8101]) no initial request sent by method=publickey
1447 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.client.auth.pubkey.UserAuthPublicKey - destroy(ClientSessionImpl[karaf#localhost/127.0.0.1:8101])[ssh-connection]
1448 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.client.session.ClientUserAuthService - tryNext(ClientSessionImpl[karaf#localhost/127.0.0.1:8101]) attempting method=keyboard-interactive
1449 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.client.auth.keyboard.UserAuthKeyboardInteractive - verifyTrialsCount([ClientSessionImpl[karaf#localhost/127.0.0.1:8101], ssh-connection, SSH_MSG_USERAUTH_REQUEST, 0, 3])[null] cmd={} - {} out of {}
1449 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.client.auth.keyboard.UserAuthKeyboardInteractive - sendAuthDataRequest([ClientSessionImpl[karaf#localhost/127.0.0.1:8101], ssh-connection, keyboard-interactive, , ])[null] send SSH_MSG_USERAUTH_REQUEST for {}: lang={}, methods={}
1449 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.client.session.ClientSessionImpl - encode([ClientSessionImpl[karaf#localhost/127.0.0.1:8101], 5, 50, SSH_MSG_USERAUTH_REQUEST, 60]) packet #null sending command={}[{}] len={}
1449 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.common.io.nio2.Nio2Session - writePacket(Nio2Session[local=/127.0.0.1:45906, remote=localhost/127.0.0.1:8101]) Writing 128 bytes
1454 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.common.io.nio2.Nio2Session - handleCompletedWriteCycle(Nio2Session[local=/127.0.0.1:45906, remote=localhost/127.0.0.1:8101]) finished writing len=128
1455 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.client.session.ClientUserAuthService - tryNext(ClientSessionImpl[karaf#localhost/127.0.0.1:8101]) successfully processed initial buffer by method=keyboard-interactive
1455 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.common.io.nio2.Nio2Session - handleReadCycleCompletion(Nio2Session[local=/127.0.0.1:45906, remote=localhost/127.0.0.1:8101]) read 128 bytes
1455 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.client.session.ClientUserAuthService - processUserAuth([ClientSessionImpl[karaf#localhost/127.0.0.1:8101], 60, keyboard-interactive]) delegate processing of null to {}
1455 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.client.auth.keyboard.UserAuthKeyboardInteractive - processAuthDataRequest([ClientSessionImpl[karaf#localhost/127.0.0.1:8101], ssh-connection, Password authentication, , en-US, 1])[null] SSH_MSG_USERAUTH_INFO_REQUEST name={}, instruction={}, language={}, num-prompts={}
1455 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.client.auth.keyboard.UserAuthKeyboardInteractive - verifyTrialsCount([ClientSessionImpl[karaf#localhost/127.0.0.1:8101], ssh-connection, SSH_MSG_USERAUTH_INFO_REQUEST, 1, 3])[null] cmd={} - {} out of {}
1456 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.client.auth.keyboard.UserAuthKeyboardInteractive - getUserResponses(ClientSessionImpl[karaf#localhost/127.0.0.1:8101]) use password candidate for interaction=Password authentication
1456 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.client.session.ClientSessionImpl - encode([ClientSessionImpl[karaf#localhost/127.0.0.1:8101], 6, 61, SSH_MSG_USERAUTH_INFO_RESPONSE, 14]) packet #null sending command={}[{}] len={}
1456 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.common.io.nio2.Nio2Session - writePacket(Nio2Session[local=/127.0.0.1:45906, remote=localhost/127.0.0.1:8101]) Writing 80 bytes
1458 [sshd-SshClient[4f209819]-nio2-thread-2] DEBUG org.apache.sshd.common.io.nio2.Nio2Session - handleCompletedWriteCycle(Nio2Session[local=/127.0.0.1:45906, remote=localhost/127.0.0.1:8101]) finished writing len=80
1529 [sshd-SshClient[4f209819]-nio2-thread-3] DEBUG org.apache.sshd.common.io.nio2.Nio2Session - handleReadCycleCompletion(Nio2Session[local=/127.0.0.1:45906, remote=localhost/127.0.0.1:8101]) read 64 bytes
1529 [sshd-SshClient[4f209819]-nio2-thread-3] DEBUG org.apache.sshd.client.session.ClientUserAuthService - processUserAuth(ClientSessionImpl[karaf#localhost/127.0.0.1:8101]) SSH_MSG_USERAUTH_SUCCESS Succeeded with keyboard-interactive
1530 [sshd-SshClient[4f209819]-nio2-thread-3] DEBUG org.apache.sshd.client.auth.keyboard.UserAuthKeyboardInteractive - destroy(ClientSessionImpl[karaf#localhost/127.0.0.1:8101])[ssh-connection]
1530 [sshd-SshClient[4f209819]-nio2-thread-3] DEBUG org.apache.sshd.client.session.ClientConnectionService - stopHeartBeat(ClientSessionImpl[karaf#localhost/127.0.0.1:8101]) no heartbeat to stop
1532 [sshd-SshClient[4f209819]-nio2-thread-3] DEBUG org.apache.sshd.client.session.ClientConnectionService - startHeartbeat([ClientSessionImpl[karaf#localhost/127.0.0.1:8101], 60000, keepalive#sshd.apache.org]) - started at interval=null with request={}
1799 [main] DEBUG org.apache.sshd.client.channel.ChannelExec - init() service=[ClientConnectionService[ClientSessionImpl[karaf#localhost/127.0.0.1:8101]], ClientSessionImpl[karaf#localhost/127.0.0.1:8101], 0] session=null id={}
1800 [main] DEBUG org.apache.sshd.common.channel.Window - init([Window[client/local](ChannelExec[id=0, recipient=-1]-ClientSessionImpl[karaf#localhost/127.0.0.1:8101]), 2097152, 2097152, 32768]) size=null, max={}, packet={}
1800 [main] DEBUG org.apache.sshd.client.session.ClientConnectionService - registerChannel([ClientConnectionService[ClientSessionImpl[karaf#localhost/127.0.0.1:8101]], 0, ChannelExec[id=0, recipient=-1]-ClientSessionImpl[karaf#localhost/127.0.0.1:8101]])[id=null] {}
1800 [main] DEBUG org.apache.sshd.client.session.ClientSessionImpl - createExecChannel([ClientSessionImpl[karaf#localhost/127.0.0.1:8101], bundle:list
, 0, null])[null] created id={} - PTY={}
1811 [Thread-2] DEBUG org.apache.sshd.client.channel.ChannelExec - close(ChannelExec[id=0, recipient=-1]-ClientSessionImpl[karaf#localhost/127.0.0.1:8101]) Closing immediately
1811 [Thread-2] DEBUG org.apache.sshd.client.channel.ChannelExec - close(ChannelExec[id=0, recipient=-1]-ClientSessionImpl[karaf#localhost/127.0.0.1:8101]) prevent sending EOF
1811 [main] DEBUG org.apache.sshd.client.channel.ChannelExec - open(ChannelExec[id=0, recipient=-1]-ClientSessionImpl[karaf#localhost/127.0.0.1:8101]) Send SSH_MSG_CHANNEL_OPEN - type=session
1811 [Thread-2] DEBUG org.apache.sshd.common.channel.Window - Closing Window[client/local](ChannelExec[id=0, recipient=-1]-ClientSessionImpl[karaf#localhost/127.0.0.1:8101])
1811 [main] DEBUG org.apache.sshd.client.channel.ChannelExec - writePacket(ChannelExec[id=0, recipient=-1]-ClientSessionImpl[karaf#localhost/127.0.0.1:8101]) Discarding output packet because channel state=Immediate
1811 [Thread-2] DEBUG org.apache.sshd.common.channel.Window - Closing Window[client/remote](ChannelExec[id=0, recipient=-1]-ClientSessionImpl[karaf#localhost/127.0.0.1:8101])
1816 [main] DEBUG org.apache.sshd.client.SshClient - close(SshClient[4f209819]) Closing immediately
1817 [main] DEBUG org.apache.sshd.common.io.nio2.Nio2Connector - close(org.apache.sshd.common.io.nio2.Nio2Connector#1b1426f4) Closing immediately
1818 [main] DEBUG org.apache.sshd.common.io.nio2.Nio2Session - close(Nio2Session[local=/127.0.0.1:45906, remote=localhost/127.0.0.1:8101]) Closing immediately
1818 [main] DEBUG org.apache.sshd.common.io.nio2.Nio2Session - doCloseImmediately(Nio2Session[local=/127.0.0.1:45906, remote=localhost/127.0.0.1:8101]) closing socket=sun.nio.ch.UnixAsynchronousSocketChannelImpl[connected local=/127.0.0.1:45906 remote=localhost/127.0.0.1:8101]
1819 [sshd-SshClient[4f209819]-nio2-thread-1] DEBUG org.apache.sshd.common.io.nio2.Nio2Session - close(Nio2Session[local=/127.0.0.1:45906, remote=localhost/127.0.0.1:8101])[Immediately] state already Immediate
1820 [main] DEBUG org.apache.sshd.common.io.nio2.Nio2Session - doCloseImmediately(Nio2Session[local=/127.0.0.1:45906, remote=localhost/127.0.0.1:8101]) socket=sun.nio.ch.UnixAsynchronousSocketChannelImpl[closed] closed
1820 [main] DEBUG org.apache.sshd.common.io.nio2.Nio2Connector - unmapSession(id=101): Nio2Session[local=/127.0.0.1:45906, remote=localhost/127.0.0.1:8101]
1820 [main] DEBUG org.apache.sshd.client.session.ClientSessionImpl - close(ClientSessionImpl[karaf#localhost/127.0.0.1:8101]) Closing immediately
1820 [main] DEBUG org.apache.sshd.client.session.ClientSessionImpl - signalAuthFailure([ClientSessionImpl[karaf#localhost/127.0.0.1:8101], SshException, false, Session is being closed]) type=null, signalled={}: {}
1821 [main] DEBUG org.apache.sshd.client.session.ClientConnectionService - close(ClientConnectionService[ClientSessionImpl[karaf#localhost/127.0.0.1:8101]]) Closing immediately
1821 [main] DEBUG org.apache.sshd.client.session.ClientConnectionService - stopHeartBeat(ClientSessionImpl[karaf#localhost/127.0.0.1:8101]) stopping
1821 [main] DEBUG org.apache.sshd.client.session.ClientConnectionService - stopHeartBeat(ClientSessionImpl[karaf#localhost/127.0.0.1:8101]) stopped
1821 [main] DEBUG org.apache.sshd.client.channel.ChannelExec - close(ChannelExec[id=0, recipient=-1]-ClientSessionImpl[karaf#localhost/127.0.0.1:8101])[Immediately] state already Immediate
1822 [main] DEBUG org.apache.sshd.client.session.ClientConnectionService - close(ClientConnectionService[ClientSessionImpl[karaf#localhost/127.0.0.1:8101]])[Immediately] closed
1822 [main] DEBUG org.apache.sshd.client.session.ClientSessionImpl - close(ClientSessionImpl[karaf#localhost/127.0.0.1:8101])[Immediately] closed
1822 [main] DEBUG org.apache.sshd.common.io.nio2.Nio2Session - close(Nio2Session[local=/127.0.0.1:45906, remote=localhost/127.0.0.1:8101])[Immediately] closed
1822 [main] DEBUG org.apache.sshd.common.io.nio2.Nio2Connector - close(org.apache.sshd.common.io.nio2.Nio2Connector#1b1426f4)[Immediately] closed
1822 [main] DEBUG org.apache.sshd.common.io.nio2.Nio2ServiceFactory - close(org.apache.sshd.common.io.nio2.Nio2ServiceFactory#71238fc2) Closing immediately
1822 [main] DEBUG org.apache.sshd.common.io.nio2.Nio2ServiceFactory - Shutdown group
1831 [Thread-2] DEBUG org.apache.sshd.common.channel.AbstractChannel$GracefulChannelCloseable - close(ChannelExec[id=0, recipient=-1]-ClientSessionImpl[karaf#localhost/127.0.0.1:8101])[immediately=true] processing
1831 [main] DEBUG org.apache.sshd.common.io.nio2.Nio2ServiceFactory - Group successfully shut down
1831 [main] DEBUG org.apache.sshd.common.io.nio2.Nio2ServiceFactory - Shutdown executor
1835 [main] DEBUG org.apache.sshd.common.io.nio2.Nio2ServiceFactory - Shutdown complete
1835 [Thread-2] DEBUG org.apache.sshd.client.session.ClientConnectionService - unregisterChannel(ChannelExec[id=0, recipient=-1]-ClientSessionImpl[karaf#localhost/127.0.0.1:8101]) result=ChannelExec[id=0, recipient=-1]-ClientSessionImpl[karaf#localhost/127.0.0.1:8101]
1835 [main] DEBUG org.apache.sshd.common.io.nio2.Nio2ServiceFactory - close(org.apache.sshd.common.io.nio2.Nio2ServiceFactory#71238fc2)[Immediately] closed
1835 [Thread-2] DEBUG org.apache.sshd.common.util.closeable.SequentialCloseable - doClose(org.apache.sshd.common.util.closeable.SequentialCloseable$1#51fc526d) signal close complete immediately=true
1835 [Thread-2] DEBUG org.apache.sshd.common.util.closeable.SequentialCloseable - doClose(org.apache.sshd.common.util.closeable.SequentialCloseable$1#5b38138a) signal close complete immediately=true
1835 [Thread-2] DEBUG org.apache.sshd.common.util.closeable.SequentialCloseable - doClose(org.apache.sshd.common.util.closeable.SequentialCloseable$1#a890633) signal close complete immediately=true
1835 [main] DEBUG org.apache.sshd.common.util.closeable.SequentialCloseable - doClose(org.apache.sshd.common.util.closeable.SequentialCloseable$1#16a0ee18) signal close complete immediately=true
1835 [Thread-2] DEBUG org.apache.sshd.common.io.nio2.Nio2Session - close(Nio2Session[local=/127.0.0.1:45906, remote=localhost/127.0.0.1:8101])[Immediately] state already Closed
1835 [main] DEBUG org.apache.sshd.client.SshClient - close(SshClient[4f209819])[Immediately] closed
1835 [Thread-2] DEBUG org.apache.sshd.common.util.closeable.SequentialCloseable - doClose(org.apache.sshd.common.util.closeable.SequentialCloseable$1#3b9b6f68) signal close complete immediately=true
1836 [Thread-2] DEBUG org.apache.sshd.client.channel.ChannelExec - close(ChannelExec[id=0, recipient=-1]-ClientSessionImpl[karaf#localhost/127.0.0.1:8101])[Immediately] closed
org.apache.sshd.common.SshException: Closed
at org.apache.sshd.common.util.closeable.FuturesCloseable.doClose(FuturesCloseable.java:47)
at org.apache.sshd.common.util.closeable.SimpleCloseable.close(SimpleCloseable.java:63)
at org.apache.sshd.common.util.closeable.SequentialCloseable$1.operationComplete(SequentialCloseable.java:56)
at org.apache.sshd.common.util.closeable.SequentialCloseable$1.operationComplete(SequentialCloseable.java:45)
at org.apache.sshd.common.util.closeable.SequentialCloseable.doClose(SequentialCloseable.java:69)
at org.apache.sshd.common.util.closeable.SimpleCloseable.close(SimpleCloseable.java:63)
at org.apache.sshd.common.util.closeable.SequentialCloseable$1.operationComplete(SequentialCloseable.java:56)
at org.apache.sshd.common.util.closeable.SequentialCloseable$1.operationComplete(SequentialCloseable.java:45)
at org.apache.sshd.common.util.closeable.SequentialCloseable.doClose(SequentialCloseable.java:69)
at org.apache.sshd.common.util.closeable.SimpleCloseable.close(SimpleCloseable.java:63)
at org.apache.sshd.common.util.closeable.AbstractInnerCloseable.doCloseImmediately(AbstractInnerCloseable.java:48)
at org.apache.sshd.common.util.closeable.AbstractCloseable.close(AbstractCloseable.java:95)
at org.apache.karaf.client.Main.lambda$main$1(Main.java:196)
at java.base/java.lang.Thread.run(Thread.java:829)
Anyone know why this is happening? is it to do with the keys? if so then what should i change?
I used the above answer to construct a solution that worked for me using the following pattern
`ssh -o RequestTTY=force -o StrictHostKeyChecking=no karaf_user#localhost -p KARAF_SSH_PORT -i "/ssh/key" "karaf_command"`
This uses the ssh server that karaf is packaged with and the client command wraps internally to make karaf console commands over ssh. This approach requires you to put a public key for your karaf user in the keys.properties file and ssh using a private key.
For more details see:
the karaf security docs: https://karaf.apache.org/manual/latest/security
the karaf sshd docs: https://karaf.apache.org/manual/latest/#_sshd_server
I came across a similar problem with the script returning a "Closed" status.
After some tests, I found the script fails when there is no TTY (as is the case if you use tools like ansible)
You should use the 'batch mode' included in karaf client script, so your command would become:
$KARAF_HOME/bin/client -u karaf -p karaf -b <<< "features:install myfeature"
This way, the script won't ask for a TTY but will run the commands from standard input. In my example I use a 'here string' to redirect to the standard input, but you could of course use any redirection you like to pass commands to the standard input
There is also a '-f' option if you prefer to read commands from a file.
See bin/client --help for details
Alternatively, if you have control of the ssh commands, you could use the option on ssh to force tty allocation when connecting to your remote machine:
ssh -o RequestTTY=force my_server $KARAF_HOME/bin/client -u karaf -p karaf "features:install myfeature"
(see https://man7.org/linux/man-pages/man5/ssh_config.5.html)

Unable to open iterator for alias <alias_name>

I know this is one of the most repeated question. I have looked almost everywhere and none of the resources could resolve the issue I am facing.
Below is the simplified version of my problem statement. But in actual data is little complex so I have to use UDF
My input File: (input.txt)
NotNeeded1,NotNeeded11;Needed1
NotNeeded2,NotNeeded22;Needed2
I want the output to be
Needed1
Needed2
So, I am writing the below UDF
(Java code):
package com.company.pig;
import java.io.IOException;
import org.apache.pig.EvalFunc;
import org.apache.pig.data.Tuple;
public class myudf extends EvalFunc<String>{
public String exec(Tuple input) throws IOException {
if (input == null || input.size() == 0)
return null;
String s = (String)input.get(0);
String str = s.split("\\,")[1];
String str1 = str.split("\\;")[1];
return str1;
}
}
And packaging it into
rollupreg_extract-jar-with-dependencies.jar
Below is my pig shell code
grunt> REGISTER /pig/rollupreg_extract-jar-with-dependencies.jar;
grunt> DEFINE myudf com.company.pig.myudf;
grunt> data = LOAD 'hdfs://sandbox.hortonworks.com:8020/pig_hdfs/input.txt' USING PigStorage(',');
grunt> extract = FOREACH data GENERATE myudf($1);
grunt> DUMP extract;
And I get the below error:
2017-05-15 15:58:15,493 [main] INFO org.apache.pig.tools.pigstats.ScriptState - Pig features used in the script: UNKNOWN
2017-05-15 15:58:15,577 [main] INFO org.apache.pig.data.SchemaTupleBackend - Key [pig.schematuple] was not set... will not generate code.
2017-05-15 15:58:15,659 [main] INFO org.apache.pig.newplan.logical.optimizer.LogicalPlanOptimizer - {RULES_ENABLED=[AddForEach, ColumnMapKeyPrune, ConstantCalculator, GroupByConstParallelSetter, LimitOptimizer, LoadTypeCastInserter, MergeFilter, MergeForEach, PartitionFilterOptimizer, PredicatePushdownOptimizer, PushDownForEachFlatten, PushUpFilter, SplitFilter, StreamTypeCastInserter]}
2017-05-15 15:58:15,774 [main] INFO org.apache.pig.impl.util.SpillableMemoryManager - Selected heap (PS Old Gen) of size 699400192 to monitor. collectionUsageThreshold = 489580128, usageThreshold = 489580128
2017-05-15 15:58:15,865 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MRCompiler - File concatenation threshold: 100 optimistic? false
2017-05-15 15:58:15,923 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer - MR plan size before optimization: 1
2017-05-15 15:58:15,923 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer - MR plan size after optimization: 1
2017-05-15 15:58:16,184 [main] INFO org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl - Timeline service address: http://sandbox.hortonworks.com:8188/ws/v1/timeline/
2017-05-15 15:58:16,196 [main] INFO org.apache.hadoop.yarn.client.RMProxy - Connecting to ResourceManager at sandbox.hortonworks.com/172.17.0.2:8050
2017-05-15 15:58:16,396 [main] INFO org.apache.hadoop.yarn.client.AHSProxy - Connecting to Application History server at sandbox.hortonworks.com/172.17.0.2:10200
2017-05-15 15:58:16,576 [main] INFO org.apache.pig.tools.pigstats.mapreduce.MRScriptState - Pig script settings are added to the job
2017-05-15 15:58:16,580 [main] WARN org.apache.pig.tools.pigstats.ScriptState - unable to read pigs manifest file
2017-05-15 15:58:16,584 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - mapred.job.reduce.markreset.buffer.percent is not set, set to default 0.3
2017-05-15 15:58:16,588 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - This job cannot be converted run in-process
2017-05-15 15:58:17,258 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - Added jar file:/pig/rollupreg_extract-jar-with-dependencies.jar to DistributedCache through /tmp/temp-1119775568/tmp-858482998/rollupreg_extract-jar-with-dependencies.jar
2017-05-15 15:58:17,276 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - Setting up single store job
2017-05-15 15:58:17,294 [main] INFO org.apache.pig.data.SchemaTupleFrontend - Key [pig.schematuple] is false, will not generate code.
2017-05-15 15:58:17,295 [main] INFO org.apache.pig.data.SchemaTupleFrontend - Starting process to move generated code to distributed cacche
2017-05-15 15:58:17,295 [main] INFO org.apache.pig.data.SchemaTupleFrontend - Setting key [pig.schematuple.classes] with classes to deserialize []
2017-05-15 15:58:17,354 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 1 map-reduce job(s) waiting for submission.
2017-05-15 15:58:17,510 [JobControl] INFO org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl - Timeline service address: http://sandbox.hortonworks.com:8188/ws/v1/timeline/
2017-05-15 15:58:17,511 [JobControl] INFO org.apache.hadoop.yarn.client.RMProxy - Connecting to ResourceManager at sandbox.hortonworks.com/172.17.0.2:8050
2017-05-15 15:58:17,511 [JobControl] INFO org.apache.hadoop.yarn.client.AHSProxy - Connecting to Application History server at sandbox.hortonworks.com/172.17.0.2:10200
2017-05-15 15:58:17,753 [JobControl] WARN org.apache.hadoop.mapreduce.JobResourceUploader - No job jar file set. User classes may not be found. See Job or Job#setJar(String).
2017-05-15 15:58:17,820 [JobControl] INFO org.apache.pig.builtin.PigStorage - Using PigTextInputFormat
2017-05-15 15:58:17,830 [JobControl] INFO org.apache.hadoop.mapreduce.lib.input.FileInputFormat - Total input paths to process : 1
2017-05-15 15:58:17,830 [JobControl] INFO org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil - Total input paths to process : 1
2017-05-15 15:58:17,884 [JobControl] INFO com.hadoop.compression.lzo.GPLNativeCodeLoader - Loaded native gpl library
2017-05-15 15:58:17,889 [JobControl] INFO com.hadoop.compression.lzo.LzoCodec - Successfully loaded & initialized native-lzo library [hadoop-lzo rev 7a4b57bedce694048432dd5bf5b90a6c8ccdba80]
2017-05-15 15:58:17,922 [JobControl] INFO org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil - Total input paths (combined) to process : 1
2017-05-15 15:58:18,525 [JobControl] INFO org.apache.hadoop.mapreduce.JobSubmitter - number of splits:1
2017-05-15 15:58:18,692 [JobControl] INFO org.apache.hadoop.mapreduce.JobSubmitter - Submitting tokens for job: job_1494853652295_0023
2017-05-15 15:58:18,879 [JobControl] INFO org.apache.hadoop.mapred.YARNRunner - Job jar is not present. Not adding any jar to the list of resources.
2017-05-15 15:58:18,973 [JobControl] INFO org.apache.hadoop.yarn.client.api.impl.YarnClientImpl - Submitted application application_1494853652295_0023
2017-05-15 15:58:19,029 [JobControl] INFO org.apache.hadoop.mapreduce.Job - The url to track the job: http://sandbox.hortonworks.com:8088/proxy/application_1494853652295_0023/
2017-05-15 15:58:19,030 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - HadoopJobId: job_1494853652295_0023
2017-05-15 15:58:19,030 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Processing aliases data,extract
2017-05-15 15:58:19,030 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - detailed locations: M: data[2,7],extract[3,10] C: R:
2017-05-15 15:58:19,044 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 0% complete
2017-05-15 15:58:19,044 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Running jobs are [job_1494853652295_0023]
2017-05-15 15:58:29,156 [main] WARN org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Ooops! Some job has failed! Specify -stop_on_failure if you want Pig to stop immediately on failure.
2017-05-15 15:58:29,156 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - job job_1494853652295_0023 has failed! Stop running all dependent jobs
2017-05-15 15:58:29,157 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 100% complete
2017-05-15 15:58:29,790 [main] INFO org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl - Timeline service address: http://sandbox.hortonworks.com:8188/ws/v1/timeline/
2017-05-15 15:58:29,791 [main] INFO org.apache.hadoop.yarn.client.RMProxy - Connecting to ResourceManager at sandbox.hortonworks.com/172.17.0.2:8050
2017-05-15 15:58:29,793 [main] INFO org.apache.hadoop.yarn.client.AHSProxy - Connecting to Application History server at sandbox.hortonworks.com/172.17.0.2:10200
2017-05-15 15:58:30,311 [main] INFO org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl - Timeline service address: http://sandbox.hortonworks.com:8188/ws/v1/timeline/
2017-05-15 15:58:30,312 [main] INFO org.apache.hadoop.yarn.client.RMProxy - Connecting to ResourceManager at sandbox.hortonworks.com/172.17.0.2:8050
2017-05-15 15:58:30,313 [main] INFO org.apache.hadoop.yarn.client.AHSProxy - Connecting to Application History server at sandbox.hortonworks.com/172.17.0.2:10200
2017-05-15 15:58:30,465 [main] ERROR org.apache.pig.tools.pigstats.mapreduce.MRPigStatsUtil - 1 map reduce job(s) failed!
2017-05-15 15:58:30,467 [main] WARN org.apache.pig.tools.pigstats.ScriptState - unable to read pigs manifest file
2017-05-15 15:58:30,472 [main] INFO org.apache.pig.tools.pigstats.mapreduce.SimplePigStats - Script Statistics:
HadoopVersion PigVersion UserId StartedAt FinishedAt Features
2.7.3.2.5.0.0-1245 root 2017-05-15 15:58:16 2017-05-15 15:58:30 UNKNOWN
Failed!
Failed Jobs:
JobId Alias Feature Message Outputs
job_1494853652295_0023 data,extract MAP_ONLY Message: Job failed! hdfs://sandbox.hortonworks.com:8020/tmp/temp-1119775568/tmp-1619300225,
Input(s):
Failed to read data from "/pig_hdfs/input.txt"
Output(s):
Failed to produce result in "hdfs://sandbox.hortonworks.com:8020/tmp/temp-1119775568/tmp-1619300225"
Counters:
Total records written : 0
Total bytes written : 0
Spillable Memory Manager spill count : 0
Total bags proactively spilled: 0
Total records proactively spilled: 0
Job DAG:
job_1494853652295_0023
2017-05-15 15:58:30,472 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Failed!
2017-05-15 15:58:30,499 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1066: Unable to open iterator for alias extract
Details at logfile: /pig/pig_1494863836458.log
I know it complaints that
Failed to read data from "/pig_hdfs/input.txt"
But I am sure this is not the actual issue. If I don't use the udf and directly dump the data, I get the output. So, this is not the issue.
First, you do not need an udf to get the desired output.You can use semi colon as the delimiter in load statement and get the needed column.
data = LOAD 'hdfs://sandbox.hortonworks.com:8020/pig_hdfs/input.txt' USING PigStorage(';');
extract = FOREACH data GENERATE $1;
DUMP extract;
If you insist on using udf then you will have to load the record into a single field and then use the udf.Also,your udf is incorrect.You should split the string s with ';' as the delimiter, which is passed from the pig script.
String s = (String)input.get(0);
String str1 = s.split("\\;")[1];
And in your pig script,you need to load the entire record into 1 field and use the udf on field $0.
REGISTER /pig/rollupreg_extract-jar-with-dependencies.jar;
DEFINE myudf com.company.pig.myudf;
data = LOAD 'hdfs://sandbox.hortonworks.com:8020/pig_hdfs/input.txt' AS (f1:chararray);
extract = FOREACH data GENERATE myudf($0);
DUMP extract;

Pentaho DI6.1, Error using mBox in Email Message Input Step

This time I need help with this software. I'm trying to create a transformation that, obtaining a mbox, return certains parts of the emails. But! When I use the step Email Message Input preview function, Pentaho return me this.
2016/09/09 14:52:53 - cfgbuilder - Warning: The configuration
parameter [org] is not supported by the default configuration builder
for scheme: sftp 2016/09/09 14:54:58 - DBCache - Loading database
cache from file: [C:\Users\fangonzalez.kettle\db.cache-6.1.0.1-196]
2016/09/09 14:54:58 - DBCache - We read 0 cached rows from the
database cache! 2016/09/09 14:54:59 - Spoon - Trying to open the last
file used. 2016/09/09 15:03:37 -
C:\Users\fangonzalez\Desktop\Pentaho\trans.ktr : trans - Dispatching
started for transformation
[C:\Users\fangonzalez\Desktop\Pentaho\trans.ktr : trans] 2016/09/09
15:03:37 - Email messages input.0 - ERROR (version 6.1.0.1-196, build
1 from 2016-04-07 12.08.49 by buildguy) : Error opening folder 1 :
java.lang.NullPointerException 2016/09/09 15:03:37 - Email messages
input.0 - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49
by buildguy) : java.lang.NullPointerException 2016/09/09 15:03:37 -
Email messages input.0 - at
org.pentaho.di.trans.steps.mailinput.MailInput.openNextFolder(MailInput.java:347)
2016/09/09 15:03:37 - Email messages input.0 - at
org.pentaho.di.trans.steps.mailinput.MailInput.getOneRow(MailInput.java:214)
2016/09/09 15:03:37 - Email messages input.0 - at
org.pentaho.di.trans.steps.mailinput.MailInput.processRow(MailInput.java:75)
2016/09/09 15:03:37 - Email messages input.0 - at
org.pentaho.di.trans.step.RunThread.run(RunThread.java:62) 2016/09/09
15:03:37 - Email messages input.0 - at java.lang.Thread.run(Unknown
Source) 2016/09/09 15:03:37 - Email messages input.0 - Finished
processing (I=0, O=0, R=0, W=0, U=0, E=1) 2016/09/09 15:03:37 -
C:\Users\fangonzalez\Desktop\Pentaho\trans.ktr : trans -
Transformation detected one or more steps with errors. 2016/09/09
15:03:37 - C:\Users\fangonzalez\Desktop\Pentaho\trans.ktr : trans -
Transformation is killing the other steps!
There is the pic of the step config screen
The option "Fetch in batches" cannot be enable if you use only one mbox.

Storm test case backtype.storm.multilang failed with backtype.storm.multilang.NoOutputException exception

I clone the latest version (master) of Storm source code from https://github.com/apache/storm.git. I am using Ubuntu 14.02.
But when I run "mvn test" command, the test process fails and terminates at backtype.storm.multilang-test.
Here is the context of backtype.storm.multilang-test.xml file:
171293 [Thread-1212-1] ERROR b.s.util - Async loop died!
java.lang.RuntimeException: backtype.storm.multilang.NoOutputException: Pipe to subprocess seems to be broken! No output read.
Serializer Exception:
at backtype.storm.utils.ShellProcess.launch(ShellProcess.java:91) ~[classes/:?]
at backtype.storm.spout.ShellSpout.open(ShellSpout.java:84) ~[classes/:?]
at backtype.storm.daemon.executor$fn__4594$fn__4609.invoke(executor.clj:559) ~[classes/:?]
at backtype.storm.util$async_loop$fn__643.invoke(util.clj:473) [classes/:?]
at clojure.lang.AFn.run(AFn.java:22) [clojure-1.6.0.jar:?]
at java.lang.Thread.run(Thread.java:745) [?:1.7.0_79]
171294 [Thread-1212-1] ERROR b.s.d.executor -
java.lang.RuntimeException: backtype.storm.multilang.NoOutputException: Pipe to subprocess seems to be broken! No output read.
Serializer Exception:
at backtype.storm.utils.ShellProcess.launch(ShellProcess.java:91) ~[classes/:?]
at backtype.storm.spout.ShellSpout.open(ShellSpout.java:84) ~[classes/:?]
at backtype.storm.daemon.executor$fn__4594$fn__4609.invoke(executor.clj:559) ~[classes/:?]
at backtype.storm.util$async_loop$fn__643.invoke(util.clj:473) [classes/:?]
at clojure.lang.AFn.run(AFn.java:22) [clojure-1.6.0.jar:?]
at java.lang.Thread.run(Thread.java:745) [?:1.7.0_79]
171297 [Thread-1221-2] INFO b.s.d.executor - Preparing bolt 2:(2)
171297 [Thread-1221-2] INFO b.s.u.ShellProcess - Storm multilang serializer: backtype.storm.multilang.JsonSerializer
171298 [Thread-1203-EventThread] INFO o.a.c.f.s.ConnectionStateManager - State change: CONNECTED
171300 [Thread-1217-__acker] INFO b.s.d.executor - Preparing bolt __acker:(13)
171301 [Thread-1217-__acker] INFO b.s.d.executor - Prepared bolt __acker:(13)
171301 [Thread-1214-__system] INFO b.s.d.executor - Preparing bolt __system:(-1)
171300 [Thread-1226-__acker] INFO b.s.d.executor - Preparing bolt __acker:(3)
171302 [Thread-1214-__system] INFO b.s.d.executor - Prepared bolt __system:(-1)
171302 [Thread-1226-__acker] INFO b.s.d.executor - Prepared bolt __acker:(3)
171302 [Thread-1228-__acker] INFO b.s.d.executor - Preparing bolt __acker:(14)
171302 [Thread-1228-__acker] INFO b.s.d.executor - Prepared bolt __acker:(14)
171300 [Thread-1223-__system] INFO b.s.d.executor - Preparing bolt __system:(-1)
171304 [Thread-1223-__system] INFO b.s.d.executor - Prepared bolt __system:(-1)
171304 [Thread-1221-2] ERROR b.s.util - Async loop died!
java.lang.RuntimeException: Error when launching multilang subprocess
at backtype.storm.utils.ShellProcess.launch(ShellProcess.java:87) ~[classes/:?]
at backtype.storm.task.ShellBolt.prepare(ShellBolt.java:126) ~[classes/:?]
at backtype.storm.daemon.executor$fn__4664$fn__4677.invoke(executor.clj:755) ~[classes/:?]
at backtype.storm.util$async_loop$fn__643.invoke(util.clj:473) [classes/:?]
at clojure.lang.AFn.run(AFn.java:22) [clojure-1.6.0.jar:?]
at java.lang.Thread.run(Thread.java:745) [?:1.7.0_79]
Caused by: java.io.IOException: Broken pipe
at java.io.FileOutputStream.writeBytes(Native Method) ~[?:1.7.0_79]
at java.io.FileOutputStream.write(FileOutputStream.java:345) ~[?:1.7.0_79]
at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) ~[?:1.7.0_79]
at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) ~[?:1.7.0_79]
at java.io.DataOutputStream.flush(DataOutputStream.java:123) ~[?:1.7.0_79]
at backtype.storm.multilang.JsonSerializer.writeString(JsonSerializer.java:96) ~[classes/:?]
at backtype.storm.multilang.JsonSerializer.writeMessage(JsonSerializer.java:89) ~[classes/:?]
at backtype.storm.multilang.JsonSerializer.connect(JsonSerializer.java:61) ~[classes/:?]
at backtype.storm.utils.ShellProcess.launch(ShellProcess.java:85) ~[classes/:?]
... 5 more
171306 [Thread-1221-2] ERROR b.s.d.executor -
java.lang.RuntimeException: Error when launching multilang subprocess
at backtype.storm.utils.ShellProcess.launch(ShellProcess.java:87) ~[classes/:?]
at backtype.storm.task.ShellBolt.prepare(ShellBolt.java:126) ~[classes/:?]
at backtype.storm.daemon.executor$fn__4664$fn__4677.invoke(executor.clj:755) ~[classes/:?]
at backtype.storm.util$async_loop$fn__643.invoke(util.clj:473) [classes/:?]
at clojure.lang.AFn.run(AFn.java:22) [clojure-1.6.0.jar:?]
at java.lang.Thread.run(Thread.java:745) [?:1.7.0_79]
Caused by: java.io.IOException: Broken pipe
at java.io.FileOutputStream.writeBytes(Native Method) ~[?:1.7.0_79]
at java.io.FileOutputStream.write(FileOutputStream.java:345) ~[?:1.7.0_79]
at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) ~[?:1.7.0_79]
at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) ~[?:1.7.0_79]
at java.io.DataOutputStream.flush(DataOutputStream.java:123) ~[?:1.7.0_79]
at backtype.storm.multilang.JsonSerializer.writeString(JsonSerializer.java:96) ~[classes/:?]
at backtype.storm.multilang.JsonSerializer.writeMessage(JsonSerializer.java:89) ~[classes/:?]
at backtype.storm.multilang.JsonSerializer.connect(JsonSerializer.java:61) ~[classes/:?]
at backtype.storm.utils.ShellProcess.launch(ShellProcess.java:85) ~[classes/:?]
... 5 more
171308 [Thread-1231-__system] INFO b.s.d.executor - Preparing bolt __system:(-1)
171308 [Thread-1231-__system] INFO b.s.d.executor - Prepared bolt __system:(-1)
171309 [Thread-1233-__system] INFO b.s.d.executor - Preparing bolt __system:(-1)
171310 [Thread-1233-__system] INFO b.s.d.executor - Prepared bolt __system:(-1)
171315 [Thread-1235-__acker] INFO b.s.d.executor - Preparing bolt __acker:(15)
171315 [Thread-1203] INFO b.s.s.a.AuthUtils - Got AutoCreds []
171315 [Thread-1235-__acker] INFO b.s.d.executor - Prepared bolt __acker:(15)
171315 [Thread-1203] INFO b.s.d.worker - Reading Assignments.
171316 [Thread-1238-__acker] INFO b.s.d.executor - Preparing bolt __acker:(16)
171316 [Thread-1238-__acker] INFO b.s.d.executor - Prepared bolt __acker:(16)
171319 [refresh-active-timer] INFO b.s.d.worker - All connections are ready for worker 7dea90e5-ca16-4136-a112-88d756cd9014:1028 with id a74f15a4-ce44-4959-89a5-483b4aff164f
171324 [Thread-1203] INFO b.s.d.worker - Launching receive-thread for 6737d54e-6cc5-4de7-af27-171ba758fb5b:1025
171325 [Thread-1286-worker-receiver-thread-0] INFO b.s.m.loader - Starting receive-thread: [stormId: test-1-1438588543, port: 1025, thread-id: 0 ]
171325 [Thread-1240-__acker] INFO b.s.d.executor - Preparing bolt __acker:(4)
171325 [Thread-1240-__acker] INFO b.s.d.executor - Prepared bolt __acker:(4)
171325 [Thread-1212-1] ERROR b.s.util - Halting process: ("Worker died")
java.lang.RuntimeException: ("Worker died")
at backtype.storm.util$exit_process_BANG_.doInvoke(util.clj:332) [classes/:?]
at clojure.lang.RestFn.invoke(RestFn.java:423) [clojure-1.6.0.jar:?]
at backtype.storm.daemon.worker$fn__5151$fn__5152.invoke(worker.clj:532) [classes/:?]
at backtype.storm.daemon.executor$mk_executor_data$fn__4493$fn__4494.invoke(executor.clj:261) [classes/:?]
at backtype.storm.util$async_loop$fn__643.invoke(util.clj:485) [classes/:?]
at clojure.lang.AFn.run(AFn.java:22) [clojure-1.6.0.jar:?]
at java.lang.Thread.run(Thread.java:745) [?:1.7.0_79]
171325 [Thread-1221-2] ERROR b.s.util - Halting process: ("Worker died")
java.lang.RuntimeException: ("Worker died")
at backtype.storm.util$exit_process_BANG_.doInvoke(util.clj:332) [classes/:?]
at clojure.lang.RestFn.invoke(RestFn.java:423) [clojure-1.6.0.jar:?]
at backtype.storm.daemon.worker$fn__5151$fn__5152.invoke(worker.clj:532) [classes/:?]
at backtype.storm.daemon.executor$mk_executor_data$fn__4493$fn__4494.invoke(executor.clj:261) [classes/:?]
at backtype.storm.util$async_loop$fn__643.invoke(util.clj:485) [classes/:?]
at clojure.lang.AFn.run(AFn.java:22) [clojure-1.6.0.jar:?]
at java.lang.Thread.run(Thread.java:745) [?:1.7.0_79]
Any help is highly appreciated.
This problem has been solved after I install nodejs manually. The error is due to the improper default nodejs installation on Ubuntu.
Here is a list of software required to pass all the tests in Apache Storm:
List item
nodejs
python
zmq
ruby(requires json module to be installed)
Hope this list is helpful for anyone failing to pass the tests.

PIG CqlStorage not working with Integers

I am running the Pig example from DataStax: http://www.datastax.com/docs/datastax_enterprise3.1/solutions/about_pig#pig-read-write. I am using DataStax Enterprise 3.1.2. But when I want to save the Data back in Cassandra with:
grunt> STORE insertformat INTO
'cql://cql3ks/test?output_query=UPDATE+cql3ks.test+set+b+%3D+%3F'
USING CqlStorage;
I get the following output:
2014-03-11 10:14:38,383 [main] INFO org.apache.pig.tools.pigstats.ScriptState - Pig features used in the script: UNKNOWN
2014-03-11 10:14:38,440 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MRCompiler - File concatenation threshold: 100 optimistic? false
2014-03-11 10:14:38,442 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer - MR plan size before optimization: 1
2014-03-11 10:14:38,442 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer - MR plan size after optimization: 1
2014-03-11 10:14:38,451 [main] INFO org.apache.pig.tools.pigstats.ScriptState - Pig script settings are added to the job
2014-03-11 10:14:38,452 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - mapred.job.reduce.markreset.buffer.percent is not set, set to default 0.3
2014-03-11 10:14:38,452 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - creating jar file Job1332293282461754849.jar
2014-03-11 10:14:40,560 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - jar file Job1332293282461754849.jar created
2014-03-11 10:14:40,569 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - Setting up single store job
2014-03-11 10:14:40,597 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 1 map-reduce job(s) waiting for submission.
2014-03-11 10:14:41,111 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 0% complete
2014-03-11 10:14:43,934 [Thread-10] INFO org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil - Total input paths (combined) to process : 1
2014-03-11 10:14:45,547 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - HadoopJobId: job_201403091619_0036
2014-03-11 10:14:45,547 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - More information at: http://127.0.0.1:50030/jobdetails.jsp?jobid=job_201403091619_0036
2014-03-11 10:17:52,330 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - job job_201403091619_0036 has failed! Stop running all dependent jobs
2014-03-11 10:17:52,330 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 100% complete
2014-03-11 10:17:52,334 [main] ERROR org.apache.pig.tools.pigstats.SimplePigStats - ERROR 2997: Unable to recreate exception from backed error: java.io.IOException: InvalidRequestException(why:Expected 4 or 0 byte int (11))
2014-03-11 10:17:52,335 [main] ERROR org.apache.pig.tools.pigstats.PigStatsUtil - 1 map reduce job(s) failed!
2014-03-11 10:17:52,335 [main] INFO org.apache.pig.tools.pigstats.SimplePigStats - Script Statistics:
HadoopVersion PigVersion UserId StartedAt FinishedAt Features
1.0.4.8 0.9.2 root 2014-03-11 10:14:38 2014-03-11 10:17:52 UNKNOWN
Failed!
Failed Jobs:
JobId Alias Feature Message Outputs
job_201403091619_0036 insertformat,moretestvalues MAP_ONLY Message: Job failed! Error - # of failed Map Tasks exceeded allowed limit. FailedCount: 1. LastFailedTask: task_201403091619_0036_m_000000 cql://cql3ks/test?output_query=UPDATE+cql3ks.test+set+b+%3D+%3F,
Input(s):
Failed to read data from "cql://cql3ks/moredata/"
Output(s):
Failed to produce result in "cql://cql3ks/test?output_query=UPDATE+cql3ks.test+set+b+%3D+%3F"
Counters:
Total records written : 0
Total bytes written : 0
Spillable Memory Manager spill count : 0
Total bags proactively spilled: 0
Total records proactively spilled: 0
Job DAG:
job_201403091619_0036
2014-03-11 10:17:52,335 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Failed!
The Log-File is:
Backend error message
---------------------
java.io.IOException: InvalidRequestException(why:Expected 4 or 0 byte int (11))
at org.apache.cassandra.hadoop.cql3.CqlRecordWriter$RangeClient.run(CqlRecordWriter.java:248)
Caused by: InvalidRequestException(why:Expected 4 or 0 byte int (11))
at org.apache.cassandra.thrift.Cassandra$execute_prepared_cql3_query_result.read(Cassandra.java:41868)
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
at org.apache.cassandra.thrift.Cassandra$Client.recv_execute_prepared_cql3_query(Cassandra.java:1689)
at org.apache.cassandra.thrift.Cassandra$Client.execute_prepared_cql3_query(Cassandra.java:1674)
at org.apache.cassandra.hadoop.cql3.CqlRecordWriter$RangeClient.run(CqlRecordWriter.java:232)
What I am doing wrong? For me, it looks like a Bug, because when I use Strings instead of Integers in CQL while creating the Table, the Example works well.
Thank you
I just tested in with a fresh install of DSE-3.12, it works for me. You may need to re-install DSE and re-create the tables to test it again.