java.lang.NoSuchFieldError: toStringWriter when using Karate - karate

Facing this error:
10:28:30.552 [main][] INFO com.intuit.karate - >> lock acquired, begin callSingle: classpath:preview-srs-session.feature
Exception in thread "main" java.lang.NoSuchFieldError: toStringWriter
at com.intuit.karate.JsonUtils$NashornObjectJsonWriter.writeJSONString(JsonUtils.java:76)
at com.intuit.karate.JsonUtils$NashornObjectJsonWriter.writeJSONString(JsonUtils.java:68)
at net.minidev.json.JSONValue.writeJSONString(JSONValue.java:596)
at net.minidev.json.reader.JsonWriter.writeJSONKV(JsonWriter.java:354)
at net.minidev.json.reader.JsonWriter$7.writeJSONString(JsonWriter.java:141)
at net.minidev.json.reader.JsonWriter$7.writeJSONString(JsonWriter.java:123)
at net.minidev.json.JSONValue.writeJSONString(JSONValue.java:596)
at net.minidev.json.reader.JsonWriter.writeJSONKV(JsonWriter.java:354)
at net.minidev.json.reader.JsonWriter$7.writeJSONString(JsonWriter.java:141)
at net.minidev.json.reader.JsonWriter$7.writeJSONString(JsonWriter.java:123)
at com.intuit.karate.JsonUtils$NashornObjectJsonWriter.writeJSONString(JsonUtils.java:78)
at com.intuit.karate.JsonUtils$NashornObjectJsonWriter.writeJSONString(JsonUtils.java:68)
at net.minidev.json.JSONValue.writeJSONString(JSONValue.java:596)
at net.minidev.json.JSONValue.toJSONString(JSONValue.java:632)
at net.minidev.json.JSONValue.toJSONString(JSONValue.java:610)
at com.intuit.karate.JsonUtils.toJson(JsonUtils.java:130)
at com.intuit.karate.JsonUtils.toJsonDoc(JsonUtils.java:172)
at com.intuit.karate.ScriptValue.<init>(ScriptValue.java:425)
at com.intuit.karate.ScriptValue.<init>(ScriptValue.java:417)
at com.intuit.karate.ScriptValueMap.put(ScriptValueMap.java:30)
at com.intuit.karate.core.ScenarioContext.<init>(ScenarioContext.java:292)
at com.intuit.karate.StepActions.<init>(StepActions.java:53)
at com.intuit.karate.core.ScenarioExecutionUnit.init(ScenarioExecutionUnit.java:141)
at com.intuit.karate.core.ScenarioExecutionUnit.run(ScenarioExecutionUnit.java:236)
at com.intuit.karate.core.FeatureExecutionUnit.run(FeatureExecutionUnit.java:164)
at com.intuit.karate.core.FeatureExecutionUnit.run(FeatureExecutionUnit.java:73)
at com.intuit.karate.core.Engine.executeFeatureSync(Engine.java:109)
at com.intuit.karate.Script.evalFeatureCall(Script.java:1769)
at com.intuit.karate.Script.evalFeatureCall(Script.java:1745)
at com.intuit.karate.core.ScriptBridge.call(ScriptBridge.java:421)
at com.intuit.karate.core.ScriptBridge.callSingle(ScriptBridge.java:450)
at jdk.nashorn.internal.scripts.Script$Recompilation$7$11$\^eval\_.L:1(<eval>:92)
at jdk.nashorn.internal.runtime.ScriptFunctionData.invoke(ScriptFunctionData.java:637)
at jdk.nashorn.internal.runtime.ScriptFunction.invoke(ScriptFunction.java:494)
at jdk.nashorn.internal.runtime.ScriptRuntime.apply(ScriptRuntime.java:393)
at jdk.nashorn.api.scripting.ScriptObjectMirror.call(ScriptObjectMirror.java:117)
at com.intuit.karate.Script.evalJsFunctionCall(Script.java:1693)
at com.intuit.karate.Script.call(Script.java:1650)
at com.intuit.karate.Script.callAndUpdateConfigAndAlsoVarsIfMapReturned(Script.java:1786)
at com.intuit.karate.core.ScenarioContext.<init>(ScenarioContext.java:267)
at com.intuit.karate.StepActions.<init>(StepActions.java:53)
at com.intuit.karate.core.ScenarioExecutionUnit.init(ScenarioExecutionUnit.java:141)
at com.intuit.karate.core.ScenarioExecutionUnit.run(ScenarioExecutionUnit.java:236)
at com.intuit.karate.core.FeatureExecutionUnit.run(FeatureExecutionUnit.java:164)
at com.intuit.karate.core.FeatureExecutionUnit.run(FeatureExecutionUnit.java:73)
at com.intuit.karate.core.Engine.executeFeatureSync(Engine.java:109)
at com.intuit.karate.IdeUtils.exec(IdeUtils.java:64)
at cucumber.api.cli.Main.main(Main.java:36)
Disconnected from the target VM, address: '127.0.0.1:61606', transport: 'socket'
Process finished with exit code 1

This is because sometimes when you mix Karate into a project with extra Java dependencies, some of the JSON handling code can get confused.
Thanks to this article, the solution is, you can explicitly load the dependency:
testCompile "net.minidev:json-smart:2.3"

Related

Unexpected "Internal error" exception when using spring-cloud-gcp-pubsub 3.2.1

I'm using PubSubReactiveFactory fromspring-cloud-gcp-pubsub onOpenJDK 11 Debian Linux and I've observed the following exception in our application:
com.google.api.gax.rpc.InternalException: io.grpc.StatusRuntimeException: INTERNAL: http2 exception
at com.google.api.gax.rpc.ApiExceptionFactory.createException(ApiExceptionFactory.java:110)
at com.google.api.gax.rpc.ApiExceptionFactory.createException(ApiExceptionFactory.java:41)
at com.google.api.gax.grpc.GrpcApiExceptionFactory.create(GrpcApiExceptionFactory.java:86)
at com.google.api.gax.grpc.GrpcApiExceptionFactory.create(GrpcApiExceptionFactory.java:66)
at com.google.api.gax.grpc.GrpcExceptionCallable$ExceptionTransformingFuture.onFailure(GrpcExceptionCallable.java:97)
at com.google.api.core.ApiFutures$1.onFailure(ApiFutures.java:67)
at com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1132)
at com.google.common.util.concurrent.DirectExecutor.execute(DirectExecutor.java:31)
at com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:1270)
at com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:1038)
at com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:808)
at io.grpc.stub.ClientCalls$GrpcFuture.setException(ClientCalls.java:572)
at io.grpc.stub.ClientCalls$UnaryStreamToFuture.onClose(ClientCalls.java:542)
at io.grpc.PartialForwardingClientCallListener.onClose(PartialForwardingClientCallListener.java:39)
at io.grpc.ForwardingClientCallListener.onClose(ForwardingClientCallListener.java:23)
at io.grpc.ForwardingClientCallListener$SimpleForwardingClientCallListener.onClose(ForwardingClientCallListener.java:40)
at com.google.api.gax.grpc.ChannelPool$ReleasingClientCall$1.onClose(ChannelPool.java:535)
at io.grpc.internal.ClientCallImpl.closeObserver(ClientCallImpl.java:562)
at io.grpc.internal.ClientCallImpl.access$300(ClientCallImpl.java:70)
at io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$1StreamClosed.runInternal(ClientCallImpl.java:743)
at io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$1StreamClosed.runInContext(ClientCallImpl.java:722)
at io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
at io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:133)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: io.grpc.StatusRuntimeException: INTERNAL: http2 exception
at io.grpc.Status.asRuntimeException(Status.java:535)
... 14 common frames omitted
Caused by: io.grpc.netty.shaded.io.netty.handler.codec.http2.Http2Exception$StreamException: Stream closed before write could take place
at io.grpc.netty.shaded.io.netty.handler.codec.http2.Http2Exception.streamError(Http2Exception.java:172)
at io.grpc.netty.shaded.io.netty.handler.codec.http2.DefaultHttp2RemoteFlowController$FlowState.cancel(DefaultHttp2RemoteFlowController.java:481)
at io.grpc.netty.shaded.io.netty.handler.codec.http2.DefaultHttp2RemoteFlowController$1.onStreamClosed(DefaultHttp2RemoteFlowController.java:105)
at io.grpc.netty.shaded.io.netty.handler.codec.http2.DefaultHttp2Connection.notifyClosed(DefaultHttp2Connection.java:357)
at io.grpc.netty.shaded.io.netty.handler.codec.http2.DefaultHttp2Connection$ActiveStreams.removeFromActiveStreams(DefaultHttp2Connection.java:1007)
at io.grpc.netty.shaded.io.netty.handler.codec.http2.DefaultHttp2Connection$ActiveStreams$2.process(DefaultHttp2Connection.java:968)
at io.grpc.netty.shaded.io.netty.handler.codec.http2.DefaultHttp2Connection$ActiveStreams.decrementPendingIterations(DefaultHttp2Connection.java:1029)
at io.grpc.netty.shaded.io.netty.handler.codec.http2.DefaultHttp2Connection$ActiveStreams.forEachActiveStream(DefaultHttp2Connection.java:984)
at io.grpc.netty.shaded.io.netty.handler.codec.http2.DefaultHttp2Connection.forEachActiveStream(DefaultHttp2Connection.java:209)
at io.grpc.netty.shaded.io.grpc.netty.NettyClientHandler.goingAway(NettyClientHandler.java:839)
at io.grpc.netty.shaded.io.grpc.netty.NettyClientHandler.access$200(NettyClientHandler.java:91)
at io.grpc.netty.shaded.io.grpc.netty.NettyClientHandler$2.onGoAwayReceived(NettyClientHandler.java:278)
at io.grpc.netty.shaded.io.netty.handler.codec.http2.DefaultHttp2Connection.goAwayReceived(DefaultHttp2Connection.java:237)
at io.grpc.netty.shaded.io.netty.handler.codec.http2.DefaultHttp2ConnectionDecoder.onGoAwayRead0(DefaultHttp2ConnectionDecoder.java:217)
at io.grpc.netty.shaded.io.netty.handler.codec.http2.DefaultHttp2ConnectionDecoder$FrameReadListener.onGoAwayRead(DefaultHttp2ConnectionDecoder.java:583)
at io.grpc.netty.shaded.io.netty.handler.codec.http2.Http2InboundFrameLogger$1.onGoAwayRead(Http2InboundFrameLogger.java:119)
at io.grpc.netty.shaded.io.netty.handler.codec.http2.DefaultHttp2FrameReader.readGoAwayFrame(DefaultHttp2FrameReader.java:580)
at io.grpc.netty.shaded.io.netty.handler.codec.http2.DefaultHttp2FrameReader.processPayloadState(DefaultHttp2FrameReader.java:271)
at io.grpc.netty.shaded.io.netty.handler.codec.http2.DefaultHttp2FrameReader.readFrame(DefaultHttp2FrameReader.java:159)
at io.grpc.netty.shaded.io.netty.handler.codec.http2.Http2InboundFrameLogger.readFrame(Http2InboundFrameLogger.java:41)
at io.grpc.netty.shaded.io.netty.handler.codec.http2.DefaultHttp2ConnectionDecoder.decodeFrame(DefaultHttp2ConnectionDecoder.java:173)
at io.grpc.netty.shaded.io.netty.handler.codec.http2.Http2ConnectionHandler$FrameDecoder.decode(Http2ConnectionHandler.java:378)
at io.grpc.netty.shaded.io.netty.handler.codec.http2.Http2ConnectionHandler.decode(Http2ConnectionHandler.java:438)
at io.grpc.netty.shaded.io.netty.handler.codec.ByteToMessageDecoder.decodeRemovalReentryProtection(ByteToMessageDecoder.java:507)
at io.grpc.netty.shaded.io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:446)
at io.grpc.netty.shaded.io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:276)
at io.grpc.netty.shaded.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at io.grpc.netty.shaded.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at io.grpc.netty.shaded.io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
at io.grpc.netty.shaded.io.netty.handler.ssl.SslHandler.unwrap(SslHandler.java:1371)
at io.grpc.netty.shaded.io.netty.handler.ssl.SslHandler.decodeJdkCompatible(SslHandler.java:1234)
at io.grpc.netty.shaded.io.netty.handler.ssl.SslHandler.decode(SslHandler.java:1283)
at io.grpc.netty.shaded.io.netty.handler.codec.ByteToMessageDecoder.decodeRemovalReentryProtection(ByteToMessageDecoder.java:507)
at io.grpc.netty.shaded.io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:446)
at io.grpc.netty.shaded.io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:276)
at io.grpc.netty.shaded.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at io.grpc.netty.shaded.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at io.grpc.netty.shaded.io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
at io.grpc.netty.shaded.io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
at io.grpc.netty.shaded.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at io.grpc.netty.shaded.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at io.grpc.netty.shaded.io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
at io.grpc.netty.shaded.io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:795)
at io.grpc.netty.shaded.io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:480)
at io.grpc.netty.shaded.io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:378)
at io.grpc.netty.shaded.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
at io.grpc.netty.shaded.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
at io.grpc.netty.shaded.io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
... 1 common frames omitted
The "Internal" gRPC status is propagated to application code and force us to retry pull operation polluting logs with errors/warnings along the way.
To add more context this is happening when Pub/Sub PullRequest API takes long time (I'm observing p99 20-30 seconds latency when this happens). Netty closes the connection with "Stream closed before write could take place" in DefaultHttp2RemoteFlowController.java:481 and status io.netty.handler.codec.http2.Http2Error.STREAM_CLOSED(0x5) Then this status code is translated to io.grpc.internal.Http2Error.INTERNAL and propagated up the stack.
Has anybody experience this error and come up with a way to gracefully handle it?

Redis exception MOVED 8429 10.21.21.117:6379

Received error while fetching Platform And Activate entities:Error in execution; nested exception is io.lettuce.core.RedisCommandExecutionException: MOVED 8429 10.21.21.117:6379
can anyone help me how can i fix these...

Debug exception in tRootTask in VxWorks

Using VxWorks A653 v2.5.0.2 I am getting an exception being generated in tRootTask during startup:
data storage
Exception current instruction address: 0x50000120
Machine Status Register: 0x0000fb30
Data Exception Address Register: 0x00000008
Integer Exception Register XER: 0x00000000
Condition Register: 0x40000088
Exception Syndrome Register: 0x01000000
Partition Domain ID: 0x007539a0
Task: 0x521f5920 "tRootTask"
the tRootTask does spawn a number of other kernel tasks and also creates a task for the user partition. But the entry point to the user task in the user partition never seems to run (printf() statement is not hit). After the exception occurs it is possible to attach a debugger, but the tRootTask itself is deleted by the exception. If I access the shell after the exception, attempting to display the contents at 0x50000000 contains fails, as if it is unmapped memory. This may be the root cause of the exception, but why it's inaccessible is unclear.
So I am searching for a way to debug why the exception is happening. I'm new to this OS.itself
Look in your linker map for the
"Exception current instruction address: 0x50000120"
Or if the shell has "lkup"
-> lkup 0x50000120
should give the nearest global function that's throwing exception.
Data Exception Address Register: 0x00000008
looks like zero page access, but you need decipher the "Exception Syndrome Register" in PowerPC manual to see if it's the right condition code?
"tRootTask" is the first thread in the context, so it's some sort of startup code that's failing. But it's so early, you probably need a JTAG debugger to get a breakpoint on it.

RabbitMQ Crash Explanation

Our RabbitMQ service crashed twice with the following report in the $RABBITMQ_NODENAME-sasl.log:
=CRASH REPORT==== 7-Jun-2016::14:37:25 ===
crasher:
initial call: gen:init_it/6
pid: <0.223.0>
registered_name: []
exception exit: {{badmatch,
{[{msg_location,
<<162,171,39,113,226,229,228,92,227,253,48,186,
45,48,29,98>>,
1,357,0,583},
******************
16000 similar msg_location lines snipped
******************
1795219}},
[{rabbit_msg_store,combine_files,3,[]},
{rabbit_msg_store_gc,attempt_action,3,[]},
{rabbit_msg_store_gc,handle_cast,2,[]},
{gen_server2,handle_msg,2,[]},
{proc_lib,wake_up,3,
[{file,"proc_lib.erl"},{line,250}]}]}
in function gen_server2:terminate/3
ancestors: [msg_store_persistent,rabbit_sup,<0.159.0>]
messages: [{'$gen_cast',{combine,394,380}}]
links: [#Port<0.86370>,<0.218.0>,#Port<0.86369>]
dictionary: [{{"/var/lib/rabbitmq/mnesia/$RABBITMQ_NODENAME/msg_store_persistent/357.rdq",
fhc_file},
{file,1,false}},
{{"/var/lib/rabbitmq/mnesia/$RABBITMQ_NODENAME/msg_store_persistent/340.rdq",
fhc_file},
{file,1,true}},
{fhc_age_tree,{2,
{{1465,346244,764691},
#Ref<0.0.3145729.257998>,nil,
{{1465,346244,891543},
#Ref<0.0.3145729.258001>,nil,nil}}}},
{{#Ref<0.0.3145729.257998>,fhc_handle},
{handle,{file_descriptor,prim_file,{#Port<0.86369>,59}},
0,false,0,1048576,[],false,
"/var/lib/rabbitmq/mnesia/$RABBITMQ_NODENAME/msg_store_persistent/357.rdq",
[raw,binary,read_ahead,read],
[{write_buffer,1048576}],
false,true,
{1465,346244,764691}}},
{{#Ref<0.0.3145729.258001>,fhc_handle},
{handle,{file_descriptor,prim_file,{#Port<0.86370>,64}},
14212552,false,0,1048576,[],false,
"/var/lib/rabbitmq/mnesia/$RABBITMQ_NODENAME/msg_store_persistent/340.rdq",
[raw,binary,read_ahead,read,write],
[{write_buffer,1048576}],
true,true,
{1465,346244,891543}}}]
trap_exit: false
status: running
heap_size: 121536
stack_size: 27
reductions: 835024
neighbours:
We'd like to understand what this crash report means. Does it signify a bad message, RMQ can't find a message, or something completely different? We're using RabbitMQ 3.1.5 with Erlang 18, and while we know we're using an old version, we want to first know what's causing the crash before dedicating resources to an upgrade.
This message means that RabbitMQ message storage process has failed to combine files during garbage collection on message store. This can in theory cause message loss.
Note that 3.1.5 is not supported and has not been tested with OTP 10. This issue can be already fixed in newer versions though.

Exception thrown when registering custom serializers in Storm's storm.yaml

I registered a custom serializer for a class in Storm's conf/storm.yaml in the following way:
topology.kryo.register:
- custom.Car: custom.MyCarSerializer
When I started storm by typing "bin/storm nimbus", exceptions are thrown:
Exception in thread "main" expected '<document start>', but found BlockMappingStart
in 'reader', line 27, column 1:
topology.kryo.register:
^
at org.yaml.snakeyaml.parser.ParserImpl$ParseDocumentStart.produce(ParserImpl.java:225)
at org.yaml.snakeyaml.parser.ParserImpl.peekEvent(ParserImpl.java:158)
at org.yaml.snakeyaml.parser.ParserImpl.checkEvent(ParserImpl.java:143)
at org.yaml.snakeyaml.composer.Composer.getSingleNode(Composer.java:108)
at org.yaml.snakeyaml.constructor.BaseConstructor.getSingleData(BaseConstructor.java:120)
at org.yaml.snakeyaml.Yaml.loadFromReader(Yaml.java:481)
at org.yaml.snakeyaml.Yaml.load(Yaml.java:424)
at backtype.storm.utils.Utils.findAndReadConfigFile(Utils.java:121)
at backtype.storm.utils.Utils.readStormConfig(Utils.java:161)
at backtype.storm.config$read_storm_config.invoke(config.clj:101)
at backtype.storm.command.config_value$_main.invoke(config_value.clj:7)
at clojure.lang.AFn.applyToHelper(AFn.java:161)
at clojure.lang.AFn.applyTo(AFn.java:151)
at backtype.storm.command.config_value.main(Unknown Source)
Exception in thread "main" expected '<document start>', but found BlockMappingStart
in 'reader', line 27, column 1:
topology.kryo.register:
^
at org.yaml.snakeyaml.parser.ParserImpl$ParseDocumentStart.produce(ParserImpl.java:225)
at org.yaml.snakeyaml.parser.ParserImpl.peekEvent(ParserImpl.java:158)
at org.yaml.snakeyaml.parser.ParserImpl.checkEvent(ParserImpl.java:143)
at org.yaml.snakeyaml.composer.Composer.getSingleNode(Composer.java:108)
at org.yaml.snakeyaml.constructor.BaseConstructor.getSingleData(BaseConstructor.java:120)
at org.yaml.snakeyaml.Yaml.loadFromReader(Yaml.java:481)
at org.yaml.snakeyaml.Yaml.load(Yaml.java:424)
at backtype.storm.utils.Utils.findAndReadConfigFile(Utils.java:121)
at backtype.storm.utils.Utils.readStormConfig(Utils.java:161)
at backtype.storm.config$read_storm_config.invoke(config.clj:101)
at backtype.storm.command.config_value$_main.invoke(config_value.clj:7)
at clojure.lang.AFn.applyToHelper(AFn.java:161)
at clojure.lang.AFn.applyTo(AFn.java:151)
at backtype.storm.command.config_value.main(Unknown Source)
Without these registrations that I added Storm can run normally, so I'm sure the exceptions are thrown because of the registrations I added in conf/storm.yaml, not any other configuration in conf/storm.yaml. I guess there may be some format errors and Storm could not parse conf/storm.yaml correctly. But I have examined the lines I added, for example, 1) there is no tab before "- custom.Car: custom.MyCarSerializer" and instead they are blank spaces; 2) there is a blank space following "- custom.Car:". But this can not solve the problem. Who has a similar problem and how did you solve it?
My storm's version is 0.9.0.1, and I run it in remote mode.
Definitely, your storm.yaml isn't a valid YAML file. As #user2720864 said, post your storm.yaml file.
Meanwhile, validate your YAML file using YAML Lint service.