When running scripts in Selenium BDD, it was showing error like:
java.net.UnknownHostException: messages.cucumber.io
Error:
java.lang.RuntimeException: java.net.UnknownHostException: messages.cucumber.io
at io.cucumber.core.plugin.MessageFormatter.writeMessage(MessageFormatter.java:36)
at io.cucumber.core.eventbus.AbstractEventPublisher.send(AbstractEventPublisher.java:51)
at io.cucumber.core.eventbus.AbstractEventBus.send(AbstractEventBus.java:12)
This has been resolved: after heavy debugging I have identified the root cause:-cucumber was trying to publish the report on cloud which is prevented by the infrastructure. resolved by putting "cucumber.publish.enabled=false" in the properties file. Thank you all for the support
Related
I have a situation wherein all the event data is getting stored in an s3 bucket and I need to fetch that from S3 to Kafka topic on ec2. I am using CamelAWSS3Connector and am facing issues of the connector not working.
Following is the error I am facing
[2023-01-06 10:11:21,048] ERROR Failed to create job for config/s3_connect.properties (org.apache.kafka.connect.cli.ConnectStandalone:107)
[2023-01-06 10:11:21,053] ERROR Stopping after connector error (org.apache.kafka.connect.cli.ConnectStandalone:117)
java.util.concurrent.ExecutionException: java.lang.NoClassDefFoundError: org/jctools/queues/MessagePassingQueue$Supplier
at org.apache.kafka.connect.util.ConvertingFutureCallback.result(ConvertingFutureCallback.java:115)
at org.apache.kafka.connect.util.ConvertingFutureCallback.get(ConvertingFutureCallback.java:99)
at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:114)
Caused by: java.lang.NoClassDefFoundError: org/jctools/queues/MessagePassingQueue$Supplier
I was expecting the publisher to push msg to topic from s3 to kafka
Following is my properties files
name=CamelAwss3SourceConnector
connector.class=org.apache.camel.kafkaconnector.aws2s3.CamelAws2s3SourceConnector
key.converter=org.apache.kafka.connect.storage.StringConverter
value.converter=org.apache.kafka.connect.converters.ByteArrayConverter
camel.source.maxPollDuration=10000
topics=mytopic
camel.component.aws-s3.access-key=XXXXXXXX
camel.component.aws-s3.region=ap-south-1
camel.source.path.bucketNameOrArn=poc-s3-kafkatopic
camel.source.endpoint.autocloseBody=true
camel.source.endpoint.deleteAfterRead=true
After using export command and adding jars location before calling the publisher following is the error
[2023-01-11 06:43:05,528] ERROR Stopping after connector error (org.apache.kafka.connect.cli.ConnectStandalone:117) java.util.concurrent.ExecutionException: java.lang.NoClassDefFoundError: org/apache/camel/kafkaconnector/CamelSourceConnectorConfig
at org.apache.kafka.connect.util.ConvertingFutureCallback.result(ConvertingFutureCallback.java:115)
at org.apache.kafka.connect.util.ConvertingFutureCallback.get(ConvertingFutureCallback.java:99)
at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:114) Caused by: java.lang.NoClassDefFoundError: org/apache/camel/kafkaconnector/CamelSourceConnectorConfig
Make sure you have added plugin.path=/path/to/extracted-camel-connector to the connect-standalone.properties file.
And if that doesn't work, you'll need to export CLASSPATH environment variable to include the jar files in that path.
i am facing this issue while connecting talend open studio with hive. Below is the error:
Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/hive/service/cli/thrift/TCLIService$Iface at
org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105) at
java.sql.DriverManager.getConnection(DriverManager.java:664) at
java.sql.DriverManager.getConnection(DriverManager.java:247) at
mtn_project.hive_test_0_1.hive_test.tHiveConnection_1Process(hive_test.java:353)
at
mtn_project.hive_test_0_1.hive_test.runJobInTOS(hive_test.java:674)
at mtn_project.hive_test_0_1.hive_test.main(hive_test.java:523)
Caused by: java.lang.ClassNotFoundException:
org.apache.hive.service.cli.thrift.TCLIService$Iface at
java.net.URLClassLoader.findClass(URLClassLoader.java:381) at
java.lang.ClassLoader.loadClass(ClassLoader.java:424) at
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
[statistics] disconnected at
java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 6 more
NoClassDefFoundError usually indicates that certain libraries in your environment are missing.
See for example Connect Hive thorugh Java JDBC
In your case it might be a possibility that you need the Big Data edition.
I had the same error message and using following jars helped me.
They're located in the $SPARK_HOME/jars folder:
commons-logging-1.1.3.jar
hadoop-common-3.0.0.jar
hive-jdbc-1.2.1.spark2.jar
hive-metastore-1.2.1.spark2.jar
httpclient-4.5.2.jar
libthrift-0.9.3.jar
guava-14.0.1.jar
hive-exec-1.2.1.spark2.jar
hive-service-1.2.2.jar
httpcore-4.4.4.jar
Any one met exceptions below when startup JDeveloper 12.1.3 with SOA suite ? this lead to I can not save the workspace, very bad. actually, I was wondering if there is anyone like JDeveloper, but I have to use it. :(
JDeveloper version: Build JDEVADF_12.1.3.0.0_GENERIC_140521.1008.S
Jdk version: java version "1.7.0_80"
stack trace as below:
SEVERE:
javax.naming.NamingException [Root exception is java.lang.NullPointerException]
at oracle.adf.share.jndi.ContextImpl.throwNamingException(ContextImpl.java:671)
at oracle.adf.share.jndi.ContextImpl.saveDocument(ContextImpl.java:968)
at oracle.adf.share.jndi.ContextImpl.save(ContextImpl.java:986)
at oracle.adf.share.dt.ConnectionNsChangeListener.refreshInternal(ConnectionNsChangeListener.java:242)
at oracle.adf.share.dt.ConnectionNsChangeListener.refresh(ConnectionNsChangeListener.java:180)
at oracle.adf.share.dt.ConnectionNsChangeListener.objectAdded(ConnectionNsChangeListener.java:43)
...
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:138)
at java.awt.EventDispatchThread.run(EventDispatchThread.java:91)
Caused by: java.lang.NullPointerException
at oracle.jdevimpl.jps.JpsConfigUtilsImpl.getDefaultJpsContext(JpsConfigUtilsImpl.java:722)
at oracle.jdevimpl.jps.JpsConfigUtilsImpl.getCredentialStoreLocation(JpsConfigUtilsImpl.java:1407)
at oracle.adf.share.dt.security.providers.jps.CSFDTCredentialStore.checkInitCSFStore(CSFDTCredentialStore.java:333)
at oracle.adf.share.dt.security.providers.jps.CSFDTCredentialStore.fetchCredential(CSFDTCredentialStore.java:588)
at oracle.adf.share.dt.security.providers.jps.CSFDTCredentialStore.fetchCredential(CSFDTCredentialStore.java:578)
at oracle.adf.share.security.credentialstore.CredentialStore.fetchCredential(CredentialStore.java:187)
at oracle.adf.share.jndi.CredentialStoreHelper.fetchCredential(CredentialStoreHelper.java:104)
at oracle.adf.share.jndi.ReferenceStoreHelper.saveCredentialsInternal(ReferenceStoreHelper.java:520)
at oracle.adf.share.jndi.ReferenceStoreHelper.saveCredentials(ReferenceStoreHelper.java:476)
at oracle.adf.share.jndi.ContextImpl.saveDocument(ContextImpl.java:957)
... 123 more
this caused by 3 files below missing, usually it will not happen
src/META-INF/cwallet.sso
src/META-INF/cwallet.sso.lck
src/META-INF/jps-config.xml
please also refer to https://community.oracle.com/thread/3870268?sr=stream
Hello I am using JDeveloper 11.1.2.3.0
I have configured Glassfish in my computer and I have followed the instructions as Shay explained here: https://blogs.oracle.com/shay/entry/deploying_oracle_adf_applications_to
The problem is that when I try to deploy my ADF application as "Deploy to application server" with glassfish in this case I get an error saying that:
[#|2013-08-21T11:45:47.516+0200|SEVERE|glassfish3.1.2|org.apache.catalina.core.ContainerBase|_ThreadID=62;_ThreadName=Thread-2;|ContainerBase.addChild: start:
org.apache.catalina.LifecycleException: java.lang.IllegalArgumentException: java.lang.ClassNotFoundException: oracle.adf.share.glassfish.listener.ADFGlassFishAppLifeCycleListener
If I deploy the ADF aplication as an EAR file and then I try to deploy this EAR file to glassfish through the admin interface I get this other error:
[#|2013-08-21T15:40:16.452+0200|SEVERE|glassfish3.1.2|javax.enterprise.system.tools.deployment.org.glassfish.deployment.common|_ThreadID=65;_ThreadName=Thread-2;|Exception while invoking class com.sun.enterprise.web.WebApplication start method
java.lang.Exception: java.lang.IllegalStateException: ContainerBase.addChild: start: org.apache.catalina.LifecycleException: java.lang.RuntimeException: com.sun.faces.config.ConfigurationException: CONFIGURATION FAILED! javax.el.ELContext
Can anyone help on this?
A few things to check-
Did you extract the adf-essentials zip file with the -j option?
Did you mark both your model and view project to have a deployment platform for Glassfish?\
Please make sure that application is deleted from application folder. If not stop-admin server and delete it manually.
I am using ivy for dependency management. i am getting the following exception now,suddenly this exception came but before it was working fine..
java.lang.NoSuchMethodError: org.apache.http.conn.scheme.Scheme.(Ljava/lang/String;ILorg/apache/http/conn/scheme/SchemeSocketFactory;)V
at org.openqa.selenium.remote.internal.HttpClientFactory.getClientConnectionManager(HttpClientFactory.java:64)
at org.openqa.selenium.remote.internal.HttpClientFactory.(HttpClientFactory.java:50)
at org.openqa.selenium.remote.HttpCommandExecutor.(HttpCommandExecutor.java:111)
at org.openqa.selenium.firefox.internal.NewProfileExtensionConnection.start(NewProfileExtensionConnection.java:78)
at org.openqa.selenium.firefox.FirefoxDriver.startClient(FirefoxDriver.java:187)
at org.openqa.selenium.remote.RemoteWebDriver.(RemoteWebDriver.java:93)
at org.openqa.selenium.firefox.FirefoxDriver.(FirefoxDriver.java:142)
at org.openqa.selenium.firefox.FirefoxDriver.(FirefoxDriver.java:88)
at com.ensarm.crawler.web.browser.FirefoxBrowser.initialize(FirefoxBrowser.java:296)
at com.ensarm.crawler.navigator.IpProxyNavigator.initialize(IpProxyNavigator.java:46)
at com.ensarm.crawler.Crawler.run(Crawler.java:23)
at java.lang.Thread.run(Thread.java:619)
The firefox driver for selenium doesn't have the ephemeral socket issue fixed, which often causes the stack trace above. There may be some test cases being run by ivy which if you turn off would solve this particular issue.