Using wso2-esb for file processing - wso2-esb

I'm trying to use wso2 ESB to pick a file from the local directory and insert the records in the file to a database and to send an email.
Followed steps given in https://docs.wso2.com/display/ESB490/Sample+271%3A+File+Processing
Below is smooks-config file
<smooks-resource-list xmlns="http://www.milyn.org/xsd/smooks-1.0.xsd">
<!--Configure the CSVParser to parse the message into a stream of SAX events. -->
<resource-config selector="org.xml.sax.driver">
<resource>org.milyn.csv.CSVParser</resource>
<param name="fields" type="string-list">name,surname,phone</param>
</resource-config>
</smooks-resource-list>
This is the error showing in ESB server
org.milyn.SmooksException: Failed to filter source.
at org.milyn.delivery.sax.SmooksSAXFilter.doFilter(SmooksSAXFilter.java:97)
at org.milyn.delivery.sax.SmooksSAXFilter.doFilter(SmooksSAXFilter.java:64)
at org.milyn.Smooks._filter(Smooks.java:526)
at org.milyn.Smooks.filterSource(Smooks.java:482)
at org.wso2.carbon.mediator.transform.SmooksMediator.mediate(SmooksMediator.java:140)
at org.apache.synapse.mediators.AbstractListMediator.mediate(AbstractListMediator.java:97)
at org.apache.synapse.mediators.AbstractListMediator.mediate(AbstractListMediator.java:59)
at org.apache.synapse.mediators.base.SequenceMediator.mediate(SequenceMediator.java:158)
at org.apache.synapse.mediators.MediatorWorker.run(MediatorWorker.java:80)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: org/milyn/csv/CSVParser
at org.xml.sax.helpers.XMLReaderFactory.loadClass(Unknown Source)
at org.xml.sax.helpers.XMLReaderFactory.createXMLReader(Unknown Source)
at org.milyn.delivery.AbstractParser.createXMLReader(AbstractParser.java:291)
at org.milyn.delivery.sax.SAXParser.parse(SAXParser.java:62)
at org.milyn.delivery.sax.SmooksSAXFilter.doFilter(SmooksSAXFilter.java:86)
... 11 more
[2016-10-30 15:01:50,111] ERROR - SequenceMediator Failed to filter source. Caused by Failed to filter source.
org.wso2.carbon.mediator.service.MediatorException: Failed to filter source. Caused by Failed to filter source.
at org.wso2.carbon.mediator.transform.SmooksMediator.handleException(SmooksMediator.java:265)
at org.wso2.carbon.mediator.transform.SmooksMediator.mediate(SmooksMediator.java:160)
at org.apache.synapse.mediators.AbstractListMediator.mediate(AbstractListMediator.java:97)
at org.apache.synapse.mediators.AbstractListMediator.mediate(AbstractListMediator.java:59)
at org.apache.synapse.mediators.base.SequenceMediator.mediate(SequenceMediator.java:158)
at org.apache.synapse.mediators.MediatorWorker.run(MediatorWorker.java:80)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
[2016-10-30 15:01:50,153] INFO - LogMediator To: , WSAction: urn:mediate, SOAPAction: urn:mediate, MessageID: urn:uuid:f8385dfd-f7c7-42fa-a567-b234028d1c59, Direction: request, MESSAGE = Executing default 'fault' sequence, ERROR_CODE = 0, ERROR_MESSAGE = Failed to filter source. Caused by Failed to filter source., Envelope: <?xml version='1.0' encoding='utf-8'?><soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"><soapenv:Body><text xmlns="http://ws.apache.org/commons/ns/payload">vidya,mj,123456789</text></soapenv:Body></soapenv:Envelope>
please help me out to solve this

Newer version milyn-smooks-all-1.5.1 jar doesn't have CSVParser so changing CSVParser to CSVReader in the smooks-config file solved the issue.

Somehow the ESB cannot find the class. Can you please verify that milyn-smooks-csv-1.2.4.jar is located in "ESB_HOME"/repository/components/lib.

ESB unable to load Class "org/milyn/csv/CSVParser" from corresponding Jar.
Search Jar file in Google. place that Jar inside extensions folder inside
ESB_HOME/repository/components/..
restart ESB surely it will work

Related

Invalid sync error while reading avro file using spark or hive

I have an avro file which is created using JAVA api, when the writer was writing data in file the program shut down ungracefully due to machine reboot.
Now when I am trying to read this file using spark/hive, it reads some data and then throws following error (org.apache.avro.AvroRuntimeException: java.io.IOException: Invalid sync!)–
INFO DAGScheduler: ShuffleMapStage 1 (count at DataReaderSpark.java:41) failed in 7.420 s due to Job aborted due to stage failure: Task 1 in stage 1.0 failed 1 times, most recent failure: Lost task 1.0 in stage 1.0 (TID 2, localhost, executor driver): org.apache.avro.AvroRuntimeException: java.io.IOException: Invalid sync!
at org.apache.avro.file.DataFileStream.hasNext(DataFileStream.java:210)
at com.databricks.spark.avro.DefaultSource$$anonfun$buildReader$1$$anon$1.hasNext(DefaultSource.scala:215)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at org.apache.spark.sql.execution.datasources.FileScanRDD$$anon$1.hasNext(FileScanRDD.scala:106)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.agg_doAggregateWithoutKey$(Unknown Source)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:125)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
at org.apache.spark.scheduler.Task.run(Task.scala:109)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748) Caused by: java.io.IOException: Invalid sync!
at org.apache.avro.file.DataFileStream.nextRawBlock(DataFileStream.java:293)
at org.apache.avro.file.DataFileStream.hasNext(DataFileStream.java:198)
... 16 more
I believe that the last records is broken and not correct. I just wanted to know if there’s a way I can read this file without getting the exception/error by skipping the last record.

Flink 1.10 not connecting to minio (s3)

I'm running locally a docker compose running flink and minio
When I try to connect to minio, I always get the following error:
caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Could not find a file system implementation for scheme 's3'. The scheme is not directly supported by Flink and no Hadoop file system to support this scheme could be loaded.
It seems that the plugin isn't loaded correctly.
my flink config (flink-conf.yaml):
state.backend: filesystem
s3.endpoint: http://minio:9000
s3.path.style.access: true
s3.access-key: minio
s3.secret-key: minio123
presto.s3.access-key: minio
presto.s3.secret-key: minio123
presto.s3.endpoint: http://minio:9000
presto.s3.path-style-access: true
I've copied the required plugin as following:
mkdir -p plugins/s3-fs-presto
cp opt/flink-s3-fs-presto-*.jar plugins/s3-fs-presto
Any suggestions?
Stack trace:
The program finished with the following exception:
org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: 7ae6657256719d8c32d76ba113fb35f0)
at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
at org.apache.flink.client.program.ContextEnvironment.execute(ContextEnvironment.java:62)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:820)
at org.apache.flink.api.java.DataSet.collect(DataSet.java:413)
at org.apache.flink.api.java.DataSet.print(DataSet.java:1652)
at org.apache.flink.examples.java.wordcount.WordCount.main(WordCount.java:96)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:604)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:466)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:274)
at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:746)
at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:273)
at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:205)
at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1008)
at org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:1081)
at org.apache.flink.runtime.security.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:30)
at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1081)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:147)
at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
... 21 more
Caused by: java.io.IOException: Error opening the Input Split s3://test/test.txt [0,3243]: Could not find a file system implementation for scheme 's3'. The scheme is not directly supported by Flink and no Hadoop file system to support this scheme could be loaded.
at org.apache.flink.api.common.io.FileInputFormat.open(FileInputFormat.java:824)
at org.apache.flink.api.common.io.DelimitedInputFormat.open(DelimitedInputFormat.java:470)
at org.apache.flink.api.common.io.DelimitedInputFormat.open(DelimitedInputFormat.java:47)
at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:173)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Could not find a file system implementation for scheme 's3'. The scheme is not directly supported by Flink and no Hadoop file system to support this scheme could be loaded.
at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:450)
at org.apache.flink.core.fs.FileSystem.get(FileSystem.java:362)
at org.apache.flink.api.common.io.FileInputFormat$InputSplitOpenThread.run(FileInputFormat.java:995)
Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Hadoop is not in the classpath/dependencies.
at org.apache.flink.core.fs.UnsupportedSchemeFactory.create(UnsupportedSchemeFactory.java:58)
at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:446)
... 2 more
#

activiti-app.war it not getting explored in Tomcat8

05-Dec-2016 20:55:23.842 WARNING [localhost-startStop-1] org.apache.catalina.startup.HostConfig.deployDescriptor A docBase /usr/share/tomcat8/webapps/activiti-app inseen specified, and will be ignored
05-Dec-2016 20:55:44.749 INFO [localhost-startStop-1] org.apache.jasper.servlet.TldScanner.scanJars At least one JAR was scanned for TLDs yet contained no TLDs. Enablogger for a complete list of JARs that were scanned but no TLDs were found in them. Skipping unneeded JARs during scanning can improve startup time and JSP compilatio
08:55:50,685 [localhost-startStop-1] WARN org.springframework.core.io.support.PathMatchingResourcePatternResolver - Cannot search for matching files underneath URL /webapps/activiti-app.war*/WEB-INF/classes/com/activiti/conf/] because it does not correspond to a directory in the file system
java.io.FileNotFoundException: URL [war:file:/var/lib/tomcat8/webapps/activiti-app.war*/WEB-INF/classes/com/activiti/conf/] cannot be resolved to absolute file path bin the file system: war:file:/var/lib/tomcat8/webapps/activiti-app.war*/WEB-INF/classes/com/activiti/conf/
at org.springframework.util.ResourceUtils.getFile(ResourceUtils.java:212)
at org.springframework.core.io.AbstractFileResolvingResource.getFile(AbstractFileResolvingResource.java:52)
at org.springframework.core.io.UrlResource.getFile(UrlResource.java:212)
at org.springframework.core.io.support.PathMatchingResourcePatternResolver.doFindPathMatchingFileResources(PathMatchingResourcePatternResolver.java:598)
at org.springframework.web.context.support.ServletContextResourcePatternResolver.doFindPathMatchingFileResources(ServletContextResourcePatternResolver.java:92
at org.springframework.core.io.support.PathMatchingResourcePatternResolver.findPathMatchingResources(PathMatchingResourcePatternResolver.java:419)
at org.springframework.core.io.support.PathMatchingResourcePatternResolver.getResources(PathMatchingResourcePatternResolver.java:273)
at org.springframework.context.support.AbstractApplicationContext.getResources(AbstractApplicationContext.java:1159)
at org.springframework.context.annotation.ClassPathScanningCandidateComponentProvider.findCandidateComponents(ClassPathScanningCandidateComponentProvider.java
at org.springframework.context.annotation.ClassPathBeanDefinitionScanner.doScan(ClassPathBeanDefinitionScanner.java:248)
at org.springframework.context.annotation.ComponentScanAnnotationParser.parse(ComponentScanAnnotationParser.java:140)
at org.springframework.context.annotation.ConfigurationClassParser.doProcessConfigurationClass(ConfigurationClassParser.java:265)
at org.springframework.context.annotation.ConfigurationClassParser.processConfigurationClass(ConfigurationClassParser.java:229)
at org.springframework.context.annotation.ConfigurationClassParser.parse(ConfigurationClassParser.java:196)
at org.springframework.context.annotation.ConfigurationClassParser.parse(ConfigurationClassParser.java:165)
at org.springframework.context.annotation.ConfigurationClassPostProcessor.processConfigBeanDefinitions(ConfigurationClassPostProcessor.java:306)
at org.springframework.context.annotation.ConfigurationClassPostProcessor.postProcessBeanDefinitionRegistry(ConfigurationClassPostProcessor.java:239)
at org.springframework.context.support.PostProcessorRegistrationDelegate.invokeBeanDefinitionRegistryPostProcessors(PostProcessorRegistrationDelegate.java:254
at org.springframework.context.support.PostProcessorRegistrationDelegate.invokeBeanFactoryPostProcessors(PostProcessorRegistrationDelegate.java:94)
at org.springframework.context.support.AbstractApplicationContext.invokeBeanFactoryPostProcessors(AbstractApplicationContext.java:606)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:462)
at com.activiti.servlet.WebConfigurer.contextInitialized(WebConfigurer.java:85)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4853)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5314)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:145)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:753)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:729)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:717)
at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:587)
at org.apache.catalina.startup.HostConfig$DeployDescriptor.run(HostConfig.java:1798)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
08:55:50,719 [localhost-startStop-1] WARN org.springframework.web.context.support.AnnotationConfigWebApplicationContext - Exception encountered during context initiresh attempt
org.springframework.beans.factory.BeanDefinitionStoreException: I/O failure during classpath scanning; nested exception is java.io.FileNotFoundException: JAR entry !/ot found in /var/cache/tomcat8/temp/jar_cache5590291403178717066.tmp
at org.springframework.context.annotation.ClassPathScanningCandidateComponentProvider.findCandidateComponents(ClassPathScanningCandidateComponentProvider.java
at org.springframework.context.annotation.ClassPathBeanDefinitionScanner.doScan(ClassPathBeanDefinitionScanner.java:248)
at org.springframework.context.annotation.ComponentScanAnnotationParser.parse(ComponentScanAnnotationParser.java:140)
at org.springframework.context.annotation.ConfigurationClassParser.doProcessConfigurationClass(ConfigurationClassParser.java:265)
at org.springframework.context.annotation.ConfigurationClassParser.processConfigurationClass(ConfigurationClassParser.java:229)
at org.springframework.context.annotation.ConfigurationClassParser.parse(ConfigurationClassParser.java:196)
at org.springframework.context.annotation.ConfigurationClassParser.parse(ConfigurationClassParser.java:165)
at org.springframework.context.annotation.ConfigurationClassPostProcessor.processConfigBeanDefinitions(ConfigurationClassPostProcessor.java:306)
at org.springframework.context.annotation.ConfigurationClassPostProcessor.postProcessBeanDefinitionRegistry(ConfigurationClassPostProcessor.java:239)
at org.springframework.context.support.PostProcessorRegistrationDelegate.invokeBeanDefinitionRegistryPostProcessors(PostProcessorRegistrationDelegate.java:254
at org.springframework.context.support.PostProcessorRegistrationDelegate.invokeBeanFactoryPostProcessors(PostProcessorRegistrationDelegate.java:94)
at org.springframework.context.support.AbstractApplicationContext.invokeBeanFactoryPostProcessors(AbstractApplicationContext.java:606)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:462)
at com.activiti.servlet.WebConfigurer.contextInitialized(WebConfigurer.java:85)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4853)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5314)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:145)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:753)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:729)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:717)
at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:587)
at org.apache.catalina.startup.HostConfig$DeployDescriptor.run(HostConfig.java:1798)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.FileNotFoundException: JAR entry !/com/activiti/repository/ not found in /var/cache/tomcat8/temp/jar_cache5590291403178717066.tmp
at sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:142)
at sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
at org.springframework.core.io.support.PathMatchingResourcePatternResolver.doFindPathMatchingJarResources(PathMatchingResourcePatternResolver.java:509)
at org.springframework.core.io.support.PathMatchingResourcePatternResolver.findPathMatchingResources(PathMatchingResourcePatternResolver.java:416)
at org.springframework.core.io.support.PathMatchingResourcePatternResolver.getResources(PathMatchingResourcePatternResolver.java:273)
at org.springframework.context.support.AbstractApplicationContext.getResources(AbstractApplicationContext.java:1159)
at org.springframework.context.annotation.ClassPathScanningCandidateComponentProvider.findCandidateComponents(ClassPathScanningCandidateComponentProvider.java
... 26 more
05-Dec-2016 20:55:50.720 SEVERE [localhost-startStop-1] org.apache.catalina.core.StandardContext.startInternal One or more listeners failed to start. Full details wilate container log file
Simply copy activiti-explorer.war to web apps directory of tomcat, and start apache tomcat server. Once started go to http://localhost:8080/activiti-explorer (port may vary according to your installation of tomcat)

MobileFirst8.0- Unable to delete Adapters and Applications form Application Console

I am unable to delete adapters/applications from the console, it is continuously showing:
RuntimeException: Synchronization failure
Screenshot attached
Messages.log
[10/26/16 17:59:34:790 IST] 00000021 com.mfp.adapter.SecurityAdapterApplication I Adapter initialized!
[10/26/16 17:59:34:797 IST] 00000021 rnal.connectivity.synchronization.RuntimeSynchronizationBean E FWLSE0324: Runtime synchronization failed. Caused by: java.lang.RuntimeException: No resource classes found for adapter 'SecurityAdapter'
com.ibm.mfp.server.core.shared.deployment.DeploymentException: java.lang.RuntimeException: No resource classes found for adapter 'SecurityAdapter'
at com.ibm.mfp.server.java.adapter.internal.deploy.JaxRsSandboxDeploymentHandler.deploy(JaxRsSandboxDeploymentHandler.java:131)
at com.ibm.mfp.server.java.adapter.internal.deploy.JaxRsSandboxDeploymentHandler.deploy(JaxRsSandboxDeploymentHandler.java:47)
at com.ibm.mfp.server.core.internal.deployment.DeploymentManagerImpl.deploy(DeploymentManagerImpl.java:187)
at com.ibm.mfp.server.core.internal.deployment.DeploymentManagerImpl.deploy(DeploymentManagerImpl.java:484)
at com.ibm.mfp.server.core.internal.deployment.DeploymentManagerImpl.changeDeploymentState(DeploymentManagerImpl.java:360)
at com.ibm.mfp.server.mgmt.internal.connectivity.synchronization.RuntimeSynchronizationBean.sync(RuntimeSynchronizationBean.java:207)
at com.ibm.mfp.server.mgmt.internal.connectivity.synchronization.RuntimeMBeanExporterListener.mbeanRegistered(RuntimeMBeanExporterListener.java:71)
at org.springframework.jmx.export.MBeanExporter.notifyListenersOfRegistration(MBeanExporter.java:1055)
at org.springframework.jmx.export.MBeanExporter.onRegister(MBeanExporter.java:1029)
at org.springframework.jmx.support.MBeanRegistrationSupport.onRegister(MBeanRegistrationSupport.java:301)
at org.springframework.jmx.support.MBeanRegistrationSupport.doRegister(MBeanRegistrationSupport.java:229)
at org.springframework.jmx.export.MBeanExporter.registerBeanInstance(MBeanExporter.java:670)
at org.springframework.jmx.export.MBeanExporter.registerBeanNameOrInstance(MBeanExporter.java:615)
at org.springframework.jmx.export.MBeanExporter.registerBeans(MBeanExporter.java:550)
at org.springframework.jmx.export.MBeanExporter.afterSingletonsInstantiated(MBeanExporter.java:432)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:775)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:757)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:480)
at org.springframework.web.servlet.FrameworkServlet.configureAndRefreshWebApplicationContext(FrameworkServlet.java:663)
at org.springframework.web.servlet.FrameworkServlet.createWebApplicationContext(FrameworkServlet.java:629)
at org.springframework.web.servlet.FrameworkServlet.createWebApplicationContext(FrameworkServlet.java:677)
at org.springframework.web.servlet.FrameworkServlet.initWebApplicationContext(FrameworkServlet.java:548)
at org.springframework.web.servlet.FrameworkServlet.initServletBean(FrameworkServlet.java:489)
at org.springframework.web.servlet.HttpServletBean.init(HttpServletBean.java:136)
at javax.servlet.GenericServlet.init(GenericServlet.java:244)
at com.ibm.ws.webcontainer.servlet.ServletWrapper.init(ServletWrapper.java:332)
at com.ibm.ws.webcontainer.servlet.ServletWrapper.loadOnStartupCheck(ServletWrapper.java:1423)
at com.ibm.ws.webcontainer.webapp.WebApp.doLoadOnStartupActions(WebApp.java:1180)
at com.ibm.ws.webcontainer.webapp.WebApp.commonInitializationFinally(WebApp.java:1148)
at com.ibm.ws.webcontainer.webapp.WebApp.initialize(WebApp.java:1054)
at com.ibm.ws.webcontainer.webapp.WebApp.initialize(WebApp.java:6448)
at com.ibm.ws.webcontainer.osgi.DynamicVirtualHost.startWebApp(DynamicVirtualHost.java:446)
at com.ibm.ws.webcontainer.osgi.DynamicVirtualHost.startWebApplication(DynamicVirtualHost.java:441)
at com.ibm.ws.webcontainer.osgi.WebContainer.startWebApplication(WebContainer.java:981)
at com.ibm.ws.webcontainer.osgi.WebContainer.startModule(WebContainer.java:805)
at com.ibm.ws.app.manager.web.internal.WebModuleHandlerImpl.deployModule(WebModuleHandlerImpl.java:102)
at com.ibm.ws.app.manager.module.internal.DeployedAppInfoBase.deployModule(DeployedAppInfoBase.java:870)
at com.ibm.ws.app.manager.module.internal.DeployedAppInfoBase.deployModules(DeployedAppInfoBase.java:830)
at com.ibm.ws.app.manager.module.internal.DeployedAppInfoBase.deployApp(DeployedAppInfoBase.java:817)
at com.ibm.ws.app.manager.war.internal.WARApplicationHandlerImpl.install(WARApplicationHandlerImpl.java:66)
at com.ibm.ws.app.manager.internal.statemachine.StartAction.execute(StartAction.java:139)
at com.ibm.ws.app.manager.internal.statemachine.ApplicationStateMachineImpl.enterState(ApplicationStateMachineImpl.java:1168)
at com.ibm.ws.app.manager.internal.statemachine.ApplicationStateMachineImpl.run(ApplicationStateMachineImpl.java:781)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: No resource classes found for adapter 'SecurityAdapter'
at com.ibm.mfp.server.java.adapter.shared.JAXRSSandbox.produceSwaggerDoc(JAXRSSandbox.java:241)
at com.ibm.mfp.server.java.adapter.shared.JAXRSSandbox.init(JAXRSSandbox.java:165)
at com.ibm.mfp.server.java.adapter.internal.deploy.JaxRsSandboxDeploymentHandler.deploy(JaxRsSandboxDeploymentHandler.java:119)
... 45 more
[10/26/16 17:59:34:802 IST] 00000021 connectivity.synchronization.AdminSynchronizationInterceptor I Runtime service is blocked, current state: synchronization required
[10/26/16 17:59:34:804 IST] 00000067 com.mfp.datamatics.IntegrationAdapterStubApplication I Adapter destroyed!
It's unclear how you managed to get to this error:
FWLSE0324: Runtime synchronization failed. Caused by: java.lang.RuntimeException: No resource classes found for adapter 'SecurityAdapter'
On the surface, this error likely means that you deleted the resource .java file from your adapter, but kept the JAXRSApplicationClass definition of the resource in the adapter.xml file.
To better understand the problem, please upload your .adapter file somewhere so it could be debugged.
As for what you can do in the meanwhile to clear the database, I tried various ways but looks like only a re-install of the devkit currently works...

ClassNotFoundException when using the Mule Amazon SQS connector

I'm using the Amazon SQS connector in my Mule project. When I updated it from the 2.5.5 to the 3.0.0 version according to the user guide and set the DEBUG logging level for the com.amazonaws package I noticed the following error right after project starts:
DEBUG 2015-07-20 15:15:56,927 [Receiving Thread] com.amazonaws.1.9.39.shade.jmx.spi.SdkMBeanRegistry: Failed to load the JMX implementation module - JMX is disabled
java.lang.ClassNotFoundException: com.amazonaws.1.9.39.shade.jmx.SdkMBeanRegistrySupport
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at org.mule.module.launcher.FineGrainedControlClassLoader.findClass(FineGrainedControlClassLoader.java:175)
at org.mule.module.launcher.MuleApplicationClassLoader.findClass(MuleApplicationClassLoader.java:134)
at org.mule.module.launcher.FineGrainedControlClassLoader.loadClass(FineGrainedControlClassLoader.java:119)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:169)
at com.amazonaws.1.9.39.shade.jmx.spi.SdkMBeanRegistry$Factory.<clinit>(SdkMBeanRegistry.java:46)
at com.amazonaws.1.9.39.shade.metrics.AwsSdkMetrics.registerMetricAdminMBean(AwsSdkMetrics.java:351)
at com.amazonaws.1.9.39.shade.metrics.AwsSdkMetrics.<clinit>(AwsSdkMetrics.java:316)
at com.amazonaws.1.9.39.shade.AmazonWebServiceClient.requestMetricCollector(AmazonWebServiceClient.java:629)
at com.amazonaws.1.9.39.shade.AmazonWebServiceClient.isRMCEnabledAtClientOrSdkLevel(AmazonWebServiceClient.java:570)
at com.amazonaws.1.9.39.shade.AmazonWebServiceClient.isRequestMetricsEnabled(AmazonWebServiceClient.java:562)
at com.amazonaws.1.9.39.shade.AmazonWebServiceClient.createExecutionContext(AmazonWebServiceClient.java:523)
at com.amazonaws.1.9.39.shade.services.sqs.AmazonSQSClient.listQueues(AmazonSQSClient.java:1163)
at com.amazonaws.1.9.39.shade.services.sqs.AmazonSQSClient.listQueues(AmazonSQSClient.java:1501)
at org.mule.modules.sqs.connection.strategy.SQSConnectionManagement.connect(SQSConnectionManagement.java:173)
at org.mule.modules.sqs.connectivity.SQSConnectionManagementSQSConnectorAdapter.connect(SQSConnectionManagementSQSConnectorAdapter.java:21)
at org.mule.modules.sqs.connectivity.SQSConnectionManagementSQSConnectorAdapter.connect(SQSConnectionManagementSQSConnectorAdapter.java:9)
at org.mule.devkit.3.6.1.shade.connection.management.ConnectionManagementConnectorFactory.makeObject(ConnectionManagementConnectorFactory.java:47)
at org.mule.devkit.3.6.1.shade.connection.management.ConnectionManagementConnectorFactory.makeObject(ConnectionManagementConnectorFactory.java:15)
at org.apache.commons.pool.impl.GenericKeyedObjectPool.borrowObject(GenericKeyedObjectPool.java:1220)
at org.mule.modules.sqs.connectivity.SQSConnectorConfigConnectionManagementConnectionManager.acquireConnection(SQSConnectorConfigConnectionManagementConnectionManager.java:407)
at org.mule.modules.sqs.connectivity.SQSConnectorConfigConnectionManagementConnectionManager.acquireConnection(SQSConnectorConfigConnectionManagementConnectionManager.java:55)
at org.mule.devkit.3.6.1.shade.connection.management.ConnectionManagementProcessInterceptor.execute(ConnectionManagementProcessInterceptor.java:47)
at org.mule.devkit.3.6.1.shade.connection.management.ConnectionManagementProcessInterceptor.execute(ConnectionManagementProcessInterceptor.java:19)
at org.mule.security.oauth.process.RetryProcessInterceptor.execute(RetryProcessInterceptor.java:84)
at org.mule.devkit.3.6.1.shade.connection.management.ConnectionManagementProcessTemplate.execute(ConnectionManagementProcessTemplate.java:33)
at org.mule.modules.sqs.sources.ReceiveMessagesMessageSource.run(ReceiveMessagesMessageSource.java:134)
at java.lang.Thread.run(Thread.java:662)
It's true, the mule-module-sqs-3.0.0.jar downloaded by Maven doesn't contain such class. I rebuild the Amazon SQS connector's source code with little changes in the pom.xml: set <minimizeJar>false</minimizeJar> for the maven-shade-plugin. Then the missing class appeared in the jar and I managed to run the project without errors.
Not sure is it a bug or not but don't like the idea to build the connector manually. Will really appreciate if you help me to sort it out. Thanks.