Analyze SonarQube Eclipse - Caused by: org.apache.bcel.classfile.ClassFormatException: Invalid byte tag in constant pool: 15 - eclipse-plugin

I'm using Eclipse Luna (Build id: 20150109-0600, 64 bits) on Windows with SonarQube plugin (SonarQube Java Analyser 3.4.0.20140404-0949-RELEASE)
When I try to Run Analyze, the plugin can connect with the server and download the issues of the last analyze (the problems are shown on SonarQube Issues view), but after that there is this error:
java.lang.IllegalStateException: Error status [command: C:\Program Files\Java\jre8\bin\java.exe -cp D:\temp\sonar-runner-impl1326048247551966004.jar org.sonar.runner.impl.BatchLauncherMain D:\temp\sonar-project7307491695046280128.properties]: 1
at org.sonar.runner.api.ForkedRunner.fork(ForkedRunner.java:199)
at org.sonar.runner.api.ForkedRunner.doExecute(ForkedRunner.java:144)
at org.sonar.runner.api.Runner.execute(Runner.java:90)
at org.sonar.ide.eclipse.core.internal.jobs.AnalyseProjectJob.run(AnalyseProjectJob.java:343)
at org.sonar.ide.eclipse.core.internal.jobs.AnalyseProjectJob.run(AnalyseProjectJob.java:130)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:54)
The strange thing is that in D:\temp there is no file sonar-runner-implXXX.jar or sonar-projectXXX.properties but there are files sonar-runner-batchXXX.jar (like sonar-runner-batch1316679692029245803.jar)
Another strange thing, the command line is using a JRE 8, but my JAVA_HOME is a JDK 7, the projet where I'm running the Analyze is configured to use a JDK 7 and all my "Installed Jres" on Eclipse are acctually JDKs.
Can someone help me?
========================================================================
Hi
1 - About the temp files, they are really created and then deleted (I made a test "monitoring" the temp file while running the Analyze)
2 - And I could even copy the files and run the command line by hand, in this case i could se the real stack trace:
C:\>c:\Progra~1\Java\jdk1.8.0\jre\bin\java.exe -cp D:\temp\sonar-runner-impl5987517469765765781.jar org.sonar.runner.impl.BatchLauncherMain D:\temp\sonar-project5904228863019510021.properties
INFO: SonarQube Server 4.3.2
14:01:54.841 INFO - Preview mode
14:01:54.851 INFO - Load batch settings
14:01:55.338 INFO - User cache: C:\Users\fred\.sonar\cache
14:01:55.353 INFO - Install plugins
14:01:55.391 INFO - Include plugins:
14:01:55.391 INFO - Exclude plugins: devcockpit, buildstability, pdfreport, report, buildbreaker, scmactivity, views, jira
Exception in thread "main" org.sonar.runner.impl.RunnerException: Unable to execute Sonar
at org.sonar.runner.impl.BatchLauncher$1.delegateExecution(BatchLauncher.java:91)
at org.sonar.runner.impl.BatchLauncher$1.run(BatchLauncher.java:75)
at java.security.AccessController.doPrivileged(Native Method)
at org.sonar.runner.impl.BatchLauncher.doExecute(BatchLauncher.java:69)
at org.sonar.runner.impl.BatchLauncher.execute(BatchLauncher.java:50)
at org.sonar.runner.impl.BatchLauncherMain.execute(BatchLauncherMain.java:41)
at org.sonar.runner.impl.BatchLauncherMain.main(BatchLauncherMain.java:59)
Caused by: org.sonar.api.utils.SonarException: You're not authorized to execute a dry run analysis. Please contact your SonarQube administrator.
at org.sonar.batch.bootstrap.ServerClient.handleHttpException(ServerClient.java:120)
at org.sonar.batch.bootstrap.ServerClient.download(ServerClient.java:71)
at org.sonar.batch.bootstrap.PreviewDatabase.downloadDatabase(PreviewDatabase.java:85)
at org.sonar.batch.bootstrap.PreviewDatabase.start(PreviewDatabase.java:67)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.picocontainer.lifecycle.ReflectionLifecycleStrategy.invokeMethod(ReflectionLifecycleStrategy.java:110)
at org.picocontainer.lifecycle.ReflectionLifecycleStrategy.start(ReflectionLifecycleStrategy.java:89)
at org.picocontainer.injectors.AbstractInjectionFactory$LifecycleAdapter.start(AbstractInjectionFactory.java:84)
at org.picocontainer.behaviors.AbstractBehavior.start(AbstractBehavior.java:169)
at org.picocontainer.behaviors.Stored$RealComponentLifecycle.start(Stored.java:132)
at org.picocontainer.behaviors.Stored.start(Stored.java:110)
at org.picocontainer.DefaultPicoContainer.potentiallyStartAdapter(DefaultPicoContainer.java:1015)
at org.picocontainer.DefaultPicoContainer.startAdapters(DefaultPicoContainer.java:1008)
at org.picocontainer.DefaultPicoContainer.start(DefaultPicoContainer.java:766)
at org.sonar.api.platform.ComponentContainer.startComponents(ComponentContainer.java:91)
at org.sonar.api.platform.ComponentContainer.execute(ComponentContainer.java:77)
at org.sonar.batch.bootstrapper.Batch.startBatch(Batch.java:92)
at org.sonar.batch.bootstrapper.Batch.execute(Batch.java:74)
at org.sonar.runner.batch.IsolatedLauncher.execute(IsolatedLauncher.java:45)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.sonar.runner.impl.BatchLauncher$1.delegateExecution(BatchLauncher.java:87)
... 6 more
C:\>
3 - The real problem seems to be a lack of a permission...
But how can I give permission to "dry run analysis" to an user?
========================================================================
3 - I could solve the permission problem adding "Execute Preview Analysis" permission to the users.
4 - Now other error happened: Caused by: org.apache.bcel.classfile.ClassFormatException: Invalid byte tag in constant pool: 15
And i have no clue about it...
C:\>c:\Progra~1\Java\jdk1.8.0\jre\bin\java.exe -cp D:\temp\sonar-runner-impl5987517469765765781.jar org.sonar.runner.impl.BatchLauncherMain D:\temp\sonar-project5904228863019510021.properties
INFO: SonarQube Server 4.3.2
14:53:18.370 INFO - Preview mode
14:53:18.389 INFO - Load batch settings
14:53:18.794 INFO - User cache: C:\Users\fred\.sonar\cache
14:53:18.814 INFO - Install plugins
14:53:18.854 INFO - Include plugins:
14:53:18.854 INFO - Exclude plugins: devcockpit, buildstability, pdfreport, report, buildbreaker, scmactivity, views, jira
14:53:19.550 INFO - Create JDBC datasource for jdbc:h2:D:\workspaces\workspace-fred\.metadata\.plugins\org.eclipse.core.resources\.projects\sgl\org.sonar.ide.eclipse.core\.sonartmp\preview1422550399090-0
14:53:23.617 INFO - Initializing Hibernate
14:53:26.543 INFO - Load project settings
14:53:26.634 INFO - Apply project exclusions
14:53:26.862 INFO - ------------- Scan sgl
14:53:26.873 INFO - Load module settings
14:53:27.828 INFO - Loading technical debt model...
14:53:27.842 INFO - Loading technical debt model done: 14 ms
14:53:27.842 INFO - Loading rules...
14:53:28.111 INFO - Loading rules done: 269 ms
14:53:28.152 INFO - Configure Maven plugins
14:53:28.356 INFO - Compare to previous analysis (2015-01-29)
14:53:28.366 INFO - Compare over 30 days (2014-12-30, analysis of 2015-01-01 02:44:35.053)
14:53:28.376 INFO - Compare to previous version (2015-01-29)
14:53:28.376 INFO - No quality gate is configured.
14:53:28.485 INFO - Base dir: D:\workspaces\workspace-fred\sgl
14:53:28.485 INFO - Working dir: D:\workspaces\workspace-fred\.metadata\.plugins\org.eclipse.core.resources\.projects\sgl\org.sonar.ide.eclipse.core
14:53:28.485 INFO - Source dirs: D:\workspaces\workspace-fred\sgl\webApplication, D:\workspaces\workspace-fred\sgl\clientconf, D:\workspaces\workspace-fred\sgl\src\java, D:\workspaces\workspace-fred\sgl\src\conf, D:\workspaces\workspace-fred\sgl\conf
14:53:28.485 INFO - Test dirs: D:\workspaces\workspace-fred\sgl\src\test
14:53:28.485 INFO - Binary dirs: D:\workspaces\workspace-fred\sgl\webApplication\WEB-INF\classes
14:53:28.485 INFO - Source encoding: UTF-8, default locale: en_US
14:53:28.485 INFO - Index files
14:53:46.434 INFO - 959 files indexed
14:53:50.916 INFO - Quality profile for java: Minds Java Profile
14:53:51.088 INFO - Sensor JavaSquidSensor...
14:53:51.223 INFO - Java Main Files AST scan...
14:53:51.227 INFO - 946 source files to be analyzed
14:54:01.229 INFO - 127/946 files analyzed, current is D:\workspaces\workspace-fred\sgl\src\java\br\com\mindsatwork\sgl\model\DatabaseVersion.java
14:54:11.240 INFO - 176/946 files analyzed, current is D:\workspaces\workspace-fred\sgl\src\java\br\com\mindsatwork\sgl\model\MaterialReservationStatus.java
14:54:21.269 INFO - 240/946 files analyzed, current is D:\workspaces\workspace-fred\sgl\src\java\br\com\mindsatwork\sgl\model\strategy\AbstractTicketChangeRequestStatusStrategy.java
14:54:31.299 INFO - 321/946 files analyzed, current is D:\workspaces\workspace-fred\sgl\src\java\br\com\mindsatwork\sgl\permission\ApplicationModuleStrategyCourse.java
14:54:36.119 ERROR - Class not found: org.slf4j.Logger
14:54:41.320 INFO - 455/946 files analyzed, current is D:\workspaces\workspace-fred\sgl\src\java\br\com\mindsatwork\sgl\web\action\ActionInitSaveCorrectiveMaintenance.java
14:54:51.324 INFO - 541/946 files analyzed, current is D:\workspaces\workspace-fred\sgl\src\java\br\com\mindsatwork\sgl\web\action\ActionListProvider.java
14:55:01.370 INFO - 614/946 files analyzed, current is D:\workspaces\workspace-fred\sgl\src\java\br\com\mindsatwork\sgl\web\action\ActionSearch.java
14:55:11.395 INFO - 674/946 files analyzed, current is D:\workspaces\workspace-fred\sgl\src\java\br\com\mindsatwork\sgl\web\dwr\MessageManager.java
14:55:21.398 INFO - 748/946 files analyzed, current is D:\workspaces\workspace-fred\sgl\src\java\br\com\mindsatwork\sgl\web\service\model\xstream\WsPackageTrackingMainData.java
14:55:23.288 ERROR - Class not found: javax.el.ELContext
14:55:23.650 ERROR - Class not found: javax.el.ELContext
14:55:23.781 ERROR - Class not found: javax.el.ELContext
14:55:23.890 ERROR - Class not found: javax.el.ELContext
14:55:24.429 ERROR - Class not found: javax.el.ELContext
14:55:24.531 ERROR - Class not found: javax.el.ELContext
14:55:24.657 ERROR - Class not found: javax.el.ELContext
14:55:24.757 ERROR - Class not found: javax.el.ELContext
14:55:24.877 ERROR - Class not found: javax.el.ELContext
14:55:24.982 ERROR - Class not found: javax.el.ELContext
14:55:25.102 ERROR - Class not found: javax.el.ELContext
14:55:25.212 ERROR - Class not found: javax.el.ELContext
14:55:25.332 ERROR - Class not found: javax.el.ELContext
14:55:25.437 ERROR - Class not found: javax.el.ELContext
14:55:25.537 ERROR - Class not found: javax.el.ELContext
14:55:25.639 ERROR - Class not found: javax.el.ELContext
14:55:25.731 ERROR - Class not found: javax.el.ELContext
14:55:25.931 ERROR - Class not found: javax.el.ELContext
14:55:26.022 ERROR - Class not found: javax.el.ELContext
14:55:26.127 ERROR - Class not found: javax.el.ELContext
14:55:26.295 ERROR - Class not found: javax.el.ELContext
14:55:26.398 ERROR - Class not found: javax.el.ELContext
14:55:26.479 ERROR - Class not found: javax.el.ELContext
14:55:26.560 ERROR - Class not found: javax.el.ELContext
14:55:26.665 ERROR - Class not found: javax.el.ELContext
14:55:26.786 ERROR - Class not found: javax.el.ELContext
14:55:26.878 ERROR - Class not found: javax.el.ELContext
14:55:27.438 ERROR - Class not found: javax.el.ELContext
14:55:27.741 ERROR - Class not found: javax.el.ELContext
14:55:28.152 ERROR - Class not found: javax.el.ELContext
14:55:28.366 ERROR - Class not found: javax.el.ELContext
14:55:28.457 ERROR - Class not found: javax.el.ELContext
14:55:28.556 ERROR - Class not found: javax.el.ELContext
14:55:28.631 ERROR - Class not found: javax.el.ELContext
14:55:28.709 ERROR - Class not found: javax.el.ELContext
14:55:28.801 ERROR - Class not found: javax.el.ELContext
14:55:28.873 ERROR - Class not found: javax.el.ELContext
14:55:28.953 ERROR - Class not found: javax.el.ELContext
14:55:29.026 ERROR - Class not found: javax.el.ELContext
14:55:29.355 ERROR - Class not found: javax.el.ELContext
14:55:29.435 ERROR - Class not found: javax.el.ELContext
14:55:29.624 ERROR - Class not found: javax.el.ELContext
14:55:29.694 ERROR - Class not found: javax.el.ELContext
14:55:31.380 INFO - 848/946 files analyzed, current is D:\workspaces\workspace-fred\sgl\src\java\br\com\mindsatwork\sgl\web\taglib\display\export\ReportExcelHssfView.java
14:55:31.436 ERROR - Class not found: javax.el.ELContext
14:55:31.536 ERROR - Class not found: javax.el.ELContext
14:55:39.628 INFO - 946/946 source files analyzed
14:55:39.791 INFO - Java Main Files AST scan done: 108568 ms
14:55:39.880 INFO - Java bytecode scan...
14:55:41.734 INFO - Java bytecode scan done: 1854 ms
14:55:41.734 INFO - Java Test Files AST scan...
14:55:41.735 INFO - 13 source files to be analyzed
14:55:41.964 INFO - Java Test Files AST scan done: 230 ms
14:55:41.964 INFO - 13/13 source files analyzed
14:55:42.290 INFO - Package design analysis...
14:55:43.038 INFO - Package design analysis done: 748 ms
14:55:43.509 INFO - Sensor JavaSquidSensor done: 112421 ms
14:55:43.509 INFO - Sensor QProfileSensor...
14:55:43.512 INFO - Sensor QProfileSensor done: 3 ms
14:55:43.512 INFO - Sensor PmdSensor...
14:55:43.515 INFO - Execute PMD 5.1.1...
14:55:43.546 INFO - Java version: 1.7
14:55:43.593 INFO - PMD configuration: D:\workspaces\workspace-fred\.metadata\.plugins\org.eclipse.core.resources\.projects\sgl\org.sonar.ide.eclipse.core\pmd.xml
14:55:52.299 INFO - PMD configuration: D:\workspaces\workspace-fred\.metadata\.plugins\org.eclipse.core.resources\.projects\sgl\org.sonar.ide.eclipse.core\pmd-unit-tests.xml
14:55:52.300 INFO - Execute PMD 5.1.1 done: 8785 ms
14:55:52.328 INFO - Sensor PmdSensor done: 8816 ms
14:55:52.328 INFO - Sensor SurefireSensor...
14:55:52.330 INFO - parsing D:\workspaces\workspace-fred\.metadata\.plugins\org.eclipse.core.resources\.projects\sgl\org.sonar.ide.eclipse.core\build\surefire-reports
14:55:52.330 WARN - Reports path not found: D:\workspaces\workspace-fred\.metadata\.plugins\org.eclipse.core.resources\.projects\sgl\org.sonar.ide.eclipse.core\build\surefire-reports
14:55:52.330 INFO - Sensor SurefireSensor done: 2 ms
14:55:52.330 INFO - Sensor InitialOpenIssuesSensor...
14:55:52.373 INFO - Sensor InitialOpenIssuesSensor done: 43 ms
14:55:52.374 INFO - Sensor ProfileEventsSensor...
14:55:52.396 INFO - Sensor ProfileEventsSensor done: 22 ms
14:55:52.396 INFO - Sensor ProjectLinksSensor...
14:55:52.409 INFO - Sensor ProjectLinksSensor done: 13 ms
14:55:52.410 INFO - Sensor FindbugsSensor...
14:55:52.412 INFO - Execute Findbugs 2.0.3...
14:55:53.888 INFO - Findbugs output report: D:\workspaces\workspace-fred\.metadata\.plugins\org.eclipse.core.resources\.projects\sgl\org.sonar.ide.eclipse.core\findbugs-result.xml
Exception in thread "main" org.sonar.runner.impl.RunnerException: Unable to execute Sonar
at org.sonar.runner.impl.BatchLauncher$1.delegateExecution(BatchLauncher.java:91)
at org.sonar.runner.impl.BatchLauncher$1.run(BatchLauncher.java:75)
at java.security.AccessController.doPrivileged(Native Method)
at org.sonar.runner.impl.BatchLauncher.doExecute(BatchLauncher.java:69)
at org.sonar.runner.impl.BatchLauncher.execute(BatchLauncher.java:50)
at org.sonar.runner.impl.BatchLauncherMain.execute(BatchLauncherMain.java:41)
at org.sonar.runner.impl.BatchLauncherMain.main(BatchLauncherMain.java:59)
Caused by: org.sonar.api.utils.SonarException: Can not execute Findbugs
at org.sonar.plugins.findbugs.FindbugsExecutor.execute(FindbugsExecutor.java:154)
at org.sonar.plugins.findbugs.FindbugsSensor.analyse(FindbugsSensor.java:59)
at org.sonar.batch.phases.SensorsExecutor.executeSensor(SensorsExecutor.java:79)
at org.sonar.batch.phases.SensorsExecutor.execute(SensorsExecutor.java:70)
at org.sonar.batch.phases.PhaseExecutor.execute(PhaseExecutor.java:131)
at org.sonar.batch.scan.ModuleScanContainer.doAfterStart(ModuleScanContainer.java:178)
at org.sonar.api.platform.ComponentContainer.startComponents(ComponentContainer.java:92)
at org.sonar.api.platform.ComponentContainer.execute(ComponentContainer.java:77)
at org.sonar.batch.scan.ProjectScanContainer.scan(ProjectScanContainer.java:199)
at org.sonar.batch.scan.ProjectScanContainer.scanRecursively(ProjectScanContainer.java:194)
at org.sonar.batch.scan.ProjectScanContainer.doAfterStart(ProjectScanContainer.java:187)
at org.sonar.api.platform.ComponentContainer.startComponents(ComponentContainer.java:92)
at org.sonar.api.platform.ComponentContainer.execute(ComponentContainer.java:77)
at org.sonar.batch.scan.ScanTask.scan(ScanTask.java:56)
at org.sonar.batch.scan.ScanTask.execute(ScanTask.java:44)
at org.sonar.batch.bootstrap.TaskContainer.doAfterStart(TaskContainer.java:82)
at org.sonar.api.platform.ComponentContainer.startComponents(ComponentContainer.java:92)
at org.sonar.api.platform.ComponentContainer.execute(ComponentContainer.java:77)
at org.sonar.batch.bootstrap.BootstrapContainer.executeTask(BootstrapContainer.java:175)
at org.sonar.batch.bootstrap.BootstrapContainer.doAfterStart(BootstrapContainer.java:163)
at org.sonar.api.platform.ComponentContainer.startComponents(ComponentContainer.java:92)
at org.sonar.api.platform.ComponentContainer.execute(ComponentContainer.java:77)
at org.sonar.batch.bootstrapper.Batch.startBatch(Batch.java:92)
at org.sonar.batch.bootstrapper.Batch.execute(Batch.java:74)
at org.sonar.runner.batch.IsolatedLauncher.execute(IsolatedLauncher.java:45)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.sonar.runner.impl.BatchLauncher$1.delegateExecution(BatchLauncher.java:87)
... 6 more
Caused by: java.util.concurrent.ExecutionException: java.lang.ExceptionInInitializerError
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.util.concurrent.FutureTask.get(FutureTask.java:206)
at org.sonar.plugins.findbugs.FindbugsExecutor.execute(FindbugsExecutor.java:146)
... 35 more
Caused by: java.lang.ExceptionInInitializerError
at edu.umd.cs.findbugs.detect.SerializableIdiom.visit(SerializableIdiom.java:609)
at edu.umd.cs.findbugs.visitclass.BetterVisitor.visitField(BetterVisitor.java:286)
at org.apache.bcel.classfile.Field.accept(Field.java:92)
at edu.umd.cs.findbugs.visitclass.PreorderVisitor.doVisitField(PreorderVisitor.java:266)
at edu.umd.cs.findbugs.visitclass.PreorderVisitor.visitJavaClass(PreorderVisitor.java:349)
at org.apache.bcel.classfile.JavaClass.accept(JavaClass.java:214)
at edu.umd.cs.findbugs.detect.SerializableIdiom.visitClassContext(SerializableIdiom.java:133)
at edu.umd.cs.findbugs.DetectorToDetector2Adapter.visitClass(DetectorToDetector2Adapter.java:74)
at edu.umd.cs.findbugs.FindBugs2.analyzeApplication(FindBugs2.java:1209)
at edu.umd.cs.findbugs.FindBugs2.execute(FindBugs2.java:282)
at org.sonar.plugins.findbugs.FindbugsExecutor$FindbugsTask.call(FindbugsExecutor.java:201)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.bcel.classfile.ClassFormatException: Invalid byte tag in constant pool: 15
at org.apache.bcel.classfile.Constant.readConstant(Constant.java:147)
at org.apache.bcel.classfile.ConstantPool.<init>(ConstantPool.java:68)
at org.apache.bcel.classfile.ClassParser.readConstantPool(ClassParser.java:237)
at org.apache.bcel.classfile.ClassParser.parse(ClassParser.java:143)
at org.apache.bcel.util.SyntheticRepository.loadClass(SyntheticRepository.java:179)
at org.apache.bcel.util.SyntheticRepository.loadClass(SyntheticRepository.java:127)
at edu.umd.cs.findbugs.ba.AnalysisContext.lookupSystemClass(AnalysisContext.java:501)
at edu.umd.cs.findbugs.DeepSubtypeAnalysis.<clinit>(DeepSubtypeAnalysis.java:39)
... 15 more
C:\>
========================================================================
Hi
Find a link that report the same error on tomcat and tells that it only happens with JDK 7.
So I change de JDK version used by the Analyze (the project is compiled with JDK 7 64 bits)
I tried:
JDK 6 32 bits: no error (but a lot of warnings "java.lang.UnsupportedClassVersionError" as project is compiled to java 7)
JDK 7 32 bits: no error
JDK 7 64 bits: no error
JDK 8 64 bits: Caused by: org.apache.bcel.classfile.ClassFormatException: Invalid byte tag in constant pool: 15
It seems that Sonar only support JDK 8 on recent versions, since 4.3.
But my Sonar server is 4.3.2, so it should support....
OBS: If I compile the project with JDK 8 and then do an Anylize I receive a different error:
...
Caused by: java.lang.ArrayIndexOutOfBoundsException: 7352
at org.objectweb.asm.ClassReader.readClass(Unknown Source)
at org.objectweb.asm.ClassReader.accept(Unknown Source)
at edu.umd.cs.findbugs.asm.FBClassReader.accept(FBClassReader.java:44)
at org.objectweb.asm.ClassReader.accept(Unknown Source)
at edu.umd.cs.findbugs.classfile.engine.ClassParserUsingASM.parse(ClassParserUsingASM.java:110)
at edu.umd.cs.findbugs.classfile.engine.ClassParserUsingASM.parse(ClassParserUsingASM.java:587)
at edu.umd.cs.findbugs.classfile.engine.ClassInfoAnalysisEngine.analyze(ClassInfoAnalysisEngine.java:76)
at edu.umd.cs.findbugs.classfile.engine.ClassInfoAnalysisEngine.analyze(ClassInfoAnalysisEngine.java:38)
at edu.umd.cs.findbugs.classfile.impl.AnalysisCache.getClassAnalysis(AnalysisCache.java:268)
at edu.umd.cs.findbugs.ba.XFactory.getXClass(XFactory.java:652)
at edu.umd.cs.findbugs.ba.AnalysisContext.setAppClassList(AnalysisContext.java:932)
at edu.umd.cs.findbugs.FindBugs2.setAppClassList(FindBugs2.java:997)
at edu.umd.cs.findbugs.FindBugs2.execute(FindBugs2.java:225)
at org.sonar.plugins.findbugs.FindbugsExecutor$FindbugsTask.call(FindbugsExecutor.java:201)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

I had the exactly the same problem.
The issue was that our sonar server was working with java 7, while the eclipse sonarQube plugin was working with java 8.
(You can check that from Help->About Eclipse -> Installation Details -> Configuration tab)
I managed to solve the issue by adding bellow to the eclipse.ini file (located under the eclipse installation library.
-vm
C:\Program Files (x86)\Java\jdk1.7.0_72\bin\javaw.exe
(this should come above -vmargs)

Related

Kotlin VSCode extension unable to find the 'mvn' command

But in a terminal I can see a Maven version:
11:58 $ mvn --version
Apache Maven 3.8.4 (9b656c72d54e5bacbed989b64718c159fe39b537)
Maven home: /home/stephane/.asdf/installs/maven/3.8.4
Java version: 15.0.2, vendor: AdoptOpenJDK, runtime: /home/stephane/.asdf/installs/java/adoptopenjdk-15.0.2+7
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "5.4.0-91-generic", arch: "amd64", family: "unix"
I'm using the asdf tool:
12:07 $ cat .tool-versions
java adoptopenjdk-15.0.2+7
nodejs 12.13.1
tflint 0.28.1
terraform-validator 3.1.3
packer 1.7.2
terraform 0.15.3
adr-tools 3.0.0
pre-commit 1.21.0
maven 3.8.4
When loading the project the bottom Output pane for the Kotlin console shows the content:
Dec 18, 2021 12:06:26 PM org.eclipse.lsp4j.jsonrpc.RemoteEndpoint fallbackResponseError
SEVERE: Internal error: java.lang.IllegalArgumentException: Unable to find the 'mvn' command
java.util.concurrent.CompletionException: java.lang.IllegalArgumentException: Unable to find the 'mvn' command
at java.base/java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:314)
at java.base/java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:319)
at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1702)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: java.lang.IllegalArgumentException: Unable to find the 'mvn' command
at org.javacs.kt.classpath.MavenClassPathResolverKt$mvnCommand$2.invoke(MavenClassPathResolver.kt:129)
at org.javacs.kt.classpath.MavenClassPathResolverKt$mvnCommand$2.invoke(MavenClassPathResolver.kt:128)
at kotlin.SynchronizedLazyImpl.getValue(LazyJVM.kt:74)
at org.javacs.kt.classpath.MavenClassPathResolverKt.getMvnCommand(MavenClassPathResolver.kt:128)
at org.javacs.kt.classpath.MavenClassPathResolverKt.generateMavenDependencyList(MavenClassPathResolver.kt:106)
at org.javacs.kt.classpath.MavenClassPathResolverKt.access$generateMavenDependencyList(MavenClassPathResolver.kt:1)
at org.javacs.kt.classpath.MavenClassPathResolver.getClasspathWithSources(MavenClassPathResolver.kt:37)
at org.javacs.kt.classpath.UnionClassPathResolver.<init>(ClassPathResolver.kt:57)
at org.javacs.kt.classpath.ClassPathResolverKt.plus(ClassPathResolver.kt:45)
at org.javacs.kt.classpath.ClassPathResolverKt.getJoined(ClassPathResolver.kt:40)
at org.javacs.kt.classpath.DefaultClassPathResolverKt.defaultClassPathResolver(DefaultClassPathResolver.kt:11)
at org.javacs.kt.CompilerClassPath.refresh(CompilerClassPath.kt:37)
at org.javacs.kt.CompilerClassPath.refresh$default(CompilerClassPath.kt:31)
at org.javacs.kt.CompilerClassPath.addWorkspaceRoot(CompilerClassPath.kt:98)
at org.javacs.kt.KotlinLanguageServer$initialize$1.invoke(KotlinLanguageServer.kt:117)
at org.javacs.kt.KotlinLanguageServer$initialize$1.invoke(KotlinLanguageServer.kt:71)
at org.javacs.kt.util.AsyncExecutor.compute$lambda-2(AsyncExecutor.kt:19)
at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1700)
... 3 more
[Error - 11:51:28 AM] Starting client failed
Message: Internal error.
Code: -32603
java.util.concurrent.CompletionException: java.lang.IllegalArgumentException: Unable to find the 'mvn' command
at java.base/java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:314)
at java.base/java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:319)
at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1702)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: java.lang.IllegalArgumentException: Unable to find the 'mvn' command
at org.javacs.kt.classpath.MavenClassPathResolverKt$mvnCommand$2.invoke(MavenClassPathResolver.kt:129)
at org.javacs.kt.classpath.MavenClassPathResolverKt$mvnCommand$2.invoke(MavenClassPathResolver.kt:128)
at kotlin.SynchronizedLazyImpl.getValue(LazyJVM.kt:74)
at org.javacs.kt.classpath.MavenClassPathResolverKt.getMvnCommand(MavenClassPathResolver.kt:128)
at org.javacs.kt.classpath.MavenClassPathResolverKt.generateMavenDependencyList(MavenClassPathResolver.kt:106)
at org.javacs.kt.classpath.MavenClassPathResolverKt.access$generateMavenDependencyList(MavenClassPathResolver.kt:1)
at org.javacs.kt.classpath.MavenClassPathResolver.getClasspathWithSources(MavenClassPathResolver.kt:37)
at org.javacs.kt.classpath.UnionClassPathResolver.<init>(ClassPathResolver.kt:57)
at org.javacs.kt.classpath.ClassPathResolverKt.plus(ClassPathResolver.kt:45)
at org.javacs.kt.classpath.ClassPathResolverKt.getJoined(ClassPathResolver.kt:40)
at org.javacs.kt.classpath.DefaultClassPathResolverKt.defaultClassPathResolver(DefaultClassPathResolver.kt:11)
at org.javacs.kt.CompilerClassPath.refresh(CompilerClassPath.kt:37)
at org.javacs.kt.CompilerClassPath.refresh$default(CompilerClassPath.kt:31)
at org.javacs.kt.CompilerClassPath.addWorkspaceRoot(CompilerClassPath.kt:98)
at org.javacs.kt.KotlinLanguageServer$initialize$1.invoke(KotlinLanguageServer.kt:117)
at org.javacs.kt.KotlinLanguageServer$initialize$1.invoke(KotlinLanguageServer.kt:71)
at org.javacs.kt.util.AsyncExecutor.compute$lambda-2(AsyncExecutor.kt:19)
at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1700)
... 3 more
The extension log shows:
[2021-12-18 11:51:28.974] [exthost] [error] Activating extension fwcd.kotlin failed due to an error:
[2021-12-18 11:51:28.975] [exthost] [error] Error: Internal error.
at /home/stephane/.vscode/extensions/fwcd.kotlin-0.2.23/dist/extension.js:2:815204
at /home/stephane/.vscode/extensions/fwcd.kotlin-0.2.23/dist/extension.js:2:815498
at Immediate.<anonymous> (/home/stephane/.vscode/extensions/fwcd.kotlin-0.2.23/dist/extension.js:2:815863)
at processImmediate (internal/timers.js:461:21)
It is a Gradle main project with a build.xml file in the project root directory, with Gradle sub projects and Maven sub projects.
There are many build.xml pom.xml gradlew gradlew.bat files in the sub projects.
Releases:
VSCode 1.63.2
Extensions:
Language Support for Java v1.2.0
Kotlin v0.2.23

Flink submit task failed

I am using Flink1.6.1 and Hadoop2.7.5. on first I start a flink
bin/yarn-session.sh -n 2 -jm 1024 -tm 1024 -d
then submit a task
./bin/flink run ./examples/batch/WordCount.jar -input hdfs://CS-201:9000/LICENSE -output hdfs://CS-201:9000/wordcount-result.txt
I got a error:
[root#CS-201 flink-1.6.1]# ./bin/flink run
./examples/batch/WordCount.jar -input hdfs://CS-201:9000/LICENSE
-output hdfs://CS-201:9000/wordcount-result.txt 2019-05-19 15:31:11,357 INFO org.apache.flink.yarn.cli.FlinkYarnSessionCli
- Found Yarn properties file under /tmp/.yarn-properties-root. 2019-05-19 15:31:11,357 INFO
org.apache.flink.yarn.cli.FlinkYarnSessionCli - Found
Yarn properties file under /tmp/.yarn-properties-root. 2019-05-19
15:31:11,737 INFO org.apache.flink.yarn.cli.FlinkYarnSessionCli
- YARN properties set default parallelism to 2 2019-05-19 15:31:11,737 INFO org.apache.flink.yarn.cli.FlinkYarnSessionCli -
YARN properties set default parallelism to 2 YARN properties set
default parallelism to 2 2019-05-19 15:31:11,777 INFO
org.apache.hadoop.yarn.client.RMProxy -
Connecting to ResourceManager at CS-201/192.168.1.201:8032 2019-05-19
15:31:11,887 INFO org.apache.flink.yarn.cli.FlinkYarnSessionCli
- No path for the flink jar passed. Using the location of class org.apache.flink.yarn.YarnClusterDescriptor to locate the jar
2019-05-19 15:31:11,887 INFO
org.apache.flink.yarn.cli.FlinkYarnSessionCli - No
path for the flink jar passed. Using the location of class
org.apache.flink.yarn.YarnClusterDescriptor to locate the jar
2019-05-19 15:31:11,891 WARN
org.apache.flink.yarn.AbstractYarnClusterDescriptor -
Neither the HADOOP_CONF_DIR nor the YARN_CONF_DIR environment variable
is set.The Flink YARN Client needs one of these to be set to properly
load the Hadoop configuration for accessing YARN. 2019-05-19
15:31:11,979 INFO org.apache.flink.yarn.AbstractYarnClusterDescriptor
- Found application JobManager host name 'cs-202' and port '52389' from supplied application id 'application_1558248666499_0003' Starting
execution of program
------------------------------------------------------------ The program finished with the following exception:
org.apache.flink.client.program.ProgramInvocationException: Could not
retrieve the execution result. (JobID:
471f0c2d047aba74ea621c5bfe782cbf) at
org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:260)
at
org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:486)
at
org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:474)
at
org.apache.flink.client.program.ContextEnvironment.execute(ContextEnvironment.java:62)
at
org.apache.flink.examples.java.wordcount.WordCount.main(WordCount.java:85)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498) at
org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:529)
at
org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:421)
at
org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:426)
at
org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:804)
at
org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:280)
at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:215)
at
org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1044)
at
org.apache.flink.client.cli.CliFrontend.lambda$main$11(CliFrontend.java:1120)
at java.security.AccessController.doPrivileged(Native Method) at
javax.security.auth.Subject.doAs(Subject.java:422) at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754)
at
org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
at
org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1120)
Caused by: org.apache.flink.runtime.client.JobSubmissionException:
Failed to submit JobGraph. at
org.apache.flink.client.program.rest.RestClusterClient.lambda$submitJob$8(RestClusterClient.java:379)
at
java.util.concurrent.CompletableFuture.uniExceptionally(CompletableFuture.java:870)
at
java.util.concurrent.CompletableFuture$UniExceptionally.tryFire(CompletableFuture.java:852)
at
java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:474)
at
java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:1977)
at
org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$5(FutureUtils.java:213)
at
java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:760)
at
java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:736)
at
java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:474)
at
java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:561)
at
java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:929)
at
java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:442)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748) Caused by:
java.util.concurrent.CompletionException:
org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could
not complete the operation. Exception is not retryable. at
java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at
java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at
java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:911)
at
java.util.concurrent.CompletableFuture$UniRelay.tryFire(CompletableFuture.java:899)
... 12 more Caused by:
org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could
not complete the operation. Exception is not retryable. ... 10 more
Caused by: java.util.concurrent.CompletionException:
org.apache.flink.runtime.rest.util.RestClientException: [Job
submission failed.] at
java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at
java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at
java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:911)
at
java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:953)
at
java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:926)
... 4 more Caused by:
org.apache.flink.runtime.rest.util.RestClientException: [Job
submission failed.] at
org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:310)
at
org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:294)
at
java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:952)
... 5 more
why it happen? and How to fix that..

Apache flink - Timeout after submitting job on hadoop / yarn cluster

I am trying to upgrade our job from flink 1.4.2 to 1.7.1 but I keep running into timeouts after submitting the job. The flink job runs on our hadoop cluster (version 2.7) with Yarn.
I've seen the following behavior:
Using the same flink-conf.yaml as we used in 1.4.2: 1.5.6 / 1.6.3 / 1.7.1 all versions timeout while 1.4.2 works.
Using 1.5.6 with "mode: legacy" (to switch off flip-6) works
Using 1.7.1 with "mode: legacy" gives timeout (I assume this option was removed but the documentation is outdated? https://ci.apache.org/projects/flink/flink-docs-stable/ops/config.html#legacy)
When the timeout happens I get the following stacktrace:
INFO class java.time.Instant does not contain a getter for field seconds
INFO class com.bol.fin_hdp.cm1.domain.Cm1Transportable does not contain a getter for field globalId
INFO Submitting job 5af931bcef395a78b5af2b97e92dcffe (detached: false).
INFO ------------------------------------------------------------
INFO The program finished with the following exception:
INFO org.apache.flink.client.program.ProgramInvocationException: The main method caused an error.
INFO at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:545)
INFO at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:420)
INFO at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:404)
INFO at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:798)
INFO at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:289)
INFO at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:215)
INFO at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1035)
INFO at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1111)
INFO at java.security.AccessController.doPrivileged(Native Method)
INFO at javax.security.auth.Subject.doAs(Subject.java:422)
INFO at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754)
INFO at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
INFO at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1111)
INFO Caused by: java.lang.RuntimeException: org.apache.flink.client.program.ProgramInvocationException: Could not retrieve the execution result.
INFO at com.bol.fin_hdp.job.starter.IntervalJobStarter.startJob(IntervalJobStarter.java:43)
INFO at com.bol.fin_hdp.job.starter.IntervalJobStarter.startJobWithConfig(IntervalJobStarter.java:32)
INFO at com.bol.fin_hdp.Main.main(Main.java:8)
INFO at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
INFO at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
INFO at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
INFO at java.lang.reflect.Method.invoke(Method.java:498)
INFO at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:528)
INFO ... 12 more
INFO Caused by: org.apache.flink.client.program.ProgramInvocationException: Could not retrieve the execution result.
INFO at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:258)
INFO at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:464)
INFO at org.apache.flink.streaming.api.environment.StreamContextEnvironment.execute(StreamContextEnvironment.java:66)
INFO at com.bol.fin_hdp.cm1.job.Job.execute(Job.java:54)
INFO at com.bol.fin_hdp.job.starter.IntervalJobStarter.startJob(IntervalJobStarter.java:41)
INFO ... 19 more
INFO Caused by: org.apache.flink.runtime.client.JobSubmissionException: Failed to submit JobGraph.
INFO at org.apache.flink.client.program.rest.RestClusterClient.lambda$submitJob$8(RestClusterClient.java:371)
INFO at java.util.concurrent.CompletableFuture.uniExceptionally(CompletableFuture.java:870)
INFO at java.util.concurrent.CompletableFuture$UniExceptionally.tryFire(CompletableFuture.java:852)
INFO at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:474)
INFO at java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:1977)
INFO at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$5(FutureUtils.java:216)
INFO at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:760)
INFO at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:736)
INFO at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:474)
INFO at java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:1977)
INFO at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$1(RestClient.java:301)
INFO at org.apache.flink.shaded.netty4.io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:680)
INFO at org.apache.flink.shaded.netty4.io.netty.util.concurrent.DefaultPromise.notifyListeners0(DefaultPromise.java:603)
INFO at org.apache.flink.shaded.netty4.io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:563)
INFO at org.apache.flink.shaded.netty4.io.netty.util.concurrent.DefaultPromise.tryFailure(DefaultPromise.java:424)
INFO at org.apache.flink.shaded.netty4.io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe$1.run(AbstractNioChannel.java:214)
INFO at org.apache.flink.shaded.netty4.io.netty.util.concurrent.PromiseTask$RunnableAdapter.call(PromiseTask.java:38)
INFO at org.apache.flink.shaded.netty4.io.netty.util.concurrent.ScheduledFutureTask.run(ScheduledFutureTask.java:120)
INFO at org.apache.flink.shaded.netty4.io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
INFO at org.apache.flink.shaded.netty4.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
INFO at org.apache.flink.shaded.netty4.io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
INFO at org.apache.flink.shaded.netty4.io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:137)
INFO at java.lang.Thread.run(Thread.java:748)
INFO Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
INFO at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$5(FutureUtils.java:213)
INFO ... 17 more
INFO Caused by: java.util.concurrent.CompletionException: org.apache.flink.shaded.netty4.io.netty.channel.ConnectTimeoutException: connection timed out: shd-hdp-b-slave-01...
INFO at java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:292)
INFO at java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:308)
INFO at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:943)
INFO at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:926)
INFO ... 15 more
INFO Caused by: org.apache.flink.shaded.netty4.io.netty.channel.ConnectTimeoutException: connection timed out: shd-hdp-b-slave-017.example.com/some.ip.address:46500
INFO at org.apache.flink.shaded.netty4.io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe$1.run(AbstractNioChannel.java:212)
INFO ... 7 more
What changed in flip-6 that might cause this behavior and how can I fix this?
For our jobs on YARN w/Flink 1.6, we had to bump up the web.timeout setting via -yD web.timeout=100000.
In our case, there was a firewall between the machine submitting the job and our Hadoop cluster.
In newer Flink versions (1.7 and up) Flink uses REST to submit jobs. The port number for this REST service is random on yarn setups and could not be set.
Flink 1.8.0 introduced a config option to set this to a port or port range using:
rest.bind-port: 55520-55530

Error starting PIG: ERROR 2998: Unhandled internal error. Found interface jline.Terminal, but class was expected

I downloaded Pig from apache, I have installed it, tried to run it using pig -x local
This is what I get:
15/12/10 15:06:26 INFO pig.ExecTypeProvider: Trying ExecType : LOCAL
15/12/10 15:06:26 INFO pig.ExecTypeProvider: Trying ExecType : MAPREDUCE
15/12/10 15:06:26 INFO pig.ExecTypeProvider: Picked MAPREDUCE as the ExecType
2015-12-10 15:06:26,063 [main] INFO org.apache.pig.Main - Apache Pig version 0.15.0 (r1682971) compiled Jun 01 2015, 11:44:35
2015-12-10 15:06:26,063 [main] INFO org.apache.pig.Main - Logging error messages to: /usr/local/pig/pig_1449756386061.log
2015-12-10 15:06:26,097 [main] INFO org.apache.pig.impl.util.Utils - Default bootup file /home/ubuntu/.pigbootup not found
2015-12-10 15:06:26,132 [main] ERROR org.apache.pig.Main - ERROR 2998: Unhandled internal error. Found interface jline.Terminal, but class was expected
Details at logfile: /usr/local/pig/pig_1449756386061.log
2015-12-10 15:06:26,157 [main] INFO org.apache.pig.Main - Pig script completed in 206 milliseconds (206 ms)
My log file contains the following:
Error before Pig is launched
----------------------------
ERROR 2998: Unhandled internal error. Found interface jline.Terminal, but class was expected
java.lang.IncompatibleClassChangeError: Found interface jline.Terminal, but class was expected
at jline.ConsoleReader.<init>(ConsoleReader.java:174)
at jline.ConsoleReader.<init>(ConsoleReader.java:169)
at org.apache.pig.Main.run(Main.java:556)
at org.apache.pig.Main.main(Main.java:177)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
================================================================================
After I downloaded and extracted the package, I did the following (pig is in /usr/local/pig):
export JAVA_HOME=/usr/lib/jvm/jdk1.7.0_79
196 export PIG_PREFIX=/usr/local/pig
197 export PATH=$PATH:$PIG_PREFIX/bin
Any ideas what is wrong?
Thanks,
Serban
Add this -
export HADOOP_USER_CLASSPATH_FIRST=true
Refer
https://issues.apache.org/jira/browse/PIG-3851
hive startup -[ERROR] Terminal initialization failed; falling back to unsupported

ERROR 1066: Unable to open iterator for alias

I am getting an error Unable to open iterator for alias for a simple LOAD and DUMP operation with Pig. I already took a look at the answers at:
ERROR 1066: Unable to open iterator for alias - Pig
But they don't help me.
My environment:
OS: Windows 7
Pig version: 0.13.0
Mode: Local
It shows in the error that the exception is 'Caused by' a failure to change
permission of a file in the TMP directory. But when I checked the TMP directory, there's no such file (probably it got deleted after the command is finished?).
Logs below (with -v and -w options) :
'D:\H\HADOOP-2.6.0\bin\hadoop-config.cmd' is not recognized as an internal or external command,
operable program or batch file.
15/01/24 09:20:22 INFO pig.ExecTypeProvider: Trying ExecType : LOCAL
15/01/24 09:20:22 INFO pig.ExecTypeProvider: Picked LOCAL as the ExecType
2015-01-24 09:20:22,909 [main] INFO org.apache.pig.Main - Apache Pig version 0.13.0 (r1606446) compiled Jun 29 2014, 02:29:34
2015-01-24 09:20:22,909 [main] INFO org.apache.pig.Main - Logging error messages to: d:\Pig\pig-0.13.0\bin\
2015-01-24 09:20:24,267 [main] INFO org.apache.pig.impl.util.Utils - Default bootup file C:\Users\Venkat/.pigbootup not found
2015-01-24 09:20:24,438 [main] INFO org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - Connecting to hadoop file system at: file:///
2015-01-24 09:20:26,205 [main] WARN org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2015-01-24 09:20:26,236 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1066: Unable to open iterator for alias a
2015-01-24 09:20:26,236 [main] ERROR org.apache.pig.tools.grunt.Grunt - org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable to open iterator for alias a
at org.apache.pig.PigServer.openIterator(PigServer.java:912)
at org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:752)
at org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:372)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:228)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:203)
at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:81)
at org.apache.pig.Main.run(Main.java:479)
at org.apache.pig.Main.main(Main.java:156)
Caused by: org.apache.pig.backend.datastorage.DataStorageException: ERROR 0: java.io.IOException: Failed to set permissions of path: \tmp\temp946561981 to 0700
at org.apache.pig.impl.io.FileLocalizer.relativeRoot(FileLocalizer.java:484)
at org.apache.pig.impl.io.FileLocalizer.getTemporaryPath(FileLocalizer.java:515)
at org.apache.pig.impl.io.FileLocalizer.getTemporaryPath(FileLocalizer.java:511)
at org.apache.pig.PigServer.openIterator(PigServer.java:887)
... 7 more
Caused by: java.io.IOException: Failed to set permissions of path: \tmp\temp946561981 to 0700
at org.apache.hadoop.fs.FileUtil.checkReturnValue(FileUtil.java:689)
at org.apache.hadoop.fs.FileUtil.setPermission(FileUtil.java:662)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:509)
at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:286)
at org.apache.pig.backend.hadoop.datastorage.HPath.setPermission(HPath.java:122)
at org.apache.pig.impl.io.FileLocalizer.createRelativeRoot(FileLocalizer.java:495)
at org.apache.pig.impl.io.FileLocalizer.relativeRoot(FileLocalizer.java:481)
... 10 more
Details also at logfile: D:\Pig\pig-0.13.0\bin\data-1.txt1422071424329.log
Pig Stack Trace
ERROR 1066: Unable to open iterator for alias a
org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable to open iterator for alias a
at org.apache.pig.PigServer.openIterator(PigServer.java:912)
at org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:752)
at org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:372)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:228)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:203)
at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:81)
at org.apache.pig.Main.run(Main.java:479)
at org.apache.pig.Main.main(Main.java:156)
Caused by: org.apache.pig.backend.datastorage.DataStorageException: ERROR 0: java.io.IOException: Failed to set permissions of path: \tmp\temp946561981 to 0700
at org.apache.pig.impl.io.FileLocalizer.relativeRoot(FileLocalizer.java:484)
at org.apache.pig.impl.io.FileLocalizer.getTemporaryPath(FileLocalizer.java:515)
at org.apache.pig.impl.io.FileLocalizer.getTemporaryPath(FileLocalizer.java:511)
at org.apache.pig.PigServer.openIterator(PigServer.java:887)
... 7 more
Caused by: java.io.IOException: Failed to set permissions of path: \tmp\temp946561981 to 0700
at org.apache.hadoop.fs.FileUtil.checkReturnValue(FileUtil.java:689)
at org.apache.hadoop.fs.FileUtil.setPermission(FileUtil.java:662)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:509)
at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:286)
at org.apache.pig.backend.hadoop.datastorage.HPath.setPermission(HPath.java:122)
at org.apache.pig.impl.io.FileLocalizer.createRelativeRoot(FileLocalizer.java:495)
at org.apache.pig.impl.io.FileLocalizer.relativeRoot(FileLocalizer.java:481)
... 10 more
Pig file contents:
a = LOAD 's.csv' AS (NAME:chararray,COUNTRY:chararray,YEAR:int,SPORT:chararray,GOLD:int,SILVER:int,BRONZE:int,TOTAL:int);
DUMP a;
Contents of s.csv:
Yang Yilin China 2008 Gymnastics 1 0 2 3
Leisel Jones Australia 2000 Swimming 0 2 0 2
Is there anything wrong in the syntax of the LOAD statement?
Are there any environment variables that need to be set specifically other
than JAVA and JAVA_HOME?
first check your hadoop permission status if its root change it too user.
$sudo chown -R testuser:testuser /(path of hadoop folder)
permission problem will be solve.
I hope this will work
Generally we will get this error as "ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1066: Unable to open iterator for alias" due to incorrect path or not able to access the file path.
check the file location is correct or not.
check are you able to access the file.
I got this issue with hadoop namenode safemode is on.
So I did below command to OFF safemode:
sudo -u hdfs hadoop dfsadmin -safemode leave
Then it works for me.