Error using Microsoft Access Input to read .mdb file with Spoon - pentaho

I'm trying to read an Access database file, .mdb file with Spoon,
using Microsoft Access Input in Design tab.
But it is not working.
When I press Get Tables in Content tab I get this error: Looking for usage map at page 32000, but page type is 0
If I put one table and press preview rows I get this error:
019/08/27 11:35:32 - Spoon - Spoon
2019/08/27 11:52:22 - /Transformación 1 - Iniciado despacho de la transformación [/Transformación 1]
2019/08/27 11:52:22 - Microsoft Access Input.0 - ERROR (version 8.0.0.0-28, build 8.0.0.0-28 from 2017-11-05 07.27.50 by buildguy) : Couldn't open file #1 : file:///home/pentaho/Escritorio/general.mdb --> java.io.IOException: Looking for usage map at page 32000, but page type is 0
2019/08/27 11:52:22 - Microsoft Access Input.0 - Procesamiento finalizado (I=0, O=0, R=0, W=0, U=0, E=1)
2019/08/27 11:52:22 - /Transformación 1 - Transformación detectada
2019/08/27 11:52:22 - /Transformación 1 - Transformación esta matando los otros pasos!
Thank you.

Related

IntelliJ inspection syntax highlighting not updating after being fixed

I recently updated intelliJ from a version from 2018 to the current version and now, the inspection syntax highlighting is wonky.
As I start to type something, for instance an assignment, I will get an inspection warning at the beginning.
field.value =
results in Expression statement is not an assignment or call which is obvious, because I haven't finished typing the assignment.
When I finish typing the assignment, however, inspection doesn't recognize that this is now a valid assignment and the syntax highlighting doesn't go away.
If I simply highlight the inspected part (field.value) and then cut it away and immediately paste it back, IntelliJ recognizes it as correctly written and the syntax highlighting goes away.
The above example is with javascript, but this happens with everything. Everywhere. There is constantly incorrect, outdated syntax highlighting throughout my entire project that needs to be cut and pasted back in order to appear correctly.
Here is another example of the inspecting triggering on a "misspelled" word inside a string as I type it:
The word was finished and syntax highlight still shows an error on the beginning of the word.
This was not the case with the older version of IntelliJ that I was previously using.
I have tried "Invalidate Caches / restart" which will get rid of the current errors in syntax highlighting but doesn't prevent new ones from appearing.
What could be the problem here? How can I force IntelliJ to automatically reevaluate its syntax highlighting? How can I fix this?
Update
Based on request in comments with logs from idea.log:
2020-10-19 12:39:15,589 [12546211] ERROR - aemon.impl.PassExecutorService - IntelliJ IDEA 2020.2.3 Build #IU-202.7660.26
2020-10-19 12:39:15,589 [12546211] ERROR - aemon.impl.PassExecutorService - JDK: 11.0.8; VM: OpenJDK 64-Bit Server VM; Vendor: JetBrains s.r.o.
2020-10-19 12:39:15,590 [12546212] ERROR - aemon.impl.PassExecutorService - OS: Windows 10
2020-10-19 12:39:15,591 [12546213] ERROR - aemon.impl.PassExecutorService - Last Action: EditorUp
2020-10-19 12:39:15,591 [12546213] ERROR - aemon.impl.PassExecutorService - Invalid range specified: (5, 1);
java.lang.IllegalArgumentException: Invalid range specified: (5, 1);
at com.intellij.openapi.util.TextRange.assertProperRange(TextRange.java:221)
at com.intellij.openapi.util.TextRange.assertProperRange(TextRange.java:216)
at com.intellij.openapi.util.TextRange.assertProperRange(TextRange.java:212)
at com.intellij.openapi.util.TextRange.<init>(TextRange.java:40)
at com.intellij.openapi.util.TextRange.<init>(TextRange.java:29)
at com.intellij.psi.util.PartiallyKnownString$mapRangeToHostRange$1.invoke(PartiallyKnownString.kt:189)
at com.intellij.psi.util.PartiallyKnownString.mapRangeToHostRange(PartiallyKnownString.kt:213)
at com.intellij.psi.util.PartiallyKnownString$split$2$2.invoke(PartiallyKnownString.kt:130)
at com.intellij.psi.util.PartiallyKnownString$split$2.invoke(PartiallyKnownString.kt:156)
at com.intellij.psi.util.PartiallyKnownString.split(PartiallyKnownString.kt:166)
at com.intellij.microservices.url.references.UrlPksParser.splitUrlPath(UrlPksParser.kt:129)
at com.intellij.microservices.url.references.UrlPksParser.parseFullUrl(UrlPksParser.kt:62)
at com.intellij.microservices.url.references.UrlPathReferenceInjector.buildFullUrlReference(UrlPathReferenceInjector.kt:67)
at com.intellij.javascript.microservices.JSUrlPathReferenceProvider.getReferencesByElement(JSUrlPathReferenceProvider.kt:17)
at com.intellij.psi.impl.source.resolve.reference.ReferenceProvidersRegistryImpl.getReferences(ReferenceProvidersRegistryImpl.java:202)
at com.intellij.psi.impl.source.resolve.reference.ReferenceProvidersRegistryImpl.mapNotEmptyReferencesFromProviders(ReferenceProvidersRegistryImpl.java:165)
at com.intellij.psi.impl.source.resolve.reference.ReferenceProvidersRegistryImpl.doGetReferencesFromProviders(ReferenceProvidersRegistryImpl.java:145)
at com.intellij.psi.impl.source.resolve.reference.ReferenceProvidersRegistry.lambda$getReferencesFromProviders$0(ReferenceProvidersRegistry.java:40)
at com.intellij.psi.util.CachedValuesManager$1.compute(CachedValuesManager.java:153)
at com.intellij.psi.impl.PsiCachedValueImpl.doCompute(PsiCachedValueImpl.java:54)
at com.intellij.util.CachedValueBase.lambda$getValueWithLock$1(CachedValueBase.java:235)
at com.intellij.openapi.util.RecursionManager$1.doPreventingRecursion(RecursionManager.java:112)
at com.intellij.openapi.util.RecursionManager.doPreventingRecursion(RecursionManager.java:71)
at com.intellij.util.CachedValueBase.getValueWithLock(CachedValueBase.java:236)
at com.intellij.psi.impl.PsiCachedValueImpl.getValue(PsiCachedValueImpl.java:43)
at com.intellij.util.CachedValuesManagerImpl.getCachedValue(CachedValuesManagerImpl.java:76)
at com.intellij.psi.util.CachedValuesManager.getCachedValue(CachedValuesManager.java:150)
at com.intellij.psi.util.CachedValuesManager.getCachedValue(CachedValuesManager.java:120)
at com.intellij.psi.impl.source.resolve.reference.ReferenceProvidersRegistry.getReferencesFromProviders(ReferenceProvidersRegistry.java:39)
at com.intellij.psi.impl.source.resolve.reference.ReferenceProvidersRegistry.getReferencesFromProviders(ReferenceProvidersRegistry.java:32)
at com.intellij.lang.javascript.psi.impl.JSLiteralExpressionImpl.createRefs(JSLiteralExpressionImpl.java:103)
at com.intellij.lang.javascript.psi.impl.JSLiteralExpressionImpl.lambda$static$0(JSLiteralExpressionImpl.java:78)
at com.intellij.psi.impl.PsiParameterizedCachedValue.doCompute(PsiParameterizedCachedValue.java:46)
at com.intellij.util.CachedValueBase.lambda$getValueWithLock$1(CachedValueBase.java:235)
at com.intellij.openapi.util.RecursionManager$1.doPreventingRecursion(RecursionManager.java:112)
at com.intellij.openapi.util.RecursionManager.doPreventingRecursion(RecursionManager.java:71)
at com.intellij.util.CachedValueBase.getValueWithLock(CachedValueBase.java:236)
at com.intellij.psi.impl.PsiParameterizedCachedValue.getValue(PsiParameterizedCachedValue.java:35)
at com.intellij.psi.util.CachedValuesManager.getParameterizedCachedValue(CachedValuesManager.java:81)
at com.intellij.lang.javascript.psi.impl.JSLiteralExpressionImpl.getReferences(JSLiteralExpressionImpl.java:88)
at com.jetbrains.nodejs.codeInsight.NodeCoreCodingAssistanceInspection.handleFileReferenceHolder(NodeCoreCodingAssistanceInspection.java:50)
at com.jetbrains.nodejs.codeInsight.NodeCoreCodingAssistanceInspection.access$000(NodeCoreCodingAssistanceInspection.java:24)
at com.jetbrains.nodejs.codeInsight.NodeCoreCodingAssistanceInspection$1.visitJSLiteralExpression(NodeCoreCodingAssistanceInspection.java:39)
at com.intellij.lang.javascript.psi.impl.JSLiteralExpressionImpl.accept(JSLiteralExpressionImpl.java:128)
at com.intellij.codeInspection.InspectionEngine.acceptElements(InspectionEngine.java:65)
at com.intellij.codeInspection.InspectionEngine.createVisitorAndAcceptElements(InspectionEngine.java:56)
at com.intellij.codeInsight.daemon.impl.LocalInspectionsPass.runToolOnElements(LocalInspectionsPass.java:296)
at com.intellij.codeInsight.daemon.impl.LocalInspectionsPass.lambda$visitPriorityElementsAndInit$3(LocalInspectionsPass.java:265)
at com.intellij.util.AstLoadingFilter.forceAllowTreeLoading(AstLoadingFilter.java:155)
at com.intellij.util.AstLoadingFilter.forceAllowTreeLoading(AstLoadingFilter.java:147)
at com.intellij.codeInsight.daemon.impl.LocalInspectionsPass.lambda$visitPriorityElementsAndInit$4(LocalInspectionsPass.java:264)
at com.intellij.util.AstLoadingFilter.disallowTreeLoading(AstLoadingFilter.java:126)
at com.intellij.util.AstLoadingFilter.disallowTreeLoading(AstLoadingFilter.java:115)
at com.intellij.codeInsight.daemon.impl.LocalInspectionsPass.lambda$visitPriorityElementsAndInit$5(LocalInspectionsPass.java:264)
at com.intellij.concurrency.ApplierCompleter.execAndForkSubTasks(ApplierCompleter.java:149)
at com.intellij.concurrency.ApplierCompleter.execAndForkSubTasks(ApplierCompleter.java:162)
at com.intellij.concurrency.ApplierCompleter.execAndForkSubTasks(ApplierCompleter.java:162)
at com.intellij.concurrency.ApplierCompleter.execAndForkSubTasks(ApplierCompleter.java:162)
at com.intellij.openapi.application.impl.ApplicationImpl.tryRunReadAction(ApplicationImpl.java:1110)
at com.intellij.concurrency.ApplierCompleter.lambda$wrapInReadActionAndIndicator$1(ApplierCompleter.java:105)
at com.intellij.openapi.progress.impl.CoreProgressManager.registerIndicatorAndRun(CoreProgressManager.java:629)
at com.intellij.openapi.progress.impl.CoreProgressManager.executeProcessUnderProgress(CoreProgressManager.java:581)
at com.intellij.openapi.progress.impl.ProgressManagerImpl.executeProcessUnderProgress(ProgressManagerImpl.java:60)
at com.intellij.concurrency.ApplierCompleter.wrapInReadActionAndIndicator(ApplierCompleter.java:117)
at com.intellij.concurrency.ApplierCompleter.lambda$compute$0(ApplierCompleter.java:96)
at com.intellij.openapi.application.impl.ReadMostlyRWLock.executeByImpatientReader(ReadMostlyRWLock.java:170)
at com.intellij.openapi.application.impl.ApplicationImpl.executeByImpatientReader(ApplicationImpl.java:182)
at com.intellij.concurrency.ApplierCompleter.compute(ApplierCompleter.java:96)
at java.base/java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:746)
at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
2020-10-19 12:39:15,592 [12546214] ERROR - aemon.impl.PassExecutorService - IntelliJ IDEA 2020.2.3 Build #IU-202.7660.26
2020-10-19 12:39:15,592 [12546214] ERROR - aemon.impl.PassExecutorService - JDK: 11.0.8; VM: OpenJDK 64-Bit Server VM; Vendor: JetBrains s.r.o.
2020-10-19 12:39:15,592 [12546214] ERROR - aemon.impl.PassExecutorService - OS: Windows 10
2020-10-19 12:39:15,592 [12546214] ERROR - aemon.impl.PassExecutorService - Last Action: EditorUp
2020-10-19 12:39:19,268 [12549890] INFO - re.component.SaveActionManager - [+] Start SaveActionManager#beforeAllDocumentsSaving
2020-10-19 12:39:19,268 [12549890] INFO - re.component.SaveActionManager - Locating psi files for 0 documents: []
2020-10-19 12:39:19,268 [12549890] INFO - re.component.SaveActionManager - End SaveActionManager#beforeAllDocumentsSaving
2020-10-19 12:39:19,809 [12550431] INFO - rationStore.ComponentStoreImpl - Saving Project(name=parent, containerState=ACTIVE, componentStore=C:\Projects\xxxxxxx)RunManager took 15 ms
2020-10-19 12:39:22,061 [12552683] INFO - .clearcut.ClearcutLogScheduler - Flushing logs.
2020-10-19 12:39:22,061 [12552683] INFO - .clearcut.ClearcutLogScheduler - Scheduling next log flush in 60000 ms.
2020-10-19 12:40:00,335 [12590957] INFO - j.ide.actions.RevealFileAction - Exit code 1
2020-10-19 12:40:00,608 [12591230] INFO - re.component.SaveActionManager - [+] Start SaveActionManager#beforeAllDocumentsSaving
2020-10-19 12:40:00,608 [12591230] INFO - re.component.SaveActionManager - Locating psi files for 0 documents: []
2020-10-19 12:40:00,608 [12591230] INFO - re.component.SaveActionManager - End SaveActionManager#beforeAllDocumentsSaving
2020-10-19 12:40:01,085 [12591707] INFO - rationStore.ComponentStoreImpl - Saving Project(name=parent, containerState=ACTIVE, componentStore=C:\Projects\xxxxxxx)ArquillianContainersManager took 29 ms, RunManager took 12 ms
That leads me to believe there's a problem with the SaveActionManager... Investigating further.
Update 2:
Removed the SaveActionManager plugin and restarted which did not fix things, however, it lead to a more direct error that is specifically in the file that I'm currently editing and directly dealing with syntax highlighting:
2020-10-19 12:50:45,329 [ 122921] INFO - ramework.JobViewEditorProvider - start JobViewEditorProvider
2020-10-19 12:50:49,237 [ 126829] ERROR - n.impl.GeneralHighlightingPass - In file: file://C:/Projects/xxxxxx/automation-test/cypress/integration/050-environments/140-xxxxxx.spec.js
java.lang.IllegalArgumentException: Invalid range specified: (5, 1);
at com.intellij.openapi.util.TextRange.assertProperRange(TextRange.java:221)
at com.intellij.openapi.util.TextRange.assertProperRange(TextRange.java:216)
at com.intellij.openapi.util.TextRange.assertProperRange(TextRange.java:212)
at com.intellij.openapi.util.TextRange.<init>(TextRange.java:40)
at com.intellij.openapi.util.TextRange.<init>(TextRange.java:29)
at com.intellij.psi.util.PartiallyKnownString$mapRangeToHostRange$1.invoke(PartiallyKnownString.kt:189)
at com.intellij.psi.util.PartiallyKnownString.mapRangeToHostRange(PartiallyKnownString.kt:213)
at com.intellij.psi.util.PartiallyKnownString$split$2$2.invoke(PartiallyKnownString.kt:130)
at com.intellij.psi.util.PartiallyKnownString$split$2.invoke(PartiallyKnownString.kt:156)
at com.intellij.psi.util.PartiallyKnownString.split(PartiallyKnownString.kt:166)
at com.intellij.microservices.url.references.UrlPksParser.splitUrlPath(UrlPksParser.kt:129)
at com.intellij.microservices.url.references.UrlPksParser.parseFullUrl(UrlPksParser.kt:62)
at com.intellij.microservices.url.references.UrlPathReferenceInjector.buildFullUrlReference(UrlPathReferenceInjector.kt:67)
at com.intellij.javascript.microservices.JSUrlPathReferenceProvider.getReferencesByElement(JSUrlPathReferenceProvider.kt:17)
at com.intellij.psi.impl.source.resolve.reference.ReferenceProvidersRegistryImpl.getReferences(ReferenceProvidersRegistryImpl.java:202)
at com.intellij.psi.impl.source.resolve.reference.ReferenceProvidersRegistryImpl.mapNotEmptyReferencesFromProviders(ReferenceProvidersRegistryImpl.java:165)
at com.intellij.psi.impl.source.resolve.reference.ReferenceProvidersRegistryImpl.doGetReferencesFromProviders(ReferenceProvidersRegistryImpl.java:145)
at com.intellij.psi.impl.source.resolve.reference.ReferenceProvidersRegistry.lambda$getReferencesFromProviders$0(ReferenceProvidersRegistry.java:40)
at com.intellij.psi.util.CachedValuesManager$1.compute(CachedValuesManager.java:153)
at com.intellij.psi.impl.PsiCachedValueImpl.doCompute(PsiCachedValueImpl.java:54)
at com.intellij.util.CachedValueBase.lambda$getValueWithLock$1(CachedValueBase.java:235)
at com.intellij.openapi.util.RecursionManager$1.doPreventingRecursion(RecursionManager.java:112)
at com.intellij.openapi.util.RecursionManager.doPreventingRecursion(RecursionManager.java:71)
at com.intellij.util.CachedValueBase.getValueWithLock(CachedValueBase.java:236)
at com.intellij.psi.impl.PsiCachedValueImpl.getValue(PsiCachedValueImpl.java:43)
at com.intellij.util.CachedValuesManagerImpl.getCachedValue(CachedValuesManagerImpl.java:76)
at com.intellij.psi.util.CachedValuesManager.getCachedValue(CachedValuesManager.java:150)
at com.intellij.psi.util.CachedValuesManager.getCachedValue(CachedValuesManager.java:120)
at com.intellij.psi.impl.source.resolve.reference.ReferenceProvidersRegistry.getReferencesFromProviders(ReferenceProvidersRegistry.java:39)
at com.intellij.psi.impl.source.resolve.reference.ReferenceProvidersRegistry.getReferencesFromProviders(ReferenceProvidersRegistry.java:32)
at com.intellij.lang.javascript.psi.impl.JSLiteralExpressionImpl.createRefs(JSLiteralExpressionImpl.java:103)
at com.intellij.lang.javascript.psi.impl.JSLiteralExpressionImpl.lambda$static$0(JSLiteralExpressionImpl.java:78)
at com.intellij.psi.impl.PsiParameterizedCachedValue.doCompute(PsiParameterizedCachedValue.java:46)
at com.intellij.util.CachedValueBase.lambda$getValueWithLock$1(CachedValueBase.java:235)
at com.intellij.openapi.util.RecursionManager$1.doPreventingRecursion(RecursionManager.java:112)
at com.intellij.openapi.util.RecursionManager.doPreventingRecursion(RecursionManager.java:71)
at com.intellij.util.CachedValueBase.getValueWithLock(CachedValueBase.java:236)
at com.intellij.psi.impl.PsiParameterizedCachedValue.getValue(PsiParameterizedCachedValue.java:35)
at com.intellij.psi.util.CachedValuesManager.getParameterizedCachedValue(CachedValuesManager.java:81)
at com.intellij.lang.javascript.psi.impl.JSLiteralExpressionImpl.getReferences(JSLiteralExpressionImpl.java:88)
at com.intellij.codeInsight.highlighting.HyperlinkAnnotator.annotate(HyperlinkAnnotator.java:44)
at com.intellij.codeInsight.daemon.impl.DefaultHighlightVisitor.runAnnotators(DefaultHighlightVisitor.java:136)
at com.intellij.codeInsight.daemon.impl.DefaultHighlightVisitor.visit(DefaultHighlightVisitor.java:116)
at com.intellij.codeInsight.daemon.impl.GeneralHighlightingPass.runVisitors(GeneralHighlightingPass.java:338)
at com.intellij.codeInsight.daemon.impl.GeneralHighlightingPass.lambda$collectHighlights$5(GeneralHighlightingPass.java:277)
at com.intellij.codeInsight.daemon.impl.GeneralHighlightingPass.analyzeByVisitors(GeneralHighlightingPass.java:297)
at com.intellij.codeInsight.daemon.impl.GeneralHighlightingPass.lambda$analyzeByVisitors$6(GeneralHighlightingPass.java:300)
at com.intellij.codeInsight.daemon.impl.analysis.XmlHighlightVisitor.analyze(XmlHighlightVisitor.java:598)
at com.intellij.codeInsight.daemon.impl.GeneralHighlightingPass.analyzeByVisitors(GeneralHighlightingPass.java:300)
at com.intellij.codeInsight.daemon.impl.GeneralHighlightingPass.lambda$analyzeByVisitors$6(GeneralHighlightingPass.java:300)
at com.intellij.codeInsight.daemon.impl.DefaultHighlightVisitor.analyze(DefaultHighlightVisitor.java:96)
at com.intellij.codeInsight.daemon.impl.GeneralHighlightingPass.analyzeByVisitors(GeneralHighlightingPass.java:300)
at com.intellij.codeInsight.daemon.impl.GeneralHighlightingPass.collectHighlights(GeneralHighlightingPass.java:268)
at com.intellij.codeInsight.daemon.impl.GeneralHighlightingPass.collectInformationWithProgress(GeneralHighlightingPass.java:214)
at com.intellij.codeInsight.daemon.impl.ProgressableTextEditorHighlightingPass.doCollectInformation(ProgressableTextEditorHighlightingPass.java:80)
at com.intellij.codeHighlighting.TextEditorHighlightingPass.collectInformation(TextEditorHighlightingPass.java:54)
at com.intellij.codeInsight.daemon.impl.PassExecutorService$ScheduledPass.lambda$doRun$1(PassExecutorService.java:399)
at com.intellij.openapi.application.impl.ApplicationImpl.tryRunReadAction(ApplicationImpl.java:1110)
at com.intellij.codeInsight.daemon.impl.PassExecutorService$ScheduledPass.lambda$doRun$2(PassExecutorService.java:392)
at com.intellij.openapi.progress.impl.CoreProgressManager.registerIndicatorAndRun(CoreProgressManager.java:629)
at com.intellij.openapi.progress.impl.CoreProgressManager.executeProcessUnderProgress(CoreProgressManager.java:581)
at com.intellij.openapi.progress.impl.ProgressManagerImpl.executeProcessUnderProgress(ProgressManagerImpl.java:60)
at com.intellij.codeInsight.daemon.impl.PassExecutorService$ScheduledPass.doRun(PassExecutorService.java:391)
at com.intellij.codeInsight.daemon.impl.PassExecutorService$ScheduledPass.lambda$run$0(PassExecutorService.java:367)
at com.intellij.openapi.application.impl.ReadMostlyRWLock.executeByImpatientReader(ReadMostlyRWLock.java:170)
at com.intellij.openapi.application.impl.ApplicationImpl.executeByImpatientReader(ApplicationImpl.java:182)
at com.intellij.codeInsight.daemon.impl.PassExecutorService$ScheduledPass.run(PassExecutorService.java:365)
at com.intellij.concurrency.JobLauncherImpl$VoidForkJoinTask$1.exec(JobLauncherImpl.java:181)
at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
2020-10-19 12:50:49,242 [ 126834] ERROR - n.impl.GeneralHighlightingPass - IntelliJ IDEA 2020.2.3 Build #IU-202.7660.26
2020-10-19 12:50:49,242 [ 126834] ERROR - n.impl.GeneralHighlightingPass - JDK: 11.0.8; VM: OpenJDK 64-Bit Server VM; Vendor: JetBrains s.r.o.
2020-10-19 12:50:49,243 [ 126835] ERROR - n.impl.GeneralHighlightingPass - OS: Windows 10
2020-10-19 12:50:49,252 [ 126844] ERROR - n.impl.GeneralHighlightingPass - Last Action: EditorEnter
2020-10-19 12:50:51,908 [ 129500] INFO - dea.updater.SdkComponentSource - File C:\Users\xxxxx\.android\repositories.cfg could not be loaded.
Update 3:
Based on suggestion from comments, opened bug ticket: https://youtrack.jetbrains.com/issue/IDEA-253270
This is an IntelliJ bug.
The advice from Bas Leijdekkers guided me toward creating a bug ticket which was marked as a duplicate of a bug that is being fixed in the next release. Thanks for the assistance Bas.

Avro : java.lang.RuntimeException: Unsupported type in record

Input: test.csv
100
101
102
Pig Script :
REGISTER required jars are registered;
A = LOAD 'test.csv' USING org.apache.pig.piggybank.storage.CSVExcelStorage() AS (code:chararray);
STORE A INTO 'test' USING org.apache.pig.piggybank.storage.avro.AvroStorage
('schema',
'{"namespace":"com.pig.test.avro","type":"record","name":"Avro_Test","doc":"Avro Test Schema",
"fields":[
{"name":"code","type":["string","null"],"default":null}
]}'
);
Getting a runtime error while STORE. Any inputs on resolving the same.
Error Log :
ERROR org.apache.pig.tools.pigstats.SimplePigStats - ERROR 2997: Unable to recreate exception from backed error: org.apache.avro.file.DataFileWriter$AppendWriteException: java.lang.RuntimeException: Unsupported type in record:class java.lang.String
at org.apache.avro.file.DataFileWriter.append(DataFileWriter.java:263)
at org.apache.pig.piggybank.storage.avro.PigAvroRecordWriter.write(PigAvroRecordWriter.java:49)
at org.apache.pig.piggybank.storage.avro.AvroStorage.putNext(AvroStorage.java:722)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat$PigRecordWriter.write(PigOutputFormat.java:139)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat$PigRecordWriter.write(PigOutputFormat.java:98)
at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:558)
at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:85)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:106)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigMapOnly$Map.collect(PigMap
2015-06-02 23:06:03,934 [main] ERROR org.apache.pig.tools.pigstats.PigStatsUtil - 1 map reduce job(s) failed!
2015-06-02 23:06:03,934 [main] INFO org.apache.pig.tools.pigstats.SimplePigStats - Script Statistics:
Looks like this is a bug: https://issues.apache.org/jira/browse/PIG-3358
If you can, try to update to pig 0.14, according to the comments this has been fixed.

Failed to load data from S3

I launched two m1.medium nodes on amazon ec2 for executing my pig script, but looks like it failed at the first line (even before MapReduce start): raw = LOAD 's3n://uw-cse-344-oregon.aws.amazon.com/btc-2010-chunk-000' USING TextLoader as (line:chararray);
The error message I got:
2015-02-04 02:15:39,804 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 1 map-reduce job(s) waiting for submission.
2015-02-04 02:15:39,821 [JobControl] INFO org.apache.hadoop.mapred.JobClient - Default number of map tasks: null
2015-02-04 02:15:39,822 [JobControl] INFO org.apache.hadoop.mapred.JobClient - Setting default number of map tasks based on cluster size to : 20
... (omitted)
2015-02-04 02:18:40,955 [main] WARN org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Ooops! Some job has failed! Specify -stop_on_failure if you want Pig to stop immediately on failure.
2015-02-04 02:18:40,956 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - job job_201502040202_0002 has failed! Stop running all dependent jobs
2015-02-04 02:18:40,956 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 100% complete
2015-02-04 02:18:40,997 [main] ERROR org.apache.pig.tools.pigstats.SimplePigStats - ERROR 2997: Unable to recreate exception from backed error: Error: Java heap space
2015-02-04 02:18:40,997 [main] ERROR org.apache.pig.tools.pigstats.PigStatsUtil - 1 map reduce job(s) failed!
2015-02-04 02:18:40,997 [main] INFO org.apache.pig.tools.pigstats.SimplePigStats - Script Statistics: HadoopVersion PigVersion UserId StartedAt FinishedAt Features 1.0.3 0.11.1.1-amzn hadoop 2015-02-04 02:15:32 2015-02-04 02:18:40 GROUP_BY
Failed!
Failed Jobs:
JobId Alias Feature Message Outputs
job_201502050202_0002 ngroup,raw,triples,tt GROUP_BY,COMBINER Message: Job failed! Error - # of failed Map Tasks exceeded allowed limit. FailedCount: 1. LastFailedTask: task_201502050202_0002_m_000022
Input(s):
Failed to read data from "s3n://uw-cse-344-oregon.aws.amazon.com/btc-2010-chunk-000"
Output(s):
Counters:
Total records written : 0
Total bytes written : 0
Spillable Memory Manager spill count : 0
Total bags proactively spilled: 0
Total records proactively spilled: 0
I think the code should be fine since I have ever successfully loaded other data with the same syntax, and the link to s3n://uw-cse-344-oregon.aws.amazon.com/btc-2010-chunk-000 looks valid. I suspect it might be related to some of my EC2 settings, but not sure how to investigate further or narrow down the problem. Anyone has a clue?
"Java heap space" error message gives some clues. Your files seem to be quite large (~2GB). Make sure that you have enough memory for each task runner to read the data.
The problem was currently solved by changing my node from m1.medium to m3.large , thanks for the good hint from #Nat as he pointed out the error message regarding with java heap space. I'll update more details later.

Kettle '?' not working Table Input Step

I want to get all the table names from the database and then get all the rows from the tables. So I created a transformation like this:
Get Table Names: Added the database connection and stored the table name in a output field called "tablename".
Table Input: Marked "Replace variables in script" and "Execute for each row". Added the first step in "Insert data from step". SQL is "SELECT * from ?".
I have read up a lot of tutorials online, including the documentation.
My problem is that everywhere it says that I my "?" should be replaced with the parameter. But this does not happen. Here are the logs:
2013/06/22 03:33:25 - Get table names.0 - Starting to run...
2013/06/22 03:33:25 - Postgres 9.1.9 RO - read :9 table names from db meta-data.
2013/06/22 03:33:25 - Table input.0 - Query parameters found = [stackexchange2]
2013/06/22 03:33:25 - Table input.0 - SQL query : SELECT * from ?
2013/06/22 03:33:25 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : Unexpected error
2013/06/22 03:33:25 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : org.pentaho.di.core.exception.KettleDatabaseException:
2013/06/22 03:33:25 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : An error occurred executing SQL:
2013/06/22 03:33:25 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : SELECT * from ?
2013/06/22 03:33:25 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : ERROR: syntax error at or near "$1"
Position: 16
2013/06/22 03:33:25 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) :
I am using Kettle 4.4. Downloaded the spoon client from here.
UPDATE
I just want to make this work. I am learning the tool right now, and it would be good to know how '?' works.
To solve your situation, i prefer to work with jobs, please find {kettle_intalation_folder_path}/examples/jobs/process all tables/Process all tables.kjb, because your case is a simplification of that example.
Actually you can do this and you absolutely would do this. Take a look at "Metadata Injection".
Also you can't parameterise a table name - jdbc doesnt allow this. However you can get away with using the SQL step rather than table input and generate the whole SQL string in a field if you really want to. Wouldnt really recommend that though.
As said above, please describe further exactly what you're trying to do. If you're just migrating a database without doing any transformation of the data, i.e. from one db to another, then dont bother with Kettle as thats not what it's for.

Pig ORDER command fails

I am trying to analyze an apache log and the goal is the find out all user agents and their percentage in usage. The following program works fine to the line when result contains each useragent, count and percentage. The program fails at last line when tries to order according to most used. Could someone help?
logs = LOAD '$LOGS' USING ApacheCombinedLogLoader AS (remoteHost, hyphen, user, time, method, uri, protocol, statusCode, responseSize, referer, userAgent);
uarows = FOREACH logs GENERATE userAgent;
total = FOREACH (GROUP uarows ALL) GENERATE COUNT(uarows) as count;
dump total;
gpuarows = GROUP uarows BY userAgent;
result = FOREACH gpuarows {
subtotal = COUNT(uarows);
GENERATE flatten(group) as ua, subtotal AS SUB_TOTAL, 100*(double)subtotal/(double)total.count AS percentage;
};
orderresult = ORDER result BY SUB_TOTAL DESC;
dump orderresult;
what's weird is that 'dump result' works just fine, so it's the ORDER line makes trouble
errors:
013-04-13 11:33:09,976 [Thread-48] INFO org.apache.hadoop.mapred.MapTask - data buffer = 79691776/99614720
2013-04-13 11:33:09,976 [Thread-48] INFO org.apache.hadoop.mapred.MapTask - record buffer = 262144/327680
2013-04-13 11:33:09,995 [Thread-48] WARN org.apache.hadoop.mapred.LocalJobRunner - job_local_0005
java.lang.RuntimeException: org.apache.hadoop.mapreduce.lib.input.InvalidInputException: Input path does not exist: file:/home/dliu/ApacheLogAnalysisWithPig/pigsample_1573648613_1365823989735
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.partitioners.WeightedRangePartitioner.setConf(WeightedRangePartitioner.java:157)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
at org.apache.hadoop.mapred.MapTask$NewOutputCollector.<init>(MapTask.java:677)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:756)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:212)
Caused by: org.apache.hadoop.mapreduce.lib.input.InvalidInputException: Input path does not exist: file:/home/dliu/ApacheLogAnalysisWithPig/pigsample_1573648613_1365823989735
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:235)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigFileInputFormat.listStatus(PigFileInputFormat.java:37)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:252)
at org.apache.pig.impl.io.ReadToEndLoader.init(ReadToEndLoader.java:177)
at org.apache.pig.impl.io.ReadToEndLoader.<init>(ReadToEndLoader.java:124)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.partitioners.WeightedRangePartitioner.setConf(WeightedRangePartitioner.java:131)
... 6 more
2013-04-13 11:33:10,276 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - HadoopJobId: job_local_0005
2013-04-13 11:33:10,276 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Processing aliases orderresult
2013-04-13 11:33:10,276 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - detailed locations: M: orderresult[16,14] C: R:
2013-04-13 11:33:15,286 [main] WARN org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Ooops! Some job has failed! Specify -stop_on_failure if you want Pig to stop immediately on failure.
2013-04-13 11:33:15,286 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - job job_local_0005 has failed! Stop running all dependent jobs
2013-04-13 11:33:15,287 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 100% complete
2013-04-13 11:33:15,287 [main] ERROR org.apache.pig.tools.pigstats.PigStatsUtil - 1 map reduce job(s) failed!
2013-04-13 11:33:15,288 [main] INFO org.apache.pig.tools.pigstats.SimplePigStats - Script Statistics:
HadoopVersion PigVersion UserId StartedAt FinishedAt Features
1.0.4 0.11.0 dliu 2013-04-13 11:32:27 2013-04-13 11:33:15 GROUP_BY,ORDER_BY
Some jobs have failed! Stop running all dependent jobs
Job Stats (time in seconds):
JobId Maps Reduces MaxMapTime MinMapTIme AvgMapTime MedianMapTime MaxReduceTime MinReduceTime AvgReduceTime MedianReducetime Alias Feature Outputs
job_local_0002 1 1 n/a n/a n/a n/a n/a n/a 1-18,logs,total,uarows MULTI_QUERY,COMBINER
job_local_0003 1 1 n/a n/a n/a n/a n/a n/a gpuarows,result GROUP_BY,COMBINER
job_local_0004 1 1 n/a n/a n/a n/a n/a n/a orderresult SAMPLER
Failed Jobs:
JobId Alias Feature Message Outputs
job_local_0005 orderresult ORDER_BY Message: Job failed! Error - NA file:/tmp/temp265162785/tmp896004388,
Input(s):
Successfully read 0 records from: "file:///home/dliu/ApacheLogAnalysisWithPig/access.log"
Output(s):
Failed to produce result in "file:/tmp/temp265162785/tmp896004388"
Counters:
Total records written : 0
Total bytes written : 0
Spillable Memory Manager spill count : 0
Total bags proactively spilled: 0
Total records proactively spilled: 0
Job DAG:
job_local_0002 -> job_local_0003,
job_local_0003 -> job_local_0004,
job_local_0004 -> job_local_0005,
job_local_0005
2013-04-13 11:33:15,291 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Some jobs have failed! Stop running all dependent jobs
2013-04-13 11:33:15,297 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1066: Unable to open iterator for alias orderresult
Details at logfile: /home/dliu/ApacheLogAnalysisWithPig/pig_1365823931459.log
Make sure two things:
1) Run pig in local mode: pig -x local
2) Set either PIG_HOME or PIG_INSTALL environment variable to point to pig installation directory
Please check that you don't have already file /tmp/temp265162785/tmp896004388
You can use the same file\directory for different tasks.