Kotlin 1.4.20 gradle kts issue in IntelliJ - kotlin

I am facing constant issues with my build.gradle.kts where the whole thing is basically red since I migrated to kotlin 1.4.20. Also, the gradle version is 6.7.1.
Any one face similar issues or is it just me? any workarounds?
when run through the cli, the build works perfectly fine but in IntelliJ, the whole thing is red
Exception in IntelliJ
java.lang.IllegalArgumentException: declarationDescriptor is null for constructor = TypeVariable(K) with class org.jetbrains.kotlin.resolve.calls.inference.model.TypeVariableTypeConstructor
at org.jetbrains.kotlin.js.descriptorUtils.DescriptorUtilsKt.getJetTypeFqName(descriptorUtils.kt:23)
at org.jetbrains.kotlin.js.descriptorUtils.DescriptorUtilsKt$getJetTypeFqName$typeArgumentsAsString$joinedTypeArguments$1.fun(descriptorUtils.kt:32)
at org.jetbrains.kotlin.js.descriptorUtils.DescriptorUtilsKt$getJetTypeFqName$typeArgumentsAsString$joinedTypeArguments$1.fun(descriptorUtils.kt)
at com.intellij.openapi.util.text.StringUtil.join(StringUtil.java:1337)
at com.intellij.openapi.util.text.StringUtil.join(StringUtil.java:1327)
at com.intellij.openapi.util.text.StringUtil.join(StringUtil.java:1307)
at org.jetbrains.kotlin.js.descriptorUtils.DescriptorUtilsKt.getJetTypeFqName(descriptorUtils.kt:32)
at org.jetbrains.hdfsplugin.highlight.kotlin.KotlinSignatureCondition.getReturnTypeFqn(KotlinSignatureCondition.kt:18)
at org.jetbrains.hdfsplugin.highlight.kotlin.KotlinSignatureCondition.getReturnTypeFqn(KotlinSignatureCondition.kt:8)
at org.jetbrains.hdfsplugin.highlight.SignatureCondition.checkCondition(SignatureCondition.kt:7)
at org.jetbrains.hdfsplugin.highlight.inspection.invalidFormat.InvalidFormatInspection.visitCall(InvalidFormatInspection.kt:13)
at org.jetbrains.hdfsplugin.highlight.inspection.invalidFormat.KotlinHdfsInvalidFormatInspection$buildVisitor$1.visitCallExpression(KotlinHdfsInvalidFormatInspection.kt:15)
at org.jetbrains.kotlin.psi.KtVisitorVoid.visitCallExpression(KtVisitorVoid.java:803)
at org.jetbrains.kotlin.psi.KtVisitorVoid.visitCallExpression(KtVisitorVoid.java:21)
at org.jetbrains.kotlin.psi.KtCallExpression.accept(KtCallExpression.java:35)
at org.jetbrains.kotlin.psi.KtElementImpl.accept(KtElementImpl.java:51)
at com.intellij.codeInspection.InspectionEngine.acceptElements(InspectionEngine.java:65)
at com.intellij.codeInspection.InspectionEngine.createVisitorAndAcceptElements(InspectionEngine.java:56)
at com.intellij.codeInsight.daemon.impl.LocalInspectionsPass.runToolOnElements(LocalInspectionsPass.java:296)
at com.intellij.codeInsight.daemon.impl.LocalInspectionsPass.lambda$visitPriorityElementsAndInit$3(LocalInspectionsPass.java:265)
at com.intellij.util.AstLoadingFilter.forceAllowTreeLoading(AstLoadingFilter.java:155)
at com.intellij.util.AstLoadingFilter.forceAllowTreeLoading(AstLoadingFilter.java:147)
at com.intellij.codeInsight.daemon.impl.LocalInspectionsPass.lambda$visitPriorityElementsAndInit$4(LocalInspectionsPass.java:264)
at com.intellij.util.AstLoadingFilter.disallowTreeLoading(AstLoadingFilter.java:126)
at com.intellij.util.AstLoadingFilter.disallowTreeLoading(AstLoadingFilter.java:115)
at com.intellij.codeInsight.daemon.impl.LocalInspectionsPass.lambda$visitPriorityElementsAndInit$5(LocalInspectionsPass.java:264)
at com.intellij.concurrency.ApplierCompleter.execAndForkSubTasks(ApplierCompleter.java:149)
at com.intellij.openapi.application.impl.ApplicationImpl.tryRunReadAction(ApplicationImpl.java:1110)
at com.intellij.concurrency.ApplierCompleter.lambda$wrapInReadActionAndIndicator$1(ApplierCompleter.java:105)
at com.intellij.openapi.progress.impl.CoreProgressManager.registerIndicatorAndRun(CoreProgressManager.java:629)
at com.intellij.openapi.progress.impl.CoreProgressManager.executeProcessUnderProgress(CoreProgressManager.java:581)
at com.intellij.openapi.progress.impl.ProgressManagerImpl.executeProcessUnderProgress(ProgressManagerImpl.java:60)
at com.intellij.concurrency.ApplierCompleter.wrapInReadActionAndIndicator(ApplierCompleter.java:117)
at com.intellij.concurrency.ApplierCompleter.lambda$compute$0(ApplierCompleter.java:96)
at com.intellij.openapi.application.impl.ReadMostlyRWLock.executeByImpatientReader(ReadMostlyRWLock.java:170)
at com.intellij.openapi.application.impl.ApplicationImpl.executeByImpatientReader(ApplicationImpl.java:182)
at com.intellij.concurrency.ApplierCompleter.compute(ApplierCompleter.java:96)
at java.base/java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:746)
at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:

Try to delete IDEA cache folder:
~/Library/Caches/JetBrains/<product><version> on macOS
~/.cache/JetBrains/<product><version> on Linux
%LOCALAPPDATA%\JetBrains\<product><version> on Windows
If it doesn't help, please file an issue in Youtrack.

Related

UnsupportedOperationException when using enum in Java 11, works fine in Java 8

I am porting an existing application Java 7 to Java 11 (Drools 7.42.0). The application works as expected if I compile as Java 1.8 and deploy on Java 11 container (Weblogic 14). But if I compile the same code in Java 11, I get the following error. The only change between 1.8 and 11 is the java compiler level in Maven pom.
I have ASM7 as a dependency in Maven.
Caused by: java.lang.UnsupportedOperationException: This feature requires ASM7
at org.mvel2.asm.ClassVisitor.visitNestMember(ClassVisitor.java:236)
at org.mvel2.asm.ClassReader.accept(ClassReader.java:651)
at org.mvel2.asm.ClassReader.accept(ClassReader.java:391)
at org.drools.core.util.asm.ClassFieldInspector.processClassWithByteCode(ClassFieldInspector.java:107)
at org.drools.core.util.asm.ClassFieldInspector.processClassWithByteCode(ClassFieldInspector.java:114)
at org.drools.core.util.asm.ClassFieldInspector.<init>(ClassFieldInspector.java:86)
at org.drools.core.util.asm.ClassFieldInspector.<init>(ClassFieldInspector.java:74)
at org.drools.core.base.ClassFieldAccessorFactory.getClassFieldInspector(ClassFieldAccessorFactory.java:157)
at org.drools.core.base.ClassFieldAccessorFactory.getFieldType(ClassFieldAccessorFactory.java:141)
at org.drools.core.base.ClassFieldAccessorStore.getFieldType(ClassFieldAccessorStore.java:295)
at org.drools.compiler.rule.builder.PatternBuilder.getValueType(PatternBuilder.java:1079)
at org.drools.compiler.rule.builder.PatternBuilder.normalizeExpression(PatternBuilder.java:1047)
at org.drools.compiler.rule.builder.PatternBuilder.buildExpression(PatternBuilder.java:965)
at org.drools.compiler.rule.builder.PatternBuilder.buildCcdDescr(PatternBuilder.java:942)
at org.drools.compiler.rule.builder.PatternBuilder.build(PatternBuilder.java:767)
at org.drools.compiler.rule.builder.PatternBuilder.processConstraintsAndBinds(PatternBuilder.java:620)
at org.drools.compiler.rule.builder.PatternBuilder.build(PatternBuilder.java:187)
at org.drools.compiler.rule.builder.PatternBuilder.build(PatternBuilder.java:154)
at org.drools.compiler.rule.builder.PatternBuilder.build(PatternBuilder.java:136)
at org.drools.compiler.rule.builder.GroupElementBuilder.build(GroupElementBuilder.java:66)
at org.drools.compiler.rule.builder.RuleBuilder.build(RuleBuilder.java:107)
at org.drools.compiler.builder.impl.KnowledgeBuilderImpl.internalAddRule(KnowledgeBuilderImpl.java:1183)
at org.drools.compiler.builder.impl.KnowledgeBuilderImpl.addRule(KnowledgeBuilderImpl.java:1178)
at org.drools.compiler.builder.impl.KnowledgeBuilderImpl.lambda$null$3(KnowledgeBuilderImpl.java:1134)
at java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
at java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:177)
at java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655)
at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
at java.base/java.util.stream.ForEachOps$ForEachTask.compute(ForEachOps.java:290)
at java.base/java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:746)
I noticed while debugging that this was caused by an "enum" in our code. Drools works fine if we remove the enum and use constants. The error seems to suggest that this could happen to any nested classes. Is there a better way to handle this other than changing our code to remove nested classes and enums?
I see that in the class org.drools.core.util.asm.ClassFieldInspector, the API is set to ASM5, but MVEL is expecting ASM7. But not sure why it would work in Java 8 and not in Java 11. Any suggestions?
ClassFieldVisitor(final Class< ? > cls,
final boolean includeFinalMethods,
final ClassFieldInspector inspector) {
super(Opcodes.ASM5);
this.clazz = cls;
this.includeFinalMethods = includeFinalMethods;
this.inspector = inspector;
}
Disclaimer: I'm one of the Drools developers
That's interesting as we might not emit valid ASM code.
Please file a bug on https://issues.redhat.com so that we can fix it
Thank you very much

nutch 1.16 crawl example from NutchTutorial returns NoSuchMethodError on org.apache.commons.cli.OptionBuilder (Windows 10)

I have been trying to run a Nutch 1.16 crawler using code example and instructions from https://cwiki.apache.org/confluence/display/NUTCH/NutchTutorial but no matter what, I seem to get stuck when initiating the actual crawl.
I'm running it through Cygwin64 on a Windows 10 machine, using a binary installation (though I have tried compiling one with the same results). Initially, Nutch would throw an UnsatisfiedLinkError (NativeIO$Windows.access0) which I fixed by adding libraries from several other answers for the same issue. Upon doing so, I could at least start a server, but trying to crawl through nutch itself would return NoSuchMethodError no matter what I did. nutch-site.xml only contains http.agent.name and plugin.includes options, both taken from the same example.
The following is the error message (I also tried to omit seed.txt):
$ bin/nutch inject crawl/crawldb urls/seed.txt
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.commons.cli.OptionBuilder.withArgPattern(Ljava/lang/String;I)Lorg/apache/commons/cli/OptionBuilder;
at org.apache.hadoop.util.GenericOptionsParser.buildGeneralOptions(GenericOptionsParser.java:207)
at org.apache.hadoop.util.GenericOptionsParser.parseGeneralOptions(GenericOptionsParser.java:370)
at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:153)
at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:138)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:59)
at org.apache.nutch.crawl.Injector.main(Injector.java:534)
The following is the list of libraries currently present in the lib directory:
activation-1.1.jar
amqp-client-5.2.0.jar
animal-sniffer-annotations-1.14.jar
antlr-runtime-3.5.2.jar
antlr4-4.5.1.jar
aopalliance-1.0.jar
apache-nutch-1.16.jar
apacheds-i18n-2.0.0-M15.jar
apacheds-kerberos-codec-2.0.0-M15.jar
api-asn1-api-1.0.0-M20.jar
api-util-1.0.0-M20.jar
args4j-2.0.16.jar
ascii-utf-themes-0.0.1.jar
asciitable-0.3.2.jar
asm-3.3.1.jar
asm-7.1.jar
avro-1.7.7.jar
bootstrap-3.0.3.jar
cglib-2.2.1-v20090111.jar
cglib-2.2.2.jar
char-translation-0.0.2.jar
checker-compat-qual-2.0.0.jar
closure-compiler-v20130603.jar
commons-beanutils-1.7.0.jar
commons-beanutils-core-1.8.0.jar
commons-cli-1.2-sources.jar
commons-cli-1.2.jar
commons-codec-1.11.jar
commons-collections-3.2.2.jar
commons-collections4-4.2.jar
commons-compress-1.18.jar
commons-configuration-1.6.jar
commons-daemon-1.0.13.jar
commons-digester-1.8.jar
commons-el-1.0.jar
commons-httpclient-3.1.jar
commons-io-2.4.jar
commons-jexl-2.1.1.jar
commons-lang-2.6.jar
commons-lang3-3.8.1.jar
commons-logging-1.1.3.jar
commons-math3-3.1.1.jar
commons-net-3.1.jar
crawler-commons-1.0.jar
curator-client-2.7.1.jar
curator-framework-2.7.1.jar
curator-recipes-2.7.1.jar
cxf-core-3.3.3.jar
cxf-rt-bindings-soap-3.3.3.jar
cxf-rt-bindings-xml-3.3.3.jar
cxf-rt-databinding-jaxb-3.3.3.jar
cxf-rt-frontend-jaxrs-3.3.3.jar
cxf-rt-frontend-jaxws-3.3.3.jar
cxf-rt-frontend-simple-3.3.3.jar
cxf-rt-security-3.3.3.jar
cxf-rt-transports-http-3.3.3.jar
cxf-rt-transports-http-jetty-3.3.3.jar
cxf-rt-ws-addr-3.3.3.jar
cxf-rt-ws-policy-3.3.3.jar
cxf-rt-wsdl-3.3.3.jar
dom4j-1.6.1.jar
ehcache-3.3.1.jar
elasticsearch-0.90.1.jar
error_prone_annotations-2.1.3.jar
FastInfoset-1.2.16.jar
geronimo-jcache_1.0_spec-1.0-alpha-1.jar
gora-hbase-0.3.jar
gson-2.2.4.jar
guava-25.0-jre.jar
guice-3.0.jar
guice-servlet-3.0.jar
h2-1.4.197.jar
hadoop-0.20.0-ant.jar
hadoop-0.20.0-core.jar
hadoop-0.20.0-examples.jar
hadoop-0.20.0-test.jar
hadoop-0.20.0-tools.jar
hadoop-annotations-2.9.2.jar
hadoop-auth-2.9.2.jar
hadoop-common-2.9.2.jar
hadoop-core-1.2.1.jar
hadoop-core_0.20.0.xml
hadoop-core_0.21.0.xml
hadoop-core_0.22.0.xml
hadoop-hdfs-2.9.2.jar
hadoop-hdfs-client-2.9.2.jar
hadoop-mapreduce-client-common-2.2.0.jar
hadoop-mapreduce-client-common-2.9.2.jar
hadoop-mapreduce-client-core-2.2.0.jar
hadoop-mapreduce-client-core-2.9.2.jar
hadoop-mapreduce-client-jobclient-2.2.0.jar
hadoop-mapreduce-client-jobclient-2.9.2.jar
hadoop-mapreduce-client-shuffle-2.2.0.jar
hadoop-mapreduce-client-shuffle-2.9.2.jar
hadoop-yarn-api-2.9.2.jar
hadoop-yarn-client-2.9.2.jar
hadoop-yarn-common-2.9.2.jar
hadoop-yarn-registry-2.9.2.jar
hadoop-yarn-server-common-2.9.2.jar
hadoop-yarn-server-nodemanager-2.9.2.jar
hbase-0.90.0-tests.jar
hbase-0.90.0.jar
hbase-0.92.1.jar
hbase-client-0.98.0-hadoop2.jar
hbase-common-0.98.0-hadoop2.jar
hbase-protocol-0.98.0-hadoop2.jar
HikariCP-java7-2.4.12.jar
htmlparser-1.6.jar
htrace-core-2.04.jar
htrace-core4-4.1.0-incubating.jar
httpclient-4.5.6.jar
httpcore-4.4.9.jar
httpcore-nio-4.4.9.jar
icu4j-61.1.jar
istack-commons-runtime-3.0.8.jar
j2objc-annotations-1.1.jar
jackson-annotations-2.9.9.jar
jackson-core-2.9.9.jar
jackson-core-asl-1.9.13.jar
jackson-databind-2.9.9.jar
jackson-dataformat-cbor-2.9.9.jar
jackson-jaxrs-1.9.13.jar
jackson-jaxrs-base-2.9.9.jar
jackson-jaxrs-json-provider-2.9.9.jar
jackson-mapper-asl-1.9.13.jar
jackson-module-jaxb-annotations-2.9.9.jar
jackson-xc-1.9.13.jar
jakarta.activation-api-1.2.1.jar
jakarta.ws.rs-api-2.1.5.jar
jakarta.xml.bind-api-2.3.2.jar
jasper-compiler-5.5.12.jar
jasper-runtime-5.5.12.jar
java-xmlbuilder-0.4.jar
javassist-3.12.1.GA.jar
javax.annotation-api-1.3.2.jar
javax.inject-1.jar
javax.persistence-2.2.0.jar
javax.servlet-api-3.1.0.jar
jaxb-api-2.2.2.jar
jaxb-impl-2.2.3-1.jar
jaxb-runtime-2.3.2.jar
jcip-annotations-1.0-1.jar
jersey-client-1.19.4.jar
jersey-core-1.9.jar
jersey-guice-1.9.jar
jersey-json-1.9.jar
jersey-server-1.9.jar
jets3t-0.9.0.jar
jettison-1.1.jar
jetty-6.1.26.jar
jetty-client-6.1.22.jar
jetty-continuation-9.4.19.v20190610.jar
jetty-http-9.4.19.v20190610.jar
jetty-io-9.4.19.v20190610.jar
jetty-security-9.4.19.v20190610.jar
jetty-server-9.4.19.v20190610.jar
jetty-sslengine-6.1.26.jar
jetty-util-6.1.26.jar
jetty-util-9.4.19.v20190610.jar
joda-time-2.3.jar
jquery-2.0.3-1.jar
jquery-selectors-0.0.3.jar
jquery-ui-1.10.2-1.jar
jquerypp-1.0.1.jar
jsch-0.1.54.jar
json-smart-1.3.1.jar
jsp-2.1-6.1.14.jar
jsp-api-2.1-6.1.14.jar
jsp-api-2.1.jar
jsr305-3.0.0.jar
junit-3.8.1.jar
juniversalchardet-1.0.3.jar
leveldbjni-all-1.8.jar
log4j-1.2.17.jar
lucene-analyzers-common-4.3.0.jar
lucene-codecs-4.3.0.jar
lucene-core-4.3.0.jar
lucene-grouping-4.3.0.jar
lucene-highlighter-4.3.0.jar
lucene-join-4.3.0.jar
lucene-memory-4.3.0.jar
lucene-queries-4.3.0.jar
lucene-queryparser-4.3.0.jar
lucene-sandbox-4.3.0.jar
lucene-spatial-4.3.0.jar
lucene-suggest-4.3.0.jar
maven-parent-config-0.3.4.jar
metrics-core-3.0.1.jar
modernizr-2.6.2-1.jar
mssql-jdbc-6.2.1.jre7.jar
neethi-3.1.1.jar
netty-3.6.2.Final.jar
netty-all-4.0.23.Final.jar
nimbus-jose-jwt-4.41.1.jar
okhttp-2.7.5.jar
okio-1.6.0.jar
org.apache.commons.cli-1.2.0.jar
ormlite-core-5.1.jar
ormlite-jdbc-5.1.jar
oro-2.0.8.jar
paranamer-2.3.jar
protobuf-java-2.5.0.jar
reflections-0.9.8.jar
servlet-api-2.5-20081211.jar
servlet-api-2.5.jar
skb-interfaces-0.0.1.jar
slf4j-api-1.7.26.jar
slf4j-log4j12-1.7.25.jar
snappy-java-1.0.5.jar
spatial4j-0.3.jar
spring-aop-4.0.9.RELEASE.jar
spring-beans-4.0.9.RELEASE.jar
spring-context-4.0.9.RELEASE.jar
spring-core-4.0.9.RELEASE.jar
spring-expression-4.0.9.RELEASE.jar
spring-web-4.0.9.RELEASE.jar
ST4-4.0.8.jar
stax-api-1.0-2.jar
stax-ex-1.8.1.jar
stax2-api-3.1.4.jar
t-digest-3.2.jar
tika-core-1.22.jar
txw2-2.3.2.jar
typeaheadjs-0.9.3.jar
warc-hadoop-0.1.0.jar
webarchive-commons-1.1.5.jar
wicket-bootstrap-core-0.9.2.jar
wicket-bootstrap-extensions-0.9.2.jar
wicket-core-6.17.0.jar
wicket-extensions-6.13.0.jar
wicket-ioc-6.17.0.jar
wicket-request-6.17.0.jar
wicket-spring-6.17.0.jar
wicket-util-6.17.0.jar
wicket-webjars-0.4.0.jar
woodstox-core-5.0.3.jar
wsdl4j-1.6.3.jar
xercesImpl-2.12.0.jar
xml-apis-1.4.01.jar
xml-resolver-1.2.jar
xmlenc-0.52.jar
xmlParserAPIs-2.6.2.jar
xmlschema-core-2.2.4.jar
zookeeper-3.4.6.jar
This is my java version:
java version "1.8.0_241"
Java(TM) SE Runtime Environment (build 1.8.0_241-b07)
Java HotSpot(TM) 64-Bit Server VM (build 25.241-b07, mixed mode)
I'd also like to point out that, despite what another answer may have said, nutch 1.4 (or any other version of nutch for that matter) did NOT resolve the issue, at least on Windows.
EDIT: The following answer worked for me, but I left the original one because it may still be useful to someone working with other versions of nutch.
Again, thanks to Sebastian Nagel, in order to get around the NoSuchMethodError, just edit ivy\ivy.xml to reference a different version of hadoop libraries, in my case I installed hadoop 3.1.3 and I also added the corresponding 3.1.3 versions of winutils.exe and hadoop.dll to the hadoop\bin directory referenced by HADOOP_HOME. Running bin/crawl and it seems to be working correctly.
Outdated answer: Okay, after working on the source code itself (courtesy of https://github.com/apache/commons-cli) under the suggestion of Sebastian Nagel, I was able to find the (very simple) implementation for the method (https://github.com/marcelmaatkamp/EntityExtractorUtils/blob/master/src/main/java/org/apache/commons/cli/OptionBuilder.java):
/**
* The next Option created will have an argument patterns and
* the number of pattern occurances
*
* #param argPattern string representing a pattern regex
* #param limit the number of pattern occurance in the argument
* return the OptionBuilder instance
*/
public static OptionBuilder withArgPattern( String argPattern,
int limit )
{
OptionBuilder.argPattern = argPattern;
OptionBuilder.limit = limit;
Using maven I was then able to compile the code into their own jar files, which I then added in the lib folder for apache nutch.
This still did not completely resolve my problem, as there seem to be deprecated functions being used by the entire nutch framework, which will probably mean even more work under similar circumstances (for instance, right after using the new jar I've been returned a NoSuchMethodError over org.apache.hadoop.mapreduce.Job.getInstance).
I leave this answer here as a temporary solution to anyone who may have also gotten stuck on the same issue, but I surely wish there was an easier way of finding out which methods appear in which jar file before exploring their entire file structure, although it may just be me ignoring it.

Blazegraph INSERT DATA crashes with NoSuchMethodError

In Blazegraph I attempted the following query:
INSERT DATA {
<http://my.site/User/instances/1>
:comment
<http://my.site/Comment/instances/16>.
}
and it crashes with the following exception trace:
java.lang.NoSuchMethodError: com.bigdata.rdf.sail.sparql.SPARQLStarUpdateDataBlockParser.read()I
at com.bigdata.rdf.sail.sparql.SPARQLStarUpdateDataBlockParser.checkSparqlStarSyntax(SPARQLStarUpdateDataBlockParser.java:107)
at com.bigdata.rdf.sail.sparql.SPARQLStarUpdateDataBlockParser.parseValue(SPARQLStarUpdateDataBlockParser.java:100)
at org.openrdf.rio.trig.TriGParser.parseGraph(TriGParser.java:158)
at org.openrdf.repository.sail.helpers.SPARQLUpdateDataBlockParser.parseGraph(SPARQLUpdateDataBlockParser.java:87)
at org.openrdf.rio.trig.TriGParser.parseStatement(TriGParser.java:128)
at org.openrdf.rio.turtle.TurtleParser.parse(TurtleParser.java:214)
at com.bigdata.rdf.sail.sparql.UpdateExprBuilder.doUnparsedQuadsDataBlock(UpdateExprBuilder.java:746)
at com.bigdata.rdf.sail.sparql.UpdateExprBuilder.visit(UpdateExprBuilder.java:161)
at com.bigdata.rdf.sail.sparql.UpdateExprBuilder.visit(UpdateExprBuilder.java:119)
at com.bigdata.rdf.sail.sparql.ast.ASTInsertData.jjtAccept(ASTInsertData.java:23)
at com.bigdata.rdf.sail.sparql.Bigdata2ASTSPARQLParser.parseUpdate2(Bigdata2ASTSPARQLParser.java:289)
at com.bigdata.rdf.sail.BigdataSailRepositoryConnection.prepareNativeSPARQLUpdate(BigdataSailRepositoryConnection.java:278)
at com.bigdata.rdf.sail.BigdataSailRepositoryConnection.prepareUpdate(BigdataSailRepositoryConnection.java:182)
at org.openrdf.repository.base.RepositoryConnectionBase.prepareUpdate(RepositoryConnectionBase.java:180)
However normal DELETE INSERT WHERE queries work fine.
Any ideas how to solve?
Simply removed the following line in my build.gradle:
compile('org.openrdf.sesame:sesame-runtime:2.8.6')
and refreshed dependencies. You simply don't need to declare a Sesame dependency if you're using bigdata-core, since Sesame is already a sub-dependency of that.
The reason for the crash was that bigdata-core relies on Sesame version 2.7.12 which was being superseded by my 2.8.6 declaration, which doesn't have an implementation of the method called, and hence the crash.
Thanks to #AKSW for the guidance behind this solution.

java.lang.VerifyError: JVMVRFY007 final method overridden; class=com/google/common/collect/NullsLastOrdering

I am in the process of migrating an application from WAS 7 to WAS 8.5. I get the following error during application start up. The error is regarding guava library and the project is using guava-1.5.jar. I am not sure if its an issue with the particular version I am using. I would be glad if someone could throw some light on this issue.
Caused by: java.lang.VerifyError: JVMVRFY007 final method overridden; class=com/google/common/collect/NullsLastOrdering, method=reverse()Lcom/google/common/collect/Ordering;, pc=0
at java.lang.ClassLoader.defineClassImpl(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:262)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:69)
at com.ibm.ws.classloader.CompoundClassLoader._defineClass(CompoundClassLoader.java:853)
at com.ibm.ws.classloader.CompoundClassLoader.localFindClass(CompoundClassLoader.java:763)
at com.ibm.ws.classloader.CompoundClassLoader.loadClass(CompoundClassLoader.java:586)
at java.lang.ClassLoader.loadClass(ClassLoader.java:627)
at com.ibm.ws.webbeans.services.ScannerServiceImpl.convertClassNamesToClass(ScannerServiceImpl.java:490)
at com.ibm.ws.webbeans.services.ScannerServiceImpl.ecsScan(ScannerServiceImpl.java:422)
at com.ibm.ws.webbeans.services.ScannerServiceImpl.populateBeans(ScannerServiceImpl.java:231)
at com.ibm.ws.webbeans.services.ScannerServiceImpl.populateBeans(ScannerServiceImpl.java:241)
at com.ibm.ws.webbeans.services.JCDIComponentImpl.populateOneDeployedObject(JCDIComponentImpl.java:331)
at com.ibm.ws.webbeans.services.JCDIComponentImpl.isJCDIEnabled(JCDIComponentImpl.java:835)
at com.ibm.ws.jaxrs.metadata.JAXRSServerMetaDataBuilder.buildJAXRSMetaData(JAXRSServerMetaDataBuilder.java:72)
at com.ibm.ws.jaxrs.component.JAXRSComponentImpl.stateChanged(JAXRSComponentImpl.java:269)
at com.ibm.ws.runtime.component.ApplicationMgrImpl.stateChanged(ApplicationMgrImpl.java:1118)
at com.ibm.ws.runtime.component.DeployedApplicationImpl.fireDeployedObjectEvent(DeployedApplicationImpl.java:1353)
at com.ibm.ws.runtime.component.DeployedModuleImpl.setState(DeployedModuleImpl.java:248)
at com.ibm.ws.runtime.component.DeployedModuleImpl.start(DeployedModuleImpl.java:636)
at com.ibm.ws.runtime.component.DeployedApplicationImpl.start(DeployedApplicationImpl.java:968)
... 62 more
Guava's Ordering#reverse method has never been final in any Guava release. It's only been final in a very, very old version of Google Collections: https://code.google.com/p/google-collections/source/diff?r=98&old=92&path=/trunk/src/com/google/common/collect/Ordering.java
So you should look for google-collect*.jar in your classpath and get rid of it. It really has no purpose for existence any more.

How to resolve this (disagrees about version of symbol) for example for (journal_restart)?

I try to modify Jbd Module.symvers and make versions of these functions of jbd the same as what is in /proc/kallsyms (for all symvers:ext3, root kernel directory and jbd), but when compiling it will be the same old value in jbd/Module.symvers (not seeable in jbd.mod.c). I've seen scripts/mod/modpost.c and there was no instance of those symbols (like journal_restart. I see a lot of disagree about version of .... and the next line this Unknown symbols... of that in disagree symbold in dmesg.
How can resolve this problem for example for (journal_restart)?
Thanks!