Reduce on Range throws NoClassDefFoundError: kotlin/IntIterator - kotlin

Test:
fun main(args: Array<String>) {
val total = (0..10).reduce { total, next -> total + next }
}
Stacktrace:
Exception in thread "main" java.lang.NoClassDefFoundError: kotlin/IntIterator
at TestReduceKt.main(TestReduce.kt:6)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
Caused by: java.lang.ClassNotFoundException: kotlin.IntIterator
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 6 more

This is a bug in Kotlin 1.0. It will be fixed in Kotlin 1.0.1.

Related

Why am I getting an IllegalStateException trying to load resource by related path to .fxml file using TornadoFX?

I am working on UI using TornadoFX with Kotlin. So I'm trying to load the start page of my app, here's a code for main-function:
fun main() {
launch<UIApp>()
}
app()-inheriting class:
class UIApp: App(AuthorizationView::class)
and view()-inheriting class:
class AuthorizationView: View() {
override val root: AnchorPane by fxml("./src/main/resources/authorizationView.fxml")
}
The directory tree is the following:
directory tree
When I run the main fun I get this trace:
SEVERE: Uncaught error
java.lang.IllegalStateException: component.javaClass.getResource(resource) must not be null
at tornadofx.ResourceLookup.url(Component.kt:1304)
at tornadofx.FX$Companion$fxmlLocator$1.invoke(FX.kt:114)
at tornadofx.FX$Companion$fxmlLocator$1.invoke(FX.kt:88)
at tornadofx.UIComponent.loadFXML(Component.kt:1119)
at tornadofx.UIComponent$fxml$1.<init>(Component.kt:1113)
at tornadofx.UIComponent.fxml(Component.kt:1112)
at tornadofx.UIComponent.fxml$default(Component.kt:1112)
at com.lowlevel.librarysheet.view.AuthorizationView.<init>(AuthorizationView.kt:8)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at tornadofx.FXKt.find(FX.kt:434)
at tornadofx.FXKt.find$default(FX.kt:423)
at tornadofx.App.start(App.kt:83)
at com.sun.javafx.application.LauncherImpl.lambda$launchApplication1$8(LauncherImpl.java:863)
at com.sun.javafx.application.PlatformImpl.lambda$runAndWait$7(PlatformImpl.java:326)
at com.sun.javafx.application.PlatformImpl.lambda$null$5(PlatformImpl.java:295)
at java.security.AccessController.doPrivileged(Native Method)
at com.sun.javafx.application.PlatformImpl.lambda$runLater$6(PlatformImpl.java:294)
at com.sun.glass.ui.InvokeLaterDispatcher$Future.run(InvokeLaterDispatcher.java:95)
at com.sun.glass.ui.win.WinApplication._runLoop(Native Method)
at com.sun.glass.ui.win.WinApplication.lambda$null$4(WinApplication.java:185)
at java.lang.Thread.run(Thread.java:748)
Exception in Application stop method
янв 06, 2022 5:06:35 PM tornadofx.DefaultErrorHandler uncaughtException
SEVERE: Uncaught error
java.lang.RuntimeException: Exception in Application stop method
at com.sun.javafx.application.LauncherImpl.launchApplication1(LauncherImpl.java:922)
at com.sun.javafx.application.LauncherImpl.lambda$launchApplication$1(LauncherImpl.java:182)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.IllegalStateException: component.javaClass.getResource(resource) must not be null
at tornadofx.ResourceLookup.url(Component.kt:1304)
at tornadofx.FX$Companion$fxmlLocator$1.invoke(FX.kt:114)
at tornadofx.FX$Companion$fxmlLocator$1.invoke(FX.kt:88)
at tornadofx.UIComponent.loadFXML(Component.kt:1119)
at tornadofx.UIComponent$fxml$1.<init>(Component.kt:1113)
at tornadofx.UIComponent.fxml(Component.kt:1112)
at tornadofx.UIComponent.fxml$default(Component.kt:1112)
at com.lowlevel.librarysheet.view.AuthorizationView.<init>(AuthorizationView.kt:8)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at tornadofx.FXKt.find(FX.kt:434)
at tornadofx.FXKt.find$default(FX.kt:423)
at tornadofx.App.stop(App.kt:139)
at com.sun.javafx.application.LauncherImpl.lambda$launchApplication1$9(LauncherImpl.java:882)
at com.sun.javafx.application.PlatformImpl.lambda$runAndWait$7(PlatformImpl.java:326)
at com.sun.javafx.application.PlatformImpl.lambda$null$5(PlatformImpl.java:295)
at java.security.AccessController.doPrivileged(Native Method)
at com.sun.javafx.application.PlatformImpl.lambda$runLater$6(PlatformImpl.java:294)
at com.sun.glass.ui.InvokeLaterDispatcher$Future.run(InvokeLaterDispatcher.java:95)
at com.sun.glass.ui.win.WinApplication._runLoop(Native Method)
at com.sun.glass.ui.win.WinApplication.lambda$null$4(WinApplication.java:185)
... 1 more
I've tried different ways of setting path to .fxml view including inserting an absolute path but I still get this exception.

Oozie Spark access to hive with kerberos

When I execute a spark process in oozie I have the following error. The database not found.
2018-09-26 15:27:23,576 INFO [main] org.apache.spark.deploy.yarn.Client:
client token: Token { kind: YARN_CLIENT_TOKEN, service: }
diagnostics: User class threw exception: org.apache.spark.sql.catalyst.analysis.NoSuchDatabaseException: Database 'zdm_datos_externos' not found;
ApplicationMaster host: 10.74.234.6
ApplicationMaster RPC port: 0
queue: default
queue user: administrador
start time: 1537986410475
final status: FAILED
tracking URL: https://BR-PC-CentOS-02:26001/proxy/application_1537467570666_4127/
user: administrador
This is my spark config
String warehouseLocation = new File("spark-warehouse").getAbsolutePath();
SparkSession spark = SparkSession
.builder()
.appName("Java Spark Hive Example")
.master("yarn")
.config("spark.sql.warehouse.dir", warehouseLocation)
.config("spark.driver.maxResultSize", "3g")
.config("spark.debug.maxToStringFields", "10000")
.config("spark.sql.crossJoin.enabled", "true")
.enableHiveSupport()
.getOrCreate();
spark.conf().set("spark.driver.maxResultSize", "3g");
metastore, current connections: 1 2018-09-26 17:31:42,598 WARN [main]
hive.metastore: set_ugi() not successful, Likely cause: new client
talking to old server. Continuing without it.
org.apache.thrift.transport.TTransportException
at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
at org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:380)
at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:230)
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_set_ugi(ThriftHiveMetastore.java:3748)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.set_ugi(ThriftHiveMetastore.java:3734)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:557)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(HiveMetaStoreClient.java:249)
at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.(SessionHiveMetaStoreClient.java:74)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1533)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.(RetryingMetaStoreClient.java:86)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3157)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3176)
at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3409)
at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:178)
at org.apache.hadoop.hive.ql.metadata.Hive.(Hive.java:170)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.yarn.security.HiveCredentialProvider$$anonfun$obtainCredentials$1.apply$mcV$sp(HiveCredentialProvider.scala:91)
at org.apache.spark.deploy.yarn.security.HiveCredentialProvider$$anonfun$obtainCredentials$1.apply(HiveCredentialProvider.scala:90)
at org.apache.spark.deploy.yarn.security.HiveCredentialProvider$$anonfun$obtainCredentials$1.apply(HiveCredentialProvider.scala:90)
at org.apache.spark.deploy.yarn.security.HiveCredentialProvider$$anon$1.run(HiveCredentialProvider.scala:124)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1778)
at org.apache.spark.deploy.yarn.security.HiveCredentialProvider.doAsRealUser(HiveCredentialProvider.scala:123)
at org.apache.spark.deploy.yarn.security.HiveCredentialProvider.obtainCredentials(HiveCredentialProvider.scala:90)
at org.apache.spark.deploy.yarn.security.ConfigurableCredentialManager$$anonfun$obtainCredentials$2.apply(ConfigurableCredentialManager.scala:82)
at org.apache.spark.deploy.yarn.security.ConfigurableCredentialManager$$anonfun$obtainCredentials$2.apply(ConfigurableCredentialManager.scala:80)
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at scala.collection.MapLike$DefaultValuesIterable.foreach(MapLike.scala:206)
at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241)
at scala.collection.AbstractTraversable.flatMap(Traversable.scala:104)
at org.apache.spark.deploy.yarn.security.ConfigurableCredentialManager.obtainCredentials(ConfigurableCredentialManager.scala:80)
at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:430)
at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:915)
at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:195)
at org.apache.spark.deploy.yarn.Client.run(Client.scala:1205)
at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1261)
at org.apache.spark.deploy.yarn.Client.main(Client.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:761)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:190)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:215)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:129)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
at org.apache.oozie.action.hadoop.SparkMain.runSpark(SparkMain.java:113)
at org.apache.oozie.action.hadoop.SparkMain.run(SparkMain.java:104)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47)
at org.apache.oozie.action.hadoop.SparkMain.main(SparkMain.java:38)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:238)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:459)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:187)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
What version of Spark are you using? Did you enable Hive support on your sparkSession?
sparkBuilder.enableHiveSupport().appName(appName).getOrCreate()

Exception while analyzing expression at AppDatabase

Here is my AppDatabase.kt
#Database(entities = arrayOf(Loan::class), version = 3)
abstract class AppDatabase : RoomDatabase() {
abstract fun loanModel(): LoanDao
companion object {
private const val DB_NAME = "loans.db"
fun createPersistentDatabase(context: Context): AppDatabase
= Room.databaseBuilder(context.applicationContext, AppDatabase::class.java, DB_NAME).build()
}
}
I am getting this error:
e: org.jetbrains.kotlin.util.KotlinFrontEndException: Exception while analyzing expression at (11,54) in /Users/mladenrakonjac/MyFirstKotlinApp/app/src/main/java/me/mnemonic/myloan/data/AppDatabase.kt:
3
at org.jetbrains.kotlin.types.expressions.ExpressionTypingVisitorDispatcher.logOrThrowException(ExpressionTypingVisitorDispatcher.java:250)
at org.jetbrains.kotlin.types.expressions.ExpressionTypingVisitorDispatcher.lambda$getTypeInfo$0(ExpressionTypingVisitorDispatcher.java:221)
at org.jetbrains.kotlin.util.PerformanceCounter.time(PerformanceCounter.kt:90)
at org.jetbrains.kotlin.types.expressions.ExpressionTypingVisitorDispatcher.getTypeInfo(ExpressionTypingVisitorDispatcher.java:171)
at org.jetbrains.kotlin.types.expressions.ExpressionTypingVisitorDispatcher.getTypeInfo(ExpressionTypingVisitorDispatcher.java:142)
at org.jetbrains.kotlin.types.expressions.ExpressionTypingServices.getTypeInfo(ExpressionTypingServices.java:121)
at org.jetbrains.kotlin.resolve.calls.ArgumentTypeResolver.getArgumentTypeInfo(ArgumentTypeResolver.java:235)
at org.jetbrains.kotlin.resolve.calls.ArgumentTypeResolver.analyzeArgumentsAndRecordTypes(ArgumentTypeResolver.java:379)
at org.jetbrains.kotlin.resolve.calls.CallResolver.doResolveCall(CallResolver.java:600)
at org.jetbrains.kotlin.resolve.calls.CallResolver.doResolveCallOrGetCachedResults(CallResolver.java:529)
at org.jetbrains.kotlin.resolve.calls.CallResolver.lambda$computeTasksFromCandidatesAndResolvedCall$1(CallResolver.java:210)
at org.jetbrains.kotlin.util.PerformanceCounter.time(PerformanceCounter.kt:90)
at org.jetbrains.kotlin.resolve.calls.CallResolver.computeTasksFromCandidatesAndResolvedCall(CallResolver.java:206)
at org.jetbrains.kotlin.resolve.calls.CallResolver.computeTasksFromCandidatesAndResolvedCall(CallResolver.java:196)
at org.jetbrains.kotlin.resolve.calls.CallResolver.resolveConstructorCall(CallResolver.java:365)
at org.jetbrains.kotlin.resolve.calls.CallResolver.resolveCallForConstructor(CallResolver.java:350)
at org.jetbrains.kotlin.resolve.calls.CallResolver.resolveFunctionCall(CallResolver.java:282)
at org.jetbrains.kotlin.resolve.calls.CallResolver.resolveFunctionCall(CallResolver.java:253)
at org.jetbrains.kotlin.resolve.AnnotationResolverImpl.resolveAnnotationCall(AnnotationResolverImpl.java:151)
at org.jetbrains.kotlin.resolve.lazy.descriptors.LazyAnnotationDescriptor.computeValueArguments(LazyAnnotations.kt:143)
at org.jetbrains.kotlin.resolve.lazy.descriptors.LazyAnnotationDescriptor.access$computeValueArguments(LazyAnnotations.kt:108)
at org.jetbrains.kotlin.resolve.lazy.descriptors.LazyAnnotationDescriptor$valueArguments$1.invoke(LazyAnnotations.kt:126)
at org.jetbrains.kotlin.resolve.lazy.descriptors.LazyAnnotationDescriptor$valueArguments$1.invoke(LazyAnnotations.kt:108)
at org.jetbrains.kotlin.storage.LockBasedStorageManager$LockBasedLazyValue.invoke(LockBasedStorageManager.java:323)
at org.jetbrains.kotlin.storage.LockBasedStorageManager$LockBasedNotNullLazyValue.invoke(LockBasedStorageManager.java:364)
at org.jetbrains.kotlin.resolve.lazy.descriptors.LazyAnnotationDescriptor.getAllValueArguments(LazyAnnotations.kt:140)
at org.jetbrains.kotlin.resolve.lazy.descriptors.LazyAnnotationDescriptor.forceResolveAllContents(LazyAnnotations.kt:161)
at org.jetbrains.kotlin.resolve.lazy.ForceResolveUtil.doForceResolveAllContents(ForceResolveUtil.java:75)
at org.jetbrains.kotlin.resolve.lazy.ForceResolveUtil.forceResolveAllContents(ForceResolveUtil.java:68)
at org.jetbrains.kotlin.resolve.lazy.descriptors.LazyClassDescriptor.resolveMemberHeaders(LazyClassDescriptor.java:535)
at org.jetbrains.kotlin.resolve.lazy.descriptors.LazyClassDescriptor.doForceResolveAllContents(LazyClassDescriptor.java:521)
at org.jetbrains.kotlin.resolve.lazy.descriptors.LazyClassDescriptor.lambda$new$4(LazyClassDescriptor.java:220)
at org.jetbrains.kotlin.storage.LockBasedStorageManager$LockBasedLazyValue.invoke(LockBasedStorageManager.java:323)
at org.jetbrains.kotlin.resolve.lazy.descriptors.LazyClassDescriptor.forceResolveAllContents(LazyClassDescriptor.java:517)
at org.jetbrains.kotlin.resolve.lazy.ForceResolveUtil.doForceResolveAllContents(ForceResolveUtil.java:75)
at org.jetbrains.kotlin.resolve.lazy.ForceResolveUtil.forceResolveAllContents(ForceResolveUtil.java:41)
at org.jetbrains.kotlin.resolve.jvm.extensions.PartialAnalysisHandlerExtension$doAnalysis$1.invoke(PartialAnalysisHandlerExtension.kt:66)
at org.jetbrains.kotlin.resolve.jvm.extensions.PartialAnalysisHandlerExtension$doAnalysis$1.invoke(PartialAnalysisHandlerExtension.kt:34)
at org.jetbrains.kotlin.resolve.jvm.extensions.PartialAnalysisHandlerExtension.doForEachDeclaration(PartialAnalysisHandlerExtension.kt:119)
at org.jetbrains.kotlin.resolve.jvm.extensions.PartialAnalysisHandlerExtension.doForEachDeclaration(PartialAnalysisHandlerExtension.kt:133)
at org.jetbrains.kotlin.resolve.jvm.extensions.PartialAnalysisHandlerExtension.doAnalysis(PartialAnalysisHandlerExtension.kt:61)
at org.jetbrains.kotlin.kapt3.AbstractKapt3Extension.doAnalysis(Kapt3Extension.kt:145)
at org.jetbrains.kotlin.resolve.jvm.TopDownAnalyzerFacadeForJVM.analyzeFilesWithJavaIntegration(TopDownAnalyzerFacadeForJVM.kt:97)
at org.jetbrains.kotlin.resolve.jvm.TopDownAnalyzerFacadeForJVM.analyzeFilesWithJavaIntegration$default(TopDownAnalyzerFacadeForJVM.kt:76)
at org.jetbrains.kotlin.cli.jvm.compiler.KotlinToJVMBytecodeCompiler$analyze$1.analyze(KotlinToJVMBytecodeCompiler.kt:365)
at org.jetbrains.kotlin.cli.common.messages.AnalyzerWithCompilerReport.analyzeAndReport(AnalyzerWithCompilerReport.kt:105)
at org.jetbrains.kotlin.cli.jvm.compiler.KotlinToJVMBytecodeCompiler.analyze(KotlinToJVMBytecodeCompiler.kt:354)
at org.jetbrains.kotlin.cli.jvm.compiler.KotlinToJVMBytecodeCompiler.compileModules(KotlinToJVMBytecodeCompiler.kt:139)
at org.jetbrains.kotlin.cli.jvm.K2JVMCompiler.doExecute(K2JVMCompiler.kt:167)
at org.jetbrains.kotlin.cli.jvm.K2JVMCompiler.doExecute(K2JVMCompiler.kt:55)
at org.jetbrains.kotlin.cli.common.CLICompiler.exec(CLICompiler.java:182)
at org.jetbrains.kotlin.incremental.IncrementalJvmCompilerRunner.compileChanged(IncrementalJvmCompilerRunner.kt:443)
at org.jetbrains.kotlin.incremental.IncrementalJvmCompilerRunner.compileIncrementally(IncrementalJvmCompilerRunner.kt:301)
at org.jetbrains.kotlin.incremental.IncrementalJvmCompilerRunner.compile(IncrementalJvmCompilerRunner.kt:128)
at org.jetbrains.kotlin.daemon.CompileServiceImpl.execIncrementalCompiler(CompileServiceImpl.kt:452)
at org.jetbrains.kotlin.daemon.CompileServiceImpl.access$execIncrementalCompiler(CompileServiceImpl.kt:99)
at org.jetbrains.kotlin.daemon.CompileServiceImpl$compile$1$$special$$inlined$withIC$lambda$1.invoke(CompileServiceImpl.kt:379)
at org.jetbrains.kotlin.daemon.CompileServiceImpl$compile$1$$special$$inlined$withIC$lambda$1.invoke(CompileServiceImpl.kt:99)
at org.jetbrains.kotlin.daemon.CompileServiceImpl$doCompile$2$$special$$inlined$withValidClientOrSessionProxy$lambda$1.invoke(CompileServiceImpl.kt:798)
at org.jetbrains.kotlin.daemon.CompileServiceImpl$doCompile$2$$special$$inlined$withValidClientOrSessionProxy$lambda$1.invoke(CompileServiceImpl.kt:99)
at org.jetbrains.kotlin.daemon.common.DummyProfiler.withMeasure(PerfUtils.kt:137)
at org.jetbrains.kotlin.daemon.CompileServiceImpl.checkedCompile(CompileServiceImpl.kt:825)
at org.jetbrains.kotlin.daemon.CompileServiceImpl.access$checkedCompile(CompileServiceImpl.kt:99)
at org.jetbrains.kotlin.daemon.CompileServiceImpl$doCompile$2.invoke(CompileServiceImpl.kt:797)
at org.jetbrains.kotlin.daemon.CompileServiceImpl$doCompile$2.invoke(CompileServiceImpl.kt:99)
at org.jetbrains.kotlin.daemon.CompileServiceImpl.ifAlive(CompileServiceImpl.kt:1004)
at org.jetbrains.kotlin.daemon.CompileServiceImpl.ifAlive$default(CompileServiceImpl.kt:865)
at org.jetbrains.kotlin.daemon.CompileServiceImpl.doCompile(CompileServiceImpl.kt:791)
at org.jetbrains.kotlin.daemon.CompileServiceImpl.access$doCompile(CompileServiceImpl.kt:99)
at org.jetbrains.kotlin.daemon.CompileServiceImpl$compile$1.invoke(CompileServiceImpl.kt:378)
at org.jetbrains.kotlin.daemon.CompileServiceImpl$compile$1.invoke(CompileServiceImpl.kt:99)
at org.jetbrains.kotlin.daemon.CompileServiceImpl.ifAlive(CompileServiceImpl.kt:1004)
at org.jetbrains.kotlin.daemon.CompileServiceImpl.ifAlive$default(CompileServiceImpl.kt:865)
at org.jetbrains.kotlin.daemon.CompileServiceImpl.compile(CompileServiceImpl.kt:336)
at sun.reflect.GeneratedMethodAccessor89.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at sun.rmi.server.UnicastServerRef.dispatch(UnicastServerRef.java:346)
at sun.rmi.transport.Transport$1.run(Transport.java:200)
at sun.rmi.transport.Transport$1.run(Transport.java:197)
at java.security.AccessController.doPrivileged(Native Method)
at sun.rmi.transport.Transport.serviceCall(Transport.java:196)
at sun.rmi.transport.tcp.TCPTransport.handleMessages(TCPTransport.java:568)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run0(TCPTransport.java:826)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.lambda$run$0(TCPTransport.java:683)
at java.security.AccessController.doPrivileged(Native Method)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run(TCPTransport.java:682)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded
The last line of that log says:
Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded
If your computer has a reasonable amount of RAM, you should file a bug report with JetBrains.
This question has lots of good information on that error: Error java.lang.OutOfMemoryError: GC overhead limit exceeded

java.lang.ClassNotFoundException: 'org.h2.jdbcx.JdbcDataSource'

I am trying to use HikariCP with h2 database. Here is my class responsible for sql
#Singleton
#Log
class SqlPersistence {
def config
{
def props = new Properties()
new File("build/resources/main/db.properties").withInputStream {
props.load(it)
}
log.info "Loaded properties: $props"
config = new HikariConfig(props)
}
#Lazy def ds = new HikariDataSource(config)
#Lazy def sql = new Sql(ds)
}
But when I try to run it using gradle task I get this error
Exception in thread "main" java.lang.RuntimeException: java.lang.ClassNotFoundException: 'org.h2.jdbcx.JdbcDataSource'
at com.zaxxer.hikari.util.UtilityElf.createInstance(UtilityElf.java:89)
at com.zaxxer.hikari.pool.PoolBase.initializeDataSource(PoolBase.java:293)
at com.zaxxer.hikari.pool.PoolBase.<init>(PoolBase.java:90)
at com.zaxxer.hikari.pool.HikariPool.<init>(HikariPool.java:101)
at com.zaxxer.hikari.HikariDataSource.<init>(HikariDataSource.java:71)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.codehaus.groovy.reflection.CachedConstructor.invoke(CachedConstructor.java:77)
at org.codehaus.groovy.runtime.callsite.ConstructorSite$ConstructorSiteNoUnwrapNoCoerce.callConstructor(ConstructorS
ite.java:102)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallConstructor(CallSiteArray.java:57)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:232)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:244)
at repository.persistence.SqlPersistence.getDs(SqlPersistence.groovy:27)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:324)
at org.codehaus.groovy.runtime.metaclass.MethodMetaProperty$GetBeanMethodMetaProperty.getProperty(MethodMetaProperty
.java:73)
at org.codehaus.groovy.runtime.callsite.GetEffectivePogoPropertySite.getProperty(GtivePogoPropertySite.java:82)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callGroovyObjectGetProperty(AbstractCallSite.java:304)
at repository.persistence.SqlPersistence.getSql(SqlPersistence.groovy:28)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:324)
at org.codehaus.groovy.runtime.metaclass.MethodMetaProperty$GetBeanMethodMetaProperty.getProperty(MethodMetaProperty
.java:73)
at org.codehaus.groovy.runtime.callsite.GetEffectivePogoPropertySite.getProperty(GetEffectivePogoPropertySite.java:8
2)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callGroovyObjectGetProperty(AbstractCallSite.java:304)
at repository.persistence.SqlPersistence.selectAll(SqlPersistence.groovy:42)
at repository.persistence.SqlPersistence$selectAll.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:45)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:110)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:122)
at repository.BookRepository.findAll(BookRepository.groovy:11)
at repository.BookRepository$findAll.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:45)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:110)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:114)
at service.BookService.findAll(BookService.groovy:13)
at service.BookService$findAll.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:45)
org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:110)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:114)
at script.run(script.groovy:7)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:324)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1207)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1016)
at org.codehaus.groovy.runtime.InvokerHelper.invokePogoMethod(InvokerHelper.java:914)
at org.codehaus.groovy.runtime.InvokerHelper.invokeMethod(InvokerHelper.java:897)
at org.codehaus.groovy.runtime.InvokerHelper.runScript(InvokerHelper.java:407)
at org.codehaus.groovy.runtime.InvokerHelper$runScript.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:45)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:110)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:130)
at script.main(script.groovy)
Caused by: java.lang.ClassNotFoundException: 'org.h2.jdbcx.JdbcDataSource'
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at com.zaxxer.hikari.util.UtilityElf.createInstance(UtilityElf.java:76)
... 65 more
My build.gradle looks like this
apply plugin: 'groovy'
repositories {
jcenter()
}
dependencies {
compile 'org.codehaus.groovy:groovy-all:2.4.3'
compile 'com.h2database:h2:1.4.191'
compile 'com.zaxxer:HikariCP:2.4.5'
testCompile 'org.spockframework:spock-core:1.0-groovy-2.4'
testCompile 'junit:junit:4.12'
testCompile 'cglib:cglib:3.2.1'
}
task runScript(dependsOn: 'compileJava', type: JavaExec) {
main = 'script'
classpath = sourceSets.main.runtimeClasspath + sourceSets.main.output
}
Why it's unable to find driver?
P.S. there is always some issue with using sql drivers...
It looks like your script is Java, but the snippet you posted is Groovy. So I think this is either an issue of how you call the Groovy part so that the H2 lib is not in Groovy classpath, because if I transform your code to Java and put it in script.java it works fine with your build.gradle. But it could also be a Groovy class loading problem. When I tried to access a database from a Gradle script (which essentially is also a Groovy script I had to do the following to get it working correctly:
// load JDBC drivers to the Groovy class loader to overcome java.sql.DriverManager quirks
Sql.classLoader.addURL new File('driver.jar').toURI().toURL()
Sql.withInstance('my:connection:string', 'user', 'pass', 'my.jdbc.DriverClass') {
it.execute 'select * from foo'
}
Maybe you need something similar.

sparkContext broadcast JedisPool not work

I was use sparkContext.broadcast in my spark streaming application for share redis connection pool(JedisPool).
The code like this:
lazy val redisPool = {
val pool = Redis.createRedisPool(redisHost, redisPort)
ssc.sparkContext.broadcast(pool)
}
Redis.createRedisPool is:
object Redis {
def createRedisPool(host: String, port: Int, maxIdle: Int, maxTotal: Int, timeout: Int): JedisPool = {
val pc = new JedisPoolConfig
pc.setMaxIdle(maxIdle)
pc.setMaxTotal(maxTotal)
pc.setMaxWaitMillis(timeout)
new JedisPool(pc, host, port)
}
def createRedisPool(host: String, port: Int): JedisPool = {
createRedisPool(host, port, 5, 5, 5000)
}
}
It works at local deploy mode, but when I run this at yarn/standalone mode like
spark-submit --master "yarn-client" --class ...
will get an error:
Exception in thread "main" java.io.NotSerializableException: redis.clients.jedis.JedisPool
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1165)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:329)
at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:42)
at org.apache.spark.broadcast.TorrentBroadcast$.blockifyObject(TorrentBroadcast.scala:210)
at org.apache.spark.broadcast.TorrentBroadcast.writeBlocks(TorrentBroadcast.scala:83)
at org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:68)
at org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:36)
at org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:29)
at org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:62)
at org.apache.spark.SparkContext.broadcast(SparkContext.scala:809)
at org.culiu.bd.streaming.AdSysStreaming$.redisPool$lzycompute$1(AdSysStreaming.scala:84)
at org.culiu.bd.streaming.AdSysStreaming$.redisPool$1(AdSysStreaming.scala:82)
at org.culiu.bd.streaming.AdSysStreaming$.main(AdSysStreaming.scala:154)
at org.culiu.bd.streaming.AdSysStreaming.main(AdSysStreaming.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
I have tried set spark.serializer = org.apache.spark.serializer.KryoSerializer in my application, and then got error like:
Exception in thread "main" com.esotericsoftware.kryo.KryoException: java.util.ConcurrentModificationException
Serialization trace:
classes (sun.misc.Launcher$AppClassLoader)
classloader (java.security.ProtectionDomain)
context (java.security.AccessControlContext)
acc (org.apache.spark.executor.ExecutorURLClassLoader)
factoryClassLoader (org.apache.commons.pool2.impl.GenericObjectPool)
internalPool (redis.clients.jedis.JedisPool)
at com.esotericsoftware.kryo.serializers.FieldSerializer$ObjectField.write(FieldSerializer.java:585)
at com.esotericsoftware.kryo.serializers.FieldSerializer.write(FieldSerializer.java:213)
at com.esotericsoftware.kryo.Kryo.writeObject(Kryo.java:501)
at com.esotericsoftware.kryo.serializers.FieldSerializer$ObjectField.write(FieldSerializer.java:564)
at com.esotericsoftware.kryo.serializers.FieldSerializer.write(FieldSerializer.java:213)
at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:568)
at com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.write(DefaultArraySerializers.java:318)
at com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.write(DefaultArraySerializers.java:293)
at com.esotericsoftware.kryo.Kryo.writeObject(Kryo.java:501)
at com.esotericsoftware.kryo.serializers.FieldSerializer$ObjectField.write(FieldSerializer.java:564)
at com.esotericsoftware.kryo.serializers.FieldSerializer.write(FieldSerializer.java:213)
at com.esotericsoftware.kryo.Kryo.writeObjectOrNull(Kryo.java:549)
at com.esotericsoftware.kryo.serializers.FieldSerializer$ObjectField.write(FieldSerializer.java:570)
at com.esotericsoftware.kryo.serializers.FieldSerializer.write(FieldSerializer.java:213)
at com.esotericsoftware.kryo.Kryo.writeObject(Kryo.java:501)
at com.esotericsoftware.kryo.serializers.FieldSerializer$ObjectField.write(FieldSerializer.java:564)
at com.esotericsoftware.kryo.serializers.FieldSerializer.write(FieldSerializer.java:213)
at com.esotericsoftware.kryo.Kryo.writeObject(Kryo.java:501)
at com.esotericsoftware.kryo.serializers.FieldSerializer$ObjectField.write(FieldSerializer.java:564)
at com.esotericsoftware.kryo.serializers.FieldSerializer.write(FieldSerializer.java:213)
at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:568)
at org.apache.spark.serializer.KryoSerializationStream.writeObject(KryoSerializer.scala:119)
at org.apache.spark.broadcast.TorrentBroadcast$.blockifyObject(TorrentBroadcast.scala:210)
at org.apache.spark.broadcast.TorrentBroadcast.writeBlocks(TorrentBroadcast.scala:83)
at org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:68)
at org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:36)
at org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:29)
at org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:62)
at org.apache.spark.SparkContext.broadcast(SparkContext.scala:809)
at org.culiu.bd.streaming.AdSysStreaming$.redisPool$lzycompute$1(AdSysStreaming.scala:85)
at org.culiu.bd.streaming.AdSysStreaming$.redisPool$1(AdSysStreaming.scala:83)
at org.culiu.bd.streaming.AdSysStreaming$.main(AdSysStreaming.scala:155)
at org.culiu.bd.streaming.AdSysStreaming.main(AdSysStreaming.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.util.ConcurrentModificationException
at java.util.AbstractList$Itr.checkForComodification(AbstractList.java:372)
at java.util.AbstractList$Itr.next(AbstractList.java:343)
at com.esotericsoftware.kryo.serializers.CollectionSerializer.write(CollectionSerializer.java:74)
at com.esotericsoftware.kryo.serializers.CollectionSerializer.write(CollectionSerializer.java:18)
at com.esotericsoftware.kryo.Kryo.writeObject(Kryo.java:501)
at com.esotericsoftware.kryo.serializers.FieldSerializer$ObjectField.write(FieldSerializer.java:564)
... 39 more
how can I solve this?
It looks like the problem here is that the redis.clients.jedis.JedisPool class is not serializable. This doesn't seem like a Spark-specific issue, since I think that any attempt to serialize that class would fail.