Spark submit failing with Hive - hive

I am trying to get a Spark 1.1.0 program written in Scala to work, but I'm having a hard time with it. I have a Hive query that is very simple:
select json, score from data
When I run the following command from spark-shell everything works (I need the MYSQL_CONN in the driver class path as I'm using Hive with a MySQL metadata store)
bin/spark-shell --master $SPARK_URL --driver-class-path $MYSQL_CONN
import org.apache.spark.sql.hive.HiveContext
val sqlContext = new HiveContext(sc)
sqlContext.sql("select json from data").map(t => t.getString(0)).take(10).foreach(println)
I get ten lines of json just like I want. However, when I run this with spark-submit as follows I get a problem
bin/spark-submit --master $SPARK_URL --class spark.Main --driver-class-path $MYSQL_CONN target/spark-testing-1.0-SNAPSHOT.jar
Here is my whole Spark program
package spark
import org.apache.spark.sql.hive.HiveContext
import org.apache.spark.{SparkContext, SparkConf}
object Main {
def main(args: Array[String]) {
val sc = new SparkContext(new SparkConf().setAppName("Gathering Data"))
val sqlContext = new HiveContext(sc)
sqlContext.sql("select json from data").map(t => t.getString(0)).take(10).foreach(println)
}
}
And here is the resultant stack
14/12/01 21:30:04 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, match1hd17.dc1): java.lang.ClassNotFoundException: spark.Main$$anonfun$main$1
java.net.URLClassLoader$1.run(URLClassLoader.java:200)
java.security.AccessController.doPrivileged(Native Method)
java.net.URLClassLoader.findClass(URLClassLoader.java:188)
java.lang.ClassLoader.loadClass(ClassLoader.java:307)
java.lang.ClassLoader.loadClass(ClassLoader.java:252)
java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
java.lang.Class.forName0(Native Method)
java.lang.Class.forName(Class.java:247)
org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:59)
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1575)
java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1496)
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1732)
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1329)
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1947)
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1871)
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1753)
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1329)
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1947)
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1871)
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1753)
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1329)
java.io.ObjectInputStream.readObject(ObjectInputStream.java:351)
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:57)
org.apache.spark.scheduler.Task.run(Task.scala:54)
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177)
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
java.lang.Thread.run(Thread.java:619)
14/12/01 21:30:10 ERROR TaskSetManager: Task 0 in stage 0.0 failed 4 times; aborting job
Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 3, match1hd12.dc1m): java.lang.ClassNotFoundException: spark.Main$$anonfun$main$1
java.net.URLClassLoader$1.run(URLClassLoader.java:200)
java.security.AccessController.doPrivileged(Native Method)
java.net.URLClassLoader.findClass(URLClassLoader.java:188)
java.lang.ClassLoader.loadClass(ClassLoader.java:307)
java.lang.ClassLoader.loadClass(ClassLoader.java:252)
java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
java.lang.Class.forName0(Native Method)
java.lang.Class.forName(Class.java:247)
org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:59)
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1575)
java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1496)
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1732)
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1329)
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1947)
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1871)
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1753)
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1329)
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1947)
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1871)
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1753)
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1329)
java.io.ObjectInputStream.readObject(ObjectInputStream.java:351)
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:57)
org.apache.spark.scheduler.Task.run(Task.scala:54)
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177)
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
java.lang.Thread.run(Thread.java:619)
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1185)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1174)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1173)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1173)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:688)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:688)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:688)
at org.apache.spark.scheduler.DAGSchedulerEventProcessActor$$anonfun$receive$2.applyOrElse(DAGScheduler.scala:1391)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)
at akka.actor.ActorCell.invoke(ActorCell.scala:456)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)
at akka.dispatch.Mailbox.run(Mailbox.scala:219)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
I have spent hours on this already, and I have no idea why this only works with spark-shell. I looked at the stderr output on the individual nodes and they have the same cryptic error message. If anyone can shed some light as to why this only works with spark-shell and not spark-submit that would be awesome.
Thanks
UPDATE:
I've been playing around and the following program works fine.
package spark
import org.apache.spark.sql.hive.HiveContext
import org.apache.spark.{SparkContext, SparkConf}
object Main {
def main(args: Array[String]) {
val sc = new SparkContext(new SparkConf().setAppName("Gathering Data"))
val sqlContext = new HiveContext(sc)
sqlContext.sql("select json from data").take(10).map(t => t.getString(0)).foreach(println)
}
}
Obviously this won't work for a large amount of data, but it shows that the problem appears to be in the ScehmaRDD.map() function.

It seems there is a problem with the spark context initialization.
Please try the below code:
val sparkConf = new SparkConf().setAppName("Gathering Data");
val sc = new SparkContext(sparkConf);

Related

Read Remote S3 File Using Databricks Connect

I am trying to read a file in an S3 bucket using Spark through Databricks Connect.
This is the code that I am using,
from pyspark import SparkConf
from pyspark.sql import SparkSession
conf = SparkConf()
conf.set('spark.jars.packages', 'org.apache.hadoop:hadoop-aws:3.3.0')
conf.set('spark.hadoop.fs.s3a.access.key', access_key)
conf.set('spark.hadoop.fs.s3a.secret.key', secret_access_key)
spark = SparkSession.builder.config(conf=conf).getOrCreate()
df = spark.read.format("csv").option("header",True).load('s3a://container/path/to/file.csv')
df.show()
This works completely fine when I execute it using a Docker container that I spin up, however, it fails with Databricks Connect with the following error,
pyspark.dbutils.ExecutionError: An error occurred while calling o48.ls.
: com.databricks.service.SparkServiceRemoteException: java.nio.file.AccessDeniedException: getFileStatus on com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden; request: HEAD Forbidden
at shaded.databricks.org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:244)
at shaded.databricks.org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:155)
at shaded.databricks.org.apache.hadoop.fs.s3a.S3AFileSystem.s3GetFileStatus(S3AFileSystem.java:2870)
at shaded.databricks.org.apache.hadoop.fs.s3a.S3AFileSystem.innerGetFileStatus(S3AFileSystem.java:2840)
at shaded.databricks.org.apache.hadoop.fs.s3a.S3AFileSystem.getFileStatus(S3AFileSystem.java:2779)
at shaded.databricks.org.apache.hadoop.fs.s3a.S3AFileSystem.innerListStatus(S3AFileSystem.java:2449)
at shaded.databricks.org.apache.hadoop.fs.s3a.S3AFileSystem.lambda$listStatus$11(S3AFileSystem.java:2428)
at shaded.databricks.org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:118)
at shaded.databricks.org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:112)
at shaded.databricks.org.apache.hadoop.fs.s3a.S3AFileSystem.listStatus(S3AFileSystem.java:2428)
at com.databricks.service.SparkServiceImpl$.$anonfun$fileSystemOperation0$2(SparkServiceImpl.scala:617)
at com.databricks.service.SparkServiceImpl$.withFileSystemExceptionHandler(SparkServiceImpl.scala:647)
at com.databricks.service.SparkServiceImpl$.fileSystemOperation0(SparkServiceImpl.scala:617)
at com.databricks.service.SparkServiceImpl$.$anonfun$fileSystemOperation$1(SparkServiceImpl.scala:184)
at com.databricks.logging.UsageLogging.$anonfun$recordOperation$4(UsageLogging.scala:431)
at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:239)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:234)
at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:231)
at com.databricks.spark.util.PublicDBLogging.withAttributionContext(DatabricksSparkUsageLogger.scala:19)
at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:276)
at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:269)
at com.databricks.spark.util.PublicDBLogging.withAttributionTags(DatabricksSparkUsageLogger.scala:19)
at com.databricks.logging.UsageLogging.recordOperation(UsageLogging.scala:412)
at com.databricks.logging.UsageLogging.recordOperation$(UsageLogging.scala:338)
at com.databricks.spark.util.PublicDBLogging.recordOperation(DatabricksSparkUsageLogger.scala:19)
at com.databricks.spark.util.PublicDBLogging.recordOperation0(DatabricksSparkUsageLogger.scala:56)
at com.databricks.spark.util.DatabricksSparkUsageLogger.recordOperation(DatabricksSparkUsageLogger.scala:131)
at com.databricks.spark.util.UsageLogger.recordOperation(UsageLogger.scala:71)
at com.databricks.spark.util.UsageLogger.recordOperation$(UsageLogger.scala:58)
at com.databricks.spark.util.DatabricksSparkUsageLogger.recordOperation(DatabricksSparkUsageLogger.scala:85)
at com.databricks.spark.util.UsageLogging.recordOperation(UsageLogger.scala:401)
at com.databricks.spark.util.UsageLogging.recordOperation$(UsageLogger.scala:380)
at com.databricks.service.SparkServiceImpl$.recordOperation(SparkServiceImpl.scala:92)
at com.databricks.service.SparkServiceImpl$.fileSystemOperation(SparkServiceImpl.scala:184)
at com.databricks.service.SparkServiceRPCHandler.execute0(SparkServiceRPCHandler.scala:663)
at com.databricks.service.SparkServiceRPCHandler.$anonfun$executeRPC0$1(SparkServiceRPCHandler.scala:451)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at com.databricks.service.SparkServiceRPCHandler.executeRPC0(SparkServiceRPCHandler.scala:351)
at com.databricks.service.SparkServiceRPCHandler$$anon$2.call(SparkServiceRPCHandler.scala:302)
at com.databricks.service.SparkServiceRPCHandler$$anon$2.call(SparkServiceRPCHandler.scala:288)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at com.databricks.service.SparkServiceRPCHandler.$anonfun$executeRPC$1(SparkServiceRPCHandler.scala:338)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at com.databricks.service.SparkServiceRPCHandler.executeRPC(SparkServiceRPCHandler.scala:315)
at com.databricks.service.SparkServiceRPCServlet.doPost(SparkServiceRPCServer.scala:152)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:707)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:873)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:542)
at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:205)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:480)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
at org.eclipse.jetty.server.Server.handle(Server.java:505)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:370)
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:267)
at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305)
at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:126)
at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:366)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:698)
at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:804)
at java.lang.Thread.run(Thread.java:750)
Caused by: com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden; request: HEAD
Note: I've scrubbed some information related to the path of my file in the above error thread.

ClassNotFoundException in CacheJdbcStoreExample run on cluster

I have a cluster with 2 nodes. Then I tried to run the CacheJdbcStoreExample in apache-ignite-fabric-2.1.0-bin/examples.But I got following exception:
visor> [06:51:41,113][SEVERE][tcp-disco-msg-worker-#13%null%][TcpDiscoverySpi] Failed to unmarshal discovery custom message.
class org.apache.ignite.IgniteCheckedException: Failed to find class with given class loader for unmarshalling (make sure same versions of all classes are available on all nodes or enable pee
r-class-loading) [clsLdr=sun.misc.Launcher$AppClassLoader#4aa4ceeb, cls=org.apache.ignite.examples.datagrid.store.jdbc.CacheJdbcStoreExample$1] at org.apache.ignite.marshaller.jdk.JdkMarshaller.unmarshal0(JdkMarshaller.java:124)
at org.apache.ignite.marshaller.AbstractNodeNameAwareMarshaller.unmarshal(AbstractNodeNameAwareMarshaller.java:94)
at org.apache.ignite.marshaller.jdk.JdkMarshaller.unmarshal0(JdkMarshaller.java:143)
at org.apache.ignite.marshaller.AbstractNodeNameAwareMarshaller.unmarshal(AbstractNodeNameAwareMarshaller.java:82)
at org.apache.ignite.internal.util.IgniteUtils.unmarshal(IgniteUtils.java:9733)
at org.apache.ignite.spi.discovery.tcp.messages.TcpDiscoveryCustomEventMessage.message(TcpDiscoveryCustomEventMessage.java:81)
at org.apache.ignite.spi.discovery.tcp.ServerImpl$RingMessageWorker.notifyDiscoveryListener(ServerImpl.java:5436)
at org.apache.ignite.spi.discovery.tcp.ServerImpl$RingMessageWorker.processCustomMessage(ServerImpl.java:5321)
at org.apache.ignite.spi.discovery.tcp.ServerImpl$RingMessageWorker.processMessage(ServerImpl.java:2629)
at org.apache.ignite.spi.discovery.tcp.ServerImpl$RingMessageWorker.processMessage(ServerImpl.java:2420)
at org.apache.ignite.spi.discovery.tcp.ServerImpl$MessageWorkerAdapter.body(ServerImpl.java:6576)
at org.apache.ignite.spi.discovery.tcp.ServerImpl$RingMessageWorker.body(ServerImpl.java:2506)
at org.apache.ignite.spi.IgniteSpiThread.run(IgniteSpiThread.java:62)
Caused by: java.lang.ClassNotFoundException: org.apache.ignite.examples.datagrid.store.jdbc.CacheJdbcStoreExample$1
at java.net.URLClassLoader$1.run(URLClassLoader.java:359)
at java.net.URLClassLoader$1.run(URLClassLoader.java:348)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:347)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:278)
at org.apache.ignite.internal.util.IgniteUtils.forName(IgniteUtils.java:8465)
at org.apache.ignite.marshaller.jdk.JdkMarshallerObjectInputStream.resolveClass(JdkMarshallerObjectInputStream.java:54)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1817)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1711)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1982)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1533)
at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1917)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1527)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2227)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2151)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2009)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1533)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2227)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2151)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2009)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1533)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:420)
at java.util.ArrayList.readObject(ArrayList.java:771)
at sun.reflect.GeneratedMethodAccessor16.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1058)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2118)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2009)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1533)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2227)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2151)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2009)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1533)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2227)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2151)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2009)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1533)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:420)
at org.apache.ignite.marshaller.jdk.JdkMarshaller.unmarshal0(JdkMarshaller.java:121)
... 12 more
I decompiled CacheJdbcStoreExample$1.class and got following code:
CacheJdbcStoreExample$1
package org.apache.ignite.examples.datagrid.store.jdbc;
import javax.cache.configuration.Factory;
import org.apache.ignite.cache.store.CacheStoreSessionListener;
import org.apache.ignite.cache.store.jdbc.CacheJdbcStoreSessionListener;
import org.h2.jdbcx.JdbcConnectionPool;
class CacheJdbcStoreExample$1
implements Factory<CacheStoreSessionListener>
{
public CacheStoreSessionListener create()
{
CacheJdbcStoreSessionListener lsnr = new CacheJdbcStoreSessionListener();
lsnr.setDataSource(JdbcConnectionPool.create("jdbc:h2:tcp://localhost/mem:ExampleDb", "sa", ""));
return lsnr;
}
}
SO I think there is something wrong at row 90 in CacheJdbcStoreExample source code:
// Configure JDBC session listener.
cacheCfg.setCacheStoreSessionListenerFactories(new Factory<CacheStoreSessionListener>() {
#Override public CacheStoreSessionListener create() {
CacheJdbcStoreSessionListener lsnr = new CacheJdbcStoreSessionListener();
lsnr.setDataSource(JdbcConnectionPool.create("jdbc:h2:tcp://localhost/mem:ExampleDb", "sa", ""));
return lsnr;
}
});
If I run the example on only one node not a cluster, it's OK.
What should I do to fix it?
The problem is that you configured a cache with a factory of CacheStoreSessionListener-s, but this factory is not visible from other nodes as they don't have it in their classpath.
You should start additional remote nodes with org.apache.ignite.examples.ExampleNodeStartup class from examples module or add examples module to the classpath of other nodes.

Flink program works from IDE but not from terminal

I have 5 different Jobs, some of them have an InputFormat and others use env.fromElements(...)
When I execute the Jobs using IntelliJ, all of them works correctly. But when I execute them from terminal, only jobs which contains env.fromElements() works.
Here is the error message:
Cluster configuration: Standalone cluster with JobManager at localhost/127.0.0.1:6123
Using address localhost:6123 to connect to JobManager.
JobManager web interface address http://localhost:8081
Starting execution of program
Submitting job with JobID: ab41f30df416c654b406c9b13e70f62e. Waiting for job completion.
Connected to JobManager at Actor[akka.tcp://flink#localhost:6123/user/jobmanager#1320020477]
The program finished with the following exception:
org.apache.flink.client.program.ProgramInvocationException: The program execution failed: Cannot initialize task 'CHAIN DataSource (at main(JobSource.java:49) (es.mypackage.flink.Sources.MyInputFormat)) -> Map (Map at main(JobSource.java:53))': Configuring the InputFormat (es.mypackage.flink.Sources.MyInputFormat#1f81aa00) failed: null
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:427)
at org.apache.flink.client.program.StandaloneClusterClient.submitJob(StandaloneClusterClient.java:101)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:400)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:387)
at org.apache.flink.client.program.ContextEnvironment.execute(ContextEnvironment.java:62)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:926)
at es.mypackage.flink.Job.JobSource.main(JobSource.java:60)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:528)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:419)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:339)
at org.apache.flink.client.CliFrontend.executeProgram(CliFrontend.java:831)
at org.apache.flink.client.CliFrontend.run(CliFrontend.java:256)
at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1073)
at org.apache.flink.client.CliFrontend$2.call(CliFrontend.java:1120)
at org.apache.flink.client.CliFrontend$2.call(CliFrontend.java:1117)
at org.apache.flink.runtime.security.HadoopSecurityContext$1.run(HadoopSecurityContext.java:43)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:40)
at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1116)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Cannot initialize task 'CHAIN DataSource (at main(JobSource.java:49) (es.mypackage.flink.Sources.MyInputFormat)) -> Map (Map at main(JobSource.java:53))': Configuring the InputFormat (es.mypackage.flink.Sources.MyInputFormat#1f81aa00) failed: null
at org.apache.flink.runtime.executiongraph.ExecutionGraphBuilder.buildGraph(ExecutionGraphBuilder.java:136)
at org.apache.flink.runtime.jobmanager.JobManager.org$apache$flink$runtime$jobmanager$JobManager$$submitJob(JobManager.scala:1274)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1.applyOrElse(JobManager.scala:477)
at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
at org.apache.flink.runtime.LeaderSessionMessageFilter$$anonfun$receive$1.applyOrElse(LeaderSessionMessageFilter.scala:36)
at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:33)
at org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:28)
at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
at org.apache.flink.runtime.LogMessages$$anon$1.applyOrElse(LogMessages.scala:28)
at akka.actor.Actor$class.aroundReceive(Actor.scala:467)
at org.apache.flink.runtime.jobmanager.JobManager.aroundReceive(JobManager.scala:118)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
at akka.dispatch.Mailbox.run(Mailbox.scala:220)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.Exception: Configuring the InputFormat (es.mypackage.flink.Sources.MyInputFormat#1f81aa00) failed: null
at org.apache.flink.runtime.jobgraph.InputFormatVertex.initializeOnMaster(InputFormatVertex.java:89)
at org.apache.flink.runtime.executiongraph.ExecutionGraphBuilder.buildGraph(ExecutionGraphBuilder.java:133)
... 20 more
Any help is much appreciated.
Edit:
public void configure(Configuration parameters) {
/*This line causes the problem*/
this.client = new KuduClient.KuduClientBuilder(KUDU_MASTER).build();
table = createTable(TABLE_NAME);
if (table != null) {
scanner = client.newScannerBuilder(table)
.setProjectedColumnNames(projectColumns)
.build();
}
}

sparkContext broadcast JedisPool not work

I was use sparkContext.broadcast in my spark streaming application for share redis connection pool(JedisPool).
The code like this:
lazy val redisPool = {
val pool = Redis.createRedisPool(redisHost, redisPort)
ssc.sparkContext.broadcast(pool)
}
Redis.createRedisPool is:
object Redis {
def createRedisPool(host: String, port: Int, maxIdle: Int, maxTotal: Int, timeout: Int): JedisPool = {
val pc = new JedisPoolConfig
pc.setMaxIdle(maxIdle)
pc.setMaxTotal(maxTotal)
pc.setMaxWaitMillis(timeout)
new JedisPool(pc, host, port)
}
def createRedisPool(host: String, port: Int): JedisPool = {
createRedisPool(host, port, 5, 5, 5000)
}
}
It works at local deploy mode, but when I run this at yarn/standalone mode like
spark-submit --master "yarn-client" --class ...
will get an error:
Exception in thread "main" java.io.NotSerializableException: redis.clients.jedis.JedisPool
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1165)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:329)
at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:42)
at org.apache.spark.broadcast.TorrentBroadcast$.blockifyObject(TorrentBroadcast.scala:210)
at org.apache.spark.broadcast.TorrentBroadcast.writeBlocks(TorrentBroadcast.scala:83)
at org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:68)
at org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:36)
at org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:29)
at org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:62)
at org.apache.spark.SparkContext.broadcast(SparkContext.scala:809)
at org.culiu.bd.streaming.AdSysStreaming$.redisPool$lzycompute$1(AdSysStreaming.scala:84)
at org.culiu.bd.streaming.AdSysStreaming$.redisPool$1(AdSysStreaming.scala:82)
at org.culiu.bd.streaming.AdSysStreaming$.main(AdSysStreaming.scala:154)
at org.culiu.bd.streaming.AdSysStreaming.main(AdSysStreaming.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
I have tried set spark.serializer = org.apache.spark.serializer.KryoSerializer in my application, and then got error like:
Exception in thread "main" com.esotericsoftware.kryo.KryoException: java.util.ConcurrentModificationException
Serialization trace:
classes (sun.misc.Launcher$AppClassLoader)
classloader (java.security.ProtectionDomain)
context (java.security.AccessControlContext)
acc (org.apache.spark.executor.ExecutorURLClassLoader)
factoryClassLoader (org.apache.commons.pool2.impl.GenericObjectPool)
internalPool (redis.clients.jedis.JedisPool)
at com.esotericsoftware.kryo.serializers.FieldSerializer$ObjectField.write(FieldSerializer.java:585)
at com.esotericsoftware.kryo.serializers.FieldSerializer.write(FieldSerializer.java:213)
at com.esotericsoftware.kryo.Kryo.writeObject(Kryo.java:501)
at com.esotericsoftware.kryo.serializers.FieldSerializer$ObjectField.write(FieldSerializer.java:564)
at com.esotericsoftware.kryo.serializers.FieldSerializer.write(FieldSerializer.java:213)
at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:568)
at com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.write(DefaultArraySerializers.java:318)
at com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.write(DefaultArraySerializers.java:293)
at com.esotericsoftware.kryo.Kryo.writeObject(Kryo.java:501)
at com.esotericsoftware.kryo.serializers.FieldSerializer$ObjectField.write(FieldSerializer.java:564)
at com.esotericsoftware.kryo.serializers.FieldSerializer.write(FieldSerializer.java:213)
at com.esotericsoftware.kryo.Kryo.writeObjectOrNull(Kryo.java:549)
at com.esotericsoftware.kryo.serializers.FieldSerializer$ObjectField.write(FieldSerializer.java:570)
at com.esotericsoftware.kryo.serializers.FieldSerializer.write(FieldSerializer.java:213)
at com.esotericsoftware.kryo.Kryo.writeObject(Kryo.java:501)
at com.esotericsoftware.kryo.serializers.FieldSerializer$ObjectField.write(FieldSerializer.java:564)
at com.esotericsoftware.kryo.serializers.FieldSerializer.write(FieldSerializer.java:213)
at com.esotericsoftware.kryo.Kryo.writeObject(Kryo.java:501)
at com.esotericsoftware.kryo.serializers.FieldSerializer$ObjectField.write(FieldSerializer.java:564)
at com.esotericsoftware.kryo.serializers.FieldSerializer.write(FieldSerializer.java:213)
at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:568)
at org.apache.spark.serializer.KryoSerializationStream.writeObject(KryoSerializer.scala:119)
at org.apache.spark.broadcast.TorrentBroadcast$.blockifyObject(TorrentBroadcast.scala:210)
at org.apache.spark.broadcast.TorrentBroadcast.writeBlocks(TorrentBroadcast.scala:83)
at org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:68)
at org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:36)
at org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:29)
at org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:62)
at org.apache.spark.SparkContext.broadcast(SparkContext.scala:809)
at org.culiu.bd.streaming.AdSysStreaming$.redisPool$lzycompute$1(AdSysStreaming.scala:85)
at org.culiu.bd.streaming.AdSysStreaming$.redisPool$1(AdSysStreaming.scala:83)
at org.culiu.bd.streaming.AdSysStreaming$.main(AdSysStreaming.scala:155)
at org.culiu.bd.streaming.AdSysStreaming.main(AdSysStreaming.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.util.ConcurrentModificationException
at java.util.AbstractList$Itr.checkForComodification(AbstractList.java:372)
at java.util.AbstractList$Itr.next(AbstractList.java:343)
at com.esotericsoftware.kryo.serializers.CollectionSerializer.write(CollectionSerializer.java:74)
at com.esotericsoftware.kryo.serializers.CollectionSerializer.write(CollectionSerializer.java:18)
at com.esotericsoftware.kryo.Kryo.writeObject(Kryo.java:501)
at com.esotericsoftware.kryo.serializers.FieldSerializer$ObjectField.write(FieldSerializer.java:564)
... 39 more
how can I solve this?
It looks like the problem here is that the redis.clients.jedis.JedisPool class is not serializable. This doesn't seem like a Spark-specific issue, since I think that any attempt to serialize that class would fail.

ClassNotFoundException on Marshal.load

I'm trying to do a marshal dump and load in Scala...
import scala.util.Marshal
case class Test(test: String)
val t = Test("hello")
val bytes = Marshal.dump(t)
Marshal.load[Test](bytes)
...but the call to Marshal.load is throwing a ClassNotFoundException...
java.lang.ClassNotFoundException: Test
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:247)
at java.io.ObjectInputStream.resolveClass(ObjectInputStream.java:603)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1574)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1495)
at java.io.ObjectInputStream.readClass(ObjectInputStream.java:1461)
at java.io.Ob...
I've also tried using the fully qualified class name but with no luck. What am I missing?
Which version of SBT are you using? I had similar problems with SBT 0.7.7 and resolved them by adding the following to my build:
// cause the SBT "run" action to fork
override def fork = Some(new ForkScalaRun() {
override def scalaJars = Seq(buildLibraryJar.asFile, buildCompilerJar.asFile)
})
Causing the "run" action to fork seems to resolve class loader problems, and problems with trapping System.exit calls:
http://code.google.com/p/simple-build-tool/wiki/Forking
http://code.google.com/p/simple-build-tool/wiki/RunningProjectCode