Spark Worker节点自动停止

Pop*_*ppy 8 java apache-spark

我正在运行Spark Standalone集群,在提交应用程序时,spark驱动程序因以下错误而停止.

16/01/12 23:26:14 INFO Worker: Asked to kill executor app-20160112232613-0012/0
16/01/12 23:26:14 INFO ExecutorRunner: Runner thread for executor app-20160112232613-0012/0 interrupted
16/01/12 23:26:14 INFO ExecutorRunner: Killing process!
16/01/12 23:26:14 ERROR FileAppender: Error writing stream to file /spark/spark-1.4.1/work/app-20160112232613-0012/0/stderr
java.io.IOException: Stream closed
        at java.io.BufferedInputStream.getBufIfOpen(BufferedInputStream.java:170)
        at java.io.BufferedInputStream.read1(BufferedInputStream.java:283)
        at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
        at java.io.FilterInputStream.read(FilterInputStream.java:107)
        at org.apache.spark.util.logging.FileAppender.appendStreamToFile(FileAppender.scala:70)
        at org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply$mcV$sp(FileAppender.scala:39)
        at org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
        at org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
        at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1772)
        at org.apache.spark.util.logging.FileAppender$$anon$1.run(FileAppender.scala:38)
16/01/12 23:26:14 INFO Worker: Executor app-20160112232613-0012/0 finished with state KILLED exitStatus 143
16/01/12 23:26:14 INFO Worker: Cleaning up local directories for application app-20160112232613-0012
Run Code Online (Sandbox Code Playgroud)

我是Spark及其处理的新手.请帮我解决这个问题.

Pop*_*ppy 2

就我而言,问题是 Spark 驱动程序无法从提交的可执行 jar 中获取依赖项。合并所有依赖项并将它们转换为单个可执行文件。它解决了这个问题。

请忍受我的术语:)