我正在运行Spark Standalone集群,在提交应用程序时,spark驱动程序因以下错误而停止.
16/01/12 23:26:14 INFO Worker: Asked to kill executor app-20160112232613-0012/0
16/01/12 23:26:14 INFO ExecutorRunner: Runner thread for executor app-20160112232613-0012/0 interrupted
16/01/12 23:26:14 INFO ExecutorRunner: Killing process!
16/01/12 23:26:14 ERROR FileAppender: Error writing stream to file /spark/spark-1.4.1/work/app-20160112232613-0012/0/stderr
java.io.IOException: Stream closed
at java.io.BufferedInputStream.getBufIfOpen(BufferedInputStream.java:170)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:283)
at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
at java.io.FilterInputStream.read(FilterInputStream.java:107)
at org.apache.spark.util.logging.FileAppender.appendStreamToFile(FileAppender.scala:70)
at org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply$mcV$sp(FileAppender.scala:39)
at org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
at org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1772)
at org.apache.spark.util.logging.FileAppender$$anon$1.run(FileAppender.scala:38)
16/01/12 23:26:14 INFO Worker: Executor app-20160112232613-0012/0 finished with state KILLED exitStatus 143
16/01/12 23:26:14 INFO Worker: Cleaning up local directories for application app-20160112232613-0012
Run Code Online (Sandbox Code Playgroud)
我是Spark及其处理的新手.请帮我解决这个问题.
| 归档时间: |
|
| 查看次数: |
1788 次 |
| 最近记录: |