小编Har*_*rry的帖子

为什么Spark退出exitCode:16?

我使用Spark 2.0.0和Hadoop 2.7并使用yarn-cluster模式.每次,我都会收到以下错误:

17/01/04 11:18:04 INFO spark.SparkContext: Successfully stopped SparkContext
17/01/04 11:18:04 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 16, (reason: Shutdown hook called before final status was reported.)
17/01/04 11:18:04 INFO util.ShutdownHookManager: Shutdown hook called
17/01/04 11:18:04 INFO util.ShutdownHookManager: Deleting directory /tmp/hadoop-hduser/nm-local-dir/usercache/harry/appcache/application_1475261544699_0833/spark-42e40ac3-279f-4c3f-ab27-9999d20069b8
17/01/04 11:18:04 INFO spark.SparkContext: SparkContext already stopped.
Run Code Online (Sandbox Code Playgroud)

但是,我确实得到了正确的打印输出.相同的代码在Spark 1.4.0-Hadoop 2.4.0中工作正常,我没有看到任何退出代码.

apache-spark

5
推荐指数
1
解决办法
3575
查看次数

标签 统计

apache-spark ×1