相关疑难解决方法(0)

Spark在Yarn集群exitCode = 13上运行:

我是一个火花/纱线新手,当我在纱线集群上提交火花作业时,遇到exitCode = 13.当火花作业在本地模式下运行时,一切都很好.

我使用的命令是:

/usr/hdp/current/spark-client/bin/spark-submit --class com.test.sparkTest --master yarn --deploy-mode cluster --num-executors 40 --executor-cores 4 --driver-memory 17g --executor-memory 22g --files /usr/hdp/current/spark-client/conf/hive-site.xml /home/user/sparkTest.jar*
Run Code Online (Sandbox Code Playgroud)

火花错误日志:

16/04/12 17:59:30 INFO Client:
         client token: N/A
         diagnostics: Application application_1459460037715_23007 failed 2 times due to AM Container for appattempt_1459460037715_23007_000002 exited with  exitCode: 13
For more detailed output, check application tracking page:http://b-r06f2-prod.phx2.cpe.net:8088/cluster/app/application_1459460037715_23007Then, click on links to logs of each attempt.
Diagnostics: Exception from container-launch.
Container id: container_e40_1459460037715_23007_02_000001
Exit code: 13
Stack trace: ExitCodeException exitCode=13:
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:576)
        at org.apache.hadoop.util.Shell.run(Shell.java:487) …
Run Code Online (Sandbox Code Playgroud)

hadoop scala hadoop-yarn apache-spark

9
推荐指数
2
解决办法
2万
查看次数

如何设置Spark应用程序退出状态?

我正在编写一个spark应用程序并使用spark-submit shell脚本运行它(使用yarn-cluster/yarn-client)

正如我现在看到的,spark-submit的退出代码是根据相关的纱线应用决定的 - 如果SUCCEEDED状态为0,否则为1.

我想要选择返回另一个退出代码 - 对于我的应用程序成功发生一些错误的状态.

可能吗?从应用程序返回不同的退出代码?

我试过使用System.exit()但没有成功......

谢谢.

exit-code hadoop-yarn apache-spark spark-submit

5
推荐指数
1
解决办法
4923
查看次数