jac*_*kar 9 hadoop hadoop-yarn apache-spark
我在kerberized集群上运行Spark 1.1.0,HDP 2.1.我可以使用--master yarn-client成功运行spark-submit,并将结果正确写入HDFS,但是,该作业未显示在Hadoop All Applications页面上.我想使用--master yarn-cluster运行spark-submit但我继续收到此错误:
appDiagnostics: Application application_1417686359838_0012 failed 2 times due to AM Container
for appattempt_1417686359838_0012_000002 exited with exitCode: -1000 due to: File does not
exist: hdfs://<HOST>/user/<username>/.sparkStaging/application_<numbers>_<more numbers>/spark-assembly-1.1.0-hadoop2.4.0.jar
.Failing this attempt.. Failing the application.
Run Code Online (Sandbox Code Playgroud)
我已经为我的帐户配置了对群集的访问权限.我已经配置了yarn-site.xml.我已经清除了.sparkStaging.我尝试过包含--jars [spark/lib中我的spark程序集的路径].我发现这个问题非常相似,但没有答案.我不知道这是2.1问题,火花1.1.0,kerberized集群,配置,或什么.任何帮助将非常感激.
归档时间: |
|
查看次数: |
2820 次 |
最近记录: |