我对Spark很新.我尝试过搜索,但我找不到合适的解决方案.我在两个盒子(一个主节点和另一个工作节点)上安装了hadoop 2.7.2我按照以下链接设置了集群http://javadev.org/docs/hadoop/centos/6/installation/multi- node-installation-on-centos-6-non-sucure-mode / 我以root用户身份运行hadoop和spark应用程序来测试集群.
我已在主节点上安装了spark,并且spark正在启动而没有任何错误.但是,当我使用spark submit提交作业时,我发现File Not Found异常,即使该文件存在于错误中同一位置的主节点中.我正在执行Spark Submit命令下面,请在下面找到日志输出命令.
/bin/spark-submit --class com.test.Engine --master yarn --deploy-mode cluster /app/spark-test.jar
Run Code Online (Sandbox Code Playgroud)
16/04/21 19:16:13 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 16/04/21 19:16:13 INFO RMProxy: Connecting to ResourceManager at /0.0.0.0:8032 16/04/21 19:16:14 INFO Client: Requesting a new application from cluster with 1 NodeManagers 16/04/21 19:16:14 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (8192 MB per …