以下是我项目的结构:
火花应用:
scala1.scala //我从这个类调用java类.
java.java //这将向yarn集群提交另一个spark应用程序.
java类触发的spark应用程序:
scala2.scala
我的参考教程就在这里
当我通过运行从scala1.scala我的java类spark-submit
在本地模式的第二火花应用scala2.scala
是越来越触发和预期工作.
但是,当我spark-submit
在纱线集群中运行相同的应用程序时,它显示以下错误!
Error: Could not find or load main class
org.apache.spark.deploy.yarn.ApplicationMaster
Application application_1493671618562_0072 failed 5 times due to AM Container for appattempt_1493671618562_0072_000005 exited with exitCode: 1
For more detailed output, check the application tracking page: http://headnode.internal.cloudapp.net:8088/cluster/app/application_1493671618562_0072 Then click on links to logs of each attempt.
Diagnostics: Exception from container-launch.
Container id: container_e02_1493671618562_0072_05_000001
Exit code: 1
Exception message: /mnt/resource/hadoop/yarn/local/usercache/helixuser/appcache/application_1493671618562_0072/container_e02_1493671618562_0072_05_000001/launch_container.sh: …
Run Code Online (Sandbox Code Playgroud) azureservicebus apache-kafka kafka-consumer-api kafka-producer-api apache-kafka-streams