Bar*_*art 3 macos scala intellij-idea sbt apache-spark
我正在尝试运行自己的spark应用程序,但是当我使用spark-submit命令时,我收到此错误:
Users/_name_here/dev/sp/target/scala-2.10/sp_2.10-0.1-SNAPSHOT.jar --stacktrace
java.lang.ClassNotFoundException: /Users/_name_here/dev/sp/mo/src/main/scala/MySimpleApp
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:340)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:633)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Run Code Online (Sandbox Code Playgroud)
我正在使用以下命令:
/Users/_name_here/dev/spark/bin/spark-submit
--class "/Users/_name_here/dev/sp/mo/src/main/scala/MySimpleApp"
--master local[4] /Users/_name_here/dev/sp/target/scala-2.10/sp_2.10-0.1-SNAPSHOT.jar
Run Code Online (Sandbox Code Playgroud)
我的build.sb看起来像这样:
name := "mo"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.10" % "1.4.0",
"org.postgresql" % "postgresql" % "9.4-1201-jdbc41",
"org.apache.spark" % "spark-sql_2.10" % "1.4.0",
"org.apache.spark" % "spark-mllib_2.10" % "1.4.0",
"org.tachyonproject" % "tachyon-client" % "0.6.4",
"org.postgresql" % "postgresql" % "9.4-1201-jdbc41",
"org.apache.spark" % "spark-hive_2.10" % "1.4.0",
"com.typesafe" % "config" % "1.2.1"
)
resolvers += "Typesafe Repo" at "http://repo.typesafe.com/typesafe/releases/"
Run Code Online (Sandbox Code Playgroud)
我的plugin.sbt:
logLevel := Level.Warn
resolvers += "Sonatype snapshots" at "https://oss.sonatype.org/content/repositories/snapshots/"
addSbtPlugin("com.github.mpeltonen" % "sbt-idea" % "1.6.0")
addSbtPlugin("com.eed3si9n" % "sbt-assembly" %"0.11.2")
Run Code Online (Sandbox Code Playgroud)
我正在使用spark.apache.org中的prebuild包.我通过brew和scala安装了sbt.从spark根文件夹运行sbt包工作正常并且它创建了jar但是使用程序集根本不起作用,可能是因为它在重建spark文件夹中丢失了.我会感激任何帮助,因为我很新兴.哦,顺便说一句火花在smartJ中运行良好
您不应该通过其目录路径引用您的类,而是通过其包路径引用您的类.例:
/Users/_name_here/dev/spark/bin/spark-submit
--master local[4]
--class com.example.MySimpleApp /Users/_name_here/dev/sp/target/scala-2.10/sp_2.10-0.1-SNAPSHOT.jar
Run Code Online (Sandbox Code Playgroud)
从我看到你没有任何包中的MySimpleApp,所以只需"--class MySimpleApp"即可.
显然我的项目结构总体上肯定有问题。因为我使用 sbt 和 sublime 创建了一个新项目,现在我可以使用 Spark-submit。但这真的很奇怪,因为我没有对 intelliJ 中提供的 sbt 项目的默认结构进行任何更改。现在的项目结构就像一个魅力:
Macbook:sp user$ find .
.
./build.sbt
./project
./project/plugin.sbt
./src
./src/main
./src/main/scala
./src/main/scala/MySimpleApp.scala
Run Code Online (Sandbox Code Playgroud)
感谢您的帮助!
| 归档时间: |
|
| 查看次数: |
11243 次 |
| 最近记录: |