java.lang.ClassNotFoundException,当我使用"spark-submit"和新的类名而不是"SimpleApp"时,

zha*_*ang 6 scala apache-spark

我用scala编写了一个spark程序,但是当我使用"spark-submit"提交我的项目时,我遇到了java.lang.ClassNotFoundException.

我的.sbt文件:

name:="Spark Project"

version:="1.0"

scalaVersion:="2.10.5"

libraryDependencies+="org.apache.spark" %% "spark-core" % "1.3.0"
Run Code Online (Sandbox Code Playgroud)

我的.scala文件的名称是SparkProject.scala,其中对象的名称也是SparkProject.

/* SparkProject.scala */
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf

object SparkProject {
  def main(args: Array[String]) {
    val logFile = "YOUR_SPARK_HOME/README.md" // Should be some file on your system
    val conf = new SparkConf().setAppName("Simple Application")
    val sc = new SparkContext(conf)
    val logData = sc.textFile(logFile, 2).cache()
    val numAs = logData.filter(line => line.contains("a")).count()
    val numBs = logData.filter(line => line.contains("b")).count()
    println("Lines with a: %s, Lines with b: %s".format(numAs, numBs))
  }
}
Run Code Online (Sandbox Code Playgroud)

我提交项目的命令是:

spark-submit --class "SparkProject" --master local[12] target/scala-2.10/spark-project_2.10-1.0.jar
Run Code Online (Sandbox Code Playgroud)

谁知道如何解决这个问题?最后让我感到困惑的是,当我尝试这里提供的示例[ http://spark.apache.org/docs/latest/quick-start.html]时,运行良好.但是当我构建一个新项目并提交它时出错.任何帮助将非常感谢.

小智 6

添加包名称对我有用.

我的代码也很简单:

package spark.wordcount

/* SimpleApp.scala */
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf

object WordCount {
  def main(args: Array[String]) {
    val infile = "/input" // Should be some file on your system
    val conf = new SparkConf().setAppName("word count")
    val sc = new SparkContext(conf)
    val indata = sc.textFile(infile, 2).cache()
    val words = indata.flatMap(line => line.split(" ")).map(word => (word,1)).reduceByKey((a,b) => (a+b))
    words.saveAsTextFile("/output")
    println("All words are counted!")
  }
}
Run Code Online (Sandbox Code Playgroud)

我尝试像这样运行spark-submit:[root @ sparkmaster bin]#./ spark -submit --class spark.wordcount.WordCount /opt/spark-wordcount-in-scala.jar并且它成功运行.


Vib*_*uti 1

删除包名称对我有用。