当我使用sbt构建spark自包含应用程序时,编译失败

Lia*_*eng 0 sbt apache-spark

我试图通过使用sbt来构建并发症.

代码:https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/ml/ALSExample.scala

教程:http://spark.apache.org/docs/latest/quick-start.html#self-contained-applications

但错误发生了.

[error] /home/zeng/workspace/spark/als/src/main/scala/ALSExample.scala:22:object ml is not a member of package org.apache.spark
[error] import org.apache.spark.ml.evaluation.RegressionEvaluator
[error]                         ^
[error] /home/zeng/workspace/spark/als/src/main/scala/ALSExample.scala:23: object ml is not a member of package org.apache.spark
[error] import org.apache.spark.ml.recommendation.ALS
[error]                         ^
[error] /home/zeng/workspace/spark/als/src/main/scala/ALSExample.scala:25: object sql is not a member of package org.apache.spark
[error] import org.apache.spark.sql.SparkSession
[error]                         ^
[error] /home/zeng/workspace/spark/als/src/main/scala/ALSExample.scala:46: not found: value SparkSession
[error]     val spark = SparkSession
[error]                 ^
[error] /home/zeng/workspace/spark/als/src/main/scala/ALSExample.scala:61: not found: type ALS
[error]     val als = new ALS()
[error]                   ^
[error] 5 errors found
[error] (compile:compileIncremental) Compilation failed
Run Code Online (Sandbox Code Playgroud)

为什么会这样?BTW,spark verion是2.0.0.

Gam*_*ows 5

所以就像怀疑一样,这个错误反映的是你没有在你的构建文件中包含所有的spark库,你缺少的是(是吗?):

 "org.apache.spark" %% "spark-mllib" % "2.0.0"
Run Code Online (Sandbox Code Playgroud)

如果您使用的是Dataframes,您还需要:

 "org.apache.spark" %% "spark-sql" % "2.0.0"
Run Code Online (Sandbox Code Playgroud)