相关疑难解决方法(0)

为什么我们需要在运行Spark SBT应用程序时添加"fork in run:= true"?

我已经构建了一个简单的Spark应用程序sbt.这是我的代码:

import org.apache.spark.sql.SparkSession

object HelloWorld {
  def main(args: Array[String]): Unit = {
    val spark = SparkSession.builder().master("local").appName("BigApple").getOrCreate()

    import spark.implicits._

    val ds = Seq(1, 2, 3).toDS()
    ds.map(_ + 1).foreach(x => println(x))
  }
}
Run Code Online (Sandbox Code Playgroud)

以下是我的 build.sbt

name := """sbt-sample-app"""

version := "1.0"

scalaVersion := "2.11.7"

libraryDependencies += "org.scalatest" %% "scalatest" % "2.2.6" % "test"
libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "2.1.1"
Run Code Online (Sandbox Code Playgroud)

现在,当我尝试这样做时sbt run,它会给我以下错误:

$ sbt run
[info] Loading global plugins from /home/user/.sbt/0.13/plugins
[info] Loading project definition from /home/user/Projects/sample-app/project
[info] …
Run Code Online (Sandbox Code Playgroud)

scala sbt apache-spark

13
推荐指数
1
解决办法
4788
查看次数

标签 统计

apache-spark ×1

sbt ×1

scala ×1