无法在ScalaTest中导入Spark Implicits

him*_*ian 13 scala implicit scalatest apache-spark apache-spark-sql

我正在使用ScalaTest为Spark编写测试用例.

import org.apache.spark.sql.SparkSession
import org.scalatest.{BeforeAndAfterAll, FlatSpec}

class ClassNameSpec extends FlatSpec with BeforeAndAfterAll {
  var spark: SparkSession = _
  var className: ClassName = _

  override def beforeAll(): Unit = {
    spark = SparkSession.builder().master("local").appName("class-name-test").getOrCreate()
    className = new ClassName(spark)
  }

  it should "return data" in {
    import spark.implicits._
    val result = className.getData(input)

    assert(result.count() == 3)
  }

  override def afterAll(): Unit = {
    spark.stop()
  }
}
Run Code Online (Sandbox Code Playgroud)

当我尝试编译测试套件时,它会给我以下错误:

stable identifier required, but ClassNameSpec.this.spark.implicits found.
[error]     import spark.implicits._
[error]                  ^
[error] one error found
[error] (test:compileIncremental) Compilation failed
Run Code Online (Sandbox Code Playgroud)

我无法理解为什么我不能import spark.implicits._进入测试套件.

任何帮助表示赞赏!

Ass*_*son 30

要进行导入,您需要一个"稳定标识符",如错误消息所示.这意味着你需要一个val,而不是var.由于您将spark定义为var,因此scala无法正确导入.

要解决这个问题,您可以执行以下操作:

val spark2 = spark
import spark2.implicits._
Run Code Online (Sandbox Code Playgroud)

或者将原始var更改为val,例如:

lazy val spark: SparkSession = SparkSession.builder().master("local").appName("class-name-test").getOrCreate()
Run Code Online (Sandbox Code Playgroud)