相关疑难解决方法(0)

如何在 ScalaTest 测试中正确使用 Spark?

我有多个 ScalaTest 类,用于BeforeAndAfterAll构造 aSparkContext并在之后停止它,如下所示:

class MyTest extends FlatSpec with Matchers with BeforeAndAfterAll {

  private var sc: SparkContext = null

  override protected def beforeAll(): Unit = {
    sc = ... // Create SparkContext
  }

  override protected def afterAll(): Unit = {
    sc.stop()
  }

  // my tests follow
}
Run Code Online (Sandbox Code Playgroud)

这些测试在从 IntelliJ IDEA 启动时运行良好,但在运行时sbt test,我得到WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one …

scala scalatest apache-spark

5
推荐指数
1
解决办法
1860
查看次数

标签 统计

apache-spark ×1

scala ×1

scalatest ×1