rab*_*ens 5 scala scalatest apache-spark
我有多个 ScalaTest 类,用于BeforeAndAfterAll构造 aSparkContext并在之后停止它,如下所示:
class MyTest extends FlatSpec with Matchers with BeforeAndAfterAll {
private var sc: SparkContext = null
override protected def beforeAll(): Unit = {
sc = ... // Create SparkContext
}
override protected def afterAll(): Unit = {
sc.stop()
}
// my tests follow
}
Run Code Online (Sandbox Code Playgroud)
这些测试在从 IntelliJ IDEA 启动时运行良好,但在运行时sbt test,我得到WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext may be running in this JVM (see SPARK-2243).,然后,我认为与此问题相关的一堆其他异常。
如何正确使用Spark?我是否必须SparkContext为整个测试套件创建一个全局变量,如果是,我该怎么做?
似乎我只见树木不见森林,我忘记了以下行build.sbt:
parallelExecution in test := false
Run Code Online (Sandbox Code Playgroud)
通过这条线,测试运行。
| 归档时间: |
|
| 查看次数: |
1860 次 |
| 最近记录: |