在sbt中运行SparkContext错误:已经有一个名为LocalBackendEndpoint的RpcEndpoint

jav*_*dba 3 apache-spark

在Intellij中运行时,sp​​ark测试用例正在运行,但是失败了sbt test.

失败是在连接服务器建立SparkContext期间netty创建:

val sc = new SparkContext("local", "SamplingTest", new SparkConf())
Run Code Online (Sandbox Code Playgroud)

错误是java.lang.IllegalArgumentException: There is already an RpcEndpoint called LocalBackendEndpoint.这是堆栈跟踪:

[info] SamplingSpec:
[info] Factorization
[info] - should factorize *** FAILED *** (1 second, 957 milliseconds)
[info]   java.lang.IllegalArgumentException: There is already an RpcEndpoint called LocalBackendEndpoint
[info]   at org.apache.spark.rpc.netty.Dispatcher.registerRpcEndpoint(Dispatcher.scala:65)
[info]   at org.apache.spark.rpc.netty.NettyRpcEnv.setupEndpoint(NettyRpcEnv.scala:136)
[info]   at org.apache.spark.scheduler.local.LocalBackend.start(LocalBackend.scala:126)
[info]   at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:144)
[info]   at org.apache.spark.SparkContext.<init>(SparkContext.scala:530)
[info]   at org.vz.datasci.spark.ml.SamplingSpec$$anonfun$1.apply$mcV$sp(SamplingSpec.scala:13)
Run Code Online (Sandbox Code Playgroud)

只创建了一个SparkContext,并且本地主机的ssh工作正常.还应该考虑什么?

小智 8

我得到了同样的错误,因为我的测试是并行运行的.

在我的build.sbt中,我补充说:

test中的parallelExecution:= false

不确定它是否是最佳解决方案,但它解决了它.