相关疑难解决方法(0)

如何在单元测试中抑制Spark记录?

所以感谢我试过的易于谷歌的博客:

import org.specs2.mutable.Specification

class SparkEngineSpecs extends Specification {
  sequential

  def setLogLevels(level: Level, loggers: Seq[String]): Map[String, Level] = loggers.map(loggerName => {
    val logger = Logger.getLogger(loggerName)
    val prevLevel = logger.getLevel
    logger.setLevel(level)
    loggerName -> prevLevel
  }).toMap

  setLogLevels(Level.WARN, Seq("spark", "org.eclipse.jetty", "akka"))

  val sc = new SparkContext(new SparkConf().setMaster("local").setAppName("Test Spark Engine"))

  // ... my unit tests
Run Code Online (Sandbox Code Playgroud)

但不幸的是它不起作用,我仍然得到很多火花输出,例如:

14/12/02 12:01:56 INFO MemoryStore: Block broadcast_4 of size 4184 dropped from memory (free 583461216)
14/12/02 12:01:56 INFO ContextCleaner: Cleaned broadcast 4
14/12/02 12:01:56 INFO ContextCleaner: Cleaned shuffle 4
14/12/02 …
Run Code Online (Sandbox Code Playgroud)

log4j scala apache-spark

25
推荐指数
3
解决办法
1万
查看次数

标签 统计

apache-spark ×1

log4j ×1

scala ×1