我想停止火花壳上的各种消息.
我试图编辑该log4j.properties文件以阻止这些消息.
这是内容 log4j.properties
# Define the root logger with appender file
log4j.rootCategory=WARN, console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n
# Settings to quiet third party logs that are too verbose
log4j.logger.org.eclipse.jetty=WARN
log4j.logger.org.eclipse.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO
Run Code Online (Sandbox Code Playgroud)
但是消息仍在控制台上显示.
以下是一些示例消息
15/01/05 15:11:45 INFO SparkEnv: Registering BlockManagerMaster
15/01/05 15:11:45 INFO DiskBlockManager: Created local directory at /tmp/spark-local-20150105151145-b1ba
15/01/05 15:11:45 INFO MemoryStore: MemoryStore started with capacity 0.0 B.
15/01/05 15:11:45 INFO ConnectionManager: Bound socket to port 44728 with id = ConnectionManagerId(192.168.100.85,44728)
15/01/05 …Run Code Online (Sandbox Code Playgroud) 我有一个使用Spark的Scala Maven项目,我正在尝试使用Logback实现日志记录.我正在将我的应用程序编译到jar,并部署到安装了Spark发行版的EC2实例.我的pom.xml包含Spark和Logback的依赖项,如下所示:
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>1.1.7</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>log4j-over-slf4j</artifactId>
<version>1.7.7</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.binary.version}</artifactId>
<version>${spark.version}</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
<exclusion>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
</exclusion>
</exclusions>
</dependency>
Run Code Online (Sandbox Code Playgroud)
提交我的Spark应用程序时,我会在命令行上打印出slf4j绑定.如果我使用java执行jar代码,则绑定是Logback.但是,如果我使用Spark(即spark-submit),则绑定到log4j.
val logger: Logger = LoggerFactory.getLogger(this.getClass)
val sc: SparkContext = new SparkContext()
val rdd = sc.textFile("myFile.txt")
val slb: StaticLoggerBinder = StaticLoggerBinder.getSingleton
System.out.println("Logger Instance: " + slb.getLoggerFactory)
System.out.println("Logger Class Type: " + slb.getLoggerFactoryClassStr)
Run Code Online (Sandbox Code Playgroud)
产量
Logger Instance: org.slf4j.impl.Log4jLoggerFactory@a64e035
Logger Class Type: org.slf4j.impl.Log4jLoggerFactory
Run Code Online (Sandbox Code Playgroud)
据我了解,双方log4j-1.2.17.jar并slf4j-log4j12-1.7.16.jar都在/ usr /本地/火花/罐,而火花,尽管在我的pom.xml排除最有可能引用这些罐子,因为如果我删除,我给在运行时一个ClassNotFoundException火花提交.
我的问题是:有没有办法使用Logback在我的应用程序中实现本机日志记录,同时保留Spark的内部日志记录功能.理想情况下,我想将Logback应用程序日志写入文件,并允许Spark日志仍显示在STDOUT.