有谁知道,如何编写 log4j2 属性文件,将日志作为 JSON 输出到控制台?
我看到了这个链接https://logging.apache.org/log4j/2.x/manual/layouts.html#JSONLayout,但我不清楚,如何在属性文件中进行配置。
谢谢,伊兰
我正在尝试通过 EMR 在 spark 2.0 中将日志写为 json。我能够使用自定义 log4j.properties 文件。
但是当我尝试使用自定义类(net.logstash.log4j.JSONEventLayoutV1)将输出更改为json时,出现以下异常:
log4j:ERROR Could not instantiate class [net.logstash.log4j.JSONEventLayoutV1].
java.lang.ClassNotFoundException: net.logstash.log4j.JSONEventLayoutV1
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at org.apache.log4j.helpers.Loader.loadClass(Loader.java:198)
at org.apache.log4j.helpers.OptionConverter.instantiateByClassName(OptionConverter.java:327)
at org.apache.log4j.helpers.OptionConverter.instantiateByKey(OptionConverter.java:124)
at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:797)
at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
at org.apache.spark.internal.Logging$class.initializeLogging(Logging.scala:117)
at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:102)
at org.apache.spark.executor.CoarseGrainedExecutorBackend$.initializeLogIfNecessary(CoarseGrainedExecutorBackend.scala:161)
at org.apache.spark.internal.Logging$class.log(Logging.scala:46)
at org.apache.spark.executor.CoarseGrainedExecutorBackend$.log(CoarseGrainedExecutorBackend.scala:161)
at org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:172)
at org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:270)
at org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)
Run Code Online (Sandbox Code Playgroud)
这是 log4j.properties 的样子:
log4j.rootCategory=INFO, json
log4j.appender.json=org.apache.log4j.ConsoleAppender
log4j.appender.json.target=System.err
log4j.appender.json.layout=net.logstash.log4j.JSONEventLayoutV1
Run Code Online (Sandbox Code Playgroud)
工件“jsonevent-layout”被组装在fat-jar中。
有没有人知道如何解决这个问题?
谢谢,伊兰
我想通过Spark从MySQL读取数据.我看到的API能够从特定表中读取数据.就像是,
val prop = new java.util.Properties
prop.setProperty("user", "<username>")
prop.setProperty("password", "<password>")
sparkSession.read.jdbc("jdbc:mysql://????:3306/???", "some-table", prop)
Run Code Online (Sandbox Code Playgroud)
现在,我想对连接表执行查询.有谁知道怎么做(在数据库方面,而不是Spark SQL)?
谢谢,
伊兰