col*_*ang 7 apache-spark pyspark jupyter-notebook
我知道我可以通过spark.sparkContext.setLogLevel('INFO')日志设置日志级别,如下所示出现在终端中,但不能在 jupyter notebook 中。
2019-03-25 11:42:37 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
2019-03-25 11:42:37 WARN SparkConf:66 - In Spark 1.0 and later spark.local.dir will be overridden by the value set by the cluster manager (via SPARK_LOCAL_DIRS in mesos/standalone and LOCAL_DIRS in YARN).
2019-03-25 11:42:38 WARN Utils:66 - Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
Run Code Online (Sandbox Code Playgroud)
spark 会话是在 jupyter notebook 单元中以本地模式创建的。
spark = SparkSession \
.builder \
.master('local[7]') \
.appName('Notebook') \
.getOrCreate()
Run Code Online (Sandbox Code Playgroud)
有没有办法将日志转发到 jupyter notebook?
| 归档时间: |
|
| 查看次数: |
649 次 |
| 最近记录: |