Airflow + python 日志记录模块不写入日志文件

ben*_*ten 9 python logging airflow

尝试将“hello world”写入气流日志(气流 1.10.3)。基于此处此处介绍的 SO 解决方案,我应该能够import logginglogging.info('hello world')。这对我来说似乎不起作用。

import logging
from datetime import datetime, timedelta
from airflow import DAG
from airflow.operators.python_operator import PythonOperator

default_args = {
    'owner': 'benten',
    'depends_on_past': False,
    'start_date': datetime(2019, 7, 25),
    'email_on_failure': False,
    'retries': 1,
    'retry_delay': timedelta(minutes=1),
    }

def logging_is_fun():
    logging.debug("hellow world")
    logging.info("hello world")
    logging.critical("hello world")
    return None

with DAG('fun_logs', schedule_interval='45 * * * *', default_args=default_args) as dag:
    log_task = PythonOperator(python_callable=logging_is_fun, task_id='log_test_task')
Run Code Online (Sandbox Code Playgroud)

我手动触发了 dag,任务执行没有问题。但唉,当我检查日志时,我看到的是:

*** Reading local file: /home/ubuntu/airflow/logs/fun_logs/log_test_task/2019-08-31T19:22:49.653712+00:00/1.log

我惊人的“hello world”语句在哪里?鉴于我的日志级别设置,我不希望看到所有这些。不过,我确实希望看到关键信息。

我的airflow.cfg 中有以下内容(据我所知,所有默认设置):

# The folder where airflow should store its log files
# This path must be absolute
base_log_folder = /home/ubuntu/airflow/logs

# Airflow can store logs remotely in AWS S3, Google Cloud Storage or Elastic Search.
# Users must supply an Airflow connection id that provides access to the storage
# location. If remote_logging is set to true, see UPDATING.md for additional
# configuration requirements.
remote_logging = False
remote_log_conn_id =
remote_base_log_folder =
encrypt_s3_logs = False

# Logging level
logging_level = WARN
fab_logging_level = WARN

# Logging class
# Specify the class that will specify the logging configuration
# This class has to be on the python classpath
# logging_config_class = my.path.default_local_settings.LOGGING_CONFIG
logging_config_class =

# Log format
log_format = [%%(asctime)s] {%%(filename)s:%%(lineno)d} %%(levelname)s - %%(message)s
simple_log_format = %%(asctime)s %%(levelname)s - %%(message)s

# Log filename format
log_filename_template = {{ ti.dag_id }}/{{ ti.task_id }}/{{ ts }}/{{ try_number }}.log
log_processor_filename_template = {{ filename }}.log
dag_processor_manager_log_location = /home/ubuntu/airflow/logs/dag_processor_manager/dag_processor_manager.log
Run Code Online (Sandbox Code Playgroud)

Ami*_*ngh 7

设置logging_level = INFO而不是WARN输入airflow.cfg,您应该能够看到您的日志。

原因

logging_level当气流事件达到这些日志级别时记录。例如,不推荐使用气流运算符将生成记录为 WARN 的气流事件。

就您的代码而言,它们只是您想要记录的普通 python 语句。所以他们实际上属于log_level气流中的信息。因此,如果您将您logging_level的信息设置为 INFO,您应该能够看到您的日志记录语句。