Celery任务不会通过记录器重要消息向管理员发送电子邮件

lui*_*stm 5 python django celery

每次调用logger.critical,我的celery任务都不会向应用程序管理员发送电子邮件。

我正在构建Django应用程序。我项目的当前配置允许每次创建logger.critical消息时,应用程序的管理员都可以接收电子邮件。设置起来非常简单,我只是遵循了两个项目(celery和Django)的文档。出于某种原因(我不确定),在celery任务中运行的代码没有相同的行为,它不会在每次创建logger.critical消息时向应用程序管理员发送电子邮件。

芹菜实际上是否允许这样做?我是否缺少某些配置?有谁遇到这个问题并能够解决?

使用方法:

  • Django 1.11
  • 芹菜4.3

谢谢你的帮助。

bug*_*bug 5

As stated in the documentation Celery overrides the current logging configuration to apply its own, it also says that you can set CELERYD_HIJACK_ROOT_LOGGER to False in your Django settings to prevent this behavior, what is not well documented is that this is not really working at the moment.

In my opinion you have 2 options:

1. Prevent Celery to override your configuration (really) using the setup_logging signal

Open your celery.py file and add the following:

from celery.signals import setup_logging

@setup_logging.connect
def config_loggers(*args, **kwags):
    pass
Run Code Online (Sandbox Code Playgroud)

After that your file should look more or less like this:

from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from celery.signals import setup_logging

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myproject.settings')

app = Celery('myproject')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()

@setup_logging.connect
def config_loggers(*args, **kwags):
    pass
Run Code Online (Sandbox Code Playgroud)

However I would avoid this option unless you have a really good reason because in this way you will lose the default task logging handled by Celery, which is quite good to have.

2. Use a specific logger

You can define a custom logger in your Django LOGGING configuration and use it in your task, eg:

Django settings:

LOGGING = {
    # ... other configs ...
    'handlers': {
        'my_email_handler': {
            # ... handler configuration ...
        },
    },
    'loggers': {
        # ... other loggers ...
        'my_custom_logger': {
            'handlers': ['my_email_handler'],
            'level': 'CRITICAL',
            'propagate': True,
        },
    },
}
Run Code Online (Sandbox Code Playgroud)

Tasks:

import logging

logger = logging.getLogger('my_custom_logger')

@shared_task
def log():
    logger.critical('Something bad happened!')
Run Code Online (Sandbox Code Playgroud)

I believe this is the best approach for you because, as far as I understand, you need to manually log messages, and this allows you to keep using the Celery logging system.