docker 中的 Celery 工作人员将无法获得正确的消息代理

1 python celery flask docker docker-compose

我正在使用应用程序工厂模式创建 Flask 服务,并且需要使用 celery 来执行异步任务。我还使用 docker 和 docker-compose 来包含和运行所有内容。我的结构如下所示:

server
 |
 +-- manage.py
 +-- docker-compose.yml
 +-- requirements.txt
 +-- Dockerfile
 |    
 +-- project
 |  |  
 |  +-- api
 |      |
 |      +--tasks.py
 |
 |  +-- __init__.py
Run Code Online (Sandbox Code Playgroud)

我的tasks.py文件如下所示:

from project import celery_app

@celery_app.task
def celery_check(test):
    print(test)
Run Code Online (Sandbox Code Playgroud)

我调用manage.pyrun ,如下所示:

# manage.py

from flask_script import Manager
from project import create_app

app = create_app()
manager = Manager(app)

if __name__ == '__main__':
    manager.run()
Run Code Online (Sandbox Code Playgroud)

我的__init__.py看起来像这样:

# project/__init__.py

import os
import json
from flask_mongoalchemy import MongoAlchemy
from flask_cas import CAS
from flask import Flask
from itsdangerous import JSONWebSignatureSerializer as JWT
from flask_httpauth import HTTPTokenAuth
from celery import Celery

# instantiate the database and CAS
db = MongoAlchemy()
cas = CAS()

# Auth stuff (ReplaceMe is replaced below in create_app())
jwt = JWT("ReplaceMe")
auth = HTTPTokenAuth('Bearer')
celery_app = Celery(__name__, broker=os.environ.get("CELERY_BROKER_URL"))


def create_app():
    # instantiate the app
    app = Flask(__name__, template_folder='client/templates', static_folder='client/static')

    # set config
    app_settings = os.getenv('APP_SETTINGS')
    app.config.from_object(app_settings)

    # Send new static files every time if debug is enabled
    if app.debug:
        app.config['SEND_FILE_MAX_AGE_DEFAULT'] = 0

    # Get the secret keys
    parse_secret(app.config['CONFIG_FILE'], app)

    celery_app.conf.update(app.config)
    print(celery_app.conf)

    # set up extensions
    db.init_app(app)
    cas.init_app(app)
    # Replace the secret key with the app's
    jwt.secret_key = app.config["SECRET_KEY"]

    parse_config(app.config['CONFIG_FILE'])

    # register blueprints
    from project.api.views import twist_blueprint
    app.register_blueprint(twist_blueprint)

    return app
Run Code Online (Sandbox Code Playgroud)

在我的 docker-compose 中,我启动了一个工作程序并定义了一些环境变量,如下所示:

version: '2.1'

services:
  twist-service:
    container_name: twist-service
    build: .
    volumes:
      - '.:/usr/src/app'
    ports:
      - 5001:5000 # expose ports - HOST:CONTAINER
    environment:
      - APP_SETTINGS=project.config.DevelopmentConfig
      - DATABASE_NAME_TESTING=testing
      - DATABASE_NAME_DEV=dev
      - DATABASE_URL=twist-database
      - CONFIG_FILE=./project/default_config.json
      - MONGO_PASSWORD=user
      - CELERY_RESULT_BACKEND=redis://redis:6379
      - CELERY_BROKER_URL=redis://redis:6379/0
      - MONGO_PORT=27017
    depends_on:
      - celery
      - twist-database
  celery:
    container_name: celery
    build: .
    command: celery -A project.api.tasks --loglevel=debug worker
    volumes:
      - '.:/usr/src/app'
  twist-database:
    image: mongo:latest
    container_name: "twist-database"
    environment:
      - MONGO_DATA_DIR=/data/db
      - MONGO_USER=mongo
    volumes:
      - /data/db
    ports:
      - 27017:27017  # expose ports - HOST:CONTAINER
    command: mongod
  redis:
    image: "redis:alpine"
    command: redis-server
    volumes:
      - '/redis'
    ports:
      - '6379:6379'
Run Code Online (Sandbox Code Playgroud)

然而,当我运行 docker-compose 文件并生成容器时,我最终在 celery 工作日志中看到以下内容:

[2017-07-20 16:53:06,721: ERROR/MainProcess] consumer: Cannot connect to amqp://guest:**@127.0.0.1:5672//: [Errno 111] Connection refused.
Run Code Online (Sandbox Code Playgroud)

这意味着worker在创建celery时忽略了redis的配置集,并尝试使用rabbitmq。我尝试将project.api.tasks更改为project和project.celery_app,但无济于事。

Ani*_*nis 5

在我看来,celery服务也应该有环境CELERY_RESULT_BACKEND变量CELERY_BROKER_URL