我有两个独立的celeryd进程在我的服务器上运行,由...管理supervisor.它们被设置为侦听单独的队列:
[program:celeryd1]
command=/path/to/celeryd --pool=solo --queues=queue1
...
[program:celeryd2]
command=/path/to/celeryd --pool=solo --queues=queue2
...
Run Code Online (Sandbox Code Playgroud)
我的celeryconfig看起来像这样:
from celery.schedules import crontab
BROKER_URL = "amqp://guest:guest@localhost:5672//"
CELERY_DISABLE_RATE_LIMITS = True
CELERYD_CONCURRENCY = 1
CELERY_IGNORE_RESULT = True
CELERY_DEFAULT_QUEUE = 'default'
CELERY_QUEUES = {
'default': {
"exchange": "default",
"binding_key": "default",
},
'queue1': {
'exchange': 'queue1',
'routing_key': 'queue1',
},
'queue2': {
'exchange': 'queue2',
'routing_key': 'queue2',
},
}
CELERY_IMPORTS = ('tasks', )
CELERYBEAT_SCHEDULE = {
'first-queue': {
'task': 'tasks.sync',
'schedule': crontab(hour=02, minute=00),
'kwargs': {'client': 'client_1'},
'options': {'queue': 'queue1'},
},
'second-queue': …Run Code Online (Sandbox Code Playgroud) 当尝试创建两个单独的专用工作人员时,我无法将任务发送到芹菜。我已经阅读了文档和这个问题,但它并没有改善我的情况。
我的配置如下:
CELERY_RESULT_BACKEND = 'django-db'
CELERY_BROKER_URL = f'redis://{env("REDIS_HOST")}:{env("REDIS_PORT")}/{env("REDIS_CELERY_DB")}'
CELERY_DEFAULT_QUEUE = 'default'
CELERY_DEFAULT_EXCHANGE_TYPE = 'topic'
CELERY_DEFAULT_ROUTING_KEY = 'default'
CELERY_QUEUES = (
Queue('default', Exchange('default'), routing_key='default'),
Queue('media', Exchange('media'), routing_key='media'),
)
CELERY_ROUTES = {
'books.tasks.resize_book_photo': {
'queue': 'media',
'routing_key': 'media',
},
}
Run Code Online (Sandbox Code Playgroud)
任务在文件中按以下方式定义tasks.py:
import logging
import time
from celery import shared_task
from books.models import Author, Book
from books.commands import resize_book_photo as resize_book_photo_command
logger = logging.getLogger(__name__)
@shared_task
def list_test_books_per_author():
time.sleep(5)
queryset = Author.objects.all()
for author in queryset:
for book in author.testing_books: …Run Code Online (Sandbox Code Playgroud)