我的问题如下:
我有一个 Django 应用程序,可以在上面上传文件。上传文件时,将启动 celery 任务来根据其版本处理特定队列上的文件,如下所示:
import my_library
@app.task()
def process_file(file):
result = my_library.process(file)
model = MyModel(result=result)
model.save()
return
def file_upload(request):
file = request.FILE['file']
version = parse_version(file)
process_file.run_async(file, queue=version)
Run Code Online (Sandbox Code Playgroud)
所以我的库的每个版本都有一个队列。我的想法是创建多个守护进程,每个守护进程对应我的库的每个队列/版本,使用虚拟环境来使用正确版本的 my_library。
但我不知道如何正确地做到这一点。
不同版本的 my_library 不能共存,因为它们包含依赖于具有相同版本号的自定义 C 库的 Cython 函数。
我正在尝试在 Celery 中设置一个每 3 秒运行一次的虚拟任务,但到目前为止收效甚微。这是我得到的输出:
我已按如下方式设置芹菜:
在设置.py中:
from datetime import timedelta
BROKER_URL = 'redis://localhost:6379/0'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = 'UTC'
CELERY_IMPORTS = ("api.tasks")
CELERYBEAT_SCHEDULE = {
'add_job': {
'task': 'add_job',
'schedule': timedelta(seconds=3),
'args': (16, 16)
},
}
CELERY_TIMEZONE = 'UTC'
Run Code Online (Sandbox Code Playgroud)
在celery.py中:
from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'blogpodapi.settings') …Run Code Online (Sandbox Code Playgroud) 我读到消息队列优于subprocess.Popen(). 据说消息队列是可扩展的解决方案。我想了解这是怎么回事。
我只是想列出消息队列的好处subeprocess.Popen(),以便我可以说服我的上级使用消息队列而不是subprocess
有时重新启动 celerybeat 后,我会收到以下错误,我已将 celerybeat 设置为 redis 的服务,
sude service celerybeat restart
Run Code Online (Sandbox Code Playgroud)
下面是异常跟踪
Traceback (most recent call last):
File "/home/ec2-user/pyenv/local/lib/python3.4/site-packages/celery/beat.py", line 484, in start
time.sleep(interval)
File "/home/ec2-user/pyenv/local/lib/python3.4/site-packages/celery/apps/beat.py", line 148, in _sync
beat.sync()
File "/home/ec2-user/pyenv/local/lib/python3.4/site-packages/celery/beat.py", line 493, in sync
self.scheduler.close()
File "/home/ec2-user/pyenv/local/lib/python3.4/site-packages/redbeat/schedulers.py", line 272, in close
self.lock.release()
File "/home/ec2-user/pyenv/local/lib/python3.4/site-packages/redis/lock.py", line 135, in release
self.do_release(expected_token)
File "/home/ec2-user/pyenv/local/lib/python3.4/site-packages/redis/lock.py", line 264, in do_release
raise LockError("Cannot release a lock that's no longer owned")
redis.exceptions.LockError: Cannot release a lock that's no longer owned
During handling of the …Run Code Online (Sandbox Code Playgroud) 我正在尝试配置 Django 和 Celery,当我将任务导入 models.py 文件并同时将模型导入到tasks.py 文件时遇到问题。否则芹菜正在工作。请参阅下面的代码...
core/celery.py
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from django.conf import settings
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'core.settings')
app = Celery('core')
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
@app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
users/models.py
from django.db import models
from core.tasks import celery_test #this triggers the error
class CustomUser(AbstractUser):
username = None
email = models.EmailField(_('email address'), unique=True)
def __str__(self):
return self.email
core/tasks.py
from celery.decorators import task
from …Run Code Online (Sandbox Code Playgroud) 我正在使用 django 作为我的 Web 应用程序的框架。为了使用 celery,我安装了django-celery、celery和celer[redis]。当它尝试启动 celery 工作程序时,它显示错误
无法连接到 redis://localhost:6379/0:连接到 localhost:6379 时出现错误 10061。由于目标机器主动拒绝,无法建立连接。请在 6.00 秒后重试...
我使用的是 Windows 笔记本电脑。如何启动 redis://localhost:6379/0 服务器。
This is the result of running the worker
$ celery worker -A myemail.celery -l info
-------------- celery@LAPTOP-ERVJPN6C v4.3.0 (rhubarb)
---- **** -----
--- * *** * -- Windows-10-10.0.18362-SP0 2019-12-30 19:35:13
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app: myemail:0x38d56d0
- ** ---------- .> transport: redis://localhost:6379/0
- ** ---------- …Run Code Online (Sandbox Code Playgroud) 我正在使用Celery和Redis作为代理,我可以看到队列实际上是一个redis列表,其中序列化任务作为项目.
我的问题是,如果我有一个AsyncResult对象作为调用的结果<task>.delay(),有没有办法确定项目在队列中的位置?
更新:
我终于可以使用以下方式获得该职位:
from celery.task.control import inspect
i = inspect()
i.reserved()
Run Code Online (Sandbox Code Playgroud)
但它有点慢,因为它需要与所有工人沟通.
我有一个在Linux机器上运行的Django应用程序(Debian).在过去的一段时间里,应用程序已经完美运行.最近我需要在芹菜任务开始挂起后重启机器,并且清除任务没有达到预期的效果.当我现在尝试使用芹菜时
sudo celery -A myapp.tasks worker -Ofair
Run Code Online (Sandbox Code Playgroud)
我得到了以下回溯
Traceback (most recent call last):
File "/usr/local/bin/celery", line 9, in <module>
load_entry_point('celery==3.1.11', 'console_scripts', 'celery')()
File "/usr/local/lib/python2.7/dist-packages/celery/__main__.py", line 30, in main
main()
File "/usr/local/lib/python2.7/dist-packages/celery/bin/celery.py", line 81, in main
cmd.execute_from_commandline(argv)
File "/usr/local/lib/python2.7/dist-packages/celery/bin/celery.py", line 769, in execute_from_commandline
super(CeleryCommand, self).execute_from_commandline(argv)))
File "/usr/local/lib/python2.7/dist-packages/celery/bin/base.py", line 304, in execute_from_commandline
argv = self.setup_app_from_commandline(argv)
File "/usr/local/lib/python2.7/dist-packages/celery/bin/base.py", line 464, in setup_app_from_commandline
self.app = self.find_app(app)
File "/usr/local/lib/python2.7/dist-packages/celery/bin/base.py", line 484, in find_app
return find_app(app, symbol_by_name=self.symbol_by_name)
File "/usr/local/lib/python2.7/dist-packages/celery/app/utils.py", line 225, in find_app
sym = imp(app) …Run Code Online (Sandbox Code Playgroud) 我已经在我的virutalenv中安装celery和redis使用.打字请问我做错了什么?pip install redis celery'djangoscrape'redis-server -bash: redis-server: command not found.
还输入:
/Users/Me/.virtualenvs/djangoscrape/bin/celery --app = scraper.celery_tasks:app worker --lvelvel = INFO
结果是:
-------------- celery@MikkyPro v3.1.18 (Cipater)
---- **** -----
--- * *** * -- Darwin-14.5.0-x86_64-i386-64bit
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app: scraper:0x1084719d0
- ** ---------- .> transport: redis://localhost:6379/0
- ** ---------- .> results: djcelery.backends.database:DatabaseBackend
- *** --- * --- .> concurrency: 8 (prefork)
-- ******* ----
--- ***** …Run Code Online (Sandbox Code Playgroud) 这种配置是正确的.我以错误的方式开始芹菜:(,没有指定项目名称.(芹菜工人-A hockey_manager -l info
我从1.6.5升级到Django 1.9,我不能再让芹菜配置工作了.
经过近两天寻找解决方案后,我没有找到任何工作.
芹菜没有检测到我的任务.我尝试过:
依赖
amqp==2.0.3
celery==3.1.23
Django==1.9.8
django-celery==3.1.17
kombu==3.0.35
Run Code Online (Sandbox Code Playgroud)
项目结构
hockey_manager/__ init__.py
from __future__ import absolute_import
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
Run Code Online (Sandbox Code Playgroud)
hockey_manager/celery.py
from __future__ import absolute_import
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'hockey_manager.settings.common')
app = Celery('hockey_manager') …Run Code Online (Sandbox Code Playgroud) django-celery ×10
celery ×9
django ×9
python ×5
redis ×4
daemon ×1
djcelery ×1
linux ×1
virtualenv ×1