bas*_*kum 7 python django multiprocessing celery
You can use celery to call a task by name, that is registered in a different process (or even on a different machine):
celery.send_task(task_name, args=args, kwargs=kwargs)
Run Code Online (Sandbox Code Playgroud)
(http://celery.readthedocs.org/en/latest/reference/celery.html#celery.Celery.send_task)
I now would like to be able to add a callback that will be executed as soon as the task finished and that will be executed within the process that is calling the task.
My Setup
I have a server A, that runs a django powered website and I use a basic celery setup as described here. I don't run a celery worker on server A.
Then there is server B, that runs (several) celery worker.
So far, this setup seems to work pretty good. I can send tasks on server A and they get executed on the remote server B.
The Problem
The only problem is, that I'm not able to add a callback function.
In the docs it says, that you can add a callback by providing a follow-up task. So I could do something like this:
@celery.task
def result_handler(result):
print "YEAH"
celery.send_task(task_name, args=args, kwargs=kwargs, link=result_handler.s())
Run Code Online (Sandbox Code Playgroud)
然而,这意味着,我必须在服务器 A 上启动一个注册任务“result_handler”的工作程序。即使我这样做,处理程序也将在工作人员产生的进程中调用,而不是在调用任务的 django 进程中调用。
我能想出的唯一解决方案是无限循环,每 2 秒检查一次任务是否准备就绪,但我认为应该有一个更简单的解决方案。
| 归档时间: |
|
| 查看次数: |
2043 次 |
| 最近记录: |