dim*_*myG 3 python event-loop coroutine python-asyncio python-multiprocessing
如果我运行以下代码:
import asyncio
import time
import concurrent.futures
def cpu_bound(mul):
for i in range(mul*10**8):
i+=1
print('result = ', i)
return i
async def say_after(delay, what):
print('sleeping async...')
await asyncio.sleep(delay)
print(what)
# The run_in_pool function must not block the event loop
async def run_in_pool():
with concurrent.futures.ProcessPoolExecutor() as executor:
result = executor.map(cpu_bound, [1, 1, 1])
async def main():
task1 = asyncio.create_task(say_after(0.1, 'hello'))
task2 = asyncio.create_task(run_in_pool())
task3 = asyncio.create_task(say_after(0.1, 'world'))
print(f"started at {time.strftime('%X')}")
await task1
await task2
await task3
print(f"finished at {time.strftime('%X')}")
if __name__ == '__main__':
asyncio.run(main())
Run Code Online (Sandbox Code Playgroud)
输出是:
started at 18:19:28
sleeping async...
result = 100000000
result = 100000000
result = 100000000
sleeping async...
hello
world
finished at 18:19:34
Run Code Online (Sandbox Code Playgroud)
这表明事件循环会阻塞,直到 cpu 绑定作业 ( task2
) 完成,然后继续执行task3
.
如果我只运行一项 cpu 密集型作业(即run_in_pool
以下一项):
async def run_in_pool():
loop = asyncio.get_running_loop()
with concurrent.futures.ProcessPoolExecutor() as executor:
result = await loop.run_in_executor(executor, cpu_bound, 1)
Run Code Online (Sandbox Code Playgroud)
那么事件循环似乎不会阻塞,因为输出是:
started at 18:16:23
sleeping async...
sleeping async...
hello
world
result = 100000000
finished at 18:16:28
Run Code Online (Sandbox Code Playgroud)
如何task2
在进程池中运行许多 cpu 绑定作业(in )而不阻塞事件循环?
use*_*342 13
正如您所发现的,您需要使用 asyncio 自己的run_in_executor
来等待提交的任务完成而不阻塞事件循环。Asyncio 没有提供 的等效项map
,但模拟它并不难:
async def run_in_pool():
with concurrent.futures.ProcessPoolExecutor() as executor:
futures = [loop.run_in_executor(executor, cpu_bound, i)
for i in (1, 1, 1)]
result = await asyncio.gather(*futures)
Run Code Online (Sandbox Code Playgroud)
归档时间: |
|
查看次数: |
3359 次 |
最近记录: |