fre*_*bie 9 python coroutine control-flow async-await python-asyncio
我希望能够从许多异步协同程序中获益.Asyncio as_completed有点接近我正在寻找的东西(即我希望任何协同程序能够随时返回到调用者然后继续),但这似乎只允许常规协同程序返回一次.
这是我到目前为止所拥有的:
import asyncio
async def test(id_):
print(f'{id_} sleeping')
await asyncio.sleep(id_)
return id_
async def test_gen(id_):
count = 0
while True:
print(f'{id_} sleeping')
await asyncio.sleep(id_)
yield id_
count += 1
if count > 5:
return
async def main():
runs = [test(i) for i in range(3)]
for i in asyncio.as_completed(runs):
i = await i
print(f'{i} yielded')
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
loop.close()
Run Code Online (Sandbox Code Playgroud)
替换runs = [test(i) for i in range(3)]为runs = [test_gen(i) for i in range(3)]和for i in asyncio.as_completed(runs)每次产量的迭代是我追求的.
这是否可以在Python中表达,是否有任何第三方可能为您提供更多选项,然后是协程流程的标准库?
谢谢
你可以使用aiostream.stream.merge:
from aiostream import stream
async def main():
runs = [test_gen(i) for i in range(3)]
async for x in stream.merge(*runs):
print(f'{x} yielded')
Run Code Online (Sandbox Code Playgroud)
在安全的上下文中运行它以确保在迭代后正确清理生成器:
async def main():
runs = [test_gen(i) for i in range(3)]
merged = stream.merge(*runs)
async with merged.stream() as streamer:
async for x in streamer:
print(f'{x} yielded')
Run Code Online (Sandbox Code Playgroud)
或者使用管道使其更紧凑:
from aiostream import stream, pipe
async def main():
runs = [test_gen(i) for i in range(3)]
await (stream.merge(*runs) | pipe.print('{} yielded'))
Run Code Online (Sandbox Code Playgroud)
文档中有更多示例.
给@nirvana-msu发表评论
通过相应地准备源,可以识别产生给定值的发电机:
async def main():
runs = [test_gen(i) for i in range(3)]
sources = [stream.map(xs, lambda x: (i, x)) for i, xs in enumerate(runs)]
async for i, x in stream.merge(*sources):
print(f'ID {i}: {x}')
Run Code Online (Sandbox Code Playgroud)
| 归档时间: |
|
| 查看次数: |
638 次 |
| 最近记录: |