带有asyncio的多个循环

Mat*_*att 18 python asynchronous python-3.x python-asyncio aiohttp

是否可以使用asyncio进行多个循环?如果回答是肯定的,我该怎么做?我的用例是:*我从异步中的网站列表中提取网址*对于每个"子网址列表",我会在异步/抓取它们

提取网址的示例:

import asyncio
import aiohttp
from suburls import extractsuburls

@asyncio.coroutine
def extracturls(url):
    subtasks = []
    response = yield from aiohttp.request('GET', url)
    suburl_list = yield from response.text()
    for suburl in suburl_list:
        subtasks.append(asyncio.Task(extractsuburls(suburl)))
     loop = asyncio.get_event_loop()
     loop.run_until_complete(asyncio.gather(*subtasks))

 if __name__ == '__main__':
     urls_list = ['http://example1.com', 'http://example2.com']
     for url in url_list: 
          subtasks.append(asyncio.Task(extractsuburls(url)))  
     loop = asyncio.get_event_loop()
     loop.run_until_complete(asyncio.gather(*subtasks))
     loop.close()
Run Code Online (Sandbox Code Playgroud)

如果我执行这段代码,当python尝试启动第二个循环时,我会发现一个错误,就是说循环已经在运行了.

PS:我的模块"extractsuburls"使用aiohttp来执行web请求.

编辑:

好吧,我试过这个解决方案:

import asyncio
import aiohttp
from suburls import extractsuburls

@asyncio.coroutine
def extracturls( url ):
    subtasks = []
    response = yield from aiohttp.request('GET', url)
    suburl_list = yield from response.text()
    jobs_loop = asyncio.new_event_loop()
    for suburl in suburl_list:
        subtasks.append(asyncio.Task(extractsuburls(suburl)))
     asyncio.new_event_loop(jobs_loop)
     jobs_loop.run_until_complete(asyncio.gather(*subtasks))
     jobs_loop.close()

 if __name__ == '__main__':
     urls_list = ['http://example1.com', 'http://example2.com']
     for url in url_list: 
          subtasks.append(asyncio.Task(extractsuburls(url)))  
     loop = asyncio.get_event_loop()
     loop.run_until_complete(asyncio.gather(*subtasks))
     loop.close()
Run Code Online (Sandbox Code Playgroud)

但我有这个错误:循环参数必须与Future一致

任何的想法?

And*_*lov 31

你不需要几个事件循环,只需yield from gather(*subtasks)extracturls()协同程序中使用:

import asyncio
import aiohttp
from suburls import extractsuburls

@asyncio.coroutine
def extracturls(url):
    subtasks = []
    response = yield from aiohttp.request('GET', url)
    suburl_list = yield from response.text()
    for suburl in suburl_list:
        subtasks.append(extractsuburls(suburl))
    yield from asyncio.gather(*subtasks)

 if __name__ == '__main__':
     urls_list = ['http://example1.com', 'http://example2.com']
     for url in url_list: 
          subtasks.append(extractsuburls(url))
     loop = asyncio.get_event_loop()
     loop.run_until_complete(asyncio.gather(*subtasks))
     loop.close()
Run Code Online (Sandbox Code Playgroud)

结果,您等待子任务直到extracturls完成.

  • `asyncio.gather`接受协程和任务,因此你可以跳过两个显式的`asyncio.Task`调用. (3认同)
  • 当然!只需将task coroutine包装成`wait_for`调用就像`asyncio.Task(asyncio.wait_for(extractsuburls(url),10.0))` (2认同)