Pet*_*ung 6 python python-3.x python-asyncio aiohttp
我有两个脚本,scraper.py和db_control.py.在scraper.py我有这样的事情:
...
def scrap(category, field, pages, search, use_proxy, proxy_file):
...
loop = asyncio.get_event_loop()
to_do = [ get_pages(url, params, conngen) for url in urls ]
wait_coro = asyncio.wait(to_do)
res, _ = loop.run_until_complete(wait_coro)
...
loop.close()
return [ x.result() for x in res ]
...
Run Code Online (Sandbox Code Playgroud)
在db_control.py中:
from scraper import scrap
...
while new < 15:
data = scrap(category, field, pages, search, use_proxy, proxy_file)
...
...
Run Code Online (Sandbox Code Playgroud)
从理论上讲,刮板应该在未知时间开始,直到获得足够的数据.但是当new不是imidiatelly > 15然后这个错误发生:
File "/usr/lib/python3.4/asyncio/base_events.py", line 293, in run_until_complete
self._check_closed()
File "/usr/lib/python3.4/asyncio/base_events.py", line 265, in _check_closed
raise RuntimeError('Event loop is closed')
RuntimeError: Event loop is closed
Run Code Online (Sandbox Code Playgroud)
但是如果我只运行一次scrap(),脚本就可以了.所以我觉得重新创建有一些问题loop = asyncio.get_event_loop(),我试过这个但没有改变.我怎么解决这个问题?当然,这些都只是片断我的代码,如果你认为问题可能是在其他地方,充满了代码,请点击这里.
方法run_until_complete,run_forever,run_in_executor,create_task,call_at明确检查的循环,如果它的封闭抛出异常.
从文档引用 - BaseEvenLoop.close:
这是幂等的,不可逆转的
除非你有一些(好的)理由,否则你可能只是省略了关闭线:
def scrap(category, field, pages, search, use_proxy, proxy_file):
#...
loop = asyncio.get_event_loop()
to_do = [ get_pages(url, params, conngen) for url in urls ]
wait_coro = asyncio.wait(to_do)
res, _ = loop.run_until_complete(wait_coro)
#...
# loop.close()
return [ x.result() for x in res ]
Run Code Online (Sandbox Code Playgroud)
如果您希望每次都有一个全新的循环,您必须手动创建它并设置为默认值:
def scrap(category, field, pages, search, use_proxy, proxy_file):
#...
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
to_do = [ get_pages(url, params, conngen) for url in urls ]
wait_coro = asyncio.wait(to_do)
res, _ = loop.run_until_complete(wait_coro)
#...
return [ x.result() for x in res ]
Run Code Online (Sandbox Code Playgroud)
| 归档时间: |
|
| 查看次数: |
3326 次 |
| 最近记录: |