cat*_*ure 7 python-3.x python-requests python-asyncio aiohttp
我刚刚开始使用Python3.4中的asyncio库,并编写了一个小程序,试图同时获取50个网页.该程序在几百个请求之后以"太多打开文件"异常爆炸.
我认为我的fetch方法用'response.read_and_close()'方法调用关闭连接.
有什么想法在这里发生了什么?我是以正确的方式解决这个问题吗?
import asyncio
import aiohttp
@asyncio.coroutine
def fetch(url):
response = yield from aiohttp.request('GET', url)
response = yield from response.read_and_close()
return response.decode('utf-8')
@asyncio.coroutine
def print_page(url):
page = yield from fetch(url)
# print(page)
@asyncio.coroutine
def process_batch_of_urls(round, urls):
print("Round starting: %d" % round)
coros = []
for url in urls:
coros.append(asyncio.Task(print_page(url)))
yield from asyncio.gather(*coros)
print("Round finished: %d" % round)
@asyncio.coroutine
def process_all():
api_url = 'https://google.com'
for i in range(10):
urls = []
for url in range(50):
urls.append(api_url)
yield from process_batch_of_urls(i, urls)
loop = asyncio.get_event_loop()
loop.run_until_complete(process_all())
Run Code Online (Sandbox Code Playgroud)
我得到的错误是:
Traceback (most recent call last):
File "/usr/local/lib/python3.4/site-packages/aiohttp/client.py", line 106, in request
File "/usr/local/lib/python3.4/site-packages/aiohttp/connector.py", line 135, in connect
File "/usr/local/lib/python3.4/site-packages/aiohttp/connector.py", line 242, in _create_connection
File "/usr/local/Cellar/python3/3.4.1/Frameworks/Python.framework/Versions/3.4/lib/python3.4/asyncio/base_events.py", line 424, in create_connection
File "/usr/local/Cellar/python3/3.4.1/Frameworks/Python.framework/Versions/3.4/lib/python3.4/asyncio/base_events.py", line 392, in create_connection
File "/usr/local/Cellar/python3/3.4.1/Frameworks/Python.framework/Versions/3.4/lib/python3.4/socket.py", line 123, in __init__
OSError: [Errno 24] Too many open files
During handling of the above exception, another exception occurred:
Run Code Online (Sandbox Code Playgroud)
啊哈,我找你问题.
明确的连接器绝对可以解决这个问题.
https://github.com/KeepSafe/aiohttp/pull/79也应修复隐式连接器.
非常感谢您在aiohttp中查找资源泄漏
UPD. aiohttp 0.8.2没有问题.
好吧,我终于开始工作了。
结果我必须使用 TCPConnector 来池化连接。
所以我做了这个变量:
connector = aiohttp.TCPConnector(share_cookies=True, loop=loop)
Run Code Online (Sandbox Code Playgroud)
并将其传递给每个 get 请求。我的新获取例程如下所示:
@asyncio.coroutine
def fetch(url):
data = ""
try:
yield from asyncio.sleep(1)
response = yield from aiohttp.request('GET', url, connector=connector)
except Exception as exc:
print('...', url, 'has error', repr(str(exc)))
else:
data = (yield from response.read()).decode('utf-8', 'replace')
response.close()
return data
Run Code Online (Sandbox Code Playgroud)