与urllib2或其他http库的多个(异步)连接?

12 python asynchronous urllib2 python-2.5

我有这样的代码.

for p in range(1,1000):
    result = False
    while result is False:
        ret = urllib2.Request('http://server/?'+str(p))
        try:
            result = process(urllib2.urlopen(ret).read())
        except (urllib2.HTTPError, urllib2.URLError):
            pass
    results.append(result)
Run Code Online (Sandbox Code Playgroud)

我想同时提出两三个请求来加速这个.我可以使用urllib2,以及如何使用?如果不是我应该使用哪个其他库?谢谢.

Pio*_*ost 10

您可以使用异步IO来执行此操作.

requests + gevent = grequests

GRequests允许您使用带有Gevent的请求来轻松地进行异步HTTP请求.

import grequests

urls = [
    'http://www.heroku.com',
    'http://tablib.org',
    'http://httpbin.org',
    'http://python-requests.org',
    'http://kennethreitz.com'
]

rs = (grequests.get(u) for u in urls)
grequests.map(rs)
Run Code Online (Sandbox Code Playgroud)


小智 9

看一下gevent - 一个基于协程的Python网络库,它使用greenlet在libevent事件循环之上提供高级同步API.

例:

#!/usr/bin/python
# Copyright (c) 2009 Denis Bilenko. See LICENSE for details.

"""Spawn multiple workers and wait for them to complete"""

urls = ['http://www.google.com', 'http://www.yandex.ru', 'http://www.python.org']

import gevent
from gevent import monkey

# patches stdlib (including socket and ssl modules) to cooperate with other greenlets
monkey.patch_all()

import urllib2


def print_head(url):
    print 'Starting %s' % url
    data = urllib2.urlopen(url).read()
    print '%s: %s bytes: %r' % (url, len(data), data[:50])

jobs = [gevent.spawn(print_head, url) for url in urls]

gevent.joinall(jobs)
Run Code Online (Sandbox Code Playgroud)


Mes*_*ssa 9

所以,它是2016年,我们有Python 3.4+,内置asyncio模块用于异步I/O. 我们可以使用aiohttp作为HTTP客户端并行下载多个URL.

import asyncio
from aiohttp import ClientSession

async def fetch(url):
    async with ClientSession() as session:
        async with session.get(url) as response:
            return await response.read()

async def run(loop, r):
    url = "http://localhost:8080/{}"
    tasks = []
    for i in range(r):
        task = asyncio.ensure_future(fetch(url.format(i)))
        tasks.append(task)

    responses = await asyncio.gather(*tasks)
    # you now have all response bodies in this variable
    print(responses)

loop = asyncio.get_event_loop()
future = asyncio.ensure_future(run(loop, 4))
loop.run_until_complete(future)
Run Code Online (Sandbox Code Playgroud)

来源:复制粘贴自http://pawelmhm.github.io/asyncio/python/aiohttp/2016/04/22/asyncio-aiohttp.html