小编Mat*_*att的帖子

暂停Python生成器

我有一个python生成器,它可以生成大量数据,占用大量内存.有没有办法检测处理过的数据是否已被使用生成器的代码"消耗",如果是,请暂停直到它被消耗?

def multi_grab(urls,proxy=None,ref=None,xpath=False,compress=True,delay=10,pool_size=50,retries=1,http_obj=None):
    if proxy is not None:
        proxy = web.ProxyManager(proxy,delay=delay)
        pool_size = len(pool_size.records)
    work_pool = pool.Pool(pool_size)
    partial_grab = partial(grab,proxy=proxy,post=None,ref=ref,xpath=xpath,compress=compress,include_url=True,retries=retries,http_obj=http_obj)
    for result in work_pool.imap_unordered(partial_grab,urls):
        if result:
            yield result
Run Code Online (Sandbox Code Playgroud)

逃离:

if __name__ == '__main__':
    links = set(link for link in grab('http://www.reddit.com',xpath=True).xpath('//a/@href') if link.startswith('http') and 'reddit' not in link)
    print '%s links' % len(links)
    counter = 1
    for url, data in multi_grab(links,pool_size=10):
        print 'got', url, counter, len(data)
        counter += 1
Run Code Online (Sandbox Code Playgroud)

python generator

3
推荐指数
1
解决办法
956
查看次数

标签 统计

generator ×1

python ×1