使用Gevent Pool的Python脚本消耗大量内存,锁定

Nik*_*hil 5 python gevent python-2.7

我有一个非常简单的Python脚本,使用gevent.pool下载URL(见下文).该脚本可以运行几天,然后锁定.我注意到那时的内存使用率非常高.我错误地使用了gevent吗?

import sys

from gevent import monkey
monkey.patch_all()
import urllib2

from gevent.pool import Pool

inputFile = open(sys.argv[1], 'r')
urls = []
counter = 0
for line in inputFile:
    counter += 1
    urls.append(line.strip())
inputFile.close()

outputDirectory = sys.argv[2]

def fetch(url):
    try:
        body = urllib2.urlopen("http://" + url, None, 5).read()
        if len(body) > 0:
            outputFile = open(outputDirectory + "/" + url, 'w')
            outputFile.write(body)
            outputFile.close()
            print "Success", url
    except:
        pass

pool = Pool(int(sys.argv[3]))
pool.map(fetch, urls)
Run Code Online (Sandbox Code Playgroud)

fal*_*tru 2

        body = urllib2.urlopen("http://" + url, None, 5).read()
Run Code Online (Sandbox Code Playgroud)

上面一行将内存中的全部内容作为字符串读取。为了防止这种情况,请按如下方式更改 fetch() :

def fetch(url):
    try:
        u = urllib2.urlopen("http://" + url, None, 5)
        try:
            with open(outputDirectory + "/" + url, 'w') as outputFile:
                while True:
                    chunk = u.read(65536)
                    if not chunk:
                        break
                    outputFile.write(chunk)
        finally:
            u.close()
        print "Success", url
    except:
        print "Fail", url
Run Code Online (Sandbox Code Playgroud)