关闭urllib2连接

Big*_*gie 9 python ftp connection urllib2

我正在使用urllib2从ftp和http服务器加载文件.

某些服务器仅支持每个IP一个连接.问题是,urllib2不会立即关闭连接.看一下示例程序.

from urllib2 import urlopen
from time import sleep

url = 'ftp://user:pass@host/big_file.ext'

def load_file(url):
    f = urlopen(url)
    loaded = 0
    while True:
        data = f.read(1024)
        if data == '':
            break
        loaded += len(data)
    f.close()
    #sleep(1)
    print('loaded {0}'.format(loaded))

load_file(url)
load_file(url)
Run Code Online (Sandbox Code Playgroud)

代码从ftp-server加载两个文件(这里两个文件是相同的),只支持1个连接.这将打印以下日志:

loaded 463675266
Traceback (most recent call last):
  File "conection_test.py", line 20, in <module>
    load_file(url)
  File "conection_test.py", line 7, in load_file
    f = urlopen(url)
  File "/usr/lib/python2.6/urllib2.py", line 126, in urlopen
    return _opener.open(url, data, timeout)
  File "/usr/lib/python2.6/urllib2.py", line 391, in open
    response = self._open(req, data)
  File "/usr/lib/python2.6/urllib2.py", line 409, in _open
    '_open', req)
  File "/usr/lib/python2.6/urllib2.py", line 369, in _call_chain
    result = func(*args)
  File "/usr/lib/python2.6/urllib2.py", line 1331, in ftp_open
    fw = self.connect_ftp(user, passwd, host, port, dirs, req.timeout)
  File "/usr/lib/python2.6/urllib2.py", line 1352, in connect_ftp
    fw = ftpwrapper(user, passwd, host, port, dirs, timeout)
  File "/usr/lib/python2.6/urllib.py", line 854, in __init__
    self.init()
  File "/usr/lib/python2.6/urllib.py", line 860, in init
    self.ftp.connect(self.host, self.port, self.timeout)
  File "/usr/lib/python2.6/ftplib.py", line 134, in connect
    self.welcome = self.getresp()
  File "/usr/lib/python2.6/ftplib.py", line 216, in getresp
    raise error_temp, resp
urllib2.URLError: <urlopen error ftp error: 421 There are too many connections from your internet address.>
Run Code Online (Sandbox Code Playgroud)

因此第一个文件被加载,第二个文件失败,因为第一个连接没有关闭.

但是当我使用sleep(1)f.close()误差不occurr:

loaded 463675266
loaded 463675266
Run Code Online (Sandbox Code Playgroud)

有没有办法强制关闭连接,以便第二次下载不会失败?

San*_*nda 0

亚历克斯·马尔泰利回答了类似的问题。阅读本文:我应该在 urllib.urlopen() 之后调用 close() 吗?

简而言之:

import contextlib

with contextlib.closing(urllib.urlopen(u)) as x:
    # ...
Run Code Online (Sandbox Code Playgroud)

  • 正如您在[此处](http://docs.python.org/library/contextlib.html#contextlib.looking)所看到的,“contextlib.looking”仅使用“close()”。这也是我在上面的代码中手动执行的操作。所以问题仍然存在,第二次下载将失败,因为第一个连接没有使用“close()”立即关闭。 (4认同)