在深度学习中无法序列化大于 4 GiB 的字节对象

Nor*_*ran 1 neural-network deep-learning keras tensorflow google-colaboratory

我正在 Google Colab环境中使用Keras创建Siamese网络来验证图像。我使用了来自 GitHub 的这段代码。但是当我尝试运行pickle.dump代码时出现错误:

with open(os.path.join(save_path,"train.pickle"), "wb") as f:
    pickle.dump((X,c),f)
Run Code Online (Sandbox Code Playgroud)

错误信息是:

---------------------------------------------------------------------------
OverflowError                             Traceback (most recent call last)
<ipython-input-7-af9d0618d385> in <module>()
      3 
      4 with open(os.path.join(save_path,"train.pickle"), "wb") as f:
----> 5         pickle.dump((X,c),f)
      6 
      7 

OverflowError: cannot serialize a bytes object larger than 4 GiB
Run Code Online (Sandbox Code Playgroud)

我在这个网站上找到了一些相关的问题,但我找不到有用的答案。我该如何解决这个错误?

Bob*_*ith 5

使用泡菜protocol=4,例如,

pickle.dump((X,c), f, protocol=4)
Run Code Online (Sandbox Code Playgroud)