小编Abh*_*bey的帖子

如何在深度学习中处理非常大的 3D 数据

我有一个很大的输入特征作为大小500x500x50010000此类样本的3D 数组。和尺寸标签500x500x500x500。我创建了一个输入形状的模型,在输入处500x500x500仅使用Conv3D一层,Dense在输出处仅使用一层(我有自己的理由在输出处使用密集层),网络的输出形状为500x500x500x500.

以下是我使用的最低限度模型:

ip = Input(shape=(500,500,500,1))  
x = Conv3D(100,3,activation="relu",padding='same')(ip)
x = Dense(500,activation="softmax")(x)
nn = Model(inputs=ip, outputs=x)
Run Code Online (Sandbox Code Playgroud)

以下是摘要:

_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_5 (InputLayer)         (None, 500, 500, 500, 1)  0         
_________________________________________________________________
conv3d_4 (Conv3D)            (None, 500, 500, 500, 100 2800      
_________________________________________________________________
dense_4 (Dense)              (None, 500, 500, 500, 500 50500     
=================================================================
Total params: 53,300
Trainable params: 53,300
Non-trainable params: 0
_________________________________________________________________
Run Code Online (Sandbox Code Playgroud)

当我运行模型时出现内存错误,因为我有 64 GB RAM …

artificial-intelligence machine-learning deep-learning keras data-science

5
推荐指数
1
解决办法
105
查看次数