SC_*_*ard 2 keras tensorflow keras-layer tf.keras
我是 keras 新手,正在处理以 Tensorflow 作为后端的回归问题。
X1 = TrainingSet[:,0:603]
Y1 = TrainingSet[:,603:607]
###################################
#reshape Xtrain for CNN
X1 = X1.reshape(9999,3,201,1)
###################################
# create model
model = Sequential()
model.add(Conv2D(32, kernel_size=(3, 3), strides=(1, 1),
activation='relu',
input_shape=(3,201,1), data_format='channels_first'))
model.add(MaxPooling2D(pool_size=(2, 2), dim_ordering="tf",strides=(1, 1)))
model.add(Conv2D(32, (3, 3), activation='relu', data_format='channels_first'))
model.add(MaxPooling2D(pool_size=(2, 2), dim_ordering="tf",strides=(1, 1)))
model.add(Flatten())
model.add(Dense(1000, activation="tanh", kernel_initializer="uniform"))
model.add(Dense(4, activation="relu", kernel_initializer="uniform"))
# Compile model
model.compile(loss='mse', optimizer='adam', metrics=['mae'])
# Fit the model
history = model.fit(X1, Y1, validation_split=0.1, epochs=100, batch_size=100, verbose=1)
# Calculate predictions
PredTestSet = model.predict(X1)
Run Code Online (Sandbox Code Playgroud)
我得到了值错误
ValueError: Negative dimension size caused by subtracting 3 from 1 for 'conv2d_26/convolution' (op:
'Conv2D') with input shapes: [?,201,1,3], [3,3,3,32].
Run Code Online (Sandbox Code Playgroud)
我已经检查过类似的问题,例如 https://github.com/keras-team/keras/issues/7611
然而,他们的解决方案对我不起作用。
小智 5
我能够重现您的错误
import tensorflow as tf
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, Dropout, Activation, Flatten
from tensorflow.keras.layers import Conv2D, MaxPooling2D
import numpy as np
model = Sequential()
model.add(Conv2D(32, kernel_size=(3, 3), strides=(1, 1),
activation='relu',
input_shape=(3,201,1), data_format='channels_first'))
model.add(MaxPooling2D(pool_size=(2, 2), dim_ordering="tf",strides=(1, 1)))
model.add(Conv2D(32, (3, 3), activation='relu', data_format='channels_first'))
model.add(MaxPooling2D(pool_size=(2, 2), dim_ordering="tf",strides=(1, 1)))
model.add(Flatten())
model.add(Dense(1000, activation="tanh", kernel_initializer="uniform"))
model.add(Dense(4, activation="relu", kernel_initializer="uniform"))
model.summary()
Run Code Online (Sandbox Code Playgroud)
输出:
---------------------------------------------------------------------------
InvalidArgumentError Traceback (most recent call last)
/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/ops.py in _create_c_op(graph, node_def, inputs, control_inputs, op_def)
1653 try:
-> 1654 c_op = pywrap_tf_session.TF_FinishOperation(op_desc)
1655 except errors.InvalidArgumentError as e:
InvalidArgumentError: Negative dimension size caused by subtracting 3 from 1 for '{{node conv2d/Conv2D}} = Conv2D[T=DT_FLOAT, data_format="NCHW", dilations=[1, 1, 1, 1], explicit_paddings=[], padding="VALID", strides=[1, 1, 1, 1], use_cudnn_on_gpu=true](conv2d_input, conv2d/Conv2D/ReadVariableOp)' with input shapes: [?,3,201,1], [3,3,3,32].
During handling of the above exception, another exception occurred:
ValueError Traceback (most recent call last)
14 frames
/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/ops.py in _create_c_op(graph, node_def, inputs, control_inputs, op_def)
1655 except errors.InvalidArgumentError as e:
1656 # Convert to ValueError for backwards compatibility.
-> 1657 raise ValueError(str(e))
1658
1659 return c_op
ValueError: Negative dimension size caused by subtracting 3 from 1 for '{{node conv2d/Conv2D}} = Conv2D[T=DT_FLOAT, data_format="NCHW", dilations=[1, 1, 1, 1], explicit_paddings=[], padding="VALID", strides=[1, 1, 1, 1], use_cudnn_on_gpu=true](conv2d_input, conv2d/Conv2D/ReadVariableOp)' with input shapes: [?,3,201,1], [3,3,3,32].
Run Code Online (Sandbox Code Playgroud)
解决方案:
padding='same'
这个问题可以通过包含到所有convolution
层来解决,maxpooling
通过这些层我们可以获得相同的结果dimensionality
(在默认行为中,即padding = 'valid'
在卷积和最大池化期间会发生自动降维,并且将面临负维度问题)。
在解决上述问题时,遇到了另一个问题是dim_ordering
在MaxPooling
层中到期的。此选项在 Keras 2 中更改为“data_format”。
dim_ordering='tf'
相当于data_format="channels_last"
我相应地对网络进行了更改
model = Sequential()
model.add(Conv2D(32, kernel_size=(3, 3), strides=(1, 1),
activation='relu',
input_shape=(3,201,1), padding='same', data_format='channels_first'))
model.add(MaxPooling2D(pool_size=(2, 2), padding='same', data_format='channels_last',strides=(1, 1)))
model.add(Conv2D(32, (3, 3), activation='relu', padding='same', data_format='channels_first'))
model.add(MaxPooling2D(pool_size=(2, 2), padding='same', data_format='channels_last',strides=(1, 1)))
model.add(Flatten())
model.add(Dense(1000, activation="tanh", kernel_initializer="uniform"))
model.add(Dense(4, activation="relu", kernel_initializer="uniform"))
model.summary()
Run Code Online (Sandbox Code Playgroud)
输出:
Model: "sequential_7"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d_8 (Conv2D) (None, 32, 201, 1) 896
_________________________________________________________________
max_pooling2d_2 (MaxPooling2 (None, 32, 201, 1) 0
_________________________________________________________________
conv2d_9 (Conv2D) (None, 32, 201, 1) 9248
_________________________________________________________________
max_pooling2d_3 (MaxPooling2 (None, 32, 201, 1) 0
_________________________________________________________________
flatten_1 (Flatten) (None, 6432) 0
_________________________________________________________________
dense_2 (Dense) (None, 1000) 6433000
_________________________________________________________________
dense_3 (Dense) (None, 4) 4004
=================================================================
Total params: 6,447,148
Trainable params: 6,447,148
Non-trainable params: 0
_________________________________________________________________
Run Code Online (Sandbox Code Playgroud)
归档时间: |
|
查看次数: |
3732 次 |
最近记录: |