我有一个有两层神经网络的例子.第一层有两个参数,有一个输出.第二个应该采用一个参数作为第一层和另一个参数的结果.它应该是这样的:
x1 x2 x3
\ / /
y1 /
\ /
y2
Run Code Online (Sandbox Code Playgroud)
所以,我创建了一个有两层的模型并尝试合并它们,但它返回一个错误:The first layer in a Sequential model must get an "input_shape" or "batch_input_shape" argument.就行了result.add(merged).
模型:
first = Sequential()
first.add(Dense(1, input_shape=(2,), activation='sigmoid'))
second = Sequential()
second.add(Dense(1, input_shape=(1,), activation='sigmoid'))
result = Sequential()
merged = Concatenate([first, second])
ada_grad = Adagrad(lr=0.1, epsilon=1e-08, decay=0.0)
result.add(merged)
result.compile(optimizer=ada_grad, loss=_loss_tensor, metrics=['accuracy'])
Run Code Online (Sandbox Code Playgroud) 我试图在Keras 2.0中合并两个Sequential模型,使用以下行:
merged_model.add(Merge([model1, model2], mode='concat'))
Run Code Online (Sandbox Code Playgroud)
这仍然可以正常工作,但会发出警告:
"The `Merge` layer is deprecated and will be removed after 08/2017. Use
instead layers from `keras.layers.merge`, e.g. `add`, `concatenate`, etc."
Run Code Online (Sandbox Code Playgroud)
但是,研究Keras文档并尝试添加Add(),并没有产生有效的东西.我已经阅读了几个有相同问题的人发帖,但发现没有解决方案适用于我的情况.有什么建议?
model = Sequential()
model1 = Sequential()
model1.add(Dense(300, input_dim=40, activation='relu', name='layer_1'))
model2 = Sequential()
model2.add(Dense(300, input_dim=40, activation='relu', name='layer_2'))
merged_model = Sequential()
merged_model.add(Merge([model1, model2], mode='concat'))
merged_model.add(Dense(1, activation='softmax', name='output_layer'))
merged_model.compile(loss='binary_crossentropy', optimizer='adam',
metrics=['accuracy'])
checkpoint = ModelCheckpoint('weights.h5', monitor='val_acc',
save_best_only=True, verbose=2)
early_stopping = EarlyStopping(monitor="val_loss", patience=5)
merged_model.fit([x1, x2], y=y, batch_size=384, epochs=200,
verbose=1, validation_split=0.1, shuffle=True,
callbacks=[early_stopping, checkpoint])
Run Code Online (Sandbox Code Playgroud)
编辑:当我尝试时(如下面由Kent Sommer建议):
from keras.layers.merge …Run Code Online (Sandbox Code Playgroud)