如何在keras中连接两个层?

rdo*_*rdo 79 python machine-learning neural-network keras

我有一个有两层神经网络的例子.第一层有两个参数,有一个输出.第二个应该采用一个参数作为第一层和另一个参数的结果.它应该是这样的:

x1  x2  x3
 \  /   /
  y1   /
   \  /
    y2
Run Code Online (Sandbox Code Playgroud)

所以,我创建了一个有两层的模型并尝试合并它们,但它返回一个错误:The first layer in a Sequential model must get an "input_shape" or "batch_input_shape" argument.就行了result.add(merged).

模型:

first = Sequential()
first.add(Dense(1, input_shape=(2,), activation='sigmoid'))

second = Sequential()
second.add(Dense(1, input_shape=(1,), activation='sigmoid'))

result = Sequential()
merged = Concatenate([first, second])
ada_grad = Adagrad(lr=0.1, epsilon=1e-08, decay=0.0)
result.add(merged)
result.compile(optimizer=ada_grad, loss=_loss_tensor, metrics=['accuracy'])
Run Code Online (Sandbox Code Playgroud)

ors*_*ady 100

那么你得到的错误是因为定义的结果result只是模型的一个容器而你没有为它定义输入.

鉴于你正在尝试建立什么设置Sequential()以采取第三个输入result.

first = Sequential()
first.add(Dense(1, input_shape=(2,), activation='sigmoid'))

second = Sequential()
second.add(Dense(1, input_shape=(1,), activation='sigmoid'))

third = Sequential()
# of course you must provide the input to result with will be your x3
third.add(Dense(1, input_shape=(1,), activation='sigmoid'))

# lets say you add a few more layers to first and second.
# concatenate them
merged = Concatenate([first, second])

# then concatenate the two outputs

result = Concatenate([merged,  third])

ada_grad = Adagrad(lr=0.1, epsilon=1e-08, decay=0.0)

result.compile(optimizer=ada_grad, loss='binary_crossentropy',
               metrics=['accuracy'])
Run Code Online (Sandbox Code Playgroud)

但是,我建立具有此类输入结构的模型的首选方法是使用函数api.

以下是您的要求实现,以帮助您入门:

from keras.models import Model
from keras.layers import Concatenate, Dense, LSTM, Input, concatenate
from keras.optimizers import Adagrad

first_input = Input(shape=(2, ))
first_dense = Dense(1, )(first_input)

second_input = Input(shape=(2, ))
second_dense = Dense(1, )(second_input)

merge_one = concatenate([first_dense, second_dense])

third_input = Input(shape=(1, ))
merge_two = concatenate([merge_one, third_input])

model = Model(inputs=[first_input, second_input, third_input], outputs=merge_two)
ada_grad = Adagrad(lr=0.1, epsilon=1e-08, decay=0.0)
model.compile(optimizer=ada_grad, loss='binary_crossentropy',
               metrics=['accuracy'])
Run Code Online (Sandbox Code Playgroud)

要在评论中回答这个问题:

1)结果和合并如何连接?假设你的意思是它们如何连接起来.

连接的工作方式如下:

  a        b         c
a b c   g h i    a b c g h i
d e f   j k l    d e f j k l
Run Code Online (Sandbox Code Playgroud)

即行刚刚加入.

2)现在,x3输入到第一个,x1输入到第二个x2输入到第三个.

  • @putonspectacles使用功能API的第二种方式是有效的,但使用顺序模型的第一种方法在Keras 2.0.2中对我不起作用.我粗略地检查了实现并调用"Concatenate([...])"并没有做太多事情,而且,你不能将它添加到顺序模型中.我实际上认为仍然需要使用描述方法"Merge([...],'concat')"直到他们更新Keras.你怎么看? (2认同)
  • Keras 中的 `Concatenate()` 和 `concatenate()` 层有什么区别? (2认同)

Pra*_*rni 11

添加到上述接受的答案中,以便它可以帮助那些正在使用 tensorflow 2.0


import tensorflow as tf

# some data
c1 = tf.constant([[1, 1, 1], [2, 2, 2]], dtype=tf.float32)
c2 = tf.constant([[2, 2, 2], [3, 3, 3]], dtype=tf.float32)
c3 = tf.constant([[3, 3, 3], [4, 4, 4]], dtype=tf.float32)

# bake layers x1, x2, x3
x1 = tf.keras.layers.Dense(10)(c1)
x2 = tf.keras.layers.Dense(10)(c2)
x3 = tf.keras.layers.Dense(10)(c3)

# merged layer y1
y1 = tf.keras.layers.Concatenate(axis=1)([x1, x2])

# merged layer y2
y2 = tf.keras.layers.Concatenate(axis=1)([y1, x3])

# print info
print("-"*30)
print("x1", x1.shape, "x2", x2.shape, "x3", x3.shape)
print("y1", y1.shape)
print("y2", y2.shape)
print("-"*30)
Run Code Online (Sandbox Code Playgroud)

结果:

------------------------------
x1 (2, 10) x2 (2, 10) x3 (2, 10)
y1 (2, 20)
y2 (2, 30)
------------------------------
Run Code Online (Sandbox Code Playgroud)


o0o*_*o0o 6

您可以尝试model.summary()(请注意concatenate_XX(连接)层的大小)

# merge samples, two input must be same shape
inp1 = Input(shape=(10,32))
inp2 = Input(shape=(10,32))
cc1 = concatenate([inp1, inp2],axis=0) # Merge data must same row column
output = Dense(30, activation='relu')(cc1)
model = Model(inputs=[inp1, inp2], outputs=output)
model.summary()

# merge row must same column size
inp1 = Input(shape=(20,10))
inp2 = Input(shape=(32,10))
cc1 = concatenate([inp1, inp2],axis=1)
output = Dense(30, activation='relu')(cc1)
model = Model(inputs=[inp1, inp2], outputs=output)
model.summary()

# merge column must same row size
inp1 = Input(shape=(10,20))
inp2 = Input(shape=(10,32))
cc1 = concatenate([inp1, inp2],axis=1)
output = Dense(30, activation='relu')(cc1)
model = Model(inputs=[inp1, inp2], outputs=output)
model.summary()
Run Code Online (Sandbox Code Playgroud)

您可以在此处查看笔记本的详细信息:https : //nbviewer.jupyter.org/github/anhhh11/DeepLearning/blob/master/Concanate_two_layer_keras.ipynb

  • Keras 中的 `Concatenate()` 和 `concatenate()` 层有什么区别? (4认同)
  • 你发现区别了吗,一个是 Keras 类,另一个是张量流方法 (2认同)