相关疑难解决方法(0)

如何将注意力层添加到 Bi-LSTM

我正在开发一个 Bi-LSTM 模型并想为其添加一个注意力层。但我不知道如何添加它。

我当前的模型代码是

model = Sequential()
model.add(Embedding(max_words, 1152, input_length=max_len, weights=[embeddings]))
model.add(BatchNormalization())
model.add(Activation('tanh'))
model.add(Dropout(0.5))
model.add(Bidirectional(LSTM(32)))
model.add(BatchNormalization())
model.add(Activation('tanh'))
model.add(Dropout(0.5))
model.add(Dense(1, activation='sigmoid'))
model.summary()
Run Code Online (Sandbox Code Playgroud)

模型摘要是

Model: "sequential_1"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
embedding_1 (Embedding)      (None, 1152, 1152)        278396928 
_________________________________________________________________
batch_normalization_1 (Batch (None, 1152, 1152)        4608      
_________________________________________________________________
activation_1 (Activation)    (None, 1152, 1152)        0         
_________________________________________________________________
dropout_1 (Dropout)          (None, 1152, 1152)        0         
_________________________________________________________________
bidirectional_1 (Bidirection (None, 64)                303360    
_________________________________________________________________
batch_normalization_2 (Batch (None, 64)                256       
_________________________________________________________________
activation_2 (Activation)    (None, 64)                0         
_________________________________________________________________
dropout_2 (Dropout) …
Run Code Online (Sandbox Code Playgroud)

nlp machine-learning python-3.x keras tensorflow

14
推荐指数
1
解决办法
4521
查看次数

标签 统计

keras ×1

machine-learning ×1

nlp ×1

python-3.x ×1

tensorflow ×1