相关疑难解决方法(0)

如何使用keras-self-attention软件包可视化注意力LSTM?

我正在使用(keras-self-attention)在KERAS中实现注意力LSTM。训练模型后如何可视化注意力部位?这是一个时间序列预测案例。

from keras.models import Sequential
from keras_self_attention import SeqWeightedAttention
from keras.layers import LSTM, Dense, Flatten

model = Sequential()
model.add(LSTM(activation = 'tanh' ,units = 200, return_sequences = True, 
               input_shape = (TrainD[0].shape[1], TrainD[0].shape[2])))
model.add(SeqSelfAttention())
model.add(Flatten())    
model.add(Dense(1, activation = 'relu'))

model.compile(optimizer = 'adam', loss = 'mse')
Run Code Online (Sandbox Code Playgroud)

python lstm keras tensorflow attention-model

5
推荐指数
1
解决办法
266
查看次数

标签 统计

attention-model ×1

keras ×1

lstm ×1

python ×1

tensorflow ×1