oku*_*oub 2 nlp language-model lstm keras perplexity
我使用 Keras LSTM 创建了一个语言模型,现在我想评估它是否良好,因此我想计算困惑度。
在 Python 中计算模型的复杂度的最佳方法是什么?
我已经提出了两个版本并附上了相应的来源,请随时查看链接。
def perplexity_raw(y_true, y_pred):
"""
The perplexity metric. Why isn't this part of Keras yet?!
/sf/ask/2931691591/
https://github.com/keras-team/keras/issues/8267
"""
# cross_entropy = K.sparse_categorical_crossentropy(y_true, y_pred)
cross_entropy = K.cast(K.equal(K.max(y_true, axis=-1),
K.cast(K.argmax(y_pred, axis=-1), K.floatx())),
K.floatx())
perplexity = K.exp(cross_entropy)
return perplexity
def perplexity(y_true, y_pred):
"""
The perplexity metric. Why isn't this part of Keras yet?!
/sf/ask/2931691591/
https://github.com/keras-team/keras/issues/8267
"""
cross_entropy = K.sparse_categorical_crossentropy(y_true, y_pred)
perplexity = K.exp(cross_entropy)
return perplexity
Run Code Online (Sandbox Code Playgroud)