LSTM/RNN可用于文本生成. 这显示了为Keras模型使用预先训练的GloVe字嵌入的方法.
尝试了示例方法:
# Sample code to prepare word2vec word embeddings
import gensim
documents = ["Human machine interface for lab abc computer applications",
"A survey of user opinion of computer system response time",
"The EPS user interface management system",
"System and human system engineering testing of EPS",
"Relation of user perceived response time to error measurement",
"The generation of random binary unordered trees",
"The intersection graph of paths in trees",
"Graph minors …Run Code Online (Sandbox Code Playgroud) 语言模型的tensorflow教程允许计算句子的概率:
probabilities = tf.nn.softmax(logits)
Run Code Online (Sandbox Code Playgroud)
在下面的评论中,它还指定了一种预测下一个词而不是概率的方法,但没有说明如何做到这一点.那么如何使用这个例子输出一个单词而不是概率呢?
lstm = rnn_cell.BasicLSTMCell(lstm_size)
# Initial state of the LSTM memory.
state = tf.zeros([batch_size, lstm.state_size])
loss = 0.0
for current_batch_of_words in words_in_dataset:
# The value of state is updated after processing each batch of words.
output, state = lstm(current_batch_of_words, state)
# The LSTM output can be used to make next word predictions
logits = tf.matmul(output, softmax_w) + softmax_b
probabilities = tf.nn.softmax(logits)
loss += loss_function(probabilities, target_words)
Run Code Online (Sandbox Code Playgroud) 我正在尝试使用tensorflow LSTM模型来进行下一个单词预测.
如此相关问题(没有接受的答案)中所述,该示例包含伪代码以提取下一个单词概率:
lstm = rnn_cell.BasicLSTMCell(lstm_size)
# Initial state of the LSTM memory.
state = tf.zeros([batch_size, lstm.state_size])
loss = 0.0
for current_batch_of_words in words_in_dataset:
# The value of state is updated after processing each batch of words.
output, state = lstm(current_batch_of_words, state)
# The LSTM output can be used to make next word predictions
logits = tf.matmul(output, softmax_w) + softmax_b
probabilities = tf.nn.softmax(logits)
loss += loss_function(probabilities, target_words)
Run Code Online (Sandbox Code Playgroud)
我对如何解释概率向量感到困惑.我修改__init__的功能PTBModel在ptb_word_lm.py存储概率和logits:
class PTBModel(object):
"""The …Run Code Online (Sandbox Code Playgroud)