小编Bas*_*our的帖子

有没有办法在 tf.GradientTape 中为多个输出层应用渐变?

我正在尝试将梯度应用于两个输出模型,但结果表明模型没有学习并且损失没有减少,我需要您的支持谢谢。

@tf.function def train_step(inp, targ, intent, enc_hidden):

loss = 0
intent_loss = 0

with tf.GradientTape(persistent= True) as tape:

    enc_output, enc_hidden = encoder(inp, enc_hidden)

    dec_hidden = enc_hidden




    dec_input = tf.expand_dims([targ_lang.word_index['<start>']] * BATCH_SIZE, 1)

    # Teacher forcing - feeding the target as the next input
    for t in range(1, targ.shape[1]):

        # passing enc_output to the decoder
        predictions, dec_hidden, _ =slot_decoder(dec_input, dec_hidden, enc_output)
        intent_pred, _ = intent_decoder(dec_hidden, enc_output)

        loss += loss_function(targ[:, t], predictions)
        intent_loss = loss_function(intent, intent_pred)

        # using teacher forcing
        dec_input = …
Run Code Online (Sandbox Code Playgroud)

python keras tensorflow

5
推荐指数
0
解决办法
635
查看次数

标签 统计

keras ×1

python ×1

tensorflow ×1