如何在Tensorflow中获得关于激活的损失梯度

and*_*ers 4 tensorflow

在该cifar10示例中,关于参数的损失梯度可以如下计算:

grads_and_vars = opt.compute_gradients(loss)
for grad, var in grads_and_vars:
    # ...
Run Code Online (Sandbox Code Playgroud)

有没有办法获得关于激活(而不是参数)的损失梯度,并在Tensorboard中观察它们?

mrr*_*rry 5

您可以使用该tf.gradients()函数计算任何标量张量相对于任何其他张量的梯度(假设为这两个张量之间的所有操作定义了渐变):

activations = ...
loss = f(..., activations)  # `loss` is some function of `activations`.

grad_wrt_activations, = tf.gradients(loss, [activation])
Run Code Online (Sandbox Code Playgroud)

在TensorBoard中可视化这一点通常很棘手,因为grad_wrt_activation(通常)具有相同形状的张量activation.添加tf.histogram_summary()操作可能是最简单的可视化方法

# Adds a histogram of `grad_wrt_activations` to the graph, which will be logged
# with the other summaries, and shown in TensorBoard.
tf.histogram_summary("Activation gradient", grad_wrt_activations)
Run Code Online (Sandbox Code Playgroud)