我有一个损失值/函数,我想计算所有关于张量f(大小为n)的二阶导数.我设法使用了两次tf.gradients,但是当第二次应用它时,它会对第一个输入的导数求和(请参阅我的代码中的second_derivatives).
我还设法检索Hessian矩阵,但我想只计算它的对角线以避免额外的计算.
import tensorflow as tf
import numpy as np
f = tf.Variable(np.array([[1., 2., 0]]).T)
loss = tf.reduce_prod(f ** 2 - 3 * f + 1)
first_derivatives = tf.gradients(loss, f)[0]
second_derivatives = tf.gradients(first_derivatives, f)[0]
hessian = [tf.gradients(first_derivatives[i,0], f)[0][:,0] for i in range(3)]
model = tf.initialize_all_variables()
with tf.Session() as sess:
sess.run(model)
print "\nloss\n", sess.run(loss)
print "\nloss'\n", sess.run(first_derivatives)
print "\nloss''\n", sess.run(second_derivatives)
hessian_value = np.array(map(list, sess.run(hessian)))
print "\nHessian\n", hessian_value
Run Code Online (Sandbox Code Playgroud)
我的想法是tf.gradients(first_derivatives,f [0,0])[0]可以检索例如关于f_0的二阶导数,但似乎张量流不允许从张量的张量中导出.