TensorFlow中的选择性零权重?

Wil*_*den 5 neural-network tensorflow

假设我有一个NxM权重变量weights和一个1s和0s的常数NxM矩阵mask

如果我的网络层是这样定义的(其他层也类似地定义):

masked_weights = mask*weights
layer1 = tf.relu(tf.matmul(layer0, masked_weights) + biases1)
Run Code Online (Sandbox Code Playgroud)

在训练过程中,该网络的行为会像输入的0 mask一样为零weights吗?(即好像这些权重代表的连接已从网络中完全删除)?

如果没有,我如何在TensorFlow中实现这一目标?

Ten*_*rye 4

答案是肯定的。该实验描绘了下图。在此输入图像描述

其实现是:

import numpy as np, scipy as sp, tensorflow as tf

x = tf.placeholder(tf.float32, shape=(None, 3))
weights = tf.get_variable("weights", [3, 2])
bias = tf.get_variable("bias", [2])
mask = tf.constant(np.asarray([[0, 1], [1, 0], [0, 1]], dtype=np.float32)) # constant mask

masked_weights = tf.multiply(weights, mask)
y = tf.nn.relu(tf.nn.bias_add(tf.matmul(x, masked_weights), bias))
loss = tf.losses.mean_squared_error(tf.constant(np.asarray([[1, 1]], dtype=np.float32)),y)

weights_grad = tf.gradients(loss, weights)

sess = tf.Session()
sess.run(tf.global_variables_initializer())
print("Masked weights=\n", sess.run(masked_weights))
data = np.random.rand(1, 3)

print("Graident of weights\n=", sess.run(weights_grad, feed_dict={x: data}))
sess.close()
Run Code Online (Sandbox Code Playgroud)

运行上面的代码后,您将看到渐变也被屏蔽了。在我的例子中,它们是:

Graident of weights
= [array([[ 0.        , -0.40866762],
       [ 0.34265977, -0.        ],
       [ 0.        , -0.35294518]], dtype=float32)]
Run Code Online (Sandbox Code Playgroud)