小编Tri*_*ood的帖子

试图用爱德华实现贝叶斯神经网络

我试图将贝叶斯神经网络用于在PyCon上的Torsten Scholak提出的用于非线性回归的一些真实世界数据,并且得到了一些奇怪的结果。合适的程度可以达到一定程度,然后可以保持平坦。关于我在做什么错的任何想法吗?我正在使用爱德华图书馆

def neural_network_with_2_layers(x, W_0, W_1, b_0, b_1):
    h = tf.nn.tanh(tf.matmul(x, W_0) + b_0)
    h = tf.matmul(h, W_1) + b_1
    return tf.reshape(h, [-1])

dim = 10  # layer dimensions
W_0 = Normal(loc=tf.zeros([D, dim]),
             scale=tf.ones([D, dim]))
W_1 = Normal(loc=tf.zeros([dim, 1]),
             scale=tf.ones([dim, 1]))
b_0 = Normal(loc=tf.zeros(dim),
             scale=tf.ones(dim))
b_1 = Normal(loc=tf.zeros(1),
             scale=tf.ones(1))

x = tf.placeholder(tf.float32, [N, D])

#Reshaping
a = neural_network_with_2_layers(x,W_0,W_1,b_0,b_1)
b = tf.reshape(a,[len(X_train),1])
y = Normal(loc=b,scale=(tf.ones([N,1])*0.1))  # constant noise


 `#BACKWARD MODEL A`

q_W_0 = Normal(loc=tf.Variable(tf.random_normal([D, dim])),
               scale=tf.nn.softplus(tf.Variable(tf.random_normal([D, dim]))))
q_W_1 = Normal(loc=tf.Variable(tf.random_normal([dim, 1])), …
Run Code Online (Sandbox Code Playgroud)

python bayesian-networks deep-learning tensorflow edward

5
推荐指数
0
解决办法
629
查看次数