相关疑难解决方法(0)

如何在theano上实现加权二元交叉熵?

如何在theano上实现加权二元交叉熵?

我的卷积神经网络只能预测0~1(sigmoid).

我想以这种方式惩罚我的预测:

成本表

基本上,我想在模型预测为0时惩罚更多,但事实是1.

问题:如何使用theano和lasagne创建此加权二进制CrossEntropy函数?

我试过这个

prediction = lasagne.layers.get_output(model)


import theano.tensor as T
def weighted_crossentropy(predictions, targets):

    # Copy the tensor
    tgt = targets.copy("tgt")

    # Make it a vector
    # tgt = tgt.flatten()
    # tgt = tgt.reshape(3000)
    # tgt = tgt.dimshuffle(1,0)

    newshape = (T.shape(tgt)[0])
    tgt = T.reshape(tgt, newshape)

   #Process it so [index] < 0.5 = 0 , and [index] >= 0.5 = 1


    # Make it an integer.
    tgt = T.cast(tgt, 'int32')


    weights_per_label = theano.shared(lasagne.utils.floatX([0.2, 0.4])) …
Run Code Online (Sandbox Code Playgroud)

python theano keras lasagne cross-entropy

8
推荐指数
1
解决办法
1845
查看次数

标签 统计

cross-entropy ×1

keras ×1

lasagne ×1

python ×1

theano ×1