小编Min*_*ing的帖子

在Keras(Tensorflow后端)中使用binary_crossentropy损失

在Keras文档中的培训示例中,

https://keras.io/getting-started/sequential-model-guide/#training

使用binary_crossentropy,并在网络的最后一层添加了乙状结肠激活,但是是否有必要在网络的最后一层中加入乙状结肠?正如我在源代码中发现的:

def binary_crossentropy(output, target, from_logits=False):
  """Binary crossentropy between an output tensor and a target tensor.
  Arguments:
      output: A tensor.
      target: A tensor with the same shape as `output`.
      from_logits: Whether `output` is expected to be a logits tensor.
          By default, we consider that `output`
          encodes a probability distribution.
  Returns:
      A tensor.
  """
  # Note: nn.softmax_cross_entropy_with_logits
  # expects logits, Keras expects probabilities.
  if not from_logits:
    # transform back to logits
    epsilon = _to_tensor(_EPSILON, output.dtype.base_dtype) …
Run Code Online (Sandbox Code Playgroud)

keras tensorflow

6
推荐指数
1
解决办法
6338
查看次数

标签 统计

keras ×1

tensorflow ×1