Weighted cross entropy loss function: tf.nn.weighted_cross_entropy_with_logits

tf.nn.weighted_cross_entropy_with_logits Function

tf.nn.weighted_cross_entropy_with_logits(
  targets,
  logits,
  pos_weight,
  name=None
)

Defined in: tensorflow/Python/OPS/NN_ impl.py 。

The weighted cross entropy was calculated.

0 Similar to sigmoid?Cross?Entropy?With?Logits () , except POS?U weight It allows people to weigh recall rate and accuracy by weighting up or down the cost of positive error relative to negative error.

0 Generally, the cross entropy cost is defined as:

targets * -log(sigmoid(logits)) +
  (1 - targets) * -log(1 - sigmoid(logits))

Value POS?U weights &> 1 The false negative count is reduced and the recall rate is increased. Instead, set POS?U weights & lt; 1 It can reduce false positive count and improve accuracy. You can see from the following It is introduced as the multiplication coefficient of the positive target term in the loss expression

targets * -log(sigmoid(logits)) * pos_weight +
  (1 - targets) * -log(1 - sigmoid(logits))

For simplicity, let x = Logits, z = targets, q = POS?U weight . The losses are:

 qz * -log(sigmoid(x)) + (1 - z) * -log(1 - sigmoid(x))
= qz * -log(1/(1 + exp(-x))) + (1 - z) * -log(exp(-x)/(1 + exp(-x)))
= qz * log(1 + exp(-x)) + (1 - z) * (-log(exp(-x)) + log(1 + exp(-x)))
= qz * log(1 + exp(-x)) + (1 - z) * (x + log(1 + exp(-x))
= (1 - z) * x + (qz + 1 - z) * log(1 + exp(-x))
= (1 - z) * x + (1 + (q - 1) * z) * log(1 + exp(-x))

Set L = (1 + (Q – 1) * z) to ensure stability and avoid overflow

(1 - z) * x + l * (log(1 + exp(-abs(x))) + max(-x, 0))

Logits and targets must have the same type and shape.

Parameters:

Targets: a tensor with the same type and shape as Logits.

Logits: a tensor of type float32 or float64.

pos_ Weight: the coefficient used in the positive sample.

Name: the name of the operation (optional).

Return:

Tensors with the same shape as Logits with component weighted logic loss.

Possible exceptions:

Valueerror: if Logits and targets do not have the same shape.

Similar Posts: