Skip to content Skip to sidebar Skip to footer

Bcewithlogitsloss In Keras

How to implement BCEWithLogitsLoss in keras and use it as custom loss function while using Tensorflow as backend. I have used BCEWithLogitsLoss in PyTorch which was defined in torc

Solution 1:

In TensorFlow, you can directly call tf.nn.sigmoid_cross_entropy_with_logits which works both in TensorFlow 1.x and 2.0.

If you want to stick to Keras API, use tf.losses.BinaryCrossentropy and set from_logits=True in the constructor call.

Unlike PyTorch, there are not explicit per-example weights in the API. You can instead set reduction=tf.keras.losses.Reduction.NONE for the loss, do your weighting by explicit multiplication and reduce your loss using tf.reduce_mean.

xent = tf.losses.BinaryCrossEntropy(
    from_logits=True,
    reduction=tf.keras.losses.Reduction.NONE)
loss = tf.reduce_mean(xent(targets, pred) * weights))

Post a Comment for "Bcewithlogitsloss In Keras"