Binary Cross Entropy (BCE)

Binary Cross Entropy (BCE) is a common error function used in classification problems.It is also known as the log loss function.

#Getting Started

#inroduction

#algorithms using BCE

  • Logistic Regression
  • Neural Networks
  • Support Vector Machines
  • Random Forests
  • Gradient Boosting

#formula

  • is the actual value.
  • is the predicted value.
  • is the number of samples.

#properties

  • The BCE is always positive.
  • The BCE is always less than or equal to 1.
  • The BCE is equal to 0 when the predicted value is equal to the actual value.
  • The BCE is equal to 1 when the predicted value is equal to 0.5.

#Example

  • Let's say we have a dataset of 5 samples.
  • The actual values are .
  • The predicted values are .
  • The BCE is calculated as follows:

#Advantages and Disadvantages

  • Advantages

    • It is easy to implement.
    • It is easy to interpret.
    • It is differentiable.
    • It is a continuous function.
    • It is a monotonically decreasing function.
    • It is a bounded function.
    • It is a symmetric function.
    • It is a strictly convex function.
  • Disadvantages

    • It is sensitive to outliers.
    • It is not robust to noise.
    • It is not robust to missing values.
    • It is not robust to class imbalance

#Implementation

#Python

import numpy as np

def binary_cross_entropy(y, y_hat):
    return -np.mean(y * np.log(y_hat) + (1 - y) * np.log(1 - y_hat))

#R

binary_cross_entropy <- function(y, y_hat) {
    return(-mean(y * log(y_hat) + (1 - y) * log(1 - y_hat)))
}

#Julia

function binary_cross_entropy(y, y_hat)
    return -mean(y .* log.(y_hat) + (1 .- y) .* log.(1 .- y_hat))
end

#Sckit-Learn

from sklearn.metrics import log_loss

def binary_cross_entropy(y, y_hat):
    return log_loss(y, y_hat)

#TensorFlow

import tensorflow as tf

def binary_cross_entropy(y, y_hat):
    return tf.keras.losses.binary_crossentropy(y, y_hat)

#PyTorch

import torch

def binary_cross_entropy(y, y_hat):
    return torch.nn.functional.binary_cross_entropy(y, y_hat)