Softmax Function

The softmax function is a common activation function used in neural networks. It is also known as the normalized exponential function.

#Getting Started

#inroduction

#algorithms using softmax

  • Neural Networks
  • Deep Learning

#formula

$$\sigma(x) = \frac{e{x}}{\sum_{i=1}e^{x_{i}}}$$

  • x is the input.

#properties

  • The softmax function is always positive.
  • The softmax function is always less than or equal to 1.
  • The softmax function is equal to 0 when the input is less than 0.
  • The softmax function is equal to the input when the input is greater than or equal to 0.

#Advantages and Disadvantages

  • Advantages

    • The softmax function is differentiable.
    • The softmax function is monotonic.
    • The softmax function is bounded.
  • Disadvantages ## looks into this

    • The softmax function is not differentiable at 0.
    • The softmax function is not differentiable at negative values.
    • The softmax function is not bounded above.
    • The softmax function is not bounded below.
    • The softmax function is not symmetric.
    • The softmax function is not centered around 0.

#Implementation

#Python

import numpy as np

def softmax(x):
    return np.exp(x) / np.sum(np.exp(x), axis=0)

#R

softmax <- function(x) {
    exp(x) / sum(exp(x))
}

#Julia

function softmax(x)
    exp.(x) ./ sum(exp.(x))
end

#Tensorflow

import tensorflow as tf

def softmax(x):
    return tf.nn.softmax(x)

#Pytorch


import torch.nn.functional as F

def softmax(x):
    return F.softmax(x, dim=0)