The softmax function is a common activation function used in neural networks. It is also known as the normalized exponential function.
$$\sigma(x) = \frac{e{x}}{\sum_{i=1}e^{x_{i}}}$$
Advantages
Disadvantages ## looks into this
import numpy as np
def softmax(x):
return np.exp(x) / np.sum(np.exp(x), axis=0)
softmax <- function(x) {
exp(x) / sum(exp(x))
}
function softmax(x)
exp.(x) ./ sum(exp.(x))
end
import tensorflow as tf
def softmax(x):
return tf.nn.softmax(x)
import torch.nn.functional as F
def softmax(x):
return F.softmax(x, dim=0)