The relu function is a common activation function used in neural networks. It is also known as the rectified linear unit.
Advantages
Disadvantages ## looks into this
def relu(x):
return max(0,x)
relu <- function(x) {
return(max(0,x))
}
function relu(x)
return(max(0,x))
end
import tensorflow as tf
x = tf.constant([-3.0, -1.0, 0.0, 1.0, 3.0])
with tf.Session() as sess:
print(sess.run(tf.nn.relu(x)))
import torch
x = torch.tensor([-3.0, -1.0, 0.0, 1.0, 3.0])
print(torch.nn.functional.relu(x))