Applies the rectified linear unit function element-wise $$\mbox{ReLU}(x) = (x)^+ = \max(0, x)$$
Shape
Input: \((N, *)\) where
*
means, any number of additional dimensionsOutput: \((N, *)\), same shape as the input
Examples
if (torch_is_installed()) {
m <- nn_relu()
input <- torch_randn(2)
m(input)
}
#> torch_tensor
#> 0.2521
#> 0.0000
#> [ CPUFloatType{2} ]