Skip to contents

Applies the randomized leaky rectified liner unit function, element-wise, as described in the paper:

Usage

nn_rrelu(lower = 1/8, upper = 1/3, inplace = FALSE)

Arguments

lower

lower bound of the uniform distribution. Default: 18

upper

upper bound of the uniform distribution. Default: 13

inplace

can optionally do the operation in-place. Default: FALSE

Details

Empirical Evaluation of Rectified Activations in Convolutional Network.

The function is defined as:

RReLU(x)={xif x0ax otherwise 

where a is randomly sampled from uniform distribution U(lower,upper). See: https://arxiv.org/pdf/1505.00853.pdf

Shape

  • Input: (N,) where * means, any number of additional dimensions

  • Output: (N,), same shape as the input

Examples

if (torch_is_installed()) {
m <- nn_rrelu(0.1, 0.3)
input <- torch_randn(2)
m(input)
}
#> torch_tensor
#> -0.1234
#> -0.2292
#> [ CPUFloatType{2} ]