Applies the Softmax function to an n-dimensional input Tensor
rescaling them so that the elements of the n-dimensional output Tensor
lie in the range [0,1]
and sum to 1.
Softmax is defined as:
Note
This module doesn't work directly with NLLLoss,
which expects the Log to be computed between the Softmax and itself.
Use LogSoftmax
instead (it's faster and has better numerical properties).
Examples
if (torch_is_installed()) {
m <- nn_softmax(1)
input <- torch_randn(2, 3)
output <- m(input)
}