Softmax function is an activation function that turns numbers
into probabilities that sum to one.
Softmax function outputs a vector that represents the probability distributions
of a list of potential outcomes.
In deep learning, This vector represents the last neuron layer of neural network which also
called Logits which are the raw scores output of the neural network.
We will use softmax to map the non-normalized Logits output number of a network to a


No comments:
Post a Comment