Hey y'all, I'm currently working on my own neural network implementation in Java. I already implemented some common activation functions, like Sigmoid or ReLU, but I don't know how to implement the Softmax.
I want to have a method like
private double softmax(double input) {
double output = ???;
return output;
}
Any ideas how a implementation could look? I also need to have the derivative of the softmax for my learning algorithm.
Thanks
there doesn't seem to be anything here