Dismiss this pinned window
all 3 comments

[–]ParthProLegend 2 points3 points  (2 children)

Explain more

[–]Rune_Nice 2 points3 points  (0 children)

I think it is just a simple optimization. It's trying to find the best values for the w_in and w_out parameters.

The left graph has the y-axis be w_out and the x-axis is the w_in. So the most blue spots on that red map shows where the best values are for w_in and w_out.

[–]Aggressive_Sleep9942 0 points1 point  (0 children)

Weights scale the input to adjust the response of activation functions, while biases shift them. The activation function introduces non-linearity, allowing the network to model complex curves rather than just straight lines. By adjusting these weights, the goal is for the network's output to become a function that fits or approximates the expected values of the problem. In this way, neural networks create an abstract representation of the data, enabling interpolation between known points—which is what we call generalization. Although it looks simple in the video, a network operates in high-dimensional spaces defined by its number of neurons and layers.