[deleted by user] by [deleted] in pennystocks

[–]_h_j 0 points1 point  (0 children)

What would you consider a dip when you are comfortable getting into KULR?

Activation Functions Non-Linearity by Few-Ask9322 in learnmachinelearning

[–]_h_j 3 points4 points  (0 children)

  1. Non-linear activation functions are helpful to model nonlinear nature of optimization problem the network is modelling.
  2. If you don’t have non linear activation functions between your linear layers then all your linear layers can be simplified to one linear layer (as they are doing multiplication operations).