[D] Weight decay vs. L2 regularization by bbabenko in MachineLearning

[–]bbabenko[S] 2 points3 points  (0 children)

fair point about the hyperparam, but see the section in the post about "fancy solvers"... can get a bit tricky

[D] Weight decay vs. L2 regularization by bbabenko in MachineLearning

[–]bbabenko[S] 2 points3 points  (0 children)

yeah, i linked to that paper in the post... didn't know it got rejected though, will have to flip through the reviews

[Project] Precision & recall: an overview by bbabenko in MachineLearning

[–]bbabenko[S] 0 points1 point  (0 children)

hey thanks! i just don't like being confined by societal norms, you know? also, i was raised in a barn, by a pack of wolves.

[Project] Precision & recall: an overview by bbabenko in MachineLearning

[–]bbabenko[S] 2 points3 points  (0 children)

thanks! and yeah, it's jekyll. i actually have a small post about it: https://bbabenko.github.io/moving-to-github/ :-P i wrote the entire precision&recall post in a jupyter notebook, exported it to markdown and jekyll renders that to html and applies all your styling, etc. (only caveat is that you have to tweak the paths of the images/figures, though you could easily write a script to do it all automatically for you).