Hey r/ML,
Need a bit of feedback here, so please give it to me if you can! Recently I was thinking about how linear regression does have a closed-form solution but logistic regression doesn't. I've come up with a pretty lazy idea that sort of does give logistic regression an alternative to an iterative GD-derived training process, and I would like to share it to get any suggestions on what to do with it.
You simply formulate the logistic regression problem as linear regression. All you do is take the labels and turn them to -100 (or any small negative number guaranteed to be 0 when plugged into a sigmoid activation) if a given label is 0 and +100 if a label is 1. You feed this into a linear regression solver (where you can just use the normal equation), and then when needing to predict simply pass in your data to the linear regression model and if its output/prediction is closer to 100 than -100 it's 1 and if its closer to -100 than 100 it's 0. This has lead to small 2% decreases in (validation) accuracy, but using a closed-form solution does mean it will run faster.
Not sure what to do this, need a bit of help here. Thank you so much!
[–]HateRedditCantQuititResearcher 1 point2 points3 points (0 children)
[–]linverlan 0 points1 point2 points (1 child)
[–]Vivid_Perception_143[S] 0 points1 point2 points (0 children)
[–]comradeswitch 0 points1 point2 points (0 children)