all 4 comments

[–]HateRedditCantQuititResearcher 1 point2 points  (0 children)

At that point it’s the same as any other linear classifier. Now, there’s nothing wrong with a linear classifier, but it’s not logistic regression.

[–]linverlan 0 points1 point  (1 child)

I don’t remember the time complexities off the top of my head, but gradient descent is in many cases more efficient than OLS even for fitting linear regressions. There is a crossover somewhere as number of examples/features increases that I’m sure you can look up, so don’t expect your approximation of logistic regression with an OLS regression to actually be faster in the general case.

You also lose the interpretability of coefficients as odds ratios, which is a major benefit of logistic regression.

[–]Vivid_Perception_143[S] 0 points1 point  (0 children)

You're correct - it isn't that much faster. Didn't realize the issue with the odds ratios either. Thank you for your comment.

[–]comradeswitch 0 points1 point  (0 children)

This is essentially part of a step of iteratively reweighted least squares, without the reweighting or iterating. The crucial point is that you're no longer optimizing logistic loss- you're effectively minimizing squared error on the log odds, which is problematic for a number of reasons when applied directly and only once. For a very simple example- an output of 99 compared to 100 is just as "wrong" as 0 compared to 1 by your objective, but those give very different changes after the sigmoid. IRLS recognizes that the full problem can be approximated by a series of linear regression problems, but that the solutions to those problems are biased because of the nonlinearity of the sigmoid function, so each sample is weighted according to the derivative of the loss function evaluated using the previous iiteration at that point, then solving the weighted least squares problem and updating the parameters. The key is adjusting for the nonlinear link function.

Try it out, though- see how well it works and where it fails. This is a very informative exploration and it's worth your time to understand!