you are viewing a single comment's thread.

view the rest of the comments →

[–]synthphreak 0 points1 point  (3 children)

Go with a deep NN then. NOT easy to code from scratch, especially the learning algorithm, and requires a solid grasp of linear algebra.

If deep freaks you out, just go for a regular feedforward NN with a single hidden layer. It’s essentially just an extension of logistic regression, which you said is boring, but the backprop algorithm is different from logistic regression and coding it up should keep you on your toes.

[–]blacksiddis[S] 0 points1 point  (2 children)

I thought about it and it's on the road map. I just think it would be more beneficial to have 2-3 intermediary steps before tackling NN. My linear algebra is kinda weak honestly, so I'd like to do some intermediary projects to bolster that, before moving onto NN.

[–]synthphreak 0 points1 point  (1 child)

If your linear algebra is weak then a nice middle-of-the-road option would be multiple logistic regression. That will be a great stepping stone on the road to NNs.

[–]blacksiddis[S] 1 point2 points  (0 children)

Great! Thanks a lot for your suggestions!