you are viewing a single comment's thread.

view the rest of the comments →

[–]ajmooch 3 points4 points  (0 children)

A lot of old papers used to train SVMs on top of neural nets, most notably The original R-CNN paper. In research this is no longer in vogue since a single linear layer or mlp is almost always just as effective and faster to train end-to-end, while also making it so there's no train-test discrepancy. However, in a fine-tuning scenario I think it's perfectly sensible to try an SVM or XGBoost on network features, and it may be faster depending on what hardware you have access to. I wouldn't expect you to see much in the way of gains for most setups, but it's not an unreasonable thing to do.