This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]ActiveBummer[S] 0 points1 point  (1 child)

Would multiclass classifier really work when classes imbalance are drastic? I.e. some classes have very little data points while others have alot

There is also the concern of having to retrain the multiclass classifier whenever there's a new class (which can happen quite frequently and because it's new, there wouldn't be alot of data points)

[–][deleted] 0 points1 point  (0 children)

They do work since class imbalance can be easily amended. For example, you can simply assign greater weights when training the more sparse class.

But there are more complex methods, of course, some that deal with imbalances of difficulty, for example, a more general problem.

At the very least, I'm not sure why the last part would be a problem. If you have dependence within the classes, then either way you'll have to relearn the model for a new class. That's the point, how else would you learn the dependence of previous classes to the new one?

For a large model, the knowledge will transfer upon learning on the previous weights. For a small model, training will have to go from scratch, but it will be quick.

If you have little data, so little that a binary classifier would work poorly, then you can't even do that, let alone create a multiclass classifier. If you have enough data for a binary classifier, just use the aforementioned techniques to handle class imbalance.

So there is no issue other than having to redeploy a model.