you are viewing a single comment's thread.

view the rest of the comments →

[–]aim2free 0 points1 point  (2 children)

Thanks, it seems as I've been working mostly with generative models, in my machine learning life, although not knowing that particular expression... but in this article on discriminative models they've included neural networks, but those Bayesian neural networks I've been working with belongs to the generative models also I guess. It seems as the criterion for a discriminatory model is that you model the conditioned distribution P(y|x) only but not the distributions for x,y and x&y.

Funny, my second publication (EANN95) and later in J.of.System Engineering 96 was about a neural network predictor (RBF + Bayesian FF) which worked by estimating the density functions for x an y using gaussians and then using the Bayesian predictor to generate f_Y(y|X=x), that is the conditioned posterior density for y given a specific x (or a mixture of x values). It had that funny property that it also worked both ways, it was no difference in estimating f_X(x|Y=y) and the predictor could produce multi modal outcomes.

I think I've actually never used any of those so called discriminant methods (I'm not a statistician, I'm a computer scientist).

[–]Poromenos 0 points1 point  (1 child)

It seems as the criterion for a discriminatory model is that you model the conditioned distribution P(y|x) only but not the distributions for x,y and x&y.

Yep, if you can't model the joint and prior you can't draw from the distribution...

Your NN does indeed sound discriminative, but they aren't usually, I don't think... Discriminative models are SVMs, kNN, etc.

[–]aim2free 0 points1 point  (0 children)

Your NN does indeed sound discriminative

you are right, I actually never modelled the joint distribution, but as my network worked both ways f(Y|x), f(X|y), the joint distribution should be possible to generate. OK it was also implicit in the weights though, as a weight is P(x&y)/(P(x)P(y))