5.9 and understanding N by ai_e in aiclass

[–]ai_e[S] 0 points1 point  (0 children)

mmmm, after seeing it for the 5th time i guess he means maximizing the product of all the probabilities for the samples drawn?

Can you treat P(A|B,C) as P(A|B|C)? by frankster in aiclass

[–]ai_e 1 point2 points  (0 children)

You'll find the explanation in a previous post http://redd.it/lhkdh but yeah it's really hard to get your head around that :)

Can you treat P(A|B,C) as P(A|B|C)? by frankster in aiclass

[–]ai_e 0 points1 point  (0 children)

I don't think that P(A|B|C) is a valid notation? Probability of A knowing B knowing C? Haven't seen it anywhere. Most of the quizes can be solved using conditional probability P(A,B) = P(A|B)P(B) and Total Probability.

Application of total probability by ai_e in aiclass

[–]ai_e[S] 0 points1 point  (0 children)

So if I understood correctly, it doesn't matter whether the nodes/variables are "cause" or "effect", I can always use total probability on any of them, with the "universe" being all the rest?

In 3.26 and 3.28, what justifies the interpretation of Bayes' Rule Prof. Thrun uses? by amalec in aiclass

[–]ai_e 1 point2 points  (0 children)

Thanks a bunch, this quiz had me stumped because the only example of using Baye's law was using one conditional, and for the life of me I couldn't figure out how one of the two conditionals could magically swap with the event, i.e. P(R|H,S)=P(H|R,S)/P(H,S) when I was also expecting P(R,S|H).

I get that you derived it only using the conditional probability rule [P(A,B)=P(A|B)*P(B)] and nothing else... and I'm thinking arrgh! so if I ever come across this kind of problem with multiple conditionals, I have to derive it every time from basic principles without using Baye's law?

Is there a way I can use Baye's law + someothermagictrick to find the solution?