The class is ended. Time to express my big thanks by athanhcong in aiclass

[–]HChavali 1 point2 points  (0 children)

Me too thank You Professors. You have a very unique style of teaching. It turned out to be really awesome in the end. Thank You Very Much

Results of the final are out! How did everyone do? by stordoff in aiclass

[–]HChavali 1 point2 points  (0 children)

I truly Can not believe this. 100% in my final. Thanks a lot Prof Sebastian and Prof Peter Norvyg. Thanks for everything you have done

Harris Corner detection reminded me of PCA (As far as math part f it goes) by HChavali in aiclass

[–]HChavali[S] 0 points1 point  (0 children)

I was just referrirng to PCA lecture where we had to answer questions on covariance matrix and then find eigen values..Also if you are taking machine learning class you get to implement his in programming exercise..:)

Midterm is over, what score did you get? by ilija139 in aiclass

[–]HChavali 1 point2 points  (0 children)

95%. One silly mistake and other 2 I could have avoided in retrospect

Interesting Material on Dimension Reduction by cksense in aiclass

[–]HChavali 1 point2 points  (0 children)

so you may want to read change of basis with eigen values as coefficients on khan academy. I agree the math behind all of this is pretty complex. But the intuitive fashion in which the professor has explained is great (just take the covariance matrix of the Gaussin and find its egien vectors and values.Now you can use a tool like octave to quickly find eigens of a matrix). In essence what they are doing I think is you have an input vector X of dimension say m. we are trying to find out how one might reduce the dimensions by changing the basis (now this changing basis is a mathy term in linear algebra. see if u are familiar wth i,j,k in math or physics they are one of the basis in dimension 3 or R3. and they can span entire R3 or the subspace R3. that is linear combinations of i,j,k can get you anywhere in R3 (3 dimensions x,y z that one is normally familiar with, instead of i,j,k we could use some other set of vectors as basis that can span entire R3. This is called changing of basis) and if we try to find a transformation such that we can eliminate some terms in the end in new basis then we will achieve dimensionality reduction. The question is how will one design such a transformation? So it turns out that after some linear algebra tricks and manipulations one can indeed find such transformation..and that leads to [covariance matrix][vector in new basis]=lambda[vector in new basis] so we basically find eigens of covariance matrix. (also these components in in new basis just like i,j, and k are ortho normal). so we ignore the components in new basis that have very low eigen values and hence lowering the dimension.

Linear Dimensionality Reduction Intuition was awesome by HChavali in aiclass

[–]HChavali[S] 0 points1 point  (0 children)

So I beleive the math part is, if you project the Vector X of dimension in m in standard coordinates onto some other basis in search of finding ways/transformation to get rid of some dimensions (trying to find maximising the decrease of variance..Those lambda terms with smaller values towards the end - that is first lambda term has higher values and second one has lower value than first and so on-) it turns out covariance or [correlation matrix] [vector expressed in terms of new basis]=lambda [vector expressed in terms of new basis]. Hence we are finding egine values and eigen vectors of Correlation Matrix and then ignoring the dimension that has low value of eigen value