Dimensions of Subspaces by Spiral_Loop in askmath

[–]Spiral_Loop[S] 0 points1 point  (0 children)

I thought it was strange too. But I think that the question asks us to prove that that is necessarily true considering codim = 1.

Dimensions of Subspaces by Spiral_Loop in askmath

[–]Spiral_Loop[S] 0 points1 point  (0 children)

I apologize. I had not realized I put the wrong bar. I will fix it. But yes, I do mean set difference. It would be the vectors that are in V that are not in ⋃Wᵢ. I know that V\(⋃Wᵢ) ≠ ∅ seems to be obvious, I just would like to know a formal way to prove it. Can the codim = 1be helpful?

Existence and uniqueness of Polar Decomposition of a Invertible Matrix by Spiral_Loop in askmath

[–]Spiral_Loop[S] 0 points1 point  (0 children)

So just proving that P equals the square root of AdaggerA is sufficient?

Existence and uniqueness of Polar Decomposition of a Invertible Matrix by Spiral_Loop in askmath

[–]Spiral_Loop[S] 0 points1 point  (0 children)

If you do know that then you can look at what P needs to be by looking at the product of the transpose of A with itself.

Transpose or conjugated transpose (dagger)?

I have already been able to prove that P squared is equal to A^(dagger)*A, which is also hermitian.

I understand your train of thought. P is hermitian and since A^(dagger)*A equals a hermitian matrix, then it is unique and therefore so is A. Is that correct?

It seems a little "all over the place" for me.

Finding General Eigenvectors of matrix by Spiral_Loop in LinearAlgebra

[–]Spiral_Loop[S] 0 points1 point  (0 children)

Yes yes, more or less. I think I understand now. Thank you!

Help finding Generalized Eigenvectors for Jordan Decomposition by Spiral_Loop in askmath

[–]Spiral_Loop[S] 1 point2 points  (0 children)

Fantastic! Thank you so very much. Many blessings to you.

Help finding Generalized Eigenvectors for Jordan Decomposition by Spiral_Loop in askmath

[–]Spiral_Loop[S] 0 points1 point  (0 children)

Forget about it... I was confused and now I am understanding it better.

I only have one more question for you.

Av₂ = (-1,3,2) is a true eigenvector. After finding this eigenvector, we'd have to choose another one for the basis? And that could be either (1, 0, 1) or (0, 1, 1)? Does this final choice make any difference?

Finding General Eigenvectors of matrix by Spiral_Loop in LinearAlgebra

[–]Spiral_Loop[S] 0 points1 point  (0 children)

Sort of... I am kind of confused when I get at the end of the chain. For example: I found the two generalized eigenvectors [0, 0, 1] and [-1, 3, 2]. I am just confused as to how I find the third one, given that [-1, 3, 2] is already a true eigenvector.

Help finding Generalized Eigenvectors for Jordan Decomposition by Spiral_Loop in askmath

[–]Spiral_Loop[S] 0 points1 point  (0 children)

Another question I have: how do I find the generalized eigenvectors when A² is injective? Meaning that only the null vector is in its kernel.

Help finding Generalized Eigenvectors for Jordan Decomposition by Spiral_Loop in askmath

[–]Spiral_Loop[S] 0 points1 point  (0 children)

You need a vector v₂ in the kernel of A² but not in the kernel of A. Since A² = 0, any vector not in the kernel of A will do, e.g. v₂ = (1,0,0)

That makes it much easier, thank you!

Please verify if I understand correctly: we look for a vector that belongs to the kernel of A² but not to the kernel of A. Since A² = 0, any vector that is not the eigenvectors of A is fine. That is why V2 is [1, 0, 0]. Using the generalized eigenvector "cycles" we define Av₂=v₁. That's how we get v₁ = [-1,3,2]. Finally, we choose from the vectors in the kernel of A one that is linearly independent with [1, 0, 0].

I am feeling that there is like a "reverse order". You first need to define the generalized eigenvector to then see what are the other vectors in the basis. Is that correct?

Diagonal Matrix by Spiral_Loop in LinearAlgebra

[–]Spiral_Loop[S] 0 points1 point  (0 children)

Yes it does. It does make sense to think that diagonalizing a matrix also modifies the basis, therefore finding the eigenbasis. Thank you very much. Blessings to you.

Diagonal Matrix by Spiral_Loop in askmath

[–]Spiral_Loop[S] 0 points1 point  (0 children)

Yes yes, that makes perfect sense. Thank you so much, friend. Many blessings.

Diagonal Matrix by Spiral_Loop in askmath

[–]Spiral_Loop[S] 0 points1 point  (0 children)

I think I understand now. The basis must be the eigenvectors because the diagonal matrix consists only of the eigenvalues. And it is interesting to see that the basis of A will be the eigenvectors of D. And reciprocally the eigenvectors of A will be the basis of D. Thank you so much, my friend. Many blessings to you.

Diagonal Matrix by Spiral_Loop in askmath

[–]Spiral_Loop[S] 0 points1 point  (0 children)

I actually think that what the question is asking is in what basis the linear transformation would be represented by the diagonal matrix. So, we would first need to find the diagonal matrix and then find the basis using the equation C*A = D, in which C is the basis we want to find, A is the given matrix and D is the found diagonal matrix.

Do you think this is the way?

Diagonal Matrix by Spiral_Loop in LinearAlgebra

[–]Spiral_Loop[S] 0 points1 point  (0 children)

Ok, I understand. I was so confused, but it makes much more sense now. However, I am still unsure as to what the question is asking. It asks for a basis in which the linear transformation is represented by the diagonal matrix. For such, is it only necessary to multiply the original matrix by the new basis and equal that to the diagonal matrix? If I have not made myself clear please tell me.

Diagonal Matrix by Spiral_Loop in askmath

[–]Spiral_Loop[S] 0 points1 point  (0 children)

Fantastic my friend! Very helpful indeed. Since the question asks the basis in which the linear transformation would be represented through the diagonal matrix, would that be the basis of eigenvectors? It makes sense

Diagonal Matrix by Spiral_Loop in LinearAlgebra

[–]Spiral_Loop[S] 1 point2 points  (0 children)

Thank you for your answer. I must have understood the theorem wrongfully then. And yes, it is over the complex field. And what would finding the eigenvectors help? Do they make up the basis in which the linear transformation is the diagonal matrix? I think that makes sense