Talk:Eigendecomposition of a matrix
This is the talk page for discussing improvements to the Eigendecomposition of a matrix article. This is not a forum for general discussion of the article's subject. |
Article policies
|
Find sources: Google (books · news · scholar · free images · WP refs) · FENS · JSTOR · TWL |
Archives: 1Auto-archiving period: 12 months |
This article is rated Start-class on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | |||||||||||
|
A fact from Eigendecomposition of a matrix appeared on Wikipedia's Main Page in the Did you know column on October 10 2007. A record of the entry may be seen at Wikipedia:Recent additions/2007/October. |
This page has archives. Sections older than 365 days may be automatically archived by Lowercase sigmabot III when more than 10 sections are present. |
Not any basis is orthogonal
[edit]"Any eigenvector basis for a real symmetric matrix is orthogonal" --- really? The zero matrix is symmetric, and all vectors are its eigenvectors. Yes, the orthogonal basis can be chosen; but the given formulation is inaccurate. Boris Tsirelson (talk) 20:06, 15 April 2009 (UTC)
I revert the edit of 143.167.67.175. It is far not enough to require that the matrix is not zero. The zero matrix is only the simplest example. Please do not claim statements unless you find them in a source. Boris Tsirelson (talk) 15:16, 21 April 2009 (UTC)
- Just out of interest, can you specify a counterexample? Oli Filth(talk|contribs) 16:59, 21 April 2009 (UTC)
- Sure. It is a matter of multiplicity. If at least one eigenvalue is of multiplicity 2 or more then we can choose non-orthogonal basis vectors from the corresponding eigenspace. Boris Tsirelson (talk) 17:39, 21 April 2009 (UTC)
- Then can we add in parentheses: (If there are n distinct eigenvalues, the eigenvectors must be orthogonal.)? 132.67.97.20 (talk) 13:39, 28 November 2010 (UTC)
The Eigenspaces are orthogonal. If they are 1D, then the corresponding EVs are orthogonal. LMSchmitt 21:25, 26 May 2021 (UTC)
Regarding "Eigendecomposition of a matrix"
[edit]Is the beginning of the "Eigendecomposition of a matrix" paragraph correct?
I tend to think it should be phrased
- "Let A be a square (N×N) matrix with N linearly independent
eigenvectors, q_icolumns, a_i \,\, (i = 1, \dots, N). Then A can be factorized as \mathbf{A}=\mathbf{Q}\mathbf{\Lambda}\mathbf{Q}^{-1} where Q is the square (N×N) matrix whose ith column is the eigenvector q_i of A ..."
Chlopisko (talk) 04:52, 12 July 2012 (UTC)
- Correct. N linearly independent columns is merely equivalent to A be invertible, which is insufficient to make A diagonalizable (for example, the defective matrix is invertible). N linearly independent eigenvectors however means that the A cannot have any strictly generalized eigenvectors, so its Jordan form has to be diagonal. I'll change it. 68.102.164.94 (talk) 03:17, 2 October 2013 (UTC)
Confusion about (in?) Eigendecomposition of a matrix: Example
[edit]I am trying to understand this topic/example. In it is Λ (the matrix with eigenvalues on the diagonal) which is surrounded by Q and Q−1, while A is on the other side of the equation. But in the example, it is the other way around. What am I missing? Or what is the article missing? Orehet (talk) 11:24, 13 November 2013 (UTC)
Functional Calculus and orthogonality condition for Q
[edit]In the section "Eigendecomposition of a matrix" it is mentioned that the eigenvectors are usually normalized but they need not be. While that's true, we should note that in the "Functional calculus" section for the formula's to hold, we need Q to be an orthogonal matrix. I just found it's never mentioned in the text, which could be misleading. 136.159.49.121 (talk) 21:24, 2 December 2015 (UTC)
Example correction
[edit]Shouldn't we add the condition c!=0 and d!=0 in the Example? Pokyrek 8. March 2016 —Preceding undated comment added 07:37, 8 March 2016 (UTC)
Conclusion from Eigenproblem to matrix decomposition is missing parts
[edit]"The decomposition can be derived from the fundamental property of eigenvectors:" This section needs to be improved with the matrix diagonalization theorem for which even no wikipedia article exists. Without this theorem the result is plainly wrong, since the vectors may not be linear independent etc. Possible references/how to fix this: https://nlp.stanford.edu/IR-book/html/htmledition/matrix-decompositions-1.html [Just using/stating the theorem] http://mathworld.wolfram.com/EigenDecompositionTheorem.html
Proofs may not be feasible in wikipedia though.
One can at least link to the spectral theory giving hints that this does not follow in general, since this may be confusing. https://wiki.riteme.site/wiki/Spectral_theorem — Preceding unsigned comment added by 134.61.83.68 (talk) 13:45, 4 January 2018 (UTC)
Only v1*Bv2 = 0, or also more generally vi*Bvj = 0?
[edit]The article contains the text:
"If A and B are both symmetric or Hermitian, and B is also a positive-definite matrix, the eigenvalues λi are real and eigenvectors v1 and v2 with distinct eigenvalues are B-orthogonal (v1*Bv2 = 0).[1]"
This way of writing suggests that something is stated only about v1 and v2.
I presume the statement may be true for vi and vj with i and j more general (but still with distinct eigenvalues).
If my understanding is correct, I propose the text be changed in a way to reflect this greater generality more clearly.Redav (talk) 20:04, 26 May 2021 (UTC)
References
- ^ Parlett, Beresford N. (1998). The symmetric eigenvalue problem (Reprint. ed.). Philadelphia: Society for Industrial and Applied Mathematics. p. 345. doi:10.1137/1.9781611971163. ISBN 978-0-89871-402-9.
Regarding the section titled, "Useful Facts"
[edit]This section title doesn't seem very encyclopedic in my opinion. Would a better name be "Properties of Eigendecompositions"? Nbennett320 (talk) 02:45, 3 November 2021 (UTC)
Normal Matrices
[edit]The discussion on unitary and normal matrices contains the line: If A is restricted to be a Hermitian matrix (A = A*)
Which implies a Hermitian matrix is equal to its own conjugate and not equal to its own conjugate transpose. This would limit Hermitian matrices to having only real-valued entries and, arguably worse, any matrix at all with only real-valued entries would then be Hermitian, i.e. it implies a matrix need not even be a square matrix to be Hermitian.
I've never edited a wikipedia page in my life and am not starting now but stopped by the talk page to say that this warrants a correction so somebody else can do the honors. 2605:6440:300A:3003:0:0:0:5732 (talk) 06:52, 15 September 2023 (UTC)
Section Useful Facts Eigenvalues
[edit]It must be added that not all holds for complex matrices. E.g. "The product of the eigenvalues is equal to the determinant of A" is wrong for unitary matrix whose determinant is complex (but its length is 1). 2A02:1210:2E1A:500:ED3C:6C13:6A21:383 (talk) 15:21, 3 August 2024 (UTC)