Jump to content

Talk:Eigenvalue algorithm

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Merge

[edit]

As a learner, it is good idea to merge them together. On the other hand, "algorithm" is not really an algorithm it should better rephrase as "calculation of eigenvalue"... —The preceding unsigned comment was added by 202.81.251.173 (talk) 09:22, 17 November 2006

I agree. I just think there should be a way to merge discussion on a merger, just seems like a obvious thing to overlook. --ANONYMOUS COWARD0xC0DE 07:44, 25 November 2006 (UTC)[reply]

As per my arrogant declaration all further discussion regarding this merge shall take place on Talk:Symbolic_computation_of_matrix_eigenvalues, simply because there have been more comments made there. --ANONYMOUS COWARD0xC0DE 08:06, 23 December 2006 (UTC)[reply]

Character set

[edit]

There are characters used on this page that my browser (Firefox 1.5) does not display. Shouldn't we be sticking to standard character sets? --jdege 11:37, 25 April 2007 (UTC)[reply]

Computational complexity?

[edit]

What is the computational complexity (in the big-O sense) of these eigenvalue algorithms? —Ben FrantzDale (talk) 18:10, 6 December 2008 (UTC)[reply]

Eigenvalues of a Symmetric 3x3 Matrix

[edit]

Hi, does anyone know a derivation (or a proof) for the symmetric 3x3 algorithm? (perhaps the source where it came from) And I would suggest to write instead of , so it works even for eye-matrices. --134.102.204.123 (talk) 11:01, 28 July 2009 (UTC)[reply]

If someone is interested, I found a derivation here: [1] --134.102.204.123 (talk) 15:55, 28 July 2009 (UTC)[reply]

Some questions/suggestions about the algorithm:

  1. afaik, acos will never result in values < 0, so the if-clause is redundant.
  2. For the Matlab implementation one can write p=norm(K,'fro')/6; instead of the ridiculous loop (alternatively: p=sum(K(:).^2)/6)
  3. Is the python code really significant? I hardly see a difference to the Matlab implementation (except for syntax)
  4. Why wouldn't this algorithm work for singular matrices? The only problem I see would be the case of a scaled unit-matrix (making K=zeros(3) and p=0).

--92.76.248.104 (talk) 04:57, 11 June 2012 (UTC)[reply]

  1. You are correct. I suspect that was a holdover from Oliver Smith's version of the algorithm, which relied on atan, not acos. What is missing from this version is that the abs(q) vs abs(p^3/2) comparison should be: if q < -p^3/2 then phi = PI/3, if q > p^3/2 then phi = 0.
  2. The purpose of these is to show people what the algorithm needs to be in general, not the Matlab specific version. Therefore telling how p is to be calculated is a better choice than using a platform-specific shortcut whose meaning is not easily deduced by those not familiar with Matlab. (Besides, the Frobenius norm is going to give the square root of p.)
  3. I agree. It would be better if both the Matlab and the python versions were replaced with a single generic pseudocode.
  4. Someone was sloppy in their remarks. The algorithm is obviously singular when p = 0, which occurs iff all three eigenvalues are equal. In which case M must be a multiple of I. The only other issue with the algorithm occurs when abs(q) << p^1/2 (in which case K is approximately singular, not M). In this situation, the calculation of q is numerically unstable. Because of cancellation, q will have significantly less accuracy than the coordinates of M. That loss of accuracy also carries over to the eigenvalues calculated from q. This may be what the author of the comment was referring to. — Preceding unsigned comment added by 68.102.164.99 (talk) 02:15, 21 June 2012 (UTC)[reply]

Corrected a mistake in the algorithm. It said B=A/p-q*I, but it should read B=(1/p)*(A-q*I). Checked it myself when implementing in Matlab. The corrected version yields the right solution. Also changed slightly the statements above (it used to say B=pA+qI but its the other way around). Value of p using this form also changes accordingly. Now everything makes sense and the statement and algorithm are easily seen to correspond. — Preceding unsigned comment added by 190.158.28.20 (talk) 04:46, 19 September 2012 (UTC)[reply]

QR algorithm redundancy

[edit]

In section "Advanced methods" the QR algorithm is mentioned two times. Is that how it should be?

Mortense (talk) 19:40, 12 October 2009 (UTC)[reply]

The section "Power iteration" is obviously wrong

[edit]

If the eigenvalues are imaginary, there is no chance that the "Power iteration algorithm" will converge to an eigenvector. I am making an obvious correction now (by saying the algorithm converges *on the condition* that there is a dominant eigenvalue). This is a very major issue, since for a matrix with real coefficients, if the eigenvalue of highest || is complex, then necessarily its complex conjugate is also an eigenvalue; the complex conjugate clearly is going to have the same magnitude of ||, so the "power iteration" algorithm is going to fail for this completely generic possibility.

This section needs considerably more work! Tition1 (talk) 01:22, 12 September 2011 (UTC)[reply]

(Note that in practice the case of two conjugate eigenvalues is easily handled. e.g. if you take two subsequent iterations, the vectors generally span the two conjugate eigenvectors so you can solve a small 2×2 eigenproblem with them to get the two eigenvalues and eigenvectors. — Steven G. Johnson (talk) 01:16, 27 October 2011 (UTC))[reply]

Tridiagonal eigenvalue complexity

[edit]

The claim that the eigenvalues of a tridiagonal matrix can be computed in time is highly dubious. It takes time just to evaluate the characteristic polynomial at one point, so even if hiding a large (but due to given precision fixed) number of bisections in the constant, you cannot bisect your way to all separate eigenvalues just time.

A recent paper (Coakley, Ed S.; Rokhlin, Vladimir (2013). "A fast divide-and-conquer algorithm for computing the spectra of real symmetric tridiagonal matrices". Applied and Computational Harmonic Analysis. 34 (3): 379–414. doi:10.1016/j.acha.2012.06.003.) considers it to be fast to compute all eigenvalues in time. 78.73.97.76 (talk) 20:57, 24 June 2018 (UTC)[reply]

Not sure if that is what's going on here, but in some circles it's common to abuse notation and take landau symbols to be ignoring logarithmic factors. 91.89.20.12

Fair enough, but no reason for us to abuse Landau notation like that. And no reason to state a result for general tridiagonal matrices when we can only find a reference for symmetric. I have updated the page accordingly David9550 (talk) 16:37, 5 October 2019 (UTC)[reply]

Non-symmetric matrix example in the symmetric matrix section ? =

[edit]

In section "Symmetric 3×3 matrices" it provides an example using 3x3 matrix "A". However, matrix A is NOT symmetric. — Preceding unsigned comment added by 81.187.174.42 (talk) 15:31, 16 November 2024 (UTC)[reply]