Talk:Sylvester's law of inertia

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

This isn't as clear as it might be. Nowhere is it said that A is a symmetric matrix. If A is symmetric one can get away with saying the eigenvalues of A are also its diagonal entries when thought of as a quadratic form. If not, then the quadratic form interpretation picks up only the symmetric part of A, throwing away the anti-symmetric part.

The simplest thing would be to clarify by making A symmetric at the outset.

Charles Matthews 09:37, 20 Apr 2005 (UTC)

much more general and without eigenvalues[edit]

Hello Sorry but i am always looking for more general definitions. On one hand this avoids confusion, on the other hand it becomes increasingly difficult to understand a simple case of something.

Anyway, the form of the law i use in my Symmetric bilinear form article, only uses ordered fields, and orthogonal basisses. Eigenvalues need not exist.

different theorem?[edit]

According to Horn and Johnson Thm 4.5.8, Sylvester's law of inertia says two Hermitian matrices A and B are unitarily similar iff they have the same inertia. If I'm not mistakened, what is here is trivial-- eigenvalues don't change under similarity transforms, so of course the inertia doesn't change. —The preceding unsigned comment was added by Swiftset (talkcontribs) 22:21, 5 March 2007 (UTC).[reply]

  • I agree with this, and I'm going to change the statement of the theorem to one based on congruence transformations and emphasize the difference you pointed out. Akriasas 18:54, 6 October 2007 (UTC)[reply]

Diagonalization for eigenvalues[edit]

Someone just added this explanation to the statement that "the law can be formulated in terms of the eigenvalues":

Let Q be a matrix that diagonalizes A, i.e., QAQT = D is diagonal. Then A can also be diagonalized by any matrix of the form WQ where W is any diagonal matrix at all, i.e., WQAQTWT = WDWT is diagonal. Since WDWT = DWWT, the signs in W can have no effect, since two negatives multiplied together produces a positive. The matrix Q consists of A's eigenvectors. When two or more of the eigenvalues along the diagonal have the same value, the associated eigenvectors can form combinations that are eigenvectors too. Other than that, the eigenvectors are unique for a given symmetric A except that each eigenvector can be rescaled and sign-flipped arbitrarily.

Apart from the tone being more appropriate for a textbook than for an encyclopedia (where results are usually stated without proof), I wonder whether this description is correct, since the diagonalization for eigenvalues usually has S-1 rather than ST on the right side. If the passage is OK, please forgive my ignorance; but perhapt it should be rephrased and provided with a reference. All the best, --Jorge Stolfi (talk) 18:01, 16 January 2010 (UTC)[reply]

The matrix in question is symmetric, and the eigendecompositon of a symmetric matrix is of the form QDQT, as stated at Eigendecomposition_of_a_matrix. I don't think the above statement is in need of a reference because, for anyone who knows what an eigendecomposition of a matrix is, it's an elementary, obvious statement. For anyone who doesn't know what eigenvalues are, it's unintelligible and no reference could redeem it. Symmetric matrices loom large in eigenvalue studies. I'm well able to forgive your ignorance about this but I also believe you shouldn't be spending your time editing this kind of material -- I believe you'd be better off spending your time learning it, or else editing other material where you're more knowledgeable. Seanwal111111 (talk) 18:00, 17 January 2010 (UTC)[reply]
Sorry, I forgot that the matrix was symmetric. I will try to restore the paragraph. All the best, --Jorge Stolfi (talk) 20:32, 17 January 2010 (UTC)[reply]
Restored it, please check. By the way, the eigendecomposition of a matrix article was incorrect: it said orthogonal matrix, should be orthonormal matrix. All the best, --Jorge Stolfi (talk) 20:58, 17 January 2010 (UTC)[reply]

The current section entitled "Statement in terms of eigenvalues" doesn't actually state the law -- added a concise statement along with a reference. Adam Marsh (talk) 01:37, 10 March 2018 (UTC)[reply]

Why is S^T interpreted as a change of basis as opposed to S^-1?[edit]

I am confused about the statement that S A S^T can be interpreted as a change of basis. Should it not be S A S^-1? This appears also in another comment below. — Preceding unsigned comment added by Rejapoci (talkcontribs) 15:38, 31 January 2014‎ (UTC)[reply]

If a change of coordinate sends a vector x to xS then a bilinear form expressed xAx^T will change to xSA(xS)^T = x(SAS^T)x^T. The effect is the same as changing A to SAS^T. Deltahedron (talk) 19:15, 31 January 2014 (UTC)[reply]

Generalization to arbitrary fields[edit]

Is there any published work on the generalization to arbitrary fields (presumably with the exception of fields of characteristic 2)? This seems fairly straightforward, and seems like an appropriate generalization to include in this article. —Quondum 21:54, 8 July 2015 (UTC)[reply]

That seems to confirm it (for all fields), and I can see smatterings of this in various books, but I don't see anything that I can interpret directly. See, for example, The Collected Mathematical Papers of Leonard Eugene Dickson, Volume 6, p. 330. It seems to be true for all fields, with characteristic 2 only having the quirk that not all symmetric bilinear forms can be diagonalized, so I suppose a signature is difficult to define for those that cannot. Perhaps we could put it in, and hope someone will reference it? There are a few caveats to mention, such as with complex numbers, where every number is a square, so the signature and rank are the same (only the number of 0s and 1s count, not −1s). Unfortunately, I do not have enough knowledge to include this all in the article. —Quondum 02:38, 10 July 2015 (UTC)[reply]