其他分享
首页 > 其他分享> > Introduction to Linear Algebra(7) Symmetric Matrices and Quadratic Forms

Introduction to Linear Algebra(7) Symmetric Matrices and Quadratic Forms

作者:互联网

@[TOC](Introduction to Linear Algebra(7) Symmetric Matrices and Quadratic Forms)

Diagonolization Of Sysmmetric Matrices

If AAA is symmetric, then any tow eigenvectors from different eigenspaces are orthogonal.
PROOF: λv1v2=(λv1)Tv2=(Av1)Tv2=(v1TAT)v2=v1T(Av2)=λ2v1v2\lambda v_1 \cdot v_2 = (\lambda v_1)^{T}v_2=(Av_1)^{T}v_2=(v_1^{T}A^{T})v_2=v_1^T(Av_2)=\lambda_2 v_1\cdot v_2λv1​⋅v2​=(λv1​)Tv2​=(Av1​)Tv2​=(v1T​AT)v2​=v1T​(Av2​)=λ2​v1​⋅v2​ forλ1λ2\lambda_1 \ne \lambda_2λ1​̸​=λ2​, v1v2=0v_1 \cdot v_2 =0v1​⋅v2​=0
An n×nn \times nn×n matrix AAA is orthogonally diagonalizable if and only if AAA is a symmetric matrix.
The Spectral Theorem
An n×nn \times nn×n sysmmetric matrix AAA has the following properties:
a.A has nnn real eigenvalues, counting multiplicities.
b. The demension of the eigenspace for each eigenvalues lambdalambdalambda equals the multiplicity of λ\lambdaλ as a root of the characterustic equation.
c. The eigenspaces are mutually orthogonal, in the sense that eigenvectors corresponding to different eigenvalues are orthogonal.
d. AAA is orthogonally diagonalizable.
Spectral Decomposition
if AAA is an sysmmetric matrix. then AAA could be written as A=λ1u1u1T+λ2u2u2T++λnununTA=\lambda_1u_1u_1^T+\lambda_2u_2u_2^T+\cdots+\lambda_nu_nu_n^TA=λ1​u1​u1T​+λ2​u2​u2T​+⋯+λn​un​unT​

Quadratic Forms

The Principla Axes Theorem
Let AAA be an n×nn \times nn×n symmetric matrix. Then there is an orthogonal change of variable, x=Pyx=Pyx=Py, that transforms the quadratic form xTAxx^TAxxTAx into a quadratic form yTDyy^TDyyTDy with no cross-product term.

Classifying Quadratic Forms

A quadratic form QQQ is:
a.positive definite if Q(x)>0Q(x)>0Q(x)>0 for all X0X \ne 0X̸​=0,
b. negative definite if Q(x)&lt;0Q(x)&lt;0Q(x)<0 for all x0x \ne 0x̸​=0,
c. indefinite if Q(x) assumes both positive and negative values.
Quadratic Forms and Eigenvalues
Let AAA be an n×nn \times nn×n symmetric matrix. Then a quadratic form xTAxx^TAxxTAx is:
a. positive definite if and only if the eigenvalues of AAA are all positive,
b. negative definite if and only if the eigenvalues of AAA are all negative,
c.indefinite if and only if AAA has both positive and negative eigenvalues.
Let AAA be a symmetric matrix, and define mmm and MMM . Then MMM is the greatest eigenvalue λ1\lambda_1λ1​ of AAA amd mmm is the least eigenvalue of AAA. Then value of xTAxx^TAxxTAx is MMM when xxx is a unit eigenvector u1u_1u1​ corresponding to MMM. Then value of xTAxx^TAxxTAx is mmm when xxx is a unit eigenvector corresponding to mmm.
Let A,λ1A,\lambda_1A,λ1​, and u1u_1u1​ be as in Theorem 6. Then the maximum value of xTAxx^TAxxTAx subject to the constraints xTx=1,xTu1=0x^Tx=1,x^Tu_1=0xTx=1,xTu1​=0
is the second greatset eigenvalue, λ2\lambda_2λ2​, and this maximum is attained when xxx is an eigenvector u2u_2u2​ corresponding to λ2\lambda_2λ2​. This theorem is also extended to λk\lambda_kλk​

The Singular Value Decomposition

Singular Value Decomposition: a factorization A=QDP1A=QDP^{-1}A=QDP−1 is possible for any m×nm \times nm×n matrix AAA
Decomposition
Let AAA be an m×nm \times nm×n matrix with rank rrr. Then there exists an m×nm \times nm×n matrix Σ\SigmaΣ as Σ=[D000]\Sigma= \begin{bmatrix}D&amp;0\\ 0&amp;0 \end{bmatrix}Σ=[D0​00​] for which the diagonal entries in DDD are the first rrr singular values of AAA,σ1σ2σr&gt;0\sigma_1 \ge \sigma_2 \ge \cdots \ge \sigma_r &gt;0σ1​≥σ2​≥⋯≥σr​>0, and there exist an m×mm \times mm×m orthogonal matrix UUU and an n×nn \times nn×n orthogonal matrix VVV such that A=UΣVTA=U \Sigma V^TA=UΣVT
PROOF:
KaTeX parse error: Expected 'EOF', got '\lambd' at position 1: \̲l̲a̲m̲b̲d̲_i and viv_ivi​ are the eigenvalues and eigenvectors of ATAA^TAATA seperately, so that {Av1,&ThinSpace;,Avr}\{Av_1,\cdots,Av_r\}{Av1​,⋯,Avr​} is an orthogonal basis for ColAColAColA. Normalize each AviAv_iAvi​ to obtain an orthonormal basis {u1,&ThinSpace;,ur}\{u_1,\cdots,u_r\}{u1​,⋯,ur​}, where ui=AviAvi=Aviσiu_i = \frac{Av_i}{||Av_i||}=\frac{Av_i}{\sigma_i}ui​=∣∣Avi​∣∣Avi​​=σi​Avi​​ and Avi=σui,(ir)Av_i=\sigma u_i ,(\le i \le r)Avi​=σui​,(≤i≤r)
Now extend {ui,&ThinSpace;,ur}\{u_i,\cdots,u_r\}{ui​,⋯,ur​} to an orthonormal basis {u1,&ThinSpace;,um}\{u_1,\cdots,u_m\}{u1​,⋯,um​} of RmR^mRm, and letU=[u1u2um]andV=[v1v2vn]U=[u_1\quad u_2 \cdots \quad u_m] and V=[v_1\quad v_2 \cdots v_n]U=[u1​u2​⋯um​]andV=[v1​v2​⋯vn​]
By construction, UUU and vvv are orthogonal matrices. AV=[Av1Avr00]=[σu1σur00]AV=[Av_1 \cdots Av_r \quad 0 \cdots 0 ]=[\sigma u_1 \cdots \sigma u_r \quad 0 \cdots 0 ]AV=[Av1​⋯Avr​0⋯0]=[σu1​⋯σur​0⋯0]
ThenUΣ=AVU\Sigma =AVUΣ=AV Thus KaTeX parse error: Expected 'EOF', got '\SigmaV' at position 19: …\Sigma V^{-1}=U\̲S̲i̲g̲m̲a̲V̲^{T}
The Invertible Matrix Theorem (concluded)
Let AAA be an n×nn \times nn×n matrix. Then the following statements are each equivalent to the statement that AAA is an invertible matrix.
u.(ColA)=0.(Col A)^{\perp}={0}.(ColA)⊥=0.
v.(NulA)=Rn.(Nul A)^{\perp}=R^n.(NulA)⊥=Rn.
w. RowA=Rn.RowA=R^n.RowA=Rn.
x. AAA has nnn nonzero sigular values.
Reduced SVDSVDSVD and the Pseudoinverse of AAA
let rrr = rankAAA then the UUU and VVV could be KaTeX parse error: Expected & or \\ or \cr or \end at end of input: …-r}^T=U_rDV_r^T
This factorization of AAA is called a reduced singular value decomposition of AAA. The following matrix is called the pseudoinverse of AAA: A+=VrD1UrTA^+=V_rD^{-1}U_r^TA+=Vr​D−1UrT​

Principal Component Analysis

For simplicity, assume that the matrix [X1XN][X_1 \cdots X_N][X1​⋯XN​] is already in mean-deviation form. The goal of principal component analysis is to find an orthogonal p×pp \times pp×p matrix P=[u1up]P=[u_1 \cdots u_p]P=[u1​⋯up​] that determines a changeof variable, X=PYX=PYX=PY,or [x1x2xp]=[u1u2up][y1y2yp]\begin{bmatrix}x_1\\x_2\\ \vdots \\x_p \end{bmatrix}=\begin{bmatrix}u_1 &amp; u_2 &amp; \cdots &amp; u_p \end{bmatrix} \begin{bmatrix}y_1 \\y_2 \\ \vdots \\ y_p \end{bmatrix}⎣⎢⎢⎢⎡​x1​x2​⋮xp​​⎦⎥⎥⎥⎤​=[u1​​u2​​⋯​up​​]⎣⎢⎢⎢⎡​y1​y2​⋮yp​​⎦⎥⎥⎥⎤​

标签:AAA,matrix,nn,Algebra,times,Forms,cdots,lambda,Linear
来源: https://blog.csdn.net/zbzhzhy/article/details/88218007