其他分享
首页 > 其他分享> > Definition and Theorems of Chapter 2

Definition and Theorems of Chapter 2

作者:互联网

Definition. A vector space (or linear space) consists of the following:

  1. a field FFF of scalars;
  2. a set VVV of objects, called vectors;
  3. a rule (or operation), called vector addition, which associates with each pair of vectors α,β{\alpha},{\beta}α,β in VVV a vector α+β\alpha+\betaα+β in VVV, called the sum of α{\alpha}α and β{\beta}β, in such a way that
    ( a ) addition is commutative, α+β=β+α\alpha+\beta=\beta+\alphaα+β=β+α;
    ( b ) addition is associative, α+(β+γ)=(α+β)+γ{\alpha}+(\beta +\gamma)=(\alpha+\beta)+\gammaα+(β+γ)=(α+β)+γ;
    ( c ) there is a unique vector 0 in VVV, called the zero vector, such that α+0=α\alpha+0=\alphaα+0=α for all α{\alpha}α in VVV;
    ( d ) for each vector α{\alpha}α in VVV there is a unique vector α-{\alpha}−α in VVV such that α+(α)=0{\alpha}+(-{\alpha})=0α+(−α)=0;
  4. a rule (or operation), called scalar multiplication, which associates with each scalar ccc in FFF and vector α\alphaα in VVV a vector cαc{\alpha}cα in VVV, called the product of ccc and α\alphaα, in such a way that
    ( a ) 1α=α1\alpha=\alpha1α=α for every α{\alpha}α in VVV;
    ( b ) (c1c2)α=c1(c2α)(c_1c_2)\alpha=c_1(c_2\alpha)(c1​c2​)α=c1​(c2​α);
    ( c ) c(α+β)=cα+cβc(\alpha+\beta)=c\alpha+c\betac(α+β)=cα+cβ;
    ( d ) (c1+c2)α=c1α+c2α(c_1+c_2)\alpha=c_1\alpha+c_2\alpha(c1​+c2​)α=c1​α+c2​α.

Definition. A vector β\betaβ in VVV is said to be a linear combination of the vectors α1,,αn{\alpha}_1,\dots,{\alpha}_nα1​,…,αn​ in VVV provided there exist scalars c1,,cnc_1,\dots,c_nc1​,…,cn​ in FFF such that
β=c1α1++cnαn=i=1nciαi\beta=c_1{\alpha}_1+\cdots+c_n{\alpha}_n=\sum_{i=1}^nc_i{\alpha}_iβ=c1​α1​+⋯+cn​αn​=i=1∑n​ci​αi​

Definition. Let VVV be a vector space over the field FFF. A subspace of VVV is a subset WWW of VVV which is itself a vector space over FFF with the operation of vector addition and scalar multiplication on VVV.

Theorem 1. A non-empty subset WWW of VVV is a subspace of VVV if and only if for each pair of vectors α,β{\alpha},{\beta}α,β in WWW and each scalar ccc in FFF the vector cα+βc\alpha+\betacα+β is in WWW.

Lemma. If AAA is an m×nm\times nm×n matrix over FFF and B,CB,CB,C are n×pn\times pn×p matrices over FFF then
A(dB+C)=d(AB)+AC,dFA(dB+C)=d(AB)+AC,\quad \forall d\in FA(dB+C)=d(AB)+AC,∀d∈F

Theorem 2. Let VVV be a vector space over the field FFF. Then intersection of any collection of subspaces of VVV is a subspace of VVV.

Definition. Let SSS be a set of vectors in a vector space VVV. The subspace spanned by SSS is defined to be the intersection WWW of all subspaces of VVV which contains SSS. When SSS is a finite set of vectors, S={α1,α2,,αn}S=\{\alpha_1,\alpha_2,\dots,\alpha_n\}S={α1​,α2​,…,αn​}, we shall simply call WWW the subspace spanned by the vectors α1,α2,,αn\alpha_1,\alpha_2,\dots,\alpha_nα1​,α2​,…,αn​.

Theorem 3. The subspace spanned by a non-empty subset SSS of a vector space VVV is the set of all linear combinations of vectors in SSS.

Definition. If S1,S2,,SkS_1,S_2,\dots,S_kS1​,S2​,…,Sk​ are subsets of a vector space VVV, the set of all sums α1+α2++αk{\alpha}_1+{\alpha_2}+\dots+{\alpha}_kα1​+α2​+⋯+αk​ of vectors αi{\alpha}_iαi​ in SiS_iSi​ is called the sum of the subsets S1,S2,,SkS_1,S_2,\dots,S_kS1​,S2​,…,Sk​ and is denoted by S1+S2++SkS_1+S_2+\dots+S_kS1​+S2​+⋯+Sk​ or by i=1kSi\sum_{i=1}^kS_i∑i=1k​Si​.

Definition. Let VVV be a vector space over FFF. A subset SSS of VVV is said to be linearly dependent (or simply, dependent) if there exist distinct vectors α1,α2,,αn{\alpha}_1,{\alpha}_2,\dots,\alpha_nα1​,α2​,…,αn​ in SSS and scalars c1,c2,,cnc_1,c_2,\dots,c_nc1​,c2​,…,cn​ in FFF, not all of which are 0, such that
c1α1++cnαn=0c_1{\alpha}_1+\cdots+c_n{\alpha}_n=0c1​α1​+⋯+cn​αn​=0
A set which is not linearly depenent is called linearly independent. If the set SSS contains only finitely many vectors α1,α2,,αn{\alpha}_1,{\alpha}_2,\dots,\alpha_nα1​,α2​,…,αn​, we sometimes say that α1,α2,,αn{\alpha}_1,{\alpha}_2,\dots,\alpha_nα1​,α2​,…,αn​ are dependent (or independent) instead of saying SSS is dependent (or independent).

Definition. Let VVV be a vector space. A basis for VVV is a linealy independent set of vectors in VVV which spans the space VVV. The space VVV is finite-dimensional if it has a finite basis.

Theorem 4. Let VVV be a vector space which is spanned by a finite set of vectors β1,β2,,βm{\beta}_1,{\beta}_2,\dots,{\beta}_mβ1​,β2​,…,βm​. Then any independent set of vectors in VVV is finite and contains no more than mmm elements.
Corollary 1. If VVV is a finite-dimensional vector space, then any two bases of VVV have the same (finite) number of elements.
Corollary 2. Let VVV be a finite-dimensional vector space and let n=dimVn=\dim Vn=dimV. Then
( a ) any sunset of VVV which contains more than nnn vectors is linearly dependent;
( b ) no subset of VVV which contains fewer than nnn vectors can span VVV.

Theorem 5. If WWW is a subspace of a finite-dimensional vector space VVV, every linearly independent subset of WWW is finite and is part of a (finite) basis for WWW.
Corollary 1. If WWW is a proper subspace of a finite-dimensional vector space VVV, then WWW is finite-dimensional and dimW<dimV\dim W<\dim VdimW<dimV.
Corollary 2. In a finite-dimensional vector space VVV every non-empty linearly independent set of vectors is part of a basis.
Corollary 3. Let AAA be an n×nn\times nn×n matrix over a field FFF, and suppose the row vectors of AAA form a linearly independent set of vectors in FnF^nFn. Then AAA is invertible.

Theorem 6. If W1W_1W1​ and W2W_2W2​ are finite-dimensional subspaces of a vector space VVV, then W1+W2W_1+W_2W1​+W2​ is finite-dimensional and
dimW1+dimW2=dim(W1W2)+dim(W1+W2)\dim W_1+\dim W_2=\dim(W_1\cap W_2)+\dim(W_1+W_2)dimW1​+dimW2​=dim(W1​∩W2​)+dim(W1​+W2​)

Definition. If VVV is a finite-dimensional vector space, an ordered basis for VVV is a finite sequence of vectors which is linearly independent and spans VVV.

Theorem 7. Let VVV be an nnn-dimensional vector space over the field FFF, and let B\mathfrak BB and B\mathfrak B'B′ be two ordered bases of VVV. Then there is a unique, necessarily invertible, n×nn\times nn×n matrix PPP with entries in FFF such that
[α]B=P[α]B[α]B=P1[α]B[{\alpha}]_{\mathfrak B}=P[{\alpha}]_{\mathfrak B'}\qquad [{\alpha}]_{\mathfrak B'}=P^{-1}[{\alpha}]_{\mathfrak B}[α]B​=P[α]B′​[α]B′​=P−1[α]B​
for every vector α\alphaα in VVV. Then columns of PPP are given by
Pj=[αj]B,j=1,,nP_j=[{\alpha}_j']_{\mathfrak B},\qquad j=1,\dots,nPj​=[αj′​]B​,j=1,…,n

Theorem 8. Suppose PPP is an n×nn\times nn×n invertilbe matrix over FFF. Let VVV be an nnn-dimensional vector space over FFF, and let B\mathfrak BB be an ordered basis of VVV. Then there is a unique basis B\mathfrak B'B′ of VVV such that
[α]B=P[α]B[α]B=P1[α]B[{\alpha}]_{\mathfrak B}=P[{\alpha}]_{\mathfrak B'}\qquad [{\alpha}]_{\mathfrak B'}=P^{-1}[{\alpha}]_{\mathfrak B}[α]B​=P[α]B′​[α]B′​=P−1[α]B​
for every vector α\alphaα in VVV.

Theorem 9. Row-equivalent matrices have the same row space.

Theorem 10. Let RRR be a non-zero row-reduced echelon matrix. Then the non-zero row vectors of RRR form a basis for the row space of RRR.

Theorem 11. Let mmm and nnn be positive integers and let FFF be a field. Suppose WWW is a subspace of FnF^nFn and dimWm\dim W\leq mdimW≤m. Then there is precisely one m×nm\times nm×n row-reduced echelon matrix over FFF which has WWW as its row space.
Corollary. Each m×nm\times nm×n matrix AAA is row-equivalent to one and oonly one row-reduced echelon matrix.
Corollary. Let AAA and BBB be m×nm\times nm×n matrices over the field FFF. Then AAA and BBB are row-equivalent if and only if they have the same row space.

christangdt 发布了77 篇原创文章 · 获赞 14 · 访问量 2782 私信 关注

标签:Chapter,Definition,vectors,space,VVV,Theorems,vector,FFF,alpha
来源: https://blog.csdn.net/christangdt/article/details/104430648