• Complement to the basis of a system of vectors. Existence of a vector space basis

    Golovizin V.V. Lectures on algebra and geometry. 5

    Lectures on algebra and geometry. Semester 2.

    Lecture 23. Basis of vector space.

    Summary: criterion for the linear dependence of a system of non-zero vectors, subsystems of a vector system, a generating system of vectors, a minimal generating system and a maximal linearly independent system, a basis of a vector space and its 4 equivalent definitions, the dimension of a vector space, a finite-dimensional vector space and the existence of its basis, addition to basis.

    clause 1. Criterion for linear dependence of a system of nonzero vectors.

    Theorem. A system of nonzero vectors is linearly dependent if and only if there is a vector of the system that is linearly expressed in terms of the previous vectors of this system.

    Proof. Let the system consist of nonzero vectors and be linearly dependent. Consider a system of one vector:
    . Because
    , then the system
    - linearly independent. Let's attach a vector to it . If the resulting system
    linearly independent, then we add the following vector to it: . Etc. we continue until we get a linearly dependent system
    , Where . There will definitely be such a number, because... source system
    is linearly dependent by condition.

    So, by construction, we get a linearly dependent system
    , and the system
    is linearly independent.

    System
    represents the zero vector non-trivially, i.e. there is such a non-zero set of scalars
    , What

    where is the scalar
    .

    Indeed, otherwise, if
    , then we would have a nontrivial representation of the zero vector by a linearly independent system
    , which is impossible.

    Dividing the last equality by a non-zero scalar
    , we can express the vector from it :

    ,

    Since the converse is obvious, the theorem is proven.

    clause 2. Subsystems of a vector space vector system.

    Definition. Any non-empty subset of a system of vectors
    is called a subsystem of a given system of vectors.

    Example. Let
    – a system of 10 vectors. Then the vector systems:
    ;
    ,
    – subsystems of a given vector system.

    Theorem. If a system of vectors contains a linearly dependent subsystem, then the system of vectors itself is also linearly dependent.

    Proof. Let a system of vectors be given
    and let, for definiteness, the subsystem
    , Where
    is linearly dependent. Then it represents the zero vector in a non-trivial way:

    where among the coefficients
    there is at least one that is not equal to zero. But then the following equality is a non-trivial representation of the zero vector:

    which, by definition, implies a linear dependence of the system
    , etc.

    The theorem has been proven.

    Consequence. Any subsystem of a linearly independent system of vectors is linearly independent.

    Proof. Let's assume the opposite. Let some subsystem of this system be linearly dependent. Then the theorem implies a linear dependence of this system, which contradicts the condition.

    The investigation has been proven.

    clause 3. Column systems of arithmetic vector column space.

    From the results of the previous paragraph, as a special case, the theorem follows.

    1) A system of columns is linearly dependent if and only if there is at least one column in the system that is linearly expressed through other columns of this system.

    2) A system of columns is linearly independent if and only if no column of the system is linearly expressed in terms of other columns of the system.

    3) A system of columns containing a zero column is linearly dependent.

    4) A column system containing two equal columns is linearly dependent.

    5) A column system containing two proportional columns is linearly dependent.

    6) A system of columns containing a linearly dependent subsystem is linearly dependent.

    7) Any subsystem of a linearly independent system of columns is linearly independent.

    The only thing that may need to be clarified here is the concept of proportional columns.

    Definition. Two non-zero columns
    are called proportional if there is a scalar
    , such that
    or

    ,
    , …,
    .

    Example. System
    is linearly dependent since its first two columns are proportional.

    Comment. We already know (see Lecture 21) that a determinant is equal to zero if the system of its columns (rows) is linearly dependent. In the future, it will be proven that the converse statement is also true: if the determinant is equal to zero, then the system of its columns and the system of its rows are linearly dependent.

    clause 4. Basis of vector space.

    Definition. Vector system
    of a vector space over a field K is called a generating (generating) system of vectors of this vector space if it represents any of its vectors, i.e. if there is such a set of scalars
    , What .

    Definition. A system of vectors in a vector space is called a minimal generating system if, when any vector is removed from this system, it ceases to be a generating system.

    Comment. It immediately follows from the definition that if the generating system of vectors is not minimal, then there is at least one vector of the system such that when removed from the system, the remaining system of vectors will still be generating.

    Lemma (On a linearly dependent generating system.)

    If in a linearly dependent and generating system of vectors one of the vectors is linearly expressed through the others, then it can be removed from the system and the remaining system of vectors will be generating.

    Proof. Let the system
    linearly dependent and generating, and let one of its vectors be linearly expressed through other vectors of this system.

    For definiteness and simplicity of notation, let us assume that

    Because
    is a generating system, then
    there is such a set of scalars
    , What

    .

    From here we get,

    those. any vector x is linearly expressed through the vectors of the system
    , which means that it is a generating system, etc.

    Corollary 1. A linearly dependent and generating system of vectors is not minimal.

    Proof. This immediately follows from the lemma and the definition of a minimal generating system of vectors.

    Corollary 2. The minimal generating system of vectors is linearly independent.

    Proof. Assuming the opposite, we arrive at a contradiction with Corollary 1.

    Definition. A system of vectors in a vector space is called a maximal linearly independent system if, when any vector is added to this system, it becomes linearly dependent.

    Comment. It immediately follows from the definition that if a system is linearly independent, but not maximal, then there is a vector that, when added to the system, produces a linearly independent system.

    Definition. The basis of a vector space V over a field K is an ordered system of its vectors that represents any vector of the vector space in a unique way.

    In other words, the system of vectors
    of a vector space V over a field K is called its basis if
    there is only one set of scalars
    , such that .

    Theorem. (On four equivalent definitions of basis.)

    Let
    – an ordered system of vectors in a vector space. Then the following statements are equivalent:

    1. System
    is a basis.

    2. System
    is a linearly independent and generating system of vectors.

    3. System
    is a maximal linearly independent system of vectors.

    4. System
    is a minimal generating system of vectors.

    Proof.

    Let the system of vectors
    is a basis. From the definition of the basis it immediately follows that this system of vectors is a generating system of vectors in the vector space, so we only need to prove its linear independence.

    Let us assume that this system of vectors is linearly dependent. Then there are two representations of the zero vector - trivial and non-trivial, which contradicts the definition of a basis.

    Let the system of vectors
    is linearly independent and generating. We need to prove that this linearly independent system is maximal.

    Let's assume the opposite. Let this linearly independent system of vectors not be maximal. Then, due to the remark above, there is a vector that can be added to this system and the resulting system of vectors remains linearly independent. However, on the other hand, a vector added to the system can be represented as a linear combination of the original system of vectors due to the fact that it is a generating system.

    And we get that in the new, expanded system of vectors, one of its vectors is linearly expressed through other vectors of this system. Such a system of vectors is linearly dependent. We got a contradiction.

    Let the system of vectors
    vector space is maximally linearly independent. Let us prove that it is a minimal generating system.

    a) First we prove that it is a generating system.

    Note that due to linear independence, the system
    does not contain a null vector. Let be an arbitrary nonzero vector. Let's add it to this vector system:
    . The resulting system of non-zero vectors is linearly dependent, because the original system of vectors is maximally linearly independent. This means that in this system there is a vector that can be expressed linearly through the previous ones. In the original linearly independent system
    none of the vectors can be expressed in terms of the previous ones; therefore, only the vector x can be expressed linearly in terms of the previous ones. Thus, the system
    represents any non-zero vector. It remains to note that this system obviously also represents a zero vector, i.e. system
    is generative.

    b) Now let us prove its minimality. Let's assume the opposite. Then one of the vectors of the system can be removed from the system and the remaining system of vectors will still be a generating system and, therefore, the vector removed from the system is also linearly expressed through the remaining vectors of the system, which contradicts the linear independence of the original system of vectors.

    Let the system of vectors
    vector space is a minimal generating system. Then it represents any vector in a vector space. We need to prove the uniqueness of the representation.

    Let's assume the opposite. Let some vector x be linearly expressed through the vectors of a given system in two different ways:

    Subtracting from one equality the other, we get:

    By Corollary 2, the system
    is linearly independent, i.e. represents the zero vector only trivially, so all coefficients of this linear combination must be zero:

    Thus, any vector x is linearly expressed through the vectors of a given system in a unique way, etc.

    The theorem has been proven.

    clause 5. Dimension of vector space.

    Theorem 1. (On the number of vectors in linearly independent and generating systems of vectors.) The number of vectors in any linearly independent system of vectors does not exceed the number of vectors in any generating system of vectors of the same vector space.

    Proof. Let
    arbitrary linearly independent system of vectors,
    - arbitrary generating system. Let's assume that .

    Because
    generating system, then it represents any vector of space, including the vector . Let's connect it to this system. We obtain a linearly dependent and generating system of vectors:
    . Then there is a vector
    of this system, which is linearly expressed through the previous vectors of this system and, by virtue of the lemma, can be removed from the system, and the remaining system of vectors will still be generating.


    . Because this system is generating, then it represents a vector
    and, adding it to this system, we again obtain a linearly dependent and generating system: .

    Then everything is repeated. There is a vector in this system that is linearly expressed in terms of the previous ones, and it cannot be a vector , because source system
    linearly independent and vector cannot be expressed linearly through a vector
    . This means this can only be one of the vectors
    . Removing it from the system, we obtain, after renumbering, the system, which will be the generating system. Continuing this process, through steps we obtain a generating system of vectors: , where
    , because according to our guess. This means that this system, as a generating system, also represents the vector, which contradicts the condition of linear independence of the system
    .

    Theorem 1 is proven.

    Theorem 2. (On the number of vectors in a basis.) Any basis of a vector space contains the same number of vectors.

    Proof. Let
    And
    – two arbitrary bases of the vector space. Any basis is a linearly independent and generating system of vectors.

    Because the first system is linearly independent, and the second is generating, then, according to Theorem 1,
    .

    Similarly, the second system is linearly independent, and the first is generating, then . It follows that
    , etc.

    Theorem 2 is proven.

    This theorem allows us to introduce the following definition.

    Definition. The dimension of a vector space V over a field K is the number of vectors in its basis.

    Designation:
    or
    .

    clause 6. Existence of a vector space basis.

    Definition. A vector space is called finite-dimensional if it has a finite generating system of vectors.

    Comment. We will study only finite-dimensional vector spaces. Despite the fact that we already know quite a lot about the basis of a finite-dimensional vector space, we are not sure that the basis of such a space exists at all. All previously obtained properties were obtained under the assumption that the basis exists. The following theorem closes this question.

    Theorem. (On the existence of a basis for a finite-dimensional vector space.) Any finite-dimensional vector space has a basis.

    Proof. By condition, there exists a finite generating system of vectors for a given finite-dimensional vector space V:
    .

    Let us immediately note that if the generating system of vectors is empty, i.e. does not contain any vector, then by definition it is assumed that this vector space is zero, i.e.
    . In this case, by definition, it is assumed that the basis of the zero vector space is the empty basis and its dimension, by definition, is assumed to be equal to zero.

    Let further, be a nonzero vector space and
    a finite generating system of nonzero vectors. If it is linearly independent, then everything is proven, because a linearly independent and generating system of vectors of a vector space is its basis. If a given system of vectors is linearly dependent, then one of the vectors of this system is linearly expressed in terms of the remaining ones and can be removed from the system, and the remaining system of vectors, by virtue of Lemma 5, will still be generating.

    Let's renumber the remaining system of vectors:
    . The reasoning is then repeated. If this system is linearly independent, then it is a basis. If not, then again there will be a vector in this system that can be removed, and the remaining system will be generating.

    By repeating this process, we cannot be left with an empty system of vectors, because in the most extreme case we will arrive at a generating system of one non-zero vector, which is linearly independent, and therefore a basis. Therefore, at some step we arrive at a linearly independent and generating system of vectors, i.e. to the base.

    The theorem has been proven.

    Lemma. Let . Then:

    1. Any system from a vector is linearly dependent.

    2. Any linearly independent system of vectors is its basis.

    Proof. 1). According to the conditions of the lemma, the number of vectors in the basis is equal and the basis is a generating system, therefore the number of vectors in any linearly independent system cannot exceed .

    2). As follows from what has just been proven, any linearly independent system of vectors of this vector space is maximal, and therefore a basis.

    The lemma is proven.

    Theorem (On complementation to a basis.) Any linearly independent system of vectors in a vector space can be complemented to a basis of this space.

    Proof. Let a vector space of dimension n and
    some linearly independent system of its vectors. Then
    .

    If
    , then according to the previous lemma, this system is a basis and there is nothing to prove.

    If
    , then this system is not a maximal linear independent system (otherwise it would be a basis, which is impossible, because ). Therefore, there is a vector
    , such that the system
    – linearly independent.

    If, now , then the system
    is a basis.

    If
    , everything repeats itself. The process of replenishing the system cannot continue indefinitely, because at each step we obtain a linearly independent system of space vectors, and according to the previous lemma, the number of vectors in such a system cannot exceed the dimension of the space. Consequently, at some step we will arrive at the basis of this space.

    The theorem has been proven.

    clause 7. Example.

    1. Let K be an arbitrary field and be an arithmetic vector space of columns of height . Then . To prove this, consider the system of columns of this space.

    It is called finite-dimensional if it has a finite generating system of vectors.

    Comment. We will study only finite-dimensional vector spaces. Despite the fact that we already know quite a lot about the basis of a finite-dimensional vector space, we are not sure that such a space exists at all. All previously obtained results were obtained under the assumption that the basis exists. The following closes this question.

    Theorem. (On the existence of a basis of a finite-dimensional vector space.)

    Any finite-dimensional vector space has a basis.

    Proof. By condition, there is a finite generating system of a given finite-dimensional vector space V: .

    Let us immediately note that if the generating system of vectors is empty, i.e. does not contain any vector, then by definition it is assumed that this vector space is zero, i.e. . In this case, by definition, it is assumed that the basis of the null vector space is the empty basis and it is, by definition, equal to zero.

    If this system is independent, then everything is proven, because a linearly independent and generating system of vectors of a vector space is its basis.

    If a given system of vectors is linearly dependent, then one of the vectors of this system is linearly expressed in terms of the remaining ones and can be removed from the system, and the remaining system of vectors will still be generating.

    Let's renumber the remaining system of vectors: . The reasoning is then repeated.

    If this system is linearly independent, then it is a basis. If not, then again there will be a vector in this system that can be removed, and the remaining system will be generating.

    By repeating this process, we cannot be left with an empty system of vectors, because in the most extreme case we will arrive at a generating system of one non-zero vector, which is linearly independent, and therefore a basis. Therefore, at some step we arrive at a linearly independent and generating system of vectors, i.e. to the base, etc.

    The theorem has been proven.

    Lemma. (On systems of vectors in n-dimensional vector space.)

    Let . Then:

    1. Any system from a vector is linearly dependent.

    2. Any linearly independent system of vectors is its basis.

    Proof. 1). According to the conditions of the lemma, the number of vectors in the basis is equal and the basis is a generating system, therefore the number of vectors in any linearly independent system cannot exceed , i.e. any system containing a vector is linearly dependent.

    2). As follows from what has just been proven, any linearly independent system of vectors of this vector space is maximal, and therefore a basis.

    The lemma is proven.

    Theorem (On complementation to a basis.) Any linearly independent system of vectors in a vector space can be complemented to a basis of this space.

    Proof. Let there be a vector space of dimension n and some linearly independent system of its vectors. Then .

    If , then according to the previous lemma, this system is a basis and there is nothing to prove.

    If , then this system is not a maximal independent system (otherwise it would be a basis, which is impossible, because ). Consequently, there is a vector such that the system – linearly independent.

    If, now , then the system is a basis.

    If so, everything repeats itself. The process of replenishing the system cannot continue indefinitely, because at each step we obtain a linearly independent system of space vectors, and according to the previous lemma, the number of vectors in such a system cannot exceed the dimension of the space. Consequently, at some step we will come to the basis of this space., etc.

    Definition. Basis

    an arithmetic column vector space of height n is called canonical or natural.

    Let V vector space over the field R, S- system of vectors from V.

    Definition 1. The basis of the vector system S such an ordered linearly independent subsystem is called B 1, B 2, ..., B R systems S, that any vector of the system S linear combination of vectors B 1, B 2, ..., B R.

    Definition 2. Rank of the vector system S is the number of basis vectors of the system S. The rank of the vector system is indicated S symbol R= rank S.

    If S = ( 0 ), then the system has no basis and it is assumed that rank S= 0.

    Example 1. Let a system of vectors be given A 1 = (1,2), A 2 = (2,3), A 3 = (3,5), A 4 = (1.3). Vector A 1 , A 2 form the basis of this system, since they are linearly independent (see example 3.1) and A 3 = A 1 + A 2 , A 4 = 3A 1 - A 2. The rank of this vector system is two.

    Theorem 1(theorem on bases). Let S be a finite system of vectors from V, S ≠{0 }. Then the statements are true.

    1 ° Any linearly independent subsystem of the system S can be expanded to a basis.

    2 ° System S has a basis.

    2 ° Any two bases of the system S contain the same number of vectors, i.e., the rank of the system does not depend on the choice of basis.

    4 ° If R= rank S, then any r linearly independent vectors form the basis of the system S.

    5 ° If R= rank S, Then any k > r vectors of the system S are linearly dependent.

    6 ° Any vector A€ S is uniquely linearly expressed through the basis vectors, i.e., if B 1, B 2, ..., B R is the basis of system S, then

    A = A1 B 1 + A2 B 2 +...+ ARB R; A1 , A2 , ..., AN€P,(1)

    And this is the only representation.

    Due to the 5° basis, this Maximally linearly independent subsystem systems S, and the rank of the system S the number of vectors in such a subsystem.

    Vector representation A in the form (1) is called By decomposing a vector into basis vectors, and the numbers a1, a2 , ..., ar are called Vector coordinates A On this basis.

    Proof. 1° Let B 1, B 2, ..., B K- linearly independent subsystem of the system S. If each vector of the system S Linearly expressed through the vectors of our subsystem, then by definition it is the basis of the system S.

    If there is a vector in the system S, which is not linearly expressed in terms of vectors B 1, B 2, ..., B K, then we denote it by B K+1. Then the systems B 1, B 2, ..., B K, B K+1 - linearly independent. If each vector of the system S Linearly expressed through the vectors of this subsystem, then by definition it is the basis of the system S.

    If there is a vector in the system S, which is not linearly expressed through B 1, B 2, ..., B K, B K+1, then let’s repeat the reasoning. Continuing this process, we will either arrive at the basis of the system S, or increase the number of vectors in a linearly independent system by one. Since in the system S finite number of vectors, then the second alternative cannot continue indefinitely and at some step we obtain the basis of the system S.

    2° Let S finite system of vectors and S ≠{0 ). Then in the system S there is a vector B 1 ≠ 0, which forms a linearly independent subsystem of the system S. According to the first part, it can be supplemented to the basis of the system S. Thus the system S has a basis.

    3° Let us assume that the system S has two bases:

    B 1, B 2, ..., B R , (2)

    C 1, C 2, ..., C S , (3)

    By definition of the basis, the system of vectors (2) is linearly independent and (2) Н S. Further, by definition of the basis, each vector of system (2) is a linear combination of vectors of system (3). Then, by the main theorem about two systems of vectors R £ S. Similarly, it is proved that S £ R. From these two inequalities it follows R = S.

    4° Let R= rank S, A 1, A 2, ..., A R- linearly independent subsystem S. Let us show that it is the basis of the systems S. If it is not a basis, then using the first part it can be supplemented to a basis and we obtain a basis A 1, A 2, ..., A R, A R+1,..., A R+T containing more than R

    5° If K vectors A 1, A 2, ..., A K (K > R) systems S- are linearly independent, then from the first part this system of vectors can be supplemented to a basis and we obtain a basis A 1, A 2, ..., A K, A K+1,..., A K+T containing more than R vectors. This contradicts what was proven in the third part.

    6° Let B 1, B 2, ..., B R system basis S. By definition of a basis, any vector A S there is a linear combination of basis vectors:

    A = a1 B 1 + a2 B 2 +...+ ar B R.

    In proving the uniqueness of such a representation, let us assume the opposite, that there is another representation:

    A = b1 B 1 + b2 B 2 +...+ br B R.

    Subtracting the equalities term by term we find

    0 = (a1 - b1) B 1 + (a2 - b2) B 2 +...+ (ar - br) B R.

    Since the basis B 1, B 2, ..., B R linearly independent system, then all coefficients ai - bi =0; I = 1, 2, ..., R. Therefore, ai = bi; I = 1, 2, ..., R and uniqueness is proven.