• C definition of linear dependence of a string. Linear combinations of rows or columns of matrices. §4.9. Matrix rank

    Let

    Dimension matrix columns. Linear combination of matrix columns called a column matrix, with some real or complex numbers called linear combination coefficients. If in a linear combination we take all the coefficients equal to zero, then the linear combination is equal to the zero column matrix.

    The columns of the matrix are called linearly independent , if their linear combination is equal to zero only when all the coefficients of the linear combination are equal to zero. The columns of the matrix are called linearly dependent , if there is a set of numbers among which at least one is non-zero, and the linear combination of columns with these coefficients is equal to zero

    Similarly, the definitions of linear dependence and linear independence of matrix rows can be given. In what follows, all theorems are formulated for the columns of the matrix.

    Theorem 5

    If there is a zero among the matrix columns, then the matrix columns are linearly dependent.

    Proof. Consider a linear combination in which all coefficients are equal to zero for all non-zero columns and one for all zero columns. It is equal to zero, and among the coefficients of the linear combination there is a nonzero coefficient. Therefore, the columns of the matrix are linearly dependent.

    Theorem 6

    If matrix columns are linearly dependent, that's all matrix columns are linearly dependent.

    Proof. For definiteness, we will assume that the first columns of the matrix linearly dependent. Then, by the definition of a linear dependence, there is a set of numbers among which at least one is nonzero, and the linear combination of columns with these coefficients is equal to zero

    Let's make a linear combination of all columns of the matrix, including the remaining columns with zero coefficients

    But . Therefore, all columns of the matrix are linearly dependent.

    Consequence. Among linearly independent matrix columns, any are linearly independent. (This statement can be easily proven by contradiction.)

    Theorem 7

    In order for the columns of a matrix to be linearly dependent, it is necessary and sufficient that at least one column of the matrix be a linear combination of the others.

    Proof.

    Necessity. Let the columns of the matrix be linearly dependent, that is, there is a set of numbers among which at least one is different from zero, and the linear combination of columns with these coefficients is equal to zero

    Let us assume for definiteness that . Then that is, the first column is a linear combination of the rest.



    Adequacy. Let at least one column of the matrix be a linear combination of the others, for example, , where are some numbers.

    Then , that is, the linear combination of columns is equal to zero, and among the numbers in the linear combination at least one (at ) is different from zero.

    Let the rank of the matrix be . Any non-zero minor of order 1 is called basic . Rows and columns at the intersection of which there is a basis minor are called basic .

    The concept of matrix rank is closely related to the concept of linear dependence (independence) of its rows or columns. In the future we will present the material for rows; for columns the presentation is similar.

    In the matrix A Let's denote its lines as follows:

    Two rows of a matrix are said to be equal, if their corresponding elements are equal: , if , .

    Arithmetic operations on matrix rows (multiplying a row by a number, adding rows) are introduced as operations carried out element-by-element:

    Line e called a linear combination of strings..., matrix, if it is equal to the sum of the products of these rows by arbitrary real numbers:

    The rows of the matrix are called linearly dependent, if there are numbers that are not simultaneously equal to zero, such that a linear combination of matrix rows is equal to the zero row:

    , =(0,0,...,0). (3.3)

    Theorem 3.3The rows of a matrix are linearly dependent if at least one row of the matrix is ​​a linear combination of the others.

    □ Indeed, let for definiteness in formula (3.3), then

    Thus, the row is a linear combine of the remaining rows. ■

    If a linear combination of rows (3.3) is equal to zero if and only if all coefficients are equal to zero, then the rows are called linearly independent.

    Theorem 3.4.(about the rank of the matrix) The rank of a matrix is ​​equal to the maximum number of its linearly independent rows or columns through which all its other rows (columns) are linearly expressed.

    □ Let the matrix A size m n has rank r(r min). This means that there is a non-zero minor r-th order. Any non-zero minor r The th order will be called the basis minor.

    For definiteness, let the basis minor be leading or corner minor. Then the rows of the matrix are linearly independent. Let's assume the opposite, that is, one of these strings, for example, is a linear combination of the others. Subtract from the elements r- of the 1st row, the elements of the 1st row, multiplied by , then the elements of the 2nd row, multiplied by , ... and the elements ( r- 1) - th rows multiplied by . Based on property 8, with such transformations of the matrix its determinant D will not change, but since r- the row will now consist of only zeros, then D = 0 is a contradiction. Therefore, our assumption that the rows of the matrix are linearly dependent is incorrect.

    Let's call the lines basic. Let us show that any (r+1) rows of the matrix are linearly dependent, i.e. any string is expressed in terms of basic ones.

    Let's consider a minor (r +1) of the first order, which is obtained by supplementing the minor in question with elements of another row i and column j. This minor is zero since the rank of the matrix is r, so any higher order minor is zero.

    Expanding it according to the elements of the last (added) column, we get

    Where the modulus of the last algebraic complement coincides with the basis minor D and therefore different from zero, i.e. 0.

    3. Voevodin V.V., Kuznetsov Yu.A. Matrices and calculations. - M.: Nauka, 1984.-320p.

    4. Ilyin V.A., Poznyak E.G. Linear algebra. - M.: “Science”, 1978. - 304 p.

    Consider an arbitrary, not necessarily square, matrix A of size mxn.

    Matrix rank.

    The concept of matrix rank is associated with the concept of linear dependence (independence) of the rows (columns) of the matrix. Let's consider this concept for strings. For columns - similarly.

    Let us denote the drains of matrix A:

    e 1 =(a 11,a 12,…,a 1n); e 2 =(a 21,a 22,…,a 2n);…, e m =(a m1,a m2,…,a mn)

    e k =e s if a kj =a sj , j=1,2,…,n

    Arithmetic operations on matrix rows (addition, multiplication by a number) are introduced as operations carried out element by element: λе k =(λа k1 ,λа k2 ,…,λа kn);

    e k +е s =[(a k1 +a s1),(a k2 +a s2),…,(a kn +a sn)].

    Line e is called linear combination rows e 1, e 2,…, e k, if it is equal to the sum of the products of these lines by arbitrary real numbers:

    e=λ 1 e 1 +λ 2 e 2 +…+λ k e k

    The lines e 1, e 2,…, e m are called linearly dependent, if there are real numbers λ 1 ,λ 2 ,…,λ m , not all equal to zero, that the linear combination of these strings is equal to the zero string: λ 1 e 1 +λ 2 e 2 +…+λ m e m = 0 ,Where 0 =(0,0,…,0) (1)

    If a linear combination is equal to zero if and only if all coefficients λ i are equal to zero (λ 1 =λ 2 =...=λ m =0), then the rows e 1, e 2,..., e m are called linearly independent.

    Theorem 1. In order for the strings e 1 , e 2 ,…, e m to be linearly dependent, it is necessary and sufficient that one of these strings be a linear combination of the remaining strings.

    Proof. Necessity. Let the strings e 1, e 2,…, e m be linearly dependent. Let, for definiteness, (1) λ m ≠0, then

    That. the string e m is a linear combination of the remaining strings. Etc.

    Adequacy. Let one of the strings, for example e m, be a linear combination of the remaining strings. Then there will be numbers such that the equality holds, which can be rewritten in the form

    where at least 1 of the coefficients, (-1), is not equal to zero. Those. the rows are linearly dependent. Etc.

    Definition. Minor kth order matrix A of size mxn is called a k-th order determinant with elements lying at the intersection of any k rows and any k columns of matrix A. (k≤min(m,n)). .

    Example., 1st order minors: =, =;

    2nd order minors: , 3rd order

    A 3rd order matrix has 9 1st order minors, 9 2nd order minors and 1 3rd order minor (the determinant of this matrix).

    Definition. Rank of matrix A is the highest order of the nonzero minors of this matrix. Designation - rg A or r(A).

    Matrix rank properties.

    1) the rank of the matrix A nxm does not exceed the smaller of its dimensions, i.e.

    r(A)≤min(m,n).

    2) r(A)=0 when all matrix elements are equal to 0, i.e. A=0.

    3) For a square matrix A of nth order r(A)=n, when A is non-degenerate.



    (The rank of a diagonal matrix is ​​equal to the number of its non-zero diagonal elements).

    4) If the rank of a matrix is ​​equal to r, then the matrix has at least one minor of order r that is not equal to zero, and all minors of higher orders are equal to zero.

    The following relations hold for the ranks of the matrix:

    2) r(A+B)≤r(A)+r(B); 3) r(AB)≤min(r(A),r(B));

    3) r(A+B)≥│r(A)-r(B)│; 4) r(A T A)=r(A);

    5) r(AB)=r(A), if B is a square non-singular matrix.

    6) r(AB)≥r(A)+r(B)-n, where n is the number of columns of matrix A or rows of matrix B.

    Definition. A non-zero minor of order r(A) is called basic minor. (Matrix A may have several basis minors). Rows and columns at the intersection of which there is a basis minor are called respectively base strings And base columns.

    Theorem 2 (about the basis minor). The underlying rows (columns) are linearly independent. Any row (any column) of matrix A is a linear combination of the basis rows (columns).

    Proof. (For strings). If the basic rows were linearly dependent, then according to Theorem (1) one of these rows would be a linear combination of other basic rows, then, without changing the value of the basic minor, you can subtract the indicated linear combination from this row and get a zero row, and this contradicts to the fact that the basis minor is different from zero. That. the basis rows are linearly independent.

    Let us prove that any row of the matrix A is a linear combination of the basis rows. Because with arbitrary changes of rows (columns) the determinant retains the property of being equal to zero, then, without loss of generality, we can assume that the basis minor is in the upper left corner of the matrix

    A=, those. located on the first r rows and first r columns. Let 1£j£n, 1£i£m. Let us show that the determinant of (r+1) order

    If j£r or i£r, then this determinant is equal to zero, because it will have two identical columns or two identical rows.

    If j>r and i>r, then this determinant is a minor of the (r+1)th order of the matrix A. Since The rank of the matrix is ​​r, which means that any minor of a higher order is equal to 0.

    Expanding it according to the elements of the last (added) column, we get

    a 1j A 1j +a 2j A 2j +…+a rj A rj +a ij A ij =0, where the last algebraic complement A ij coincides with the basis minor M r and therefore A ij = M r ≠0.

    Dividing the last equality by A ij, we can express the element a ij as a linear combination: , where .

    Let us fix the value of i (i>r) and find that for any j (j=1,2,...,n) the elements of the i-th row e i are linearly expressed through the elements of the rows e 1, e 2,...,e r, i.e. e. The i-th row is a linear combination of the basis rows: . Etc.

    Theorem 3. (necessary and sufficient condition for the determinant to be equal to zero). In order for the nth order determinant D to be equal to zero, it is necessary and sufficient that its rows (columns) be linearly dependent.

    Proof (p.40). Necessity. If the nth order determinant D is equal to zero, then the basis minor of its matrix is ​​of order r

    Thus, one row is a linear combination of the others. Then, by Theorem 1, the rows of the determinant are linearly dependent.

    Adequacy. If the rows D are linearly dependent, then by Theorem 1 one row A i is a linear combination of the remaining rows. Subtracting the indicated linear combination from the string A i without changing the value of D, we obtain a zero string. Therefore, according to the properties of determinants, D=0. etc.

    Theorem 4. During elementary transformations, the rank of the matrix does not change.

    Proof. As was shown when considering the properties of determinants, when transforming square matrices, their determinants either do not change, or are multiplied by a non-zero number, or change sign. In this case, the highest order of non-zero minors of the original matrix is ​​preserved, i.e. the rank of the matrix does not change. Etc.

    If r(A)=r(B), then A and B are equivalent: A~B.

    Theorem 5. Using elementary transformations, you can reduce the matrix to stepped view. The matrix is ​​called stepwise, if it has the form:

    A=, where a ii ≠0, i=1,2,…,r; r≤k.

    The condition r≤k can always be achieved by transposing.

    Theorem 6. The rank of an echelon matrix is ​​equal to the number of its non-zero rows .

    Those. The rank of the step matrix is ​​equal to r, because there is a nonzero minor of order r:

    Note that the rows and columns of the matrix can be considered as arithmetic vectors of dimensions m And n, respectively. Thus, the size matrix can be interpreted as a set m n-dimensional or n m-dimensional arithmetic vectors. By analogy with geometric vectors, we introduce the concepts of linear dependence and linear independence of the rows and columns of a matrix.

    4.8.1. Definition. Line
    called linear combination of strings with odds
    , if all elements of this line have the following equality:

    ,
    .

    4.8.2. Definition.

    Strings
    are called linearly dependent, if there is a non-trivial linear combination of them equal to the zero row, i.e. there are numbers that are not all equal to zero


    ,
    .

    4.8.3. Definition.

    Strings
    are called linearly independent, if only their trivial linear combination is equal to the zero row, i.e.

    ,

    4.8.4. Theorem. (Criterion for linear dependence of matrix rows)

    In order for the rows to be linearly dependent, it is necessary and sufficient that at least one of them is a linear combination of the others.

    Proof:

    Necessity. Let the lines
    are linearly dependent, then there is a nontrivial linear combination of them equal to the zero row:

    .

    Without loss of generality, assume that the first of the coefficients of the linear combination is nonzero (otherwise, the rows can be renumbered). Dividing this ratio by , we get


    ,

    that is, the first row is a linear combination of the others.

    Adequacy. Let one of the lines, for example, , is a linear combination of the others, then

    that is, there is a non-trivial linear combination of strings
    , equal to the zero string:

    which means the lines
    are linearly dependent, which is what needed to be proven.

    Comment.

    Similar definitions and statements can be formulated for the columns of the matrix.

    §4.9. Matrix rank.

    4.9.1. Definition. Minor order matrices size
    called the order determinant with elements located at the intersection of some of it lines and columns.

    4.9.2. Definition. Non-zero minor order matrices size
    called basic minor, if all minors of the matrix are of order
    are equal to zero.

    Comment. A matrix can have several basis minors. Obviously, they will all be of the same order. It is also possible that the matrix size
    minor order is different from zero, and the minors are of order
    does not exist, that is
    .

    4.9.3. Definition. The rows (columns) that form the basis minor are called basic rows (columns).

    4.9.4. Definition. Rank of a matrix is ​​called the order of its basis minor. Matrix rank denoted by
    or
    .

    Comment.

    Note that due to the equality of the rows and columns of the determinant, the rank of the matrix does not change when it is transposed.

    4.9.5. Theorem. (Invariance of matrix rank under elementary transformations)

    The rank of a matrix does not change during its elementary transformations.

    No proof.

    4.9.6. Theorem. (About the basic minor).

    The underlying rows (columns) are linearly independent. Any row (column) of a matrix can be represented as a linear combination of its basic rows (columns).

    Proof:

    Let's do the proof for strings. The proof of the statement for columns can be carried out by analogy.

    Let the rank of the matrix sizes
    equals , A
    − basic minor. Without loss of generality, we assume that the basis minor is located in the upper left corner (otherwise, the matrix can be reduced to this form using elementary transformations):

    .

    Let us first prove the linear independence of the basis rows. We will carry out the proof by contradiction. Let us assume that the basis rows are linearly dependent. Then, according to Theorem 4.8.4, one of the strings can be represented as a linear combination of the remaining basic strings. Therefore, if we subtract the specified linear combination from this row, we get a zero row, which means that the minor
    is equal to zero, which contradicts the definition of a basis minor. Thus, we have obtained a contradiction; therefore, the linear independence of the basis rows has been proven.

    Let us now prove that every row of a matrix can be represented as a linear combination of basis rows. If the line number in question from 1 to r, then, obviously, it can be represented as a linear combination with a coefficient equal to 1 for the line and zero coefficients for the remaining rows. Let us now show that if the line number from
    to
    , it can be represented as a linear combination of basis strings. Consider the matrix minor
    , obtained from the basis minor
    adding a line and an arbitrary column
    :

    Let us show that this minor
    from
    to
    and for any column number from 1 to .

    Indeed, if the column number from 1 to r, then we have a determinant with two identical columns, which is obviously equal to zero. If the column number from r+1 to , and the line number from
    to
    , That
    is a minor of the original matrix of higher order than the basis minor, which means that it is equal to zero from the definition of basis minor. Thus, it has been proven that the minor
    is zero for any line number from
    to
    and for any column number from 1 to . Expanding it over the last column, we get:

    Here
    − corresponding algebraic additions. Note that
    , since therefore
    is a basic minor. Therefore, the elements of the line k can be represented as a linear combination of the corresponding elements of the basis rows with coefficients independent of the column number :

    Thus, we have proven that an arbitrary row of a matrix can be represented as a linear combination of its basis rows. The theorem has been proven.

    Lecture 13

    4.9.7. Theorem. (On the rank of a non-singular square matrix)

    In order for a square matrix to be non-singular, it is necessary and sufficient that the rank of the matrix is ​​equal to the size of this matrix.

    Proof:

    Necessity. Let the square matrix size n is non-degenerate, then
    , therefore, the determinant of the matrix is ​​a basis minor, i.e.

    Adequacy. Let
    then the order of the basis minor is equal to the size of the matrix, therefore the basis minor is the determinant of the matrix , i.e.
    by definition of a basic minor.

    Consequence.

    In order for a square matrix to be non-singular, it is necessary and sufficient that its rows be linearly independent.

    Proof:

    Necessity. Since a square matrix is ​​non-singular, its rank is equal to the size of the matrix
    that is, the determinant of the matrix is ​​a basis minor. Therefore, by Theorem 4.9.6 on the basis minor, the rows of the matrix are linearly independent.

    Adequacy. Since all rows of the matrix are linearly independent, its rank is not less than the size of the matrix, which means
    therefore, by the previous Theorem 4.9.7, the matrix is non-degenerate.

    4.9.8. The method of bordering minors for finding the rank of a matrix.

    Note that part of this method has already been implicitly described in the proof of the basis minor theorem.

    4.9.8.1. Definition. Minor
    called bordering relative to minor
    , if it is obtained from a minor
    by adding one new row and one new column to the original matrix.

    4.9.8.2. The procedure for finding the rank of a matrix using the bordering minors method.

      We find any current minor of the matrix that is different from zero.

      We calculate all minors bordering it.

      If they are all equal to zero, then the current minor is a basis one, and the rank of the matrix is ​​equal to the order of the current minor.

      If among the bordering minors there is at least one non-zero, then it is considered current and the procedure continues.

    Using the method of bordering minors, we find the rank of the matrix

    .

    It is easy to specify the current non-zero second order minor, e.g.

    .

    We calculate the minors bordering it:




    Consequently, since all bordering minors of the third order are equal to zero, then the minor
    is basic, that is

    Comment. From the example considered, it is clear that the method is quite labor-intensive. Therefore, in practice, the method of elementary transformations is much more often used, which will be discussed below.

    4.9.9. Finding the rank of a matrix using the method of elementary transformations.

    Based on Theorem 4.9.5, it can be argued that the rank of the matrix does not change under elementary transformations (that is, the ranks of equivalent matrices are equal). Therefore, the rank of the matrix is ​​equal to the rank of the step matrix obtained from the original one by elementary transformations. The rank of a step matrix is ​​obviously equal to the number of its non-zero rows.

    Let's determine the rank of the matrix

    by the method of elementary transformations.

    Let's present the matrix to step view:

    The number of non-zero rows of the resulting echelon matrix is ​​three, therefore,

    4.9.10. Rank of a system of linear space vectors.

    Consider the system of vectors
    some linear space . If it is linearly dependent, then a linearly independent subsystem can be distinguished in it.

    4.9.10.1. Definition. Rank of the vector system
    linear space the maximum number of linearly independent vectors of this system is called. Vector system rank
    denoted as
    .

    Comment. If a system of vectors is linearly independent, then its rank is equal to the number of vectors in the system.

    Let us formulate a theorem showing the connection between the concepts of the rank of a system of vectors in a linear space and the rank of a matrix.

    4.9.10.2. Theorem. (On the rank of a system of vectors in linear space)

    The rank of a system of vectors in a linear space is equal to the rank of a matrix whose columns or rows are the coordinates of vectors in some basis of the linear space.

    No proof.

    Consequence.

    In order for a system of vectors in a linear space to be linearly independent, it is necessary and sufficient that the rank of the matrix, the columns or rows of which are the coordinates of vectors in a certain basis, is equal to the number of vectors in the system.

    The proof is obvious.

    4.9.10.3. Theorem (On the dimension of a linear shell).

    Dimension of linear hull vectors
    linear space equal to the rank of this vector system:

    No proof.