Next Article in Journal
On the Semi-Group Property of the Perpendicular Bisector in a Normed Space
Next Article in Special Issue
Splitting Extensions of Nonassociative Algebras and Modules with Metagroup Relations
Previous Article in Journal
A Model in Which the Separation Principle Holds for a Given Effective Projective Sigma-Class
Previous Article in Special Issue
QM-BZ-Algebras and Quasi-Hyper BZ-Algebras
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Forward Order Law for the Reflexive Inner Inverse of Multiple Matrix Products †

School of Mathematics and Computational Science, Wuyi University, Jiangmen 529020, China
*
Author to whom correspondence should be addressed.
This work was supported by the project for characteristic innovation of 2018 Guangdong University (No: 2018KTSCX234) and and the Natural Science Foundation of GuangDong (No: 2014A030313625) and the basic Theory and Scientific Research of Science and Technology Project of Jiangmen City, China (No: 2020JC01010, 2021030102610005049) and the joint research and development fund of Wuyi University, Hong Kong and Macao (No: 2019WGALH20).
Submission received: 10 December 2021 / Revised: 4 March 2022 / Accepted: 4 March 2022 / Published: 10 March 2022
(This article belongs to the Special Issue Algebra, Logic and Applications)

Abstract

:
The generalized inverse has numerous important applications in aspects of the theoretic research of matrices and statistics. One of the core problems of generalized inverse is finding the necessary and sufficient conditions for the reverse (or the forward) order laws for the generalized inverse of matrix products. In this paper, by using the extremal ranks of the generalized Schur complement, some necessary and sufficient conditions are given for the forward order law for A 1 { 1 , 2 } A 2 { 1 , 2 } A n { 1 , 2 } ( A 1 A 2 A n ) { 1 , 2 } .

1. Introduction

Throughout this paper, all matrices will be over the complex number field C. C m × n and C m denote the set of m × n complex matrices and m-dimensional complex vectors, respectively. For a matrix A in the set C m × n of all m × n matrices over C, the symbols r ( A ) and A * denote the rank and the conjugate transpose of the matrix A, respectively. As usual, the identity matrix of order k is denoted by I k , and the m × n matrix of all zero entries is denoted by O m × n (if no confusion occurs, we will drop the subscript).
For various applications, we will introduce some generalized inverses of matrices. Let A C m × n and η { 1 , 2 , 3 , 4 } be nonempty sets. If X C n × m satisfies the following equations ( i ) for all i η :
( 1 ) A X A = A ; ( 2 ) X A X = X ; ( 3 ) ( A X ) * = A X ; ( 4 ) ( X A ) * = X A ,
then X is said to be an η -inverse of A, which is denoted by X = A η . The set of all η -inverses of A is denoted by A { η } . For example, X is called a { 1 } -inverse or an inner inverse of A if it satisfies Equation (1), which is always denoted by X = A ( 1 ) A { 1 } . An n × m matrix X of the set A { 1 , 2 } is called a { 1 , 2 } -inverse or a reflexive inner inverse of A and is denoted by X = A ( 1 , 2 ) A { 1 , 2 } . The unique { 1 , 2 , 3 , 4 } -inverse of A is denoted by X = A ( 1 , 2 , 3 , 4 ) = A , which is also called the Moore Penrose inverse of A. As is well-known, each kind of η -inverse has its own properties and functions; see [1,2,3,4].
Theories and computations of the reverse (or the forward) order laws for generalized inverse are important in many branches of applied sciences, such as in non-linear control theory [2], matrix analysis [1,4], statistics [5,6], and numerical linear algebra [1,5,7]. Suppose that A i C m × m , i = 1 , 2 , , n , and b C m , the least squares problems (LS):
min x C m ( A 1 A 2 A n ) x b 2
is used in many practical scientific problems, see [4,5,6,7,8,9]. If the above LS is consistent, then any solution x for the above LS can be expressed as x = ( A 1 A 2 A n ) ( 1 , j , k ) b , where { j , k } { 2 , 3 , 4 } . For example, the minimum norm solution x has the form x = ( A 1 A 2 A n ) ( 1 , 4 ) b . The unique minimal norm least square solution x of the LS above is x = ( A 1 A 2 A n ) b .
One of the core problems with the LS above is identifying the conditions under which the following reverse order laws hold:
A n ( 1 , j , , k ) A n 1 ( 1 , j , , k ) A 1 ( 1 , j , , k ) ( A 1 A 2 A n ) ( 1 , j , , k )
Another core problem with the LS above is identifying the conditions under which the following forward order laws hold:
A 1 ( 1 , j , , k ) A 2 ( 1 , j , , k ) A n ( 1 , j , , k ) ( A 1 A 2 A n ) ( 1 , j , , k )
The reverse order laws for the generalized inverse of multiple matrix products (1) yield a class of interesting problems that are fundamental in the theory of the generalized inverse of matrices; see [1,4,5,6]. As a hot topic in current matrix research, the necessary and sufficient conditions for the reverse order laws for the generalized inverse of matrix products are useful in both theoretical study and practical scientific computing; hence, this has attracted considerable attention and several interesting results have been obtained; see [10,11,12,13,14,15,16,17,18,19,20,21,22,23].
The forward order law for the generalized inverse of multiple matrix products (2) originally arose in the study of the inverse of multiple matrix Kronecker products; see [1,4]. Recently, Xiong et al. studied the forward order laws for some generalized inverses of multiple matrix products by using the maximal and minimal ranks of the generalized Schur complement; see [24,25,26,27]. To our knowledge, the forward order law for the reflexive inner inverse of multiple matrix products has not yet been studied in the literature. In this paper, by using the extremal ranks of the generalized Schur complement, we will provide some necessary and sufficient conditions for the forward order law:
A 1 { 1 , 2 } A 2 { 1 , 2 } A n { 1 , 2 } ( A 1 A 2 A n ) { 1 , 2 } .
As we all know, the most widely used generalized inverses of matrices, such as M-P inverses, Drazin inverses, group inverses, etc., are some special { 1 , 2 } -inverses. Therefore, the forward order law for the { 1 , 2 } -inverse of a multiple matrix product studied in this paper is broad and general and contains the forward order laws for the above-mentioned generalized inverses.
The main tools of the later discussion are the following lemmas.
Lemma 1 
([1]). Let A C m × n and X C n × m . Then,
X A { 1 , 2 } A X A = A a n d r ( X ) r ( A ) .
Lemma 2 
([28]). Let A C m × n , B C m × k , C C l × n , and D C l × k . Then,
max A ( 1 , 2 ) r ( D C A ( 1 , 2 ) B ) = min { r ( A ) + r ( D ) , r ( C , D ) , r B D , r A B C D r ( A ) } ,
where A ( 1 , 2 ) A { 1 , 2 } .
Lemma 3 
([27]). Let A i C m × m , i = 1 , 2 , , n . Then,
( n 1 ) m + r ( A 1 A 2 A n ) r ( A 1 ) + r ( A 2 ) + + r ( A n ) .
Lemma 4 
([29]). Let A, B have suitable sizes. Then,
r ( A , B ) r ( A ) + r ( B ) a n d r ( A , B ) m a x { r ( A ) , r ( B ) } .

2. Main Results

In this section, by using the extremal ranks of the generalized Schur complement, we will give some necessary and sufficient conditions for the forward order law for the reflexive inner inverse of multiple matrix products (3).
Let
S A 1 A 2 A n = A 1 A 2 A n A 1 A 2 A n X 1 X 2 X n A 1 A 2 A n ,
where A i C m × m , X i A i { 1 , 2 } , i = 1 , 2 , , n . From Lemma 1, we know that (3) holds if and only if:
S A 1 A 2 A n = 0 a n d r ( X 1 X 2 X n ) r ( A 1 A 2 A n ) ,
hold for any X i A i { 1 , 2 } , i = 1 , 2 , , n , which are respectively equivalent to the following two identities:
max X 1 , X 2 , , X n r ( S A 1 A 2 A n ) = 0
and
max X 1 , X 2 , , X n r ( X 1 X 2 X n ) r ( A 1 A 2 A n ) .
Hence, we can present the equivalent conditions for the forward order law (3) if the concrete expression of the maximal ranks involved in the identities in (6) and (7) are derived. The relative results are included in the following three theorems.
Theorem 1.
Let A i C m × m , X i A i { 1 , 2 } , i = 1 , 2 , , n and S A 1 A 2 A n be as in (4). Then,
max X 1 , X 2 , , X n r ( S A 1 A 2 A n ) = min { r ( A 1 A 2 A n ) , r ( A n A n 1 A 1 A 1 A 2 A n ) + r ( A 1 A 2 A n ) + ( n 1 ) m l = 1 n r ( A l ) } .
Proof. 
Suppose that X 0 = I m . For 1 i n 1 , we first prove the following:
max X n i 1 i n 1 r ( A n A n 1 A n i + 1 A 1 A 2 A n X 1 X 2 X n i ) = min { r ( A 1 A 2 A n X 1 X 2 X n i 1 , A n A n 1 A n i + 1 ) , r ( A n A n 1 A n i A 1 A 2 A n X 1 X 2 X n i 1 ) + m r ( A n i ) } .
In fact, by Lemma 2, we have the equations below:
max X n i r ( A n A n 1 A n i + 1 A 1 A 2 A n X 1 X 2 X n i ) = min { r ( A n i ) + r ( A n A n 1 A n i + 1 ) , r ( A 1 A 2 A n X 1 X 2 X n i 1 , A n A n 1 A n i + 1 ) , r I m A n A n 1 A n i + 1 , r A n i I m A 1 A 2 A n X 1 X 2 X n i 1 A n A n 1 A n i + 1 r ( A n i ) } = min { r ( A 1 A 2 A n X 1 X 2 X n i 1 , A n A n 1 A n i + 1 ) , r ( A n A n 1 A n i A 1 A 2 A n X 1 X 2 X n i 1 ) + m r ( A n i ) } ,
where the second equality holds, since by Lemma 4, we have:
r ( A 1 A 2 A n X 1 X 2 X n i 1 , A n A n 1 A n i + 1 ) r ( A 1 A 2 A n X 1 X 2 X n i 1 ) + r ( A n A n 1 A n i + 1 ) r ( A n i ) + r ( A n A n 1 A n i + 1 )
and
r ( A 1 A 2 A n X 1 X 2 X n i 1 , A n A n 1 A n i + 1 ) m = r I m A n A n 1 A n i + 1 .
More specifically, when i = n 1 , we have the following:
max X 1 r ( A n A n 1 A 2 A 1 A 2 A n X 1 ) = min { r ( A 1 A 2 A n , A n A n 1 A 2 ) , r ( A n A n 1 A 1 A 1 A 2 A n ) + m r ( A 1 ) } .
We now prove (8). Again, by Lemma 2, we have the following equations:
max X n r ( S A 1 A 2 A n ) = max X n r ( A 1 A 2 A n A 1 A 2 A n X 1 X 2 X n A 1 A 2 A n ) = min { r ( A n ) + r ( A 1 A 2 A n ) , r ( A 1 A 2 A n X 1 X 2 X n 1 , A 1 A 2 A n ) , r A 1 A 2 A n A 1 A 2 A n , r A n A 1 A 2 A n A 1 A 2 A n X 1 X 2 X n 1 A 1 A 2 A n r ( A n ) } = min { r ( A 1 A 2 A n ) , r ( A n A 1 A 2 A n X 1 X 2 X n 1 ) + r ( A 1 A 2 A n ) r ( A n ) } ,
where the third equality holds, since by Lemma 4, we have:
r A 1 A 2 A n A 1 A 2 A n = r ( A 1 A 2 A n ) r ( A 1 A 2 A n X 1 X 2 X n 1 , A 1 A 2 A n )
and
r A 1 A 2 A n A 1 A 2 A n = r ( A 1 A 2 A n ) r ( A n ) + r ( A 1 A 2 A n ) .
Combining (9) with (11), we obtain the following equations:
max X n 1 , X n r ( S A 1 A 2 A n ) = min { r ( A 1 A 2 A n ) , max X n 1 r ( A n A 1 A 2 A n X 1 X 2 X n 1 ) + r ( A 1 A 2 A n ) r ( A n ) } = min { r ( A 1 A 2 A n ) , r ( A 1 A 2 A n X 1 X 2 X n 2 , A n ) + r ( A 1 A 2 A n ) r ( A n ) , r ( A n A n 1 A 1 A 2 A n X 1 X 2 X n 2 ) + r ( A 1 A 2 A n ) + m r ( A n 1 ) r ( A n ) } = min { r ( A 1 A 2 A n ) , r ( A n A n 1 A 1 A 2 A n X 1 X 2 X n 2 ) + r ( A 1 A 2 A n ) + m r ( A n 1 ) r ( A n ) } ,
where the third equality holds, since by Lemma 4, we have:
r ( A 1 A 2 A n X 1 X 2 X n 2 , A n ) r ( A n ) .
In general, for 1 i n 2 , we have the equations below:
max X n i , X n i + 1 , , X n 1 i n 2 r ( S A 1 A 2 A n ) = min { r ( A 1 A 2 A n ) , r ( A n A n 1 A n i A 1 A 2 A n X 1 X 2 X n i 1 ) + r ( A 1 A 2 A n ) + i m l = n i n r ( A l ) } .
Equation (12) can be proved by using induction on i. In fact, for i = 1 , the statement in (12) is proved. Assuming the statement (12) is true for i 1 , that is:
max X n i + 1 , X n i + 2 , , X n r ( S A 1 A 2 A n ) = min { r ( A 1 A 2 A n ) , r ( A n A n 1 A n i + 1 A 1 A n X 1 X n i ) + r ( A 1 A n ) + ( i 1 ) m l = n i + 1 n r ( A l ) } .
We now prove that (12) is also true for i. By (9) and (13), we have the equations below:
max X n i , X n i + 1 , , X n r ( S A 1 A 2 A n ) = min { r ( A 1 A 2 A n ) , max X n i r ( A n A n i + 1 A 1 A n X 1 X n i ) + r ( A 1 A n ) + ( i 1 ) m l = n i + 1 n r ( A l ) } = min { r ( A 1 A 2 A n ) , r ( A 1 A n X 1 X n i 1 , A n A n i + 1 ) + r ( A 1 A n ) + ( i 1 ) m l = n i + 1 n r ( A l ) , r ( A n A n i A 1 A n X 1 X n i 1 ) + m + r ( A 1 A n ) r ( A n i ) + ( i 1 ) m l = n i + 1 n r ( A l ) } .
From Lemma 4, we have the following:
r ( A 1 A 2 A n X 1 X 2 X n i 1 , A n A n 1 A n i + 1 ) r ( A n A n 1 A n i + 1 )
and from Lemma 3, we have:
r ( A n A n 1 A n i + 1 ) + ( i 1 ) m r ( A n i + 1 ) + r ( A n i + 2 ) + + r ( A n ) .
Then, we recognize that (12) holds, that is, for 1 i n 2 :
max X n i , X n i + 1 , , X n r ( S A 1 A 2 A n ) = min { r ( A 1 A 2 A n ) , r ( A n A n i A 1 A n X 1 X n i 1 ) + r ( A 1 A 2 A n ) + i m l = n i n r ( A l ) } .
When i = n 2 , we get the following from (12):
max X 2 , X 3 , , X n r ( S A 1 A 2 A n ) = min { r ( A 1 A 2 A n ) , r ( A n A n 1 A 2 A 1 A n X 1 ) + r ( A 1 A n ) + ( n 2 ) m l = 2 n r ( A l ) } .
Hence, by (10) and (14), we have:
max X 1 , X 2 , , X n r ( S A 1 A 2 A n ) = min { r ( A 1 A 2 A n ) , max X 1 r ( A n A 2 A 1 A n X 1 ) + r ( A 1 A n ) + ( n 2 ) m l = 2 n r ( A l ) } = min { r ( A 1 A 2 A n ) , r ( A 1 A 2 A n , A n A n 1 A 2 ) + r ( A 1 A n ) + ( n 2 ) m l = 2 n r ( A l ) , r ( A n A 1 A 1 A n ) r ( A 1 ) + m + r ( A 1 A n ) + ( n 2 ) m l = 2 n r ( A l ) } = min { r ( A 1 A 2 A n ) , r ( A n A n 1 A 1 A 1 A 2 A n ) + r ( A 1 A 2 A n ) + ( n 1 ) m l = 1 n r ( A l ) } ,
where the third equality holds, since by Lemma 4, we have:
r ( A 1 A 2 A n , A n A n 1 A 2 ) r ( A n A n 1 A 2 )
and
r ( A n A n 1 A 2 ) + ( n 2 ) m l = 2 n r ( A l ) .
The next theorem gives the expression in the ranks of the known matrices for:
max X n , X n 1 , , X 1 r ( X 1 X 2 X n ) ,
where X i varies over A i { 1 , 2 } for i = 1 , 2 , , n .
Theorem 2.
Let A i C m × m , X i A i { 1 , 2 } , i = 1 , 2 , , n . Then,
max X n , X n 1 , , X 1 r ( X 1 X 2 X n ) = min { r ( A 1 ) , r ( A 2 ) , , r ( A n ) } .
Proof. 
We will divide the proof of Theorem 2 into two parts: first, n = 2 ; second, n 3 . When n = 2 , according to Lemma 2, with A = A 1 , B = X 2 , C = I m , and D = O , we have the following equations:
max X 1 r ( X 1 X 2 ) = min { r ( A 1 ) , r ( I m , O ) , r X 2 O , r A 1 X 2 I m O r ( A 1 ) } = min { r ( A 1 ) , m , r ( X 2 ) , r ( X 2 ) + m r ( A 1 ) } = min { r ( A 1 ) , r ( X 2 ) } .
Since X 2 A 2 { 1 , 2 } , then r ( X 2 ) = r ( A 2 ) . Thus, by (16), we have the equation below:
max X 2 , X 1 r ( X 1 X 2 ) = min { r ( A 1 ) , r ( A 2 ) } ,
i.e., Theorem 2 is true when n = 2 .
When n 3 , by Lemma 2, with A = A 1 , B = X 2 X 3 X n , C = I m , and D = O , we have:
max X 1 r ( X 1 X 2 X n ) = min { r ( A 1 ) , r ( I m , O ) , r X 2 X 3 X n O , r A 1 X 2 X 3 X n I m O r ( A 1 ) } = min { r ( A 1 ) , m , r ( X 2 X 3 X n ) , r ( X 2 X 3 X n ) + m r ( A 1 ) } = min { r ( A 1 ) , r ( X 2 X 3 X n ) } .
Again, by Lemma 2, with A = A 2 , B = X 3 X 4 X n , C = I m , and D = O , we have:
max X 2 , X 1 r ( X 1 X 2 X n ) = min { r ( A 1 ) , max X 2 r ( X 2 X 3 X n ) } = min { r ( A 1 ) , min { r ( A 2 ) , r ( I m , O ) , r X 3 X 4 X n O , r A 2 X 3 X 4 X n I m O r ( A 2 ) } } = min { r ( A 1 ) , min { r ( A 2 ) , m , r ( X 3 X 4 X n ) , r ( X 3 X 4 X n ) + m r ( A 2 ) } } = min { r ( A 1 ) , r ( A 2 ) , r ( X 3 X 4 X n ) } .
We claim that for 2 i n 1 :
max X i , X i 1 , , X 1 r ( X 1 X 2 X n ) = min { r ( A 1 ) , r ( A 2 ) , , r ( A i ) , r ( X i + 1 X i + 2 X n ) } .
Equation (20) can be proved by using induction on i. In fact, for i = 2 , the statement in (20) has been proved. Assuming the statement in (20) is true for i 1 , that is:
max X i 1 , X i 2 , , X 1 r ( X 1 X 2 X n ) = min { r ( A 1 ) , r ( A 2 ) , , r ( A i 1 ) , r ( X i X i + 1 X n ) } .
We now prove that (20) is also true for i. By (21) and Lemma 2, with A = A i , B = X i + 1 X i + 2 X n , C = I m , and D = O , we have the following:
max X i , X i 1 , , X 1 r ( X 1 X 2 X n ) = min { r ( A 1 ) , r ( A 2 ) , , r ( A i 1 ) , max X i r ( X i X i + 1 X n ) } = min { r ( A 1 ) , r ( A 2 ) , , r ( A i 1 ) , min { r ( A i ) , r ( I m , O ) , r X i + 1 X i + 2 X n O , r A i X i + 1 X i + 2 X n I m O r ( A i ) } } = min { r ( A 1 ) , r ( A 2 ) , , r ( A i 1 ) , min { r ( A i ) , m , r ( X i + 1 X i + 2 X n ) , r ( X i + 1 X i + 2 X n ) + m r ( A i ) } } = min { r ( A 1 ) , r ( A 2 ) , , r ( A i 1 ) , min { r ( A i ) , r ( X i + 1 X i + 2 X n ) } } = min { r ( A 1 ) , r ( A 2 ) , , r ( A i 1 ) , r ( A i ) , r ( X i + 1 X i + 2 X n ) } .
When i = n 1 , from (20), we have the following equations:
max X n 1 , X n 2 , , X 1 r ( X 1 X 2 X n ) = min { r ( A 1 ) , r ( A 2 ) , , r ( A n 1 ) , r ( X n ) } .
Since X n A n { 1 , 2 } , then r ( X n ) = r ( A n ) . Thus, by (22), we have the equation below:
max X n , X n 1 , , X 1 r ( X 1 X 2 X n ) = min { r ( A 1 ) , r ( A 2 ) , , r ( A n ) } ,
i.e., Theorem 2 is true when n 3 . □
Based on Theorem 1 and 2, we can immediately obtain the main result of this paper.
Theorem 3.
Let A i C m × m , i = 1 , 2 , , n . Then, the following statements are equivalent:
( 1 ) A 1 { 1 , 2 } A 2 { 1 , 2 } A n { 1 , 2 } ( A 1 A 2 A n ) { 1 , 2 } ; ( 2 ) min { r ( A 1 A n ) , r ( A n A 1 A 1 A n ) + r ( A 1 A n ) + ( n 1 ) m l = 1 n r ( A l ) } = 0 a n d min { r ( A 1 ) , r ( A 2 ) , , r ( A n ) } r ( A 1 A 2 A n ) ; ( 3 ) A 1 A 2 A n = O o r r ( A 1 A n A n A 1 ) + r ( A 1 A n ) + ( n 1 ) m l = 1 n r ( A l ) = 0 a n d min { r ( A 1 ) , r ( A 2 ) , , r ( A n ) } r ( A 1 A 2 A n ) ; ( 4 ) A 1 A 2 A n = O o r A 1 A n = A n A 1 a n d r ( A 1 A n ) + ( n 1 ) m = l = 1 n r ( A l ) a n d min { r ( A 1 ) , r ( A 2 ) , , r ( A n ) } r ( A 1 A 2 A n ) ; ( 5 ) A 1 A 2 A n = O o r A 1 A n = A n A 1 a n d r ( A 1 A n i ) + ( n i 1 ) m = l = 1 n i r ( A l ) , i = 0 , , n 2 , a n d min { r ( A 1 ) , r ( A 2 ) , , r ( A n ) } r ( A 1 A 2 A n ) .
Proof. 
( 1 ) ( 2 ) . From Lemma 1, we know that (3) holds if and only if Equations (6) and (7) hold. Then, according to Equation (8) in Theorem 1 and Equation (15) in Theorem 2, we have ( 1 ) ( 2 ) in Theorem 3.
( 2 ) ( 3 ) . In fact, r ( A ) = 0 if and only if A = O , so ( 2 ) ( 3 ) is obvious.
( 3 ) ( 4 ) . Since
r ( A 1 A 2 A n A n A n 1 A 1 ) 0
and from Lemma 3, we have:
r ( A 1 A 2 A n ) + ( n 1 ) m l = 1 n r ( A l ) ,
it is easy to obtain ( 3 ) ( 4 ) .
( 4 ) ( 5 ) . In fact, ( 5 ) ( 4 ) is obvious. We now show ( 4 ) ( 5 ) . In fact, for the case of i = 0 , the results in (4) are actually for (5). Assuming (5) holds for i 1 , where 1 i n 2 , i.e.:
r ( A 1 A 2 A n i + 1 ) + ( n i ) m = l = 1 n i + 1 r ( A l ) .
We now prove that (5) is also true for i. Based on Lemma 3, we know that:
r ( A 1 A 2 A n i + 1 ) + m r ( A 1 A 2 A n i ) + r ( A n i + 1 ) .
From (24) and (25), we have the following:
l = 1 n i r ( A l ) r ( A 1 A 2 A n i ) + ( n i 1 ) m .
On the other hand, again by Lemma 3, we know the following:
r ( A 1 A 2 A n i ) + ( n i 1 ) m l = 1 n i r ( A l ) .
Hence, from (26) and (27), we have:
l = 1 n i r ( A l ) = r ( A 1 A 2 A n i ) + ( n i 1 ) m .
This means that ( 4 ) ( 5 ) hold. □

Author Contributions

All authors have equally contributed to this work. All authors read and approved the final manuscript.

Funding

This work was supported by the project for characteristic innovation of 2018 Guangdong University (No: 2018KTSCX234) and the Natural Science Foundation of GuangDong (No: 2014A030313625) and the basic Theory and Scientific Research of Science and Technology Project of Jiangmen City, China (No: 2020JC01010, 2021030102610005049) and the joint research and development fund of Wuyi University, Hong Kong and Macao (No: 2019WGALH20).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The authors would like to thank Leila Zhang and the anonymous reviewers for their very detailed comments and constructive suggestions, which greatly improved the presentation of this paper.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ben-Israel, A.; Greville, T.N.E. Generalized Inverses Theory and Applications; John Wiley: New York, NY, USA, 2003. [Google Scholar]
  2. Moore, E.H. On the reciprocal of general algebraic matrix (Abstract). Bull. Am. Math. Soc. 1920, 26, 394–395. [Google Scholar]
  3. Penrose, R. A generalized inverse for matrix. Proc. Camb. Philos. Soc. 1955, 51, 406–413. [Google Scholar] [CrossRef] [Green Version]
  4. Wang, G.R.; Wei, Y.M.; Qiao, S.Z. Generalized Inverses Theory and Computations; Science Press: Beijing, China, 2004. [Google Scholar]
  5. Campbell, S.L.; Meyer, C.D. Generalized Inverses of Linear Transformations; Dover: New York, NY, USA, 1979. [Google Scholar]
  6. Rao, C.R.; Mitra, S.K. Generalized Inverses of Matrices and its Applications; Wiley: New York, NY, USA, 1971. [Google Scholar]
  7. Bjerhamar, A. Application of calculus of matrices to method of least squares with special reference to geodetic calculations. Trans. R. Inst. Technol. 1951, 49, 88–93. [Google Scholar]
  8. Simeon, B.; Fuhrer, C.; Rentrop, P. The Drazin inverse in multibody system dynamics. Numer. Math. 1993, 64, 521–539. [Google Scholar] [CrossRef]
  9. Staninirovic, R.; Tasic, M. Computing generalized inverses using LU factorrization of matrix product. Int. Comput. Math. 2008, 85, 1865–1878. [Google Scholar] [CrossRef] [Green Version]
  10. Cvetkovic-IIic, D.S.; Milosevic, J. Reverse order laws for {1,3}-generalized inverses. Linear Multilinear Algebra 2019, 67, 613–624. [Google Scholar] [CrossRef]
  11. Djordjevic, D.S. Futher results on the reverse order law for generalized inverses. SIAM J. Matrix. Anal. Appl. 2007, 29, 1242–1246. [Google Scholar] [CrossRef] [Green Version]
  12. Greville, T.N.E. Note on the generalized inverses of a matrix products. SIAM Rev. 1966, 8, 518–521. [Google Scholar] [CrossRef]
  13. Hartwing, R.E. The reverse order law revisited. Linear Algebra Appl. 1986, 76, 241–246. [Google Scholar] [CrossRef] [Green Version]
  14. Liu, D.; Yan, H. The reverse order law for {1,3,4}-inverse of the product of two matrices. Appl. Math. Comput. 2010, 215, 4293–4303. [Google Scholar] [CrossRef]
  15. Liu, X.; Huanga, S.; Cvetkovic-Ilic, D.S. Mixed-tipe reverse-order law for {1,3}-inverses over Hilbert spaces. Appl. Math. Comput. 2012, 218, 8570–8577. [Google Scholar]
  16. Nikolov Radenkovic, J. Reverse order law for multiple operator product. Linear Multilinear Algebra 2016, 64, 1266–1282. [Google Scholar] [CrossRef]
  17. Shinozaki, N.; Sibuya, M. The Reverse order law (AB)=BA. Linear Algebra Appl. 1974, 9, 29–40. [Google Scholar] [CrossRef] [Green Version]
  18. Tian, Y.G. Reverse order laws for generalized inverses of multiple matrix products. Linear Algebra Appl. 1994, 211, 85–100. [Google Scholar] [CrossRef] [Green Version]
  19. Wei, M. Reverse order laws for generalized inverse of multiple matrix products. Linear Algebra Appl. 1999, 293, 273–288. [Google Scholar] [CrossRef] [Green Version]
  20. Wei, M.; Gao, W. Reverse order laws for least squares g-inverses and minimum-norm g-inverses of products of two matrices. Linear Algebra Appl. 2002, 342, 117–132. [Google Scholar] [CrossRef] [Green Version]
  21. Xiong, Z.P.; Zheng, B. The reverse order laws for {1,2,3}-and{1,2,4}-Inverses of a two-matrix product. Appl. Math. Lett. 2008, 21, 649–655. [Google Scholar] [CrossRef] [Green Version]
  22. Xiong, Z.P.; Liu, Z.S. The Forward Order Law for Least Square g-Inverse of Multiple Matrix Products. Mathematics 2019, 7, 277. [Google Scholar] [CrossRef] [Green Version]
  23. Xiong, Z.P.; Qin, Y.Y. Reverse order law for weighted Moore-Penrose inverses of multiple matrix products. Math. Inequal. Appl. 2014, 17, 121–135. [Google Scholar] [CrossRef] [Green Version]
  24. Liu, Z.S.; Xiong, Z.P. The forward order laws for {1,2,3}- and {1,2,4}-inverses of a three matrix products. Filomat 2018, 32, 589–598. [Google Scholar] [CrossRef] [Green Version]
  25. Liu, Z.S.; Xiong, Z.P.; Qin, Y.Y. A note on the forward order law for least square g-inverse of three matrix products. Comput. Appl. Math. 2019, 38, 48. [Google Scholar] [CrossRef]
  26. Qin, Y.Y.; Xiong, Z.P.; Zhou, W.N. On the Drazin inverse and M-P inverse for sum of matrices. Oper. Matrices 2021, 15, 209–223. [Google Scholar] [CrossRef]
  27. Xiong, Z.P.; Zheng, B. Forward order law for the generalized inverses of multiple matrix products. J. Appl. Math. Comput. 2007, 25, 415–424. [Google Scholar] [CrossRef]
  28. Tian, Y.G. More on maximal and minimal ranks of Schur complements with applications. Appl. Math. Comput. 2004, 152, 675–692. [Google Scholar] [CrossRef]
  29. Marsaglia, G.; Styan, G.P. Equalities and inequalities for ranks of matrices. Linear Multilinear Algebra 1974, 2, 269–292. [Google Scholar] [CrossRef]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Zhou, W.; Xiong, Z.; Qin, Y. Forward Order Law for the Reflexive Inner Inverse of Multiple Matrix Products. Axioms 2022, 11, 123. https://0-doi-org.brum.beds.ac.uk/10.3390/axioms11030123

AMA Style

Zhou W, Xiong Z, Qin Y. Forward Order Law for the Reflexive Inner Inverse of Multiple Matrix Products. Axioms. 2022; 11(3):123. https://0-doi-org.brum.beds.ac.uk/10.3390/axioms11030123

Chicago/Turabian Style

Zhou, Wanna, Zhiping Xiong, and Yingying Qin. 2022. "Forward Order Law for the Reflexive Inner Inverse of Multiple Matrix Products" Axioms 11, no. 3: 123. https://0-doi-org.brum.beds.ac.uk/10.3390/axioms11030123

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop