This paper was converted on www.awesomepapers.org from LaTeX by an anonymous user.
Want to know more? Visit the Converter page.

On Column Competent Matrices and Linear Complementarity Problem

A. Duttaa,1, R. Janab,2, A. K. Dasb
aJadavpur University, Kolkata , 700 032, India.
bIndian Statistical Institute, 203 B. T. Road, Kolkata, 700 108, India.
1Email: aritradutta001@gmail.com
Abstract

We revisit the class of column competent matrices and study some matrix theoretic properties of this class. The local ww-uniqueness of the solutions to the linear complementarity problem can be identified by the column competent matrices. We establish some new results on ww-uniqueness properties in connection with column competent matrices. These results are significant in the context of matrix theory as well as algorithms in operations research. We prove some results in connection with locally ww-uniqueness property of column competent matrices. Finally we establish a connection between column competent matrices and column adequate matrices with the help of degree theory.

keywords: Linear complementarity problem Column competent matrices WW-uniqueness Column adequate matrices.

11footnotetext: Corresponding author22footnotetext: The author R.Jana presently working in an integrated steel plant

1 Introduction

The ww-uniqueness property is important in the context of dynamical systems under smooth unilateral constraints. Xu [18] introduced the column competent matrices. On uniqueness, quite a large number of results are available in the literature of operations research. The study of uniqueness property of the solution is important in the context of the theory of the complementarity system as well as the method applied for finding the solution. For details see ([15], [11], [3], [10]). Ingleton [6] studied the ww-uniqueness solutions to linear complementarity problem in the context of adequate matrices.

The linear complementarity problem can be stated as follows: For ARn×nA\in R^{n\times n} and a vector qRn,\,q\,\in\,R^{n},\, the linear complementarity problem denoted as LCP(q,A)(q,A) finds the solution wRnw\;\in R^{n}\; and zRnz\;\in R^{n}\; to the following systems

wAz=q;z0;w0\displaystyle{w\ -\ Az\ =\ q\ ;\ z\geq 0\ ;w\geq 0} (1.1)
wTz= 0\displaystyle{w^{T}z\ =\ 0} (1.2)

or show that there does not exist any zRnz\;\in R^{n}\; and wRnw\;\in R^{n}\; satisfying the system of linear inequalities (1.1) and complementary condition (1.2).

Pang [14] studied local zz-uniqueness of solutions of a linear complementarity problem. The LCP(q,A)(q,A) has unique zz-solution for all qRnq\in R^{n} iff AA is a PP-matrix [1]. The ww-uniqueness property is identified by a condition on AA related to the notion of sign-reversing. Motivated by the ww-uniqueness results, we consider column competent matrices in the context of LCP(q,A).(q,A). The sufficient matrices capture many properties of positive semi definite matrices. The aim of this article is to study some matrix theoretic properties of this class and establish some new results which are useful to the solution of the LCP(q,A).(q,A).

The paper is organised as follows. In section 2, we include few related notations and results. Section 3 presents some new results related to column competent matrices. We develop several matrix theoretic results of column competent matrices which are related to solution of linear complementarity problem. Section 4 provides a conclusion about the article.

2 Preliminaries

Here any vector zRnz\in R^{n} is a column vector and zTz^{T} is the row transpose of z.z. We write z=z+zz=z^{+}-z^{-} where zi+=max(0,zi)z^{+}_{i}=\max(0,z_{i}) and zi=max(0,zi)z^{-}_{i}=\max(0,-z_{i}) for any index i.i. If AA is a matrix of order n,n, α{1,2,,n}\alpha\subseteq\{1,2,\cdots,n\} and β{1,2,,n}α{\beta}\subseteq\{1,2,\cdots,n\}\setminus\alpha then AαβA_{\alpha\beta} is the submatrix with the rows and columns of AA whose indices are in α\alpha and β\beta respectively. A principal submatrix and a principal minor of AA are denoted by AααA_{\alpha\alpha} and detAαα\det A_{\alpha\alpha} respectively. For ARn×nA\in R^{n\times n} and qRn,q\in R^{n}, the feasible set of LCP(q,A)(q,A) is defined by FEA(q,A)(q,A) ={zRn:z0,q+Az0}=\{z\in R^{n}:z\geq 0,q+Az\geq 0\} and the solution set is also defined by SOL(q,A)(q,A) ={zFEA(q,A):zT(q+Az)=0}.=\{z\in\text{FEA}(q,A):z^{T}(q+Az)=0\}. A zz-solution, z~\tilde{z} is called locally unique if \exists a neighborhood of z~\tilde{z} within which z~\tilde{z} is the only zz-solution. A ww-solution, w~=Az~+q\tilde{w}=A\tilde{z}+q, is called locally unique if \exists a neighborhood of w~\tilde{w} in which w~\tilde{w} is the only ww-solution. Let ψ:RnRn\psi:R^{n}\to R^{n} and the kernel of the function ψ\psi is defined by ker  ψ={zRn:ψ(z)=0}.\psi=\{z\in R^{n}:\psi(z)=0\}. The kernel of a matrix ARn×nA\in R^{n\times n} is defined by ker AA={zRn:Az=0}.=\{z\in R^{n}:Az=0\}. Now we define the column competent matrix.

Definition 2.1.

[18] The matrix AA is said to be column competent if zi(Az)i=0,i=1,2,,nAz=0.z_{i}(Az)_{i}=0,\ \ i=1,2,\cdots,n\implies Az=0.

Column competent matrices can be singular or nonsingular matrices. Note that all singular matrices need not be column competent matrices. Consider A=A= [1010]\left[\begin{array}[]{rr}1&0\\ 1&0\\ \end{array}\right] which is a singular matrix. For any z=[0k],kR,zi(Az)i=0,i=1,2z=\left[\begin{array}[]{rr}0\\ k\\ \end{array}\right],\ k\in R,\ z_{i}(Az)_{i}=0,\ i=1,2 implies that Az=0.Az=0. Consider another A=A= [1100].\left[\begin{array}[]{rr}1&1\\ 0&0\\ \end{array}\right]. It is easy to show that zi(Az)i=0,i=1,2z_{i}(Az)_{i}=0,\ i=1,2 does not imply Az=0.Az=0. Hence AA is not a column competent matrix. Let A=A= [143215320],\left[\begin{array}[]{rrr}1&4&3\\ 2&1&5\\ 3&2&0\\ \end{array}\right], for z=[001]z=\left[\begin{array}[]{rrr}0\\ 0\\ 1\\ \end{array}\right] zi(Az)i=0,i=1,2,3z_{i}(Az)_{i}=0,\ \ i=1,2,3 but Az=[350]0.Az=\left[\begin{array}[]{rrr}3\\ 5\\ 0\\ \end{array}\right]\neq 0. Here AA is a nonsingular matrix but not a column competent matrix.

Now we define ψ:RnRn\psi:R^{n}\to R^{n} where ψ(z)=z(Az)\psi(z)=z*(Az) and z(Az)z*(Az) is the Hadamard product defined by (z(Az))i=zi(Az)i,i.(z*(Az))_{i}=z_{i}*(Az)_{i},\ \forall\ i. Note that the product is associative, distributive and commutative.

Definition 2.2.

[18] In view of Hadamard product, a matrix AA is said to be column competent if kerψ=ker A.\text{ker}\ \psi=\text{ker\ }A.

Column adequate matrices are related to column competent matrices. We start with definition of column adequate matrices.

Definition 2.3.

[1] The matrix AA is said to be column adequate if zi(Az)i0,i=1,2,,nAz=0.z_{i}(Az)_{i}\leq 0,\ i=1,2,\cdots,n\implies Az=0.

We state the following lemma and theorems which are useful for the subsequent sections.

Lemma 2.1.

[18] The matrix AA is said to be non-degenerate if and only if kerψ={0}.\text{ker}\ \psi=\{0\}.

Theorem 2.1.

[18] The following statements are equivalent.

  1. (i)

    AA is column competent.

  2. (ii)

    For all vector q,q, the LCP(q,A)(q,A) has a finite number (possibly zero) of ww-solutions.

  3. (iii)

    For all vector q,q, any ww-solution of the LCP(q,A),(q,A), if it exists, must be locally ww-unique.

Theorem 2.2.

[18] The following statements are equivalent.

  1. (i)

    (a) AA is column competent.
    (b) AA is a P0P_{0}-matrix.

  2. (ii)

    AA is column adequate.

Theorem 2.3.

[1] Let ARn×nA\in R^{n\times n} be a E0E_{0}-matrix. Then the following statements are equivalent.

  1. (i)

    AR0.A\in R_{0}.

  2. (ii)

    AR.A\in R.

We say that ARn×nA\in R^{n\times n} is a

- QQ-matrix if for every qRn,q\in R^{n}, LCP(q,A)(q,A) has a solution.
- Q0Q_{0}-matrix if for any qRn,q\in R^{n}, feasibility implies solvability.
- PP-matrix if for each vector z0z\neq 0 there exists an index ii such that max(zi0)zi(Az)i>0.\max_{(z_{i}\neq 0)}z_{i}(Az)_{i}>0.
- P0P_{0}-matrix if for each vector z0z\neq 0 there exists an index ii such that max(zi0)zi(Az)i0.\max_{(z_{i}\neq 0)}z_{i}(Az)_{i}\geq 0.
- R0R_{0}-matrix if LCP(0,A)(0,A) has unique solution.
- principally non-degenerate if it has no principal submatrix which has determinant zero.
For further details about the matrix classes in linear complementarity problem see ([4],[9], [13], [7], [8]).

The principal pivot transform (PPT) has an important role in the study of matrix classes and linear complementarity problem. The principal pivot transform of ARn×nA\in R^{n\times n} with real entries, with respect to α{1,2,,n}\alpha\subseteq\{1,2,\ldots,n\} is defined as the matrix given by

A=[AααAαα¯Aα¯αAα¯α¯]\displaystyle{A^{\prime}=\left[\begin{array}[]{cc}A^{{}^{\prime}}_{\alpha\alpha}&A^{{}^{\prime}}_{\alpha\bar{\alpha}}\\ A^{{}^{\prime}}_{\bar{\alpha}\alpha}&A^{{}^{\prime}}_{\bar{\alpha}\bar{\alpha}}\end{array}\right]}

where Aαα=(Aαα)1,Aαα¯A^{{}^{\prime}}_{\alpha\alpha}=(A_{\alpha\alpha})^{-1},\;A^{{}^{\prime}}_{\alpha\bar{\alpha}}=(Aαα)1Aαα¯,-(A_{\alpha\alpha})^{-1}A_{\alpha\bar{\alpha}},\;\,Aα¯α=Aα¯α(Aαα)1A^{{}^{\prime}}_{\bar{\alpha}\alpha}=A_{\bar{\alpha}\alpha}(A_{\alpha\alpha})^{-1} and Aα¯α¯=Aα¯α¯Aα¯α(Aαα)1Aαα¯.A^{{}^{\prime}}_{\bar{\alpha}\bar{\alpha}}=A_{\bar{\alpha}\bar{\alpha}}-A_{\bar{\alpha}\alpha}(A_{\alpha\alpha})^{-1}A_{\alpha\bar{\alpha}}.

Here PPT is only identified with respect to those α\alpha for which detAαα0.\det A_{\alpha\alpha}\neq 0. When α=\alpha=\emptyset, by convention detAαα=1\det A_{\alpha\alpha}=1 and A=A.A^{\prime}=A. Here Aα¯α¯A^{{}^{\prime}}_{\bar{\alpha}\bar{\alpha}} is said to be Schur complement of A.A. We denote the PPT of A as A=𝒫α(A).A^{{}^{\prime}}=\mathcal{P}_{\alpha}(A). The schur complement of AααA_{\alpha\alpha} in A=[AααAαα¯Aα¯αAα¯α¯]A=\left[\begin{array}[]{cc}A_{\alpha\alpha}&A_{\alpha\bar{\alpha}}\\ A_{\bar{\alpha}\alpha}&A_{\bar{\alpha}\bar{\alpha}}\end{array}\right] is a principal submatrix of the principal pivot transform AA^{\prime}. For details of PPT see ([11], [12], [2]).

We establish a connection between competent matrices and adequate matrices using degree theoretic approach. We provide a brief details about degree theory in the subsequent section.

2.1 Degree theory

Let fA:RnRnf_{A}:R^{n}\rightarrow R^{n} be a piecewise linear mapping for a given matrix ARn×nA\in R^{n\times n} defined as fA(ei)=eif_{A}(e_{i})=e_{i} and fA(ei)=Aeii.f_{A}(-e_{i})=-Ae_{i}\ \forall\ i. We write for any zRn,z\in R^{n},

fA(z)=z+Az.f_{A}(z)=z^{+}-Az^{-}.

For details see [10]. It is clear that LCP(q,A)(q,A) is equivalent to find a vector zRnz\in R^{n} such that fA(z)=q.f_{A}(z)=q. If zz belongs to the interior of some orthants of RnR^{n} and detAαα0\det A_{\alpha\alpha}\neq 0 where α={i:zi<0},\alpha=\{i:z_{i}<0\}, then the index of fA(z)f_{A}(z) at zz is well defined and can be written as

indfA(q,z)=sgn(detAαα).\text{ind}f_{A}(q,z)=sgn(\det A_{\alpha\alpha}).

Note that the cardinality of fA1(q)f_{A}^{-1}(q) denotes the number of solutions of LCP(q,A).(q,A). Particularly, if qq is non-degenerate with respect to A,A, each index of fAf_{A} is well defined and we can define local degree of AA at q.q. It can be denoted as deg(q)A.{}_{A}(q). For details see ([1], chapter 6). We state the following theorem from [10], which will be required to prove one of our result.

Theorem 2.4.

Let AA\inRn×n.R^{n\times n}. Let K(A)K(A) denote the union of all the facets of the complementary cones of (I,A)(I,{}-A). Consider qRnk(A)q\in R^{n}\setminus k(A) where qq is non-degenerate with respect to A.A. Let β{1,2,,n}\beta\subseteq\{1,2,\cdots,n\} be such that detAββ0.\det A_{\beta\beta}\neq 0. Suppose AA^{\prime} is a PPT of AA with respect to β.\beta. Then degA(q)=sgn(detAββ)degA(q).\text{deg}_{A^{\prime}}(q^{\prime})=sgn(\det A_{\beta\beta})\cdot\text{deg}_{A}(q).

3 Results on Column Competent Matrices

Hadamard product is important to characterize the complementary condition. Here we show that the property of column competent matrix is related to Hadamard product.

Theorem 3.1.

Suppose AA is a column competent matrix and the function ψ:RnRn\psi:R^{n}\to R^{n} defined by ψ(z)=z(Az)\psi(z)=z*(Az) where z(Az)z*(Az) is the Hadamard product. Then kerψ=kerA.\text{ker}\ \psi=\text{ker}A.

Proof.

Let AA be a column competent matrix. Then for a vector zRn,zi(Az)i=0,i=1,2,,nAz=0.z\in R^{n},z_{i}(Az)_{i}=0,\ i=1,2,\cdots,n\implies Az=0. Hence zkerψz\in\text{ker}\ \psi implies zkerA.z\in\text{ker}A. So we write kerψkerA.\text{ker}\ \psi\subseteq\text{ker}A. Again by definition kerAkerψ.\text{ker}A\subseteq\text{ker}\ \psi. Therefore kerψ=kerA.\text{ker}\ \psi=\text{ker}A.

The following result provides a characterization of non-degenerate column competent matrices.

Theorem 3.2.

Let ARn×nA\in R^{n\times n} be a non-degenerate column competent matrix. Then AR0.A\in R_{0}.

Proof.

Let AA be a non-degenerate column competent matrix. By Lemma 2.1, kerψ={0}\text{ker}\ \psi=\{0\} where ψ(z)=zAz.\psi(z)=z*Az. By Theorem 3.1, we can write kerψ=ker A={0}.\text{ker}\ \psi=\text{ker\ }A=\{0\}. Let zz be the solution of LCP(0,A).(0,A). Then zi(Az)i=0,i=1,2,,n.z_{i}(Az)_{i}=0,\ i=1,2,\cdots,n. This implies that Az=0.Az=0. Hence z=0.z=0. Therefore, LCP(0,A)(0,A) has only one solution zero. Hence AA is a R0R_{0}- matrix. ∎

Note that column competent matrix need not be a P0P_{0}- matrix in general. Consider the matrix A=A= [2111].\left[\begin{array}[]{rr}2&1\\ 1&-1\\ \end{array}\right]. We show that AA is a column competent matrix but not a P0P_{0}-matrix. Now we establish the following result.

Theorem 3.3.

Suppose AA is a column competent matrix with AP0.A\in P_{0}. Then for 0z0,0\neq z\geq 0, (z,0)(z,0) is the solution of LCP(0,A).(0,A).

Proof.

Let ARn×nA\in R^{n\times n} be a column competent matrix with AP0.A\in P_{0}. Then for each 0z,0\neq z, maxizi(Az)i0,zi0.\max_{i}z_{i}(Az)_{i}\geq 0,z_{i}\neq 0. If zi(Az)i=0,i=1,2,,nz_{i}(Az)_{i}=0,\ i=1,2,\cdots,n implies that Az=0.Az=0. Then (z,0),z0(z,0),z\geq 0 is the solution of LCP(0,A).(0,A).

Now we consider the matrix A=A= [2142]\left[\begin{array}[]{rr}2&-1\\ -4&2\\ \end{array}\right] is column competent as well as P0P_{0} and ([12],[00])(\left[\begin{array}[]{rr}1\\ 2\\ \end{array}\right],\left[\begin{array}[]{rr}0\\ 0\\ \end{array}\right]) is a solution of LCP(0,A).(0,A). Note that this can be explained using the Theorem  3.3.

Theorem 3.4.

Let AA be a column competent matrix. Suppose z0z\geq 0 and zi(Az)i=0,i=1,2,,n.z_{i}(Az)_{i}=0,\ i=1,2,\cdots,n. Then LCP(0,A)(0,A) has the solution (z,0).(z,0).

Proof.

Since AA is a column competent matrix, then for z0z\geq 0 and zi(Az)i=0,i=1,2,,n.z_{i}(Az)_{i}=0,\ i=1,2,\cdots,n. This implies that Az=0.Az=0. Therefore (z,0)(z,0) is the solution of LCP(0,A).(0,A).

Xu [18] showed that if AA is a column competent matrix then DADTDAD^{T} is a column competent matrix where DD is a diagonal matrix. In the next theorem, we prove that column competent matrices with some additional assumptions are invariant under principal rearrangement. For any principal submatrix AααA_{\alpha\alpha} of A,A,it is possible to rearrange principally the rows and columns of AA in such a way that AααA_{\alpha\alpha} becomes a leading principal submatrix in the rearranged matrix PAPT.PAP^{T}.

Theorem 3.5.

Suppose AA is a column competent matrix. If for any zRn,z\in R^{n}, either zi(Az)i0z_{i}(Az)_{i}\geq 0 or zi(Az)i0z_{i}(Az)_{i}\leq 0 for all i,i, then PAPTPAP^{T} is also column competent where PP is a permutation matrix.

Proof.

Let for any zRn,z\in R^{n}, y=Pz.y=Pz. Consider yi(PAPTy)i=0y_{i}(PAP^{T}y)_{i}=0 for all i.i. This implies that (Pz)i(PAPTPz)i=0(Pz)_{i}(PAP^{T}Pz)_{i}=0 for all i.i. We know that

zTPTPAPTPz=z^{T}P^{T}PAP^{T}Pz= i=1n(Pz)i(PAPTPz)i\sum_{i=1}^{n}(Pz)_{i}(PAP^{T}Pz)_{i} =0.=0.

Hence zTAz=0z^{T}Az=0 as PTP=I.P^{T}P=I. We write i=1nzi(Az)i=0.\sum_{i=1}^{n}z_{i}(Az)_{i}=0. It means zi(Az)i=0,i=1,2,,n.z_{i}(Az)_{i}=0,\ i=1,2,\cdots,n. As AA is a column competent matrix, Az=0.Az=0. Therefore APTPz=0.AP^{T}Pz=0. Hence (PAPT)(Pz)=0.(PAP^{T})(Pz)=0. Hence PAPTPAP^{T} is column competent. ∎

Theorem 3.6.

Let AA be a ZZ- matrix. Suppose zi(Az)i=0z_{i}(Az)_{i}=0 for all i,i, A|z|0A|z|\geq 0 and Az0.Az\leq 0. Then AA is a column competent matrix.

Proof.

Suppose AA is a ZZ- matrix. Consider zi(Az)i=0z_{i}(Az)_{i}=0 for all i,i, A|z|0A|z|\geq 0 and Az0.Az\leq 0. As AA is a ZZ- matrix, AzA|z|0.Az\geq A|z|\geq 0. Now this implies that Az=0.Az=0. Therefore AA is a column competent matrix. ∎

Consider A=A= [114225341].\left[\begin{array}[]{rrr}1&1&4\\ 2&2&5\\ 3&4&1\\ \end{array}\right]. Note that AA is a R0R_{0}-matrix. Now for z=[110],z=\left[\begin{array}[]{rrr}1\\ -1\\ 0\\ \end{array}\right], zi(Az)i=0,i=1,2,3z_{i}(Az)_{i}=0,\ i=1,2,3 but Az0.Az\neq 0. Hence AA is not a column competent matrix. The class of non-degenerate matrices play an important role to characterize certain uniqueness properties of the solutions of LCP(q,A).(q,A). We prove the following theorem to establish the relation between principally non-degenerate matrices and column competent matrices.

Theorem 3.7.

Let AA be a principally nondegenerate matrix. Then AA is column competent.

Proof.

Let AA be a principally non-degenerate matrix. Assume that AA is not a column competent matrix. Hence \exists a 0zRn0\neq z\in R^{n} such that zi(Az)i=0,i=1,2,,nz_{i}(Az)_{i}=0,\ i=1,2,\cdots,n but Az0.Az\neq 0. Without loss of generality, consider z=[zαzβ]0z=\left[\begin{array}[]{rr}z_{\alpha}\\ z_{\beta}\\ \end{array}\right]\neq 0 where zα0,zβ=0z_{\alpha}\neq 0,z_{\beta}=0 and A=[AααAαβAβαAββ].A=\left[\begin{array}[]{rr}A_{\alpha\alpha}&A_{\alpha\beta}\\ A_{\beta\alpha}&A_{\beta\beta}\\ \end{array}\right]. Then we consider the following cases:

case1: Let α={1,2,,n}\alpha=\{1,2,\cdots,n\} and β=.\beta=\emptyset. Then z=zαz=z_{\alpha} and zi(Az)i=0,iαz_{i}(Az)_{i}=0,\ i\in\alpha. It implies Az=0,Az=0, contradicts the fact that Az0.Az\neq 0.

case2: Let α{1,2,,n}\alpha\subset\{1,2,\cdots,n\} and β={1,2,,n}α.\beta=\{1,2,\cdots,n\}\setminus\alpha. Consider (zα)i(Aααzα)i=0,iα.(z_{\alpha})_{i}(A_{\alpha\alpha}z_{\alpha})_{i}=0,\ i\in\alpha. This implies Aααzα=0.A_{\alpha\alpha}z_{\alpha}=0. As zα0,z_{\alpha}\neq 0, AααA_{\alpha\alpha} is a singular matrix. It contradicts that the matrix AA is a principally non-degenerate matrix.

Therefore AA is a column competent matrix. ∎

Here we consider A=A= [320211320].\left[\begin{array}[]{rrr}3&-2&0\\ -2&1&1\\ -3&2&0\\ \end{array}\right]. For z=[2k3kk],kR,zi(Az)i=0,i=1,2,3z=\left[\begin{array}[]{rrr}2k\\ 3k\\ k\\ \end{array}\right],k\in R,\ z_{i}(Az)_{i}=0,\ i=1,2,3 implies that Az=0.Az=0. Hence AA is a column competent matrix. However AA is neither an adequate matrix nor a sufficient matrix. For details of sufficient matrices see ([17], [16], [5]). Now we develop a necessary and sufficient condition for column competent matrices.

Theorem 3.8.

Let ARn×nA\in R^{n\times n}. The following two statements are equivalent:

  1. (a)

    AA is column competent.

  2. (b)

    For 0z=0\neq z= [zαzβ]0\left[\begin{array}[]{rr}z_{\alpha}\\ z_{\beta}\\ \end{array}\right]\geq 0 with zβ=0z_{\beta}=0 and the submatrix AααA_{\alpha\alpha} is singular with Aααzα=0A_{\alpha\alpha}z_{\alpha}=0 where αβ={1,2,,n}\alpha\cup\beta=\{1,2,\cdots,n\} and αβ=,\alpha\cap\beta=\emptyset, the system

    [AααAαβAβαAββ][zαzβ]0\left[\begin{array}[]{rr}A_{\alpha\alpha}&A_{\alpha\beta}\\ A_{\beta\alpha}&A_{\beta\beta}\\ \end{array}\right]\left[\begin{array}[]{rr}z_{\alpha}\\ z_{\beta}\\ \end{array}\right]\neq 0 (3.1)

    has no solution.

Proof.

(a)(b).(a)\implies(b). Suppose AA is column competent and Equation 3.1 is consistent. Let [zαzβ]\left[\begin{array}[]{rr}z_{\alpha}\\ z_{\beta}\\ \end{array}\right] satisfies the Equation 3.1 where zβ=0z_{\beta}=0 and the submatrix AααA_{\alpha\alpha} is singular with Aααzα=0A_{\alpha\alpha}z_{\alpha}=0 where αβ={1,2,,n}\alpha\cup\beta=\{1,2,\cdots,n\} and αβ=\alpha\cap\beta=\emptyset. Here (zα)i(Aααzα)i=0(z_{\alpha})_{i}(A_{\alpha\alpha}z_{\alpha})_{i}=0 and (zβ)i(Aβαzα)i=0.(z_{\beta})_{i}(A_{\beta\alpha}z_{\alpha})_{i}=0. But [zαzβ]\left[\begin{array}[]{rr}z_{\alpha}\\ z_{\beta}\\ \end{array}\right] satisfies 3.1 which contradicts that AA is column competent.

(b)(a).(b)\implies(a). Conversely, let xRnx\in R^{n} be a vector such that xi(Ax)i=0x_{i}(Ax)_{i}=0 for all i.i. Consider AA is not a column competent matrix. Suppose xα=zαx_{\alpha}=z_{\alpha} and xβ=zβx_{\beta}=z_{\beta} with 0z=0\neq z= [zαzβ]0,\left[\begin{array}[]{rr}z_{\alpha}\\ z_{\beta}\\ \end{array}\right]\geq 0, zβ=0z_{\beta}=0 and the submatrix AααA_{\alpha\alpha} is singular with Aααzα=0A_{\alpha\alpha}z_{\alpha}=0 where αβ={1,2,,n}\alpha\cup\beta=\{1,2,\cdots,n\} and αβ=.\alpha\cap\beta=\emptyset. But the system 3.1 has no solution, i.e. xx does not satisfy Equation 3.1. Therefore AA is column competent. ∎

Now we prove the following sufficient condition related to the PPT of column competent matrices.

Theorem 3.9.

Let AααA_{\alpha\alpha} and the Schur complement A/AααA/A_{\alpha\alpha} be nonsingular of the square matrix A=[AααAαβAβαAββ]A=\left[\begin{array}[]{rr}A_{\alpha\alpha}&A_{\alpha\beta}\\ A_{\beta\alpha}&A_{\beta\beta}\\ \end{array}\right] where αβ={1,2,,n}\alpha\cup\beta=\{1,2,\cdots,n\} and αβ=.\alpha\cap\beta=\emptyset. If AA is column competent, then A=𝒫α(A)A^{\prime}=\mathcal{P}_{\alpha}(A) is column competent.

Proof.

Let w=Azw=A^{\prime}z and zw=0z*w=0 where * is the Hadamard product. Thus we write

[wαwβ]=[AααAαβAβαAββ][zαzβ].\left[\begin{array}[]{rr}w_{\alpha}\\ w_{\beta}\\ \end{array}\right]=\left[\begin{array}[]{rr}A^{\prime}_{\alpha\alpha}&A^{\prime}_{\alpha\beta}\\ A^{\prime}_{\beta\alpha}&A^{\prime}_{\beta\beta}\\ \end{array}\right]\left[\begin{array}[]{rr}z_{\alpha}\\ z_{\beta}\\ \end{array}\right]. (3.2)

The condition zw=0z*w=0 means [zαzβ][wαwβ]=[wαzαwβzβ]=0.\left[\begin{array}[]{rr}z_{\alpha}\\ z_{\beta}\\ \end{array}\right]*\left[\begin{array}[]{rr}w_{\alpha}\\ w_{\beta}\\ \end{array}\right]=\left[\begin{array}[]{rr}w_{\alpha}*z_{\alpha}\\ w_{\beta}*z_{\beta}\\ \end{array}\right]=0. Since A=𝒫α(A),A^{\prime}=\mathcal{P}_{\alpha}(A), we have

[zαwβ]=[AααAαβAβαAββ][wαzβ].\left[\begin{array}[]{rr}z_{\alpha}\\ w_{\beta}\\ \end{array}\right]=\left[\begin{array}[]{rr}A_{\alpha\alpha}&A_{\alpha\beta}\\ A_{\beta\alpha}&A_{\beta\beta}\\ \end{array}\right]\left[\begin{array}[]{rr}w_{\alpha}\\ z_{\beta}\\ \end{array}\right]. (3.3)

The matrix AA is column competent implies that [AααAαβAβαAββ][wαzβ]=0.\left[\begin{array}[]{rr}A_{\alpha\alpha}&A_{\alpha\beta}\\ A_{\beta\alpha}&A_{\beta\beta}\\ \end{array}\right]\left[\begin{array}[]{rr}w_{\alpha}\\ z_{\beta}\\ \end{array}\right]=0. It follows that [zαwβ]=0.\left[\begin{array}[]{rr}z_{\alpha}\\ w_{\beta}\\ \end{array}\right]=0. From 3.2, we get Aβαzα+Aββzβ=0.A^{\prime}_{\beta\alpha}z_{\alpha}+A^{\prime}_{\beta\beta}z_{\beta}=0. Hence Aββzβ=0A^{\prime}_{\beta\beta}z_{\beta}=0 implies that zβ=0z_{\beta}=0 as Aββ=A/AααA^{\prime}_{\beta\beta}=A/A_{\alpha\alpha} is nonsingular. Clearly, wα=0.w_{\alpha}=0. Hence [wαwβ]=[AααAαβAβαAββ][zαzβ]=0.\left[\begin{array}[]{rr}w_{\alpha}\\ w_{\beta}\end{array}\right]=\left[\begin{array}[]{rr}A^{\prime}_{\alpha\alpha}&A^{\prime}_{\alpha\beta}\\ A^{\prime}_{\beta\alpha}&A^{\prime}_{\beta\beta}\\ \end{array}\right]\left[\begin{array}[]{rr}z_{\alpha}\\ z_{\beta}\\ \end{array}\right]=0. Therefore AA^{\prime} is column competent. ∎

Theorem 3.10.

Let AA be a column competent matrix where AααA_{\alpha\alpha} and the Schur complement A/AααA/A_{\alpha\alpha} be nonsingular of the square matrix A=[AααAαβAβαAββ].A=\left[\begin{array}[]{rr}A_{\alpha\alpha}&A_{\alpha\beta}\\ A_{\beta\alpha}&A_{\beta\beta}\\ \end{array}\right]. If AE0R0,A\in E_{0}\cap R_{0}, then AA is column adequate.

Proof.

Suppose AA is not a column adequate matrix but is column competent. By Theorem 2.2, AA is not a P0P_{0}- matrix. Then there exists β{1,2,,n}\beta\subseteq\{1,2,\cdots,n\} such that detAββ<0.\det A_{\beta\beta}<0. Let AE0R0.A\in E_{0}\cap R_{0}. It follows from the Theorem 2.3 that AR.A\in R. Then deg(q)A=1{}_{A}(q)=1 for any q.q. Let AA^{\prime} be a principal pivot transform of A.A. Then AR.A^{\prime}\in R. Hence deg(q)A=1.{}_{A^{\prime}}(q^{\prime})=1. By Theorem 2.4, deg(q)A={}_{A^{\prime}}(q^{\prime})= deg(q)A.sgn(detAββ).{}_{A}(q).sgn(\det A_{\beta\beta}). It implies that deg(q)A=1.{}_{A^{\prime}}(q^{\prime})=-1. This contradicts that AA is not a P0P_{0}-matrix. Therefore AA is column adequate matrix. ∎

3.1 Solution of Linear Complementarity Problem with Column Competent Matrices

We begin with some examples of ww-uniqueness of the solution. Consider the column competent matrix A=[1326],q=[12].A=\left[\begin{array}[]{rr}-1&3\\ 2&-6\\ \end{array}\right],q=\left[\begin{array}[]{rr}1\\ -2\\ \end{array}\right]. This LCP(q,A)(q,A) has solution z=[41]z=\left[\begin{array}[]{rr}4\\ 1\\ \end{array}\right] and w=[00].w=\left[\begin{array}[]{rr}0\\ 0\\ \end{array}\right]. In the neighbourhood of zz there is another solution z=[4.01001.0033]z^{\prime}=\left[\begin{array}[]{rr}4.0100\\ 1.0033\\ \end{array}\right] and w=w=[00].w^{\prime}=w=\left[\begin{array}[]{rr}0\\ 0\\ \end{array}\right].
We consider another matrix A=[213426111],q=[121].A=\left[\begin{array}[]{rrr}-2&1&3\\ 4&-2&-6\\ 1&-1&-1\\ \end{array}\right],q=\left[\begin{array}[]{rrr}1\\ -2\\ 1\\ \end{array}\right]. For z=[2kkk],z=\left[\begin{array}[]{rrr}2k\\ k\\ k\\ \end{array}\right], kR,zi(Az)i=0,i=1,2,,nk\in R,z_{i}(Az)_{i}=0,\ i=1,2,\cdots,n implies that Az=0.Az=0. So AA is a column competent matrix. This LCP(q,A)(q,A) has solution z=[441]z=\left[\begin{array}[]{rrr}4\\ 4\\ 1\\ \end{array}\right] and w=[000].w=\left[\begin{array}[]{rr}0\\ 0\\ 0\\ \end{array}\right]. In the neighbourhood of zz there is another solution z=[4.024.011.01]z^{\prime}=\left[\begin{array}[]{rr}4.02\\ 4.01\\ 1.01\\ \end{array}\right] and w=w=[000].w^{\prime}=w=\left[\begin{array}[]{rr}0\\ 0\\ 0\\ \end{array}\right].

Now we prove the following two results in connection with locally ww-uniqueness property of the column competent matrices. The following two results state the necessary and sufficient condition that AA is a column competent matrix in the system of linear complementarity problem.

Theorem 3.11.

Suppose (w,z)(w^{*},z^{*}) is the solution of LCP(q,A)(q,A) such that w=q+Az.w^{*}=q+Az^{*}. Let α={i:wi>0},\alpha=\{i:{w_{i}}^{*}>0\}, β={i:wi=0}\beta=\{i:{w_{i}}^{*}=0\} be the index set. Further consider that the submatrix AααA_{\alpha\alpha} is nonsingular. If A=[AααAαβAβαAββ]A=\left[\begin{array}[]{rr}A_{\alpha\alpha}&A_{\alpha\beta}\\ A_{\beta\alpha}&A_{\beta\beta}\\ \end{array}\right] is a column competent matrix, then (wα,zβ)=(0,0)(w_{\alpha},z_{\beta})=(0,0) is the only solution of the system:

zα=Aααwα+Aαβzβ=0wβ=Aβαwα+Aββzβ=0wα>0zβ>0,\begin{split}z_{\alpha}=A^{\prime}_{\alpha\alpha}w_{\alpha}+A^{\prime}_{\alpha\beta}z_{\beta}=0\\ w_{\beta}=A^{\prime}_{\beta\alpha}w_{\alpha}+A^{\prime}_{\beta\beta}z_{\beta}=0\\ w_{\alpha}>0\\ z_{\beta}>0,\end{split} (3.4)

where Aαα=(Aαα)1,A^{\prime}_{\alpha\alpha}=(A_{\alpha\alpha})^{-1}, Aαβ=(Aαα)1Aαβ,A^{\prime}_{\alpha\beta}=-(A_{\alpha\alpha})^{-1}A_{\alpha\beta}, Aβα=Aβα(Aαα)1A^{\prime}_{\beta\alpha}=A_{\beta\alpha}(A_{\alpha\alpha})^{-1} and Aββ=AββAβα(Aαα)1Aαβ.A^{\prime}_{\beta\beta}=A_{\beta\beta}-A_{\beta\alpha}(A_{\alpha\alpha})^{-1}A_{\alpha\beta}.

Proof.

Let AA be a column competent matrix. Then by Theorem 2.1, it is locally ww-unique. Suppose ww^{*} is locally unique solution of LCP(q,A)(q,A) such that w=q+Azw^{*}=q+Az^{*} and the system (3.4) has a nonzero solution (w¯α,z¯β).(\bar{w}_{\alpha},\bar{z}_{\beta}).
Now [z¯αw¯β]=[AααAαβAβαAββ][w¯αz¯β]=0\left[\begin{array}[]{rr}\bar{z}_{\alpha}\\ \bar{w}_{\beta}\\ \end{array}\right]=\left[\begin{array}[]{rr}A^{\prime}_{\alpha\alpha}&A^{\prime}_{\alpha\beta}\\ A^{\prime}_{\beta\alpha}&A^{\prime}_{\beta\beta}\\ \end{array}\right]\left[\begin{array}[]{rr}\bar{w}_{\alpha}\\ \bar{z}_{\beta}\\ \end{array}\right]=0 implies that [w¯αw¯β]=[AααAαβAβαAββ][z¯αz¯β].\left[\begin{array}[]{rr}\bar{w}_{\alpha}\\ \bar{w}_{\beta}\\ \end{array}\right]=\left[\begin{array}[]{rr}A_{\alpha\alpha}&A_{\alpha\beta}\\ A_{\beta\alpha}&A_{\beta\beta}\\ \end{array}\right]\left[\begin{array}[]{rr}\bar{z}_{\alpha}\\ \bar{z}_{\beta}\\ \end{array}\right]. Clearly, w¯=Az¯\bar{w}=A\bar{z} and (w)Tz¯=0,(w^{*})^{T}\bar{z}=0, (w¯)Tz=0.(\bar{w})^{T}z^{*}=0. Hence (w+λw¯,z+λz¯)(w^{*}+\lambda\bar{w},z^{*}+\lambda\bar{z}) solves LCP(q,A)(q,A) for all λ0.\lambda\geq 0. This contradicts the local uniqueness of w.w^{*}. Therefore, (wα,zβ)=(0,0)(w_{\alpha},z_{\beta})=(0,0) is the only solution of the system (3.4). ∎

Theorem 3.12.

Suppose (w,z)(w^{*},z^{*}) is the solution of LCP(q,A)(q,A) such that w=q+Azw^{*}=q+Az^{*} where α={i:wi>0}\alpha=\{i:{w_{i}}^{*}>0\} and β={i:wi=0}.\beta=\{i:{w_{i}}^{*}=0\}. Further suppose [zαwβ]=[AααAαβAβαAββ][wαzβ]=0,\left[\begin{array}[]{rr}{z}_{\alpha}\\ {w}_{\beta}\\ \end{array}\right]=\left[\begin{array}[]{rr}A^{\prime}_{\alpha\alpha}&A^{\prime}_{\alpha\beta}\\ A^{\prime}_{\beta\alpha}&A^{\prime}_{\beta\beta}\\ \end{array}\right]\left[\begin{array}[]{rr}{w}_{\alpha}\\ {z}_{\beta}\\ \end{array}\right]=0, wα>0,zβ>0.w_{\alpha}>0,z_{\beta}>0.  If (zα,zβ)=(0,0)(z_{\alpha},z_{\beta})=(0,0) is the only solution of wβ=Aβαzα+Aββzβ=0w_{\beta}=A_{\beta\alpha}z_{\alpha}+A_{\beta\beta}z_{\beta}=0 then A=[AααAαβAβαAββ]A=\left[\begin{array}[]{rr}A_{\alpha\alpha}&A_{\alpha\beta}\\ A_{\beta\alpha}&A_{\beta\beta}\end{array}\right] is column competent.

Proof.

Suppose the matrix AA is not column competent. So ww^{*} is not locally unique. Now [zαwβ]=[AααAαβAβαAββ][wαzβ]=0\left[\begin{array}[]{rr}{z}_{\alpha}\\ {w}_{\beta}\\ \end{array}\right]=\left[\begin{array}[]{rr}A^{\prime}_{\alpha\alpha}&A^{\prime}_{\alpha\beta}\\ A^{\prime}_{\beta\alpha}&A^{\prime}_{\beta\beta}\\ \end{array}\right]\left[\begin{array}[]{rr}{w}_{\alpha}\\ {z}_{\beta}\\ \end{array}\right]=0 implies that [wαwβ]=[AααAαβAβαAββ][zαzβ]\left[\begin{array}[]{rr}{w}_{\alpha}\\ {w}_{\beta}\\ \end{array}\right]=\left[\begin{array}[]{rr}A_{\alpha\alpha}&A_{\alpha\beta}\\ A_{\beta\alpha}&A_{\beta\beta}\\ \end{array}\right]\left[\begin{array}[]{rr}{z}_{\alpha}\\ {z}_{\beta}\\ \end{array}\right] and (w)Tz=0,(w^{*})^{T}{z}=0, (w)Tz=0.({w})^{T}z^{*}=0. Hence (w+λw,z+λz)(w^{*}+\lambda{w},z^{*}+\lambda{z}) solves LCP(q,A)(q,A) for all λ0.\lambda\geq 0. Hence \exists a sequence of vectors {w¯k}\{\bar{w}^{k}\} converging to ww^{*} such that each (w¯k,z¯k)=(w+λkw,z+λkz)(\bar{w}^{k},\bar{z}^{k})=(w^{*}+\lambda^{k}{w},z^{*}+\lambda^{k}{z}) is a solution of LCP(q,A)(q,A) with w¯k=q+Az¯k.\bar{w}^{k}=q+A\bar{z}^{k}. Since w¯kw\bar{w}^{k}\to w^{*} and z¯kz,\bar{z}^{k}\to z^{*}, it follows that w¯αk>0,z¯βk>0.\bar{w}_{\alpha}^{k}>0,\bar{z}_{\beta}^{k}>0. By complementarity z¯αk=0,w¯βk=0.\bar{z}_{\alpha}^{k}=0,\bar{w}_{\beta}^{k}=0. Consider vk=w¯kwv^{k}=\bar{w}^{k}-w^{*} and uk=z¯kz.u^{k}=\bar{z}^{k}-z^{*}. The normalized sequence {vk/vk}\{v^{k}/\|{v^{k}}\|\} is bounded and converges to v0v^{*}\neq 0 as k.k\to\infty. Similarly, the normalized sequence {uk/uk}\{u^{k}/\|{u^{k}}\|\} is bounded and converges to u0u^{*}\neq 0 as k.k\to\infty. Now for all large k,k, we have w¯βkwβ=λkwβ=0=Aβαuαk+Aββuβk.\bar{w}_{\beta}^{k}-w_{\beta}^{*}=\lambda^{k}w_{\beta}=0=A_{\beta\alpha}u_{\alpha}^{k}+A_{\beta\beta}u_{\beta}^{k}. Thus dividing by uk\|u^{k}\| and k,k\to\infty, we have Aβαuα+Aββuβ=0.A_{\beta\alpha}{u_{\alpha}}^{*}+A_{\beta\beta}{u_{\beta}}^{*}=0. Therefore, u=[uαuβ]0u^{*}=\left[\begin{array}[]{rr}{u_{\alpha}}^{*}\\ {u_{\beta}}^{*}\\ \end{array}\right]\neq 0 is the nonzero solution of system wβ=Aβαzα+Aββzβ=0.w_{\beta}=A_{\beta\alpha}z_{\alpha}+A_{\beta\beta}z_{\beta}=0. It contradicts that (zα,zβ)=(0,0)(z_{\alpha},z_{\beta})=(0,0) is the only solution of the system wβ=Aβαzα+Aββzβ=0.w_{\beta}=A_{\beta\alpha}z_{\alpha}+A_{\beta\beta}z_{\beta}=0. Hence AA is column competent. ∎

4 Conclusion

The complementary condition is an important issue in operations research. The concept of matrix theoretic approach helps to develop many theory of linear complementary problem. In this study we consider column competent matrix in the context of local ww-uniqueness property which is important both for the theory as well as solution method of complementarity problrm. The results based on w-uniqueness and column competent matrix class motivate future study and application in matrix theory.

5 Acknowledgement

The author A. Dutta is thankful to the Department of Science and Technology, Govt. of India, INSPIRE Fellowship Scheme for financial support.

References

  • [1] Richard W Cottle, Jong-Shi Pang, and Richard E Stone. The Linear Complementarity Problem, volume 60. Siam, 1992.
  • [2] AK Das. Properties of some matrix classes based on principal pivot transform. Annals of Operations Research, 243(1-2):375–382, 2016.
  • [3] AK Das, R Jana, and Deepmala. Finiteness of criss-cross method in complementarity problem. In International Conference on Mathematics and Computing, pages 170–180. Springer, 2017.
  • [4] AK Das, R Jana, et al. On generalized positive subdefinite matrices and interior point algorithm. In International Conference on Frontiers in Optimization: Theory and Applications, pages 3–16. Springer, 2016.
  • [5] Dick den Hertog, Cornelis Roos, and Tamás Terlaky. The linear complimentarity problem, sufficient matrices, and the criss-cross method. Linear Algebra and Its Applications, 187:1–14, 1993.
  • [6] Aubrey William Ingleton. A probelm in linear inequalities. Proceedings of the London Mathematical Society, 3(1):519–536, 1966.
  • [7] R Jana, AK Das, and A Dutta. On hidden z-matrix and interior point algorithm. OPSEARCH, 56(4):1108–1116, 2019.
  • [8] R Jana, A Dutta, and AK Das. More on hidden z-matrices and linear complementarity problem. Linear and Multilinear Algebra, pages 1–10, 2019.
  • [9] SR Mohan, SK Neogy, and AK Das. More on positive subdefinite matrices and the linear complementarity problem. Linear Algebra and Its Applications, 338(1-3):275–285, 2001.
  • [10] SR Mohan, SK Neogy, and AK Das. On the classes of fully copositive and fully semimonotone matrices. Linear Algebra and Its Applications, 323(1-3):87–97, 2001.
  • [11] SK Neogy and AK Das. On almost type classes of matrices with q-property. Linear and Multilinear Algebra, 53(4):243–257, 2005.
  • [12] SK Neogy and AK Das. Principal pivot transforms of some classes of matrices. Linear Algebra and Its Applications, 400:243–252, 2005.
  • [13] SK Neogy and AK Das. Some properties of generalized positive subdefinite matrices. SIAM Journal on Matrix Analysis and Applications, 27(4):988–995, 2006.
  • [14] Jong-Shi Pang. Two characterization theorems in complementarity theory. Operations Research Letters, 7(1):27–31, 1988.
  • [15] Hans Samelson, Robert M Thrall, and Oscar Wesler. A partition theorem for euclidean n-space. Proceedings of the American Mathematical Society, 9(5):805–807, 1958.
  • [16] Jie Sun and Zheng-Hai Huang. A smoothing newton algorithm for the lcp with a sufficient matrix that terminates finitely at a maximally complementary solution. Optimisation Methods and Software, 21(4):597–615, 2006.
  • [17] Hannu Väliaho. P-matrices are just sufficient. Linear Algebra and Its Applications, 239:103–108, 1996.
  • [18] Song Xu. On local w-uniqueness of solutions to linear complementarity problem. Linear Algebra and Its Applications, 290(1-3):23–29, 1999.