2021
[2]\fnmPeng-fei \surHuang
1]\orgdivSchool of Mathematics and Statistics, \orgnameKashi University, \orgaddress\cityKashi, \postcode844008, \countryP.R. China
2]\orgdivSchool of Mathematical Sciences and LPMC, \orgnameNankai University, \orgaddress\cityTianjin, \postcode300071, \countryP.R. China
The low-rank approximation of fourth-order partial-symmetric and conjugate partial-symmetric tensor
Abstract
We present an orthogonal matrix outer product decomposition for the fourth-order conjugate partial-symmetric (CPS) tensor and show that the greedy successive rank-one approximation (SROA) algorithm can recover this decomposition exactly. Based on this matrix decomposition, the CP rank of CPS tensor can be bounded by the matrix rank, which can be applied to low rank tensor completion. Additionally, we give the rank-one equivalence property for the CPS tensor based on the SVD of matrix, which can be applied on the rank-one approximation for CPS tensors.
keywords:
Conjugate partial-symmetric tensor, approximation algorithm, rank-one equivalence property, convex relaxationpacs:
[MSC Classification]15A69, 15B57, 90C26, 41A50
1 Introduction
Tensor decomposition and approximation have significant applications in computer vision, data mining, statistical estimation and so on. We refer to kolda2009tensor for the survey. Moreover, it is general that tensor from real applications with special symmetric structure. For instance, the symmetric outer product decomposition is particularly important in the process of blind identification of under-determined mixtures comon2008symmetric.
Jiang et al.jiang2016characterizing studied the functions in multivariate complex variables which always take real values. They proposed the conjugate partially symmetric (CPS) tensor to characterize such polynomial functions, which is the generalization of Hermitian matrix. Various examples of conjugate partial-symmetric tensors can be encountered in engineering applications arising from singal processing, electrical engineering, and control theoryaubry2013ambiguity; de2007fourth. Ni et al.Ni2019Hermitian and Nie et al.Nie2019Hermitian researched on the Hermitian tensor decomposition. Motivated by Lieven et al.de2007fourth, we proposed a new orthogonal matrix outer product decomposition model for CPS tensors, which explore the orthogonality of these matrices.
It is well known that unlike the matrix case, the best rank-r () approximation of a general tensor may not exist, and even if it admits a solution, it is NP-hard to solvede2008tensor. The greedy successive rank-one approximation (SROA) algorithm can be applied to compute the rank-r () approximation of tensor. However, the theoretical guarantee for obtaining the best rank-r approximation is less developed. Zhang et al.zhang2001rank first proved the successive algorithm exactly recovers the symmetric and orthogonal decomposition of the underlying real symmetrically and orthogonally decomposable tensors. Fu et al.2018Successive showed that SROA algorithm can exactly recover unitarily decomposable CPS tensors. We offer the theoretical guarantee of SROA algorithm for our matrix decomposition model of CPS tensor tensors.
Many multi-dimensional data from real practice are fourth-order tensors and can be formulated as low-CP-rank tensor. However it is very difficult to compute CP-rank of a tensor. Jiang et al.Jiang2018Low showed that CP-rank can be bounded by the corresponding rank of square unfolding matrix of tensors. Following the idea we research on the low rank tensor completion for fourth order partial-symmetric tensor in particular.
Recently, Jiang et al. 2015Tensor proposed convex relaxations for solving a tensor optimization problem closely related to the best rank-one approximation problem for symmetric tensors. They proved an equivalence property between a rank-one symmetric tensor and its unfolding matrix. Yang et al. Yuning2016Rank studied the rank-one equivalence property for general real tensors. Based on these rank-one equivalence properties, the above mentioned tensor optimization problem can be casted into a matrix optimization problem, which alleviates the difficulty of solving the tensor problem. In line with this idea, we study the rank-one equivalence property for the fourth-order CPS tensor and transform the best rank-one tensor approximation problem into a matrix optimization problem.
In Section 2, we give some notations and definitions. The outer product approximation model based on matrix is proposed and the successive rank-one approximation (SMROA) algorithm is given to solve it in Section 3. We show that the SMROA algorithm can exactly recover the matrix outer product decomposition or approximation of the CPS tensor in Section 4. Section LABEL:sec:application discusses applications of our model simply. In Section LABEL:sec:rank1Equ, we present the rank-one equivalence property of fourth-order CPS tensor, and based on it an application is proposed. Numerical examples are in Section LABEL:sec:numerical.
2 Preliminary
All tensors in this paper are fourth-order. For any complex number , denotes the conjugate of . "" denotes the outer product of matrices, namely means that
denotes the set of by symmetric matrices, the entries of these matrices can be complex or only real according to the context, without causing ambiguity. The inner product between is defined as
Definition 1.
A fourth-order tensor is called symmetric if is invariant under all permutations of its indices, i.e.,
Definition 2.
Ni2019Hermitian A fourth-order complex tensor is called a Hermitian tensor if
Jiang et al.jiang2016characterizing introduced the concept of conjugate partial-symmetric tensors as following.
Definition 3.
A fourth-order complex tensor is called conjugate partial-symmetric (CPS) if
Definition 4.
A fourth-order tensor is called partial-symmetric if
Example 1.
de2007fourth In the blind source separation problem, the cumulant tensor is computed as
By a permutation of the indices, it is in fact a conjugate partial-symmetric tensor.
Definition 5.
The square unfolding form of a fourth-order tensor is defined as
3 Matrix outer product approximation model
Jiang et al.Jiang2018Low introduced the new notions of M-decomposition for an even-order tensor , which is exactly the rank-one decomposition of , followed by the notion of tensor M-rank.
For each , let be the SVD of , then has the following decomposition form
(1) |
where , , for . , .
We are particularly interested in the tensor with some symmetric properties. And analogous to Lieven et al.de2007fourth, we prove that the CPS tensor has a decomposition based on matrix as following.
Theorem 1.
If is a conjugate partial-symmetric tensor, then it can be decomposed as,
(2) |
where , are symmetric matrices and , for . And the decomposition is unique when are different from each other.
Proof: Since is conjugate partial-symmetric, then the unfold matrix is Hermitian, and can be decomposed as
where , , are mutually orthogonal. Folding into matrix via , thus , are mutually orthogonal, that is . In this case, we have .
From the eigen-decomposition of , we have , for i.e., , for any . Since , for all , then , thus is symmetric. The uniqueness of the decomposition follows the property of eigen-decomposition of Hermitian matrix naturally.
Remark 1.
Jiang et al.jiang2016characterizing gave the decomposition theorem for CPS tensor like Theorem 3. However, they established this theorem in the view of polynomial decomposition and did not explore the mutually orthogonality of matrices in the decomposition model.
Definition 6.
is a CPS tensor,
The is actually the strongly symmetric M-rank defined by JiangJiang2018Low. For symmetric tensor , they also proved the equivalence between and
This is also true for CPS fourth-order tensor.
Theorem 2.
Let be a CPS tensor, then
Proof: It is obvious that On the other hand, if , we have . Since , we obtain the desired conclusion.
Corollary 3.
Let be a partial-symmetric tensor, then one has,
(3) |
where , are symmetric matrices and , for . .
Proof: The first part is obvious according to Theorem 3. Since all matrices belonging to form a -dimensional vector space, we have . Fu et al. gave a rank-one decomposition of vector form for the CPS tensor based on Theorem 1 as follows,
Theorem 4.
(fu2018decompositions, Theorem 3.2) is CPS if and only if it has the following partial-symmetric decomposition
where and . That is, a CPS tensor can be decomposed as the sum of rank-one CPS tensors.
However, when we restricted the decomposition on real domain, the decomposition does not seem to hold, since , where , , can only represent symmetric tensor. Thus, an extended rank-one approximation model for the partial-symmetric tensor can be proposed based on Corollary 3.
Corollary 5.
Let be a partial-symmetric tensor, then it can be decomposed as the sum of simple low rank partial-symmetric tensor,
(4) |
Proof: From Corollary 3, partial-symmetric tensor , where are symmetric. So it can be decomposed as , thus
The desired decomposition form follows.
Remark 2.
From the proof of Corollary 5, we can see that if , . Whether this decomposition form is the compactest will be one of our future work.
We can discuss the case of skew partial-symmetric tensor in parallel.
Theorem 6.
We call skew partial-symmetric tensor if
Then one has
and
Proof: is skew-symmetric according to the definition of the skew partial-symmetric tensor. Then . The rest of the proof is similar to that for partial-symmetric tensor, here we omit it.
Based on Theorem 1, we propose a matrix outer product approximation model for the CPS tensor as following.
(5) | ||||
The successive rank-one approximation algorithm can also be applied to the conjugate partial symmetric tensors to find the matrix outer product decompositions or approximations, as shown in Algorithm 1.
Given a CPS tensor . Initialize .
The main optimization problem in Algorithm 1 could be expressed as
(6) |
The objective function of (6) can be rewritten as
From which we can derive that problem (6) is equivalent to
(7) |
and . We can solve (7) by transforming it into matrix eigenproblem as follows,
(8) |
Remark 3.
Zhang et al.zhang proved that if is symmetric,
if is symmetric about the first two and the last two mode respectively,
It is obvious that for partial-symmetric tensor, we also have
Remark 4.
It is well-known that (6) is equivalent to the nearset Kronecker product problem golub2013matrix as below
where , "" denotes the kronecker product of matrices.
4 Exact Recovery for CPS tensors
In this section, we give the theoretical analysis of exact recovery for CPS tensors by the SMROA algorithm.
Theorem 7.
Let be a CPS tensor with , that is
If are different from each other, then the SMROA algorithm will obtain the exact decomposition of after iterations.
We first claim the following lemma before proving the above theorem.
Lemma 8.
Let be a CPS tensor with , that is
are different from each other. Suppose
Then, there exists such that
Proof: According to Theorem 1, are mutually orthogonal, thus is a subset of an orthonormal basis of and . Let , where for . Since =1, we have