is a measurement vector,is a sensing matrix, is a -sparse signal (i.e., , where supp( is the support of and is the cardinality of supp()) and represents the measurement noise. Then under some conditions on , CS can accurately recover the support of based on and .
and direction of arrival estimation, the nonzero entries of occur in blocks (or clusters). Such kind of signals are referred to as block sparse signals and are denoted as in this paper.
To mathematically define , analogous to , we view as a concatenation of blocks :
where with denotes the th block of . Then,
() A vector is called block -sparse if is nonzero for at most indices .
Then, by Definition 1, we have and .
Similar to , we also represent as a concatenation of column-blocks of size , , i.e.,
Since block sparse signals arise from many fields , this paper focus on study the recovery of from measurements
To this end, we introduce the definition of block restricted isometry property (RIP).
To efficiently recover block sparse signals, the block OMP (BOMP) algorithm, which is described in Algorithm 1, has been proposed in . Recently, using RIP,  investigated some sufficient conditions for exact or stable recovery of block sparse signals with BOMP. They also proved that their sufficient conditions are sharp in the noiseless case.
In order to analyze the recoverability of BOMP in the noisy case, we investigate the sufficient condition of the support recovery of block sparse signals with iterations of BOMP in the noisy case. The condition reduces to that for the noiseless case when and it is the results presented in .
The rest of the paper is organized as follows. We present our new sufficient conditions in Sections II and prove them in Sections III. The paper is summarized in Section IV.
Ii Main Results
Similar to , we define mixed -norm as
where with . Then our sufficient condition for the exact support recovery of block -sparse signals with BOMP is as follows:
Equivalently, we need to show
Since , it is clear that (10) holds if
which is equivalent to
We obtained a less restrictive sufficient condition for the exact support recovery of -block sparse signals with the BOMP algorithm based on RIC. Since the weaker the RIC bound is, the less number of measurements are needed. The improved RIC results can be used in many CS-based applications, see, e.g., .
In the following, we study the worst-case necessity condition for the exact support recovery by BOMP. Recall that BOMP may fail to recover the support of from if [14, Theorem 2]. Therefore, naturally becomes a necessity for the noisy case. Thus, we want to obtain the worst-case necessity condition on when .
For any given positive integers , , and any real number , , we construct a matrix function , block -sparse signal and a noise vector . Let
where being the identity matrix, with all of its entries being 0,
and is the first coordinate unit vector. So, is supported on , and .
By simple calculations we get
, the eigenvaluesof are
When and , the eigenvalues of are
Thus, the RIP constant of is for .
In the following, we will show that the block RIP constant of is .
Given any block -sparse vector . Let , with for and . Then
On the other hand, we have
We now show that BOMP may fail to recover from
Recall that the BOMP algorithm, in order to show this Theorem, we only need to show
This completes the proof. ∎
Iii Proof of Theorem 1
where (a) follows from (4) and supp. The symbol denotes the orthogonal projection onto that is the range space of and .
It is worth mentioning that the residual is orthogonal to the columns of , i.e.,
Iii-a Main Analysis
The proof of Theorem 1 is related to . We will give a brief sketch for the proof of Theorem 1. Our proof consists of two steps. We show that BOMP chooses a correct index in each iteration in the first step. In the second step, we show that BOMP performs exactly iterations.
We prove the first step by induction. If BOMP selects a correct index at an iteration, we will say that BOMP makes a success at the iteration. First, we present the condition guaranteeing BOMP to make a success in the first iteration. Then, suppose that BOMP has been successful in the first iterations, we show that BOMP also makes its success in the th iteration. Here, we assume .
The proof for the first selection corresponds to the case of . Clearly the induction hypothesis holds for this case since .
If BOMP has been successful for the previous iterations, then it means that and , In this sense, BOMP will make a success in the th iteration, provided that (see Algorithm 1). Based on step 2 of Algorithm 1 and (29), in order to show that , in the th iteration, we need to show
From (III-A), for any , it suffices to show
Iii-B Proof of inequality (31)
with and supp. For simplicity, we denote
By (32), using the Cauchy-Schwarz inequality, we can have
where (a) follows from (6), (b) follows from Cauchy-Schwarz inequality, (c) is from , (d) .
Now, we can present a lower bound for left-hand-side of (31).
So, to show (31), we only need to show .
By the property of block RIP, it follows that
It remains to show that BOMP stops under the stopping rule when it performs exactly iterations. Hence, we need to prove for and .
Similarly, from (28),
where (a) is from . Thus, BOMP performs iteration.
In this paper, in the noisy case, we have presented a sufficient condition, which is weaker than existing ones, for the exact support recovery of block -sparse signals with iterations of BOMP.
V Proof of Proposition 1
Vi Proof of Lemma 1