1 Introduction
A two dimensional chirp signal model is expressed mathematically as follows:
(1) 
Here s are the signal observations, , are real valued, nonzero amplitudes and and are the frequencies and the frequency rates
, respectively. The random variables
is a sequence of stationary errors. The explicit assumptions on the error structure are provided in section 2.The above model has been considered in many areas of image processing, particularly in modeling gray images. Several estimation techniques for the unknown parameters of this model have been considered by different authors, for instance, Friedlander and Francos [7], Francos and Friedlander [5], [6], Lahiri [10], [11] and the references cited therein.
Our goal is to estimate the unknown parameters of the above model, primarily the nonlinear parameters, the frequencies , and the frequency rates , , under certain suitable assumptions. One of the straightforward and efficient ways to do so is to use the least squares estimation method. But since the least squares surface is highly nonlinear and iterative methods must be employed for their computation, for these methods to work, we need good starting points for the unknown parameters.
One of the fundamental models in statistical signal processing literature, among the 2D models, is the 2D sinusoidal model. This model has different applications in many fields such as biomedical spectral analysis, geophysical perception etc. For references, see Barbieri and Barone [2], Cabrera and Bose [3], Hua [8] , Zhang and Mandrekar [17], Prasad et al, [13], Nandi et al, [12] and Kundu and Nandi [9].
A 2D sinusoidal model has the following mathematical expression:
For this model as well, the least squares surface is highly nonlinear and thus we need good initial values, for any iterative procedure to work. One of the most prevalent methods to find the initial guesses for the 2D sinusoidal model are the periodogram estimators. These are obtained by maximizing a 2D periodogram function, which is defined as follows:
This periodogram function is maximized over 2D Fourier frequencies, that is, at , for , and . The estimators that are obtained by maximising the above periodogram function with respect to and simultaneously over the continuous space , are known as the approximate least squares estimators (ALSEs). Kundu and Nandi, [9] proved that the ALSEs are consistent and asymptotically equivalent to the least squares estimators (LSEs).
Analogously, we define a periodogramtype function for the 2D chirp model defined in equation (1), as follows:
(2) 
To find the initial values, we propose to maximise the above function at the grid points , , , , and , corresponding to the Fourier frequencies of the 2D sinusoidal model. These starting values can be used in any iterative procedure, to compute the LSEs and ALSEs.
Next we propose to estimate the unknown parameters of model (1) by approximate least squares estimation method. In this method, we maximize the periodogramlike function defined above, with respect to , , and simultaneously, over . The details on the methodology to obtain the ALSEs are further explained in section 3
. We prove that these estimators are strongly consistent and asymptotically normally distributed under the assumptions, that are slightly mild than those required for the LSEs. Also, the convergence rates of the ALSEs are same as those of the LSEs.
The rest of the paper is organized as follows. In the next section we state the model assumptions, some notations and some preliminary results required. In section 3, we give a brief description of the methodology. In section 4, we study the asymptotic properties of one component 2D chirp model and in section 5, we propose a sequential method to obtain the LSEs and ALSEs for the multicomponent 2D chirp model and study their asymptotic properties. Numerical experiments and a simulated data analysis are illustrated in sections 6 and 7. In section 8, we conclude the paper. All the proofs are provided in the appendices.
2 Model Assumptions, Notations and Preliminary Results
Assumption 1. The error is stationary with the following form:
where
is a double array sequence of independently and identically distributed (i.i.d.) random variables with mean zero, variance
and finite fourth moment, and
s are real constants such thatWe will use the following notation:
, the parameter vector,
, the true parameter vector, , the parameter space. Also, , a vector of the nonlinear parameters. Assumption 2. The true parameter vector is an interior point of .Note that the assumptions required to prove strong consistency of the LSEs of the unknown parameters in this case are slightly different from those required to prove the consistency of ALSEs. For the LSEs the parametric space for the linear parameters has to be bounded, though here we do not require that bound. For details on the assumptions for the consistency of the LSEs, see Lahiri [10].
We need the following results to proceed further:
Lemma 1.
If , then except for a countable number of points, and for s, t = 0, 1, , the following are true:

[label=()]







Proof.
Refer to Lahiri [10]
∎
Lemma 2.
If , then except for a countable number of points, the following holds true:
Proof.
Refer to Lahiri [10].
∎
Lemma 3.
If and , then except for a countable number of points, and for s, t = 0, 1, , the following are true:

[label=()]



Proof.
See Appendix D.
∎
3 Method to obtain ALSEs
Consider the periodogramlike function defined in (2). In matrix notation, it can be written as:
Here, is the observed data vector, and
In matrix notation, equation (1), can be written as:
where is the error vector, and . The estimators obtained by maximising the function are known as the approximate least squares estimators (ALSEs). We will show that the estimators obtained by maximising are asymptotically equivalent to the estimators obtained by minimising the error sum of squares function, that is the LSEs, and hence the former are termed as the ALSEs. To do so, we require the following lemma:
Lemma 4.
For , except for a countable number of points, we have the following result:
Proof.
Consider the following:
where,
Now using Lemma (c), (e) and (f), it can be easily seen that the matrix on the right hand side of the above equation tends to , except for a countable number of points and hence the result.
∎
We know that to find the LSEs, we minimise the following error sum of squares:
(3) 
with respect to . If we fix , then the estimates of the linear parameters can be obtained by separable regression technique of Richards [15] by minimizing with respect to and .
Thus the estimate of is given by:
(4) 
Substituting and in (3), we have:
Using Lemma 5, we have the following relationship between the function and the periodogramlike function :
Here, a function is , if 0 as min{, } Thus, that minimises is equivalent to , which maximises .
4 Asymptotic Properties of ALSEs
In this section, we study the asymptotic properties of the proposed estimators, the ALSEs of model (1). The following theorem states the result on the consistency property of the ALSEs.
Theorem 1.
If the assumptions 1 and 2 are satisfied, then , the ALSE of , is a strongly consistent estimator of , that is, as .
Proof.
See Appendix A.
∎
In the following theorem, we state the result obtained on the asymptotic distribution of the proposed estimators.
Theorem 2.
If the assumptions 1 and 2 are true, then the asymptotic distribution of is same as that of as , where
= is the ALSE of and = is the LSE of and is a 6 6 diagonal matrix defined as:
=
Proof.
See Appendix B.
∎
5 Multiple Component 2D Chirp Model
In this section, we consider a 2D chirp model with multiple components, mathematically expressed in the following form:
(5) 
Here is the observed data vector, s, s are the amplitudes, s, s are the frequencies and the s, s are the frequency rates. The random variables sequence is a stationary error sequence. In practice, the number of components, is unknown and its estimation is an important and still an open problem. For recent references on this model, see Zhang et al. [18] and Lahiri [10].
Here it is assumed that is known and our main purpose is to estimate the unknown parameters of this model, primarily the nonlinear parameters. Finding the ALSEs for the above model is computationally challenging, especially when the number of components, is large. Even when , we need to solve a 12D optimisation problem to obtain the ALSEs. Thus, we propose a sequential procedure to find these estimates. This method reduces the complexity of computation without compromising on the efficiency of the estimators. We prove that the ALSEs obtained by the proposed sequential procedure are strongly consistent and have the same rates of convergence as the LSEs.
In the following subsection, we provide the algorithm to obtain the sequential ALSEs of the unknown parameters of the component 2D chirp signal. Let us denote .
5.1 Algorithm to find the ALSEs:
Step 1: Maximizing the periodogramlike function
(6) 
We first obtain the nonlinear parameter estimates: . Then the linear parameter estimates can be obtained by substituting in (4). Thus
(7) 
Step 2: Now we have the estimates of the parameters of the first component of the observed signal. We subtract the contribution of the first component from the original signal vector to eliminate the effect of the first component and obtain a new data vector, say
Step 3: Now we compute by maximizing which is obtained by replacing the original data vector by the new data vector in (6) and then the linear parameters, and can be obtained by substituting in (4).
Step 4: Continue the process upto steps.
5.2 Asymptotic Properties
Further assumptions required to study the consistency property and derive the asymptotic distribution of the proposed estimators, are stated as follows:
Assumption 3. is an interior point of , for all and the frequencies , and the frequency rates , are such that .
Assumption 4. s and s satisfy the following relationship:
In the following theorems, we state the results we obtained on the consistency of the proposed estimators.
Theorem 3.
Under the assumptions 1, 3 and 4, , , and are strongly consistent estimators of , , , respectively, that is, as .
Proof.
See Appendix C.
∎
Theorem 4.
If the assumptions 1, 3 and 4 are satisfied and p 2,then as .
Proof.
See Appendix C.
∎
The result obtained in the above theorem can be extended upto the th step. Thus for any , the ALSEs obtained at the th step are strongly consistent.
Theorem 5.
If the assumptions 1, 3 and 4 are satisfied, and if , , , , and are the estimators obtained at the th step, and k p then 0 and 0 as .
Proof.
See Appendix C.
∎
Next we derive the asymptotic distribution of the proposed estimators. In the following theorem, we state the results on the distribution of the sequential ALSEs.
Theorem 6.
If the assumptions, 1, 3 and 4 are satisfied, then
where is the diagonal matrix as defined in Theorem 2 and
Proof.
See Appendix D. ∎
The above result holds true for all and is stated in the following theorem.
Theorem 7.
If the assumptions, 1, 3 and 4 are satisfied, then
where can be obtained by replacing by and by in defined above.
Proof.
This proof can be obtained by proceeding exactly in the same manner as in the proof of Theorem 6.
∎
6 Simulation Studies
6.1 Simulation results for the one component model
We perform numerical simulations on model (1) with the following parameters:
The following error structures are used to generate the data:
(8)  
(9) 
Here . For simulations we consider different values of and different values of and as can be seen in the tables. We estimate the parameters both by least squares estimation method and approximate least squares estimation method. These estimates are obtained 1000 times each and averages, biases and MSEs are reported. We also compute the asymptotic variances to compare with the corresponding MSEs. From the tables above, it is observed that as the error variance increases, the MSEs also increase for both the LSEs and the ALSEs. As the sample size increases, one can see that the estimates become closer to the corresponding true values, that is, the biases become small. Also, the MSEs decrease as the sample size, and increase, and the order of the MSEs of both the estimators is almost equivalent to the order of the asymptotic variances. Hence, one may conclude that they are well matched. The MSEs of the ALSEs get close to those of LSEs as and increase and hence to the theoretical asymptotic variances of the LSEs, showing that they are asymptotically equivalent.
Parameters  
True values  1.5  0.5  2.5  0.75  1.5  0.5  2.5  0.75  
ALSEs  LSEs  
0.1  Avg  1.4910  0.5005  2.5194  0.7492  1.5000  0.4999  2.5000  0.7499 
Bias  0.0090  0.0005  0.0194  0.0008  3.85E05  1.66E06  1.22E05  3.97E07  
MSE  8.21E05  2.83E07  3.79E04  5.62E07  8.29E07  1.13E09  7.18E07  9.90E10  
AVar  7.56E07  1.13E09  7.56E07  1.13E09  7.56E07  1.13E09  7.56E07  1.13E09  
ALSEs  LSEs  
0.5  Avg  1.4912  0.5005  2.5196  0.7492  1.5000  0.5000  2.5003  0.7499 
Bias  0.0088  0.0005  0.0196  0.0008  3.01E05  1.29E06  0.0003  9.60E06  
MSE  9.78E05  3.08E07  4.10E04  6.03E07  2.03E05  2.76E08  2.10E05  2.96E08  
AVar  1.89E05  2.48E08  1.89E05  2.48E08  1.89E05  2.48E08  1.89E05  2.48E08  
ALSEs  LSEs  
1  Avg  1.4911  0.5005  2.5184  0.7492  1.5001  0.4999  2.4992  0.7500 
Bias  0.0089  0.0005  0.0184  0.0008  0.0001  1.15E06  0.0007  2.44E05  
MSE  1.52E04  3.87E07  4.21E04  6.23E07  8.64E05  1.18E07  7.82E05  1.09E07  
AVar  7.56E05  1.13E07  7.56E05  1.13E07  7.56E05  1.13E07  7.56E05  1.13E07 
Parameters  
True values  1.5  0.5  2.5  0.75  1.5  0.5  2.5  0.75  
ALSEs  LSEs  
0.1  Avg  1.5039  0.4999  2.4997  0.7500  1.5000  0.4999  2.5000  0.7499 
Bias  0.0039  9.22E05  0.0002  1.01E05  5.12E06  6.96E08  2.97E06  2.76E08  
MSE  1.95E05  9.75E09  3.22E07  2.03E10  2.73E08  1.04E11  3.07E08  1.14E11  
AVar  4.73E08  1.77E11  4.73E08  1.77E11  4.73E08  1.77E11  4.73E08  1.77E11  
ALSEs  LSEs  
0.5  Avg  1.5041  0.4999  2.4997  0.7500  1.5000  0.4999  2.5000  0.7499 
Bias  0.0041  9.53E05  0.0002  9.99E06  2.34E05  3.01E07  9.23E06  2.12E07  
MSE  2.11E05  1.04E08  1.67E06  6.70E10  9.53E07  3.44E10  8.90E07  3.33E10  
AVar  1.18E06  4.43E10  1.18E06  4.43E10  1.18E06  4.43E10  1.18E06  4.43E10  
ALSEs  LSEs  
1  Avg  1.504  0.4999  2.4997  0.7500  1.5000  0.4999  2.5000  0.7499 
Bias  0.0040  9.40E05  0.0002  1.01E05  8.76E05  1.28E06  1.66E05  9.05E08  
MSE  2.42E05  1.14E08  5.00E06  1.87E09  4.24E06  1.53E09  4.01E06  1.45E09  
AVar  4.73E06  1.77E09  4.73E06  1.77E09  4.73E06  1.77E09  4.73E06  1.77E09 
Parameters  
True values  1.5  0.5  2.5  0.75  1.5  0.5  2.5  0.75  
ALSEs  LSEs  
0.1  Avg  1.5005  0.4999  2.4997  0.7500  1.4999  0.5  2.5000  0.7499 
Bias  0.0005  7.67E06  0.0003  2.02E06  4.78E07  3.90E09  3.55E07  5.54E09  
MSE  3.15E07  6.20E11  1.69E07  1.68E11  8.29E09  1.33E12  7.29E10  2.04E13  
AVar  9.34E09  1.56E12  9.34E09  1.56E12  9.34E09  1.56E12  9.34E09  1.56E12  
ALSEs  LSEs  
0.5  Avg  1.5003  0.4999  2.4996  0.7500  1.5000  0.4999  2.5000  0.7499 
Bias  0.0003  5.85E06  0.0004  2.40E06  4.90E06  1.40E07  4.41E06  1.60E08  
MSE  3.80E07  7.20E11  3.14E07  4.12E11  1.55E07  2.62E11  1.07E07  1.88E11  
AVar  2.33E07  3.89E11  2.33E07  3.89E11  2.33E07  3.89E11  2.33E07  3.89E11  
ALSEs  LSEs  
1  Avg  1.5004  0.4999  2.4995  0.7500  1.5000  0.4999  2.4999  0.7500 
Bias  0.0004  6.70E06  0.0005  3.89E06  4.90E05  6.11E07  1.45E05  5.86E08  
MSE  1.01E06  1.73E10  9.37E07  1.38E10  7.11E07  1.17E10  5.98E07  9.97E11  
AVar  9.34E07  1.56E10  9.34E07  1.56E10  9.34E07  1.56E10  9.34E07  1.56E10 
Parameters  
True values  1.5  0.5  2.5  0.75  1.5  0.5  2.5  0.75  
ALSEs  LSEs  
0.1  Avg  1.4999  0.5000  2.4999  0.7500  1.5000  0.4999  2.5000  0.7500 
Bias  4.19E05  1.99E06  4.30E05  3.65E08  2.55E06  3.13E08  3.12E07  5.88E10  
MSE  1.60E08  5.19E12  2.18E08  1.78E12  5.38E10  5.93E14  7.86E10  8.54E14  
AVar  2.95E09  2.77E13  2.95E09  2.77E13  2.95E09  2.77E13  2.95E09  2.77E13  
ALSEs  LSEs  
0.5  Avg  1.4998  0.5000  2.4998  0.7500  1.5000  0.4999  2.5000  0.7500 
Bias  0.0002  2.77E06  0.0002  7.46E07  4.96E06  4.93E08  1.34E06  2.74E08  
MSE  8.14E08  1.38E11  9.44E08  8.14E12  3.83E08  3.75E12  3.64E08  3.66E12  
AVar  7.38E08  6.92E12  7.38E08  6.92E12  7.38E08  6.92E12  7.38E08  6.92E12  
ALSEs  LSEs  
1  Avg  1.4997  0.5000  2.4997  0.7500  1.5000  0.4999  2.5000  0.7499 
Bias  0.0003  3.60E06  0.0002  1.43E06  9.37E07  2.97E08  2.10E06  8.23E08  
MSE  2.35E07  3.09E11  2.71E07  2.35E11  1.60E07  1.57E11  1.91E07  1.79E11  
AVar  2.95E07  2.77E11  2.95E07  2.77E11  2.95E07  2.77E11  2.95E07  2.77E11 
Parameters  
True values  1.5  0.5  2.5  0.75  1.5  0.5  2.5  0.75  
ALSEs  LSEs  
0.1  Avg  1.4911  0.5005  2.5193  0.7492  1.4999  0.5000  2.5000  0.7499 
Bias  0.0089  0.0005  0.0193  0.0008  5.15E05  1.81E06  3.05E05  7.46E07  
MSE  8.28E05  2.84E07  3.78E04  5.60E07  1.13E06  1.70E09  1.12E06  1.57E09  
AVar  1.13E06  1.70E09  1.13E06  1.70E09  1.13E06  1.70E09  1.13E06  1.70E09  
ALSEs  LSEs  
0.5  Avg  1.4910  0.5005  2.5192  0.7492  1.4998  0.5000  2.5000  0.7500 
Bias  0.0090  0.0005  0.0192  0.0007  0.0002  6.45E06  2.31E06  1.59E06  
MSE  1.09E04  3.29E07  4.03E04  5.93E07  3.13E05  4.60E08  2.87E05  4.00E08  
AVar  2.84E05  4.25E08  2.84E05  4.25E08  2.84E05  4.25E08  2.84E05  4.25E08  
ALSEs  LSEs  
1  Avg  1.4910  0.5005  2.5195  0.7492  1.4997  0.5000  2.5002  0.7499 
Bias  0.0090  0.0005  0.0195  0.0008  0.0003  8.25E06  0.0002  6.10E06  
MSE  1.91E04  4.57E07  5.04E04  7.30E07  1.31E04  1.94E07  1.24E04  1.77E07  
AVar  1.13E04  1.70E07  1.13E04  1.70E07  1.13E04  1.70E07  1.13E04  1.70E07 
Parameters  
True values  1.5  0.5  2.5  0.75  1.5  0.5  2.5  0.75  
ALSEs  LSEs  
0.1  Avg  1.5039  0.4999  2.4997  0.7500  1.5000  0.4999  2.5000  0.7499 
Bias  0.0039  9.19E05  0.0003  1.05E05  1.26E05  2.32E07  3.69E06  7.66E08  
MSE  1.94E05  9.71E09  3.99E07  2.32E10  3.92E08  1.54E11  4.35E08  1.60E11  
AVar  7.09E08  2.66E11  7.09E08  2.66E11  7.09E08  2.66E11  7.09E08  2.66E11  
ALSEs  LSEs  
0.5  Avg  1.5042  0.4999  2.4998  0.7500  1.5000  0.4999  2.5000  0.7499 
Bias  0.0042  9.70E05  0.0002  8.66E06  6.93E05  1.12E06  4.43E05  1.17E06  
MSE  2.24E05  1.10E08  2.31E06  9.16E10  1.47E06  5.55E10  1.45E06  5.63E10  
AVar  1.77E06  6.65E10  1.77E06  6.65E10  1.77E06  6.65E10  1.77E06  6.65E10  
ALSEs  LSEs  
1  Avg  1.5041  0.4999  2.4998  0.7500  1.4999  0.5000  2.4999  0.7499 
Bias  0.0041  9.59E05  0.0002  8.50E06  3.56E05  1.71E07  2.04E05  1.20E07  
MSE  2.60E05  1.24E08  7.63E06  2.77E09  6.11E06  2.30E09  6.68E06  2.37E09  
AVar  7.09E06  2.66E09  7.09E06  2.66E09  7.09E06  2.66E09  7.09E06  2.66E09 
Parameters  
True values  1.5  0.5  2.5  0.75  1.5  0.5  2.5  0.75  
ALSEs  LSEs  
0.1  Avg  1.5005  0.4999  2.4997  0.7500  1.4999  0.5000  2.5000  0.7499 
Bias  0.0005  7.49E06  0.0002  1.92E06  1.18E06  1.58E08  1.46E06  1.12E08  
MSE  3.11E07  6.12E11  1.68E07  1.71E11  1.21E08  2.03E12  9.24E10  2.82E13  
AVar  1.40E08  2.33E12  1.40E08  2.33E12  1.40E08  2.33E12  1.40E08  2.33E12  
ALSEs  LSEs  
0.5  Avg  1.5004  0.4999  2.4996  0.7500  1.5000  0.4999  2.5000  0.7500 
Bias  0.0004  6.10E06  0.0004  3.16E06  5.48E06  8.28E08  1.95E06  4.45E08  
MSE  5.07E07  9.31E11  4.75E07  6.37E11  2.80E07  2.80E07  2.10E07  3.49E11  
AVar  3.50E07  5.83E11  3.50E07  5.83E11  3.50E07  5.83E11  3.50E07  5.83E11  
ALSEs  LSEs  
1  Avg  1.5004  0.4999  2.4995  0.7500  1.5000  0.4999  2.4999  0.7500 
Bias  0.0004  6.65E06  0.0005  4.30E06  3.76E05  6.39E07  1.91E05  1.19E07  
MSE  1.37E06  2.39E10  1.26E06  1.90E10  1.07E06  1.80E10  9.31E07  1.58E10  
AVar  1.40E06  2.33E10  1.40E06  2.33E10  1.40E06  2.33E10  1.40E06  2.33E10 
Parameters  
True values  1.5  0.5  2.5  0.75  1.5  0.5  2.5  0.75  
ALSEs  LSEs  
0.1  Avg  1.4999  0.5000  2.4999  0.7500  1.5000  0.4999  2.4999  0.7500 
Bias  4.14E05  1.98E06  4.85E05  9.25E08  3.60E06  3.84E08  9.42E07  1.36E08  
MSE  1.68E08  5.28E12  2.5063E08  2.02E12  9.26E10  1.07E13  1.81E09  1.82E13  
AVar  4.43E09  4.15E13  4.43E09  4.15E13  4.43E09  4.15E13  4.43E09  4.15E13  
ALSEs  LSEs  
0.5  Avg  1.4998  0.5000  2.4998  0.7500  1.4999  0.5000  2.4999  0.7500 
Bias  0.0002  3.21E06  0.0001  1.02E06  6.40E06  3.26E08  4.78E06  3.26E08  
MSE  1.36E07  2.00E11  1.36E07  1.16E11  6.31E08  6.15E12  6.12E08  5.81E12  
AVar  1.11E07  1.04E11  1.11E07  1.04E11  1.11E07  1.04E11  1.11E07  1.04E11  
ALSEs  LSEs  
1  Avg  1.4997  0.5000  2.4997  0.7500  1.4999  0.5000  2.5000  0.7499 
Bias  0.0003  3.94E06  0.0003  1.60E06  2.75E05  2.64E07  6.40E06  4.03E08  
MSE  3.66E07  4.48E11  3.67E07  3.29E11  2.73E07  2.67E11  2.78E07  2.67E11  
AVar  4.43E07  4.15E11  4.43E07  4.15E11  4.43E07  4.15E11  4.43E07  4.15E11 
6.2 Simulation results for the multiple component model with
Next we conduct numerical simulations on model (5) with and the following parameters:
The error structures used to generate the data are same as that used for the one component model, see equations, (8) and (9). For simulations we consider different values of and different values of and , again same as that for the one component model. We estimate the parameters both by least squares estimation method and approximate least squares estimation method. These estimates are obtained 1000 times each and averages, biases, MSEs and asymptotic variances are computed. The results are reported in the following tables. From the tables, it can be seen that the estimates, both the ALSEs and the LSEs are quite close to their true values. It is observed that the estimates of the second component are better than those of the first component, in the sense that their biases and MSEs are smaller and the MSEs are better matched with the corresponding asymptotic variances. For both the estimators, as the sample size increases, the MSEs and the biases of the estimates of both components, decrease thus showing consistency.
Parameters  

True values  2.1  0.1  1.25  0.25  1.5  0.5  1.75  0.75  
ALSEs  
Average  2.1154  0.0994  1.2587  0.2500  1.5411  0.4988  1.7664  0.7493  
Bias  0.0154  0.0006  0.0087  1.01E05  0.0411  0.0012  0.0164  0.0007  
MSE  2.36E04  3.48E07  7.67E05  4.85E10  2.36E04  1.45E06  2.68E04  4.85E10  
0.1  LSEs  
Average  2.1031  0.0998  1.2565  0.2500  1.5017  0.5000  1.7510  0.7500  
Bias  0.0031  0.0002  0.0065  3.83E05  0.0017  2.16E05  0.0010  2.92E05  
MSE  9.70E06  3.14E08  4.23E05  1.85E09  3.71E06  1.75E09  1.93E06  2.11E09  
AVar  2.40E07  3.60E10  2.40E07  3.60E10  7.56E07  1.13E09  7.56E07  1.13E09  
ALSEs  
Average  2.1154  0.0994  1.2586  0.2500  1.5412  0.4988  1.7664  0.7493  
Bias  0.0154  0.0006  0.0086  1.49E05  0.0412  0.0012  0.0164  0.0007  
MSE  2.44E04  3.59E07  8.02E05  8.99E09  2.44E04  1.48E06  2.87E04  8.99E09  
0.5  LSEs  
Average  2.1031  0.0998  1.2563  0.2500  1.5017  0.5000  1.7510  0.7500  
Bias  0.0031  0.0002  0.0063  4.40E05  0.0017  2.25E05  0.0010  3.13E05  
MSE  1.66E05  4.03E08  4.63E05  1.04E08  2.48E05  3.16E08  2.55E05  3.50E08  
AVar  5.99E06  8.99E09  5.99E06  8.99E09  1.89E05  2.84E08  1.89E05  2.84E08  
ALSEs  
Average  2.1154  0.0994  1.2585  0.2500  1.5408  0.4988  1.7665  0.7493  
Bias  0.0154  0.0006  0.0085  1.88E05  0.0408  0.0012  0.0165  0.0007  
MSE  2.65E04  3.84E07  9.75E05  3.93E08  2.65E04  1.53E06  3.38E04  3.93E08  
1  LSEs  
Average  2.1031  0.0998  1.2563  0.2500  1.5015  0.5000  1.7513  0.7500  
Bias  0.0031  0.0002  0.0063  4.78E05  0.0015  1.40E05  0.0013  4.21E05  
MSE  3.63E05  6.50E08  6.50E05  3.98E08  8.57E05  1.22E07  8.44E05  1.18E07  
AVar  2.40E05  3.60E08  2.40E05  3.60E08  7.56E05  1.13E07  7.56E05  1.13E07 
Parameters  

True values  2.1  0.1  1.25  0.25  1.5  0.5  1.75  0.75  
ALSEs  
Average  2.1011  0.1000  1.2597  0.2499  1.5127  0.4997  1.7529  0.7499  
Bias  0.0011  1.36E05  0.0097  0.0001  0.0127  0.0003  0.0029  5.92E05  
MSE  1.16E06  1.92E10  9.37E05  2.07E08  1.16E06  7.47E08  8.34E06  2.07E08  
0.1  LSEs  
Average  2.1010  0.1000  1.2572  0.2499  1.5007  0.5000  1.7507  0.7500  
Bias  0.0010  1.07E05  0.0072  0.0001  0.0007  1.35E05  0.0007  1.19E05  
MSE  1.12E06  1.24E10  5.18E05  1.49E08  6.03E07  1.99E10  5.16E07  1.60E10  
AVar  1.50E08  5.62E12  1.50E08  5.62E12  4.73E08  1.77E11  4.73E08  1.77E11  
ALSEs  
Average  2.1011  0.1000  1.2597  0.2499  1.5127  0.4997  1.7529  0.7499  
Bias  0.0011  1.36E05  0.0097  0.0001  0.0127  0.0003  0.0029  5.94E05  
MSE  1.57E06  3.33E10  9.39E05  2.08E08  1.57E06  7.46E08  9.66E06  2.08E08  
0.5  LSEs  
Average  2.1011  0.1000  1.2572  0.2499  1.5007  0.5000  1.7507  0.7500  
Bias  0.0011  1.09E05  0.0072  0.0001  0.0007  1.27E05  0.0007  1.20E05  
MSE  1.53E06  2.59E10  5.22E05  1.50E08  1.75E06  5.97E10  1.67E06  5.74E10  
AVar  3.75E07  1.40E10  3.75E07  1.40E10  1.18E06  4.43E10  1.18E06  4.43E10  
ALSEs  
Average  2.1010  0.1000  1.2597  0.2499  1.5127  0.4997  1.7528  0.7499  
Bias  0.0010  1.32E05  0.0097  0.0001  0.0127  0.0003  0.0028  5.66E05  
MSE  2.69E06  7.54E10  9.51E05  2.13E08  2.69E06  7.60E08  1.28E05  2.13E08  
1  LSEs  
Average  2.1010  0.1000  1.2572  0.2499  1.5007  0.5000  1.7506  0.7500  
Bias  0.0010  1.03E05  0.0072  0.0001  0.0007  1.30E05  0.0006  9.31E06  
MSE  2.62E06  6.72E10  5.32E05  1.54E08  5.14E06  1.84E09  5.11E06  1.80E09  
AVar  1.50E06  5.62E10  1.50E06  5.62E10  4.73E06  1.77E09  4.73E06  1.77E09 
Parameters  

True values  2.1  0.1 
Comments
There are no comments yet.