A New Algorithm using Component-wise Adaptive Trimming For Robust Mixture Regression

05/23/2020
by   Wennan Chang, et al.
Indiana University
0

Mixture regression provides a statistical model for teasing out latent heterogeneous relationships between response and independent variables. Solving mixture regression relying on EM algorithm is highly sensitive to outliers. To enable simultaneous outlier detection and robust parameter estimation, we proposed a fast and efficient robust mixture regression algorithm, considering Component-wise Adaptive Trimming (CAT). Compared with multiple existing algorithms, it grasps a good balance of computational efficiency and robustness, in different scenarios of simulated data, where unequal component proportions and variances, different levels of outlier contaminations and sample sizes, occur. The adaptive trimming ability of CAT makes it a highly potential tool for mining the latent relationships among variables in the big data era. CAT has been implemented in an R package 'RobMixReg' available in CRAN.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

09/01/2021

Spatially and Robustly Hybrid Mixture Regression Model for Inference of Spatial Dependence

In this paper, we propose a Spatial Robust Mixture Regression model to i...
12/16/2019

Detecting and Classifying Outliers in Big Functional Data

This paper proposes two new outlier detection methods, which are useful ...
11/08/2018

Weighted likelihood mixture modeling and model based clustering

A weighted likelihood approach for robust fitting of a mixture of multiv...
09/17/2019

Efficient and Robust Estimation of Linear Regression with Normal Errors

Linear regression with normally distributed errors - including particula...
08/23/2019

A Robust Regression Approach for Robot Model Learning

Machine learning and data analysis have been used in many robotics field...
07/19/2020

Supervised clustering of high dimensional data using regularized mixture modeling

Identifying relationships between molecular variations and their clinica...
12/15/2021

Gaining Outlier Resistance with Progressive Quantiles: Fast Algorithms and Theoretical Studies

Outliers widely occur in big-data applications and may severely affect s...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1 Introduction

Finite Mixture Gaussian Regression (FMGR) was first introduced by goldfeld1973estimation, and has been widely used to explore the relationship among variables coming from several unknown latent classes in many fields bohning1999computer; hennig2000identifiablity; jiang1999hierarchical; mclachlan2004finite; xu1996convergence; fruhwirth2006finite

. Inference of parameters in FMGR is usually through EM algorithm assuming normally distributed component errors, and might be vulnerable to outliers or heavy-tailed noises. Many algorithms have been developed to estimate the FMGR parameters robustly

yu2020selective. To robustify the estimation procedure, Markatou markatou2000mixture and Shen et al. shen2004outlier proposed using a weight factor for each data point. García-Escudero et al. garcia2017robust have proposed robust model estimation complemented with trimming and constrained estimation. Bai et al. bai2012robust proposed a modified EM algorithm by replacing the least squares criterion in M step with a robust bi-square criterion (MIXBI). Bashir and Carter bashir2012robust

extended the idea of the S-estimator to mixture of linear regression. Yao et al.

yao2014robust extended the idea of mixture of t-distributions proposed by Peel and McLachlan peel2000robust from clustering to the regression setting (MIXT). Similarly, Song et al. song2014robust proposed using Laplace distribution to model the error distribution (MIXL). These methods seek for robust parameters estimations in the presence of outliers, however the identities of the outliers still remain unknown. The identities of the ourliers are often interesting for two reasons: firstly, removal of the outliers could improve the parameter estimates; secondly, for practical reasons, outlying samples could be caused by measurement errors or represent a novel mechanism not representative by the current observations, both of which are interesting to be further investigated. To enable outlier detection, Neykov et al. neykov2007robust proposed robust fitting of mixtures using the trimmed likelihood estimator (TLE), where given a parameter , , the outliers are defined as the observations with the smallest sample likelihood; Yu et al. yu2017new proposed a penalized mean-shift mixture model, , for simultaneous outlier detection and robust parameter estimation. The challenge with TLE and

are the involvement of hyperparameters, namely,

in TLE and penalty parameter in , which could heavily impact the performance of the two algorithms. Yu et al. yu2017new proposed using BIC procedure for hyperparameter, however, BIC criterior becomes highly unstable when the total number of parameters, which equals to the total number of outliers, becomes large.

To address the challenges in simultaneous outlier detection and robust parameter estimation in FMGR, we adopted the idea of Classification-Expectation-Maximization (CEM) algorithm where individual observations are assigned to a definite cluster as part of the maximization process, different from EM algorithm

celeux1992classification. Essentially, CEM maximizes the complete data likelihood, instead of the observed data likelihood as in EM. Under CEM, each component has its exclusive members, which makes it possible to apply a trimmed likelihood approach designed for (single component) linear regression on its member, and hence enables both robust parameter estimation as well as outlier detection for the component. Our major contribution in this paper is the introduction of CEM to FMGR, which provided a platform that migrates the robustness issue from mixture regression to (single component) linear regression, for which robust estimators have been extensively studied, and many algorithms with high break down point have been developed. The task of outlier detection was distributed to each component, making it possible to formally define outliers in FMGR. The algorithm, namely Component-wise Adaptive Trimming (CAT), detects outlier in a data-driven fashion free of hyperparameters, and is hence computationally efficient.

The remainder of the article is organized as follows. In Section 2, we will introduce the complete data maximum likelihood, and the CEM algorithm, based on which, our component adaptive trimming method is developed. In Section 3, we show the performance comparison of our method with other five state of the art methods using synthetic datasets.

2 Component wise adaptive trimming

2.1 The complete data maximum likelihood estimation

Let , be a finite set of observations, and the design matrix, and

the response vector. Consider a FMGR model parameterized by

, it is assumed that when belongs to the k-th component, , then , where . Then, the conditional density of given is where is the normal density function with mean and variance . Let be the membership indicator for observation , then . The maximum likelihood estimate for is through minimizing the following negative log likelihood:

(1)

EM algorithm is usually applied to obtain the MLE estimates, by treating the cluster membership

as missing random variables.

Assume, we are given a set of observations and assignments . Then, the likelihood that all observations have been drawn according to a FMGR and that each observation have been generated by the -th component, is given by

(2)

This is called the complete-data likelihood. Note that the assignments define a partition of the observations, , such that iff . Hence, we can also rewrite the Equation (2) in its negative logarithm form as

(3)

We introduce the the complete data maximum likelihood estimates (CMLE) as follows.

Definition 2.1.

(Complete-data Maximum Likelihood Estimates, CMLE) Let be the design matrix of , and be the response vector. Given an integer , find a partition of the N observations and FMGR parameters that minimizes defined in equation (3).

Note that, CMLE is not well defined in this form. For example, for an observation , if is chosen such that and we let , then , which results in infinite likelihood. It is easy to put some mild restrictions on the cluster size, then we can lower bound the variance associated with each regression line, and the CMLE will be well defined blomer2016hard.

2.2 Alternating Optimization Scheme with the CEM algorithm

We introduce the alternating optimization algorithm to solve the CMLE problem. Clearly, fixing the partition , the optimal mixture parameter is given by with

Here, denotes the cardinality of the set ; means the OLS solution to regressing on using only observations from .

Fixing the FMGR parameters , the optimal partition is given by assigning each point to its most likely component, i.e.

where

which is the posterior probability that

lies on the k-th regression line of the mixtures. By repeatedly computing updating between and , we will show in Lemma 2.1 that the solution converges to a stationary point of the likelihood function. We call this alternating scheme the CEM algorithm.

Input:
Initialization:
For
E-step: Compute for and , the current posterior probabilities by
C-step: For , assign .
M-step: For , the parameters are then updated by
Stop if converged.
End
Output:
Algorithm 1 CEM

Note: Here denotes the observations indexed by

denotes ordinary least square estimates of regressing

on .

Lemma 2.1.

The complete data likelihood, , is non-decreasing for any sequence defined as in Algorithm 1, and it converges to a stationary value. Moreover, if the maximum likelihood estimates of the parameters are well-defined, the sequence of converges to a stationary position.

Proof.

We first show that the sequence is non-decreasing. Since is maximizing , then

And since

for all , which implies

then we have

Since there is a finite number of partitions of the samples into clusters, the non-decreasing sequence takes a finite number of values, and thus, converges to a stationary values. Hence for large enough; from the first equality and from the assumption that the maximum likelihood estimate are well-defined, we deduce that . ∎

2.3 A new definition for outlier under CEM

In linear regression, outliers are understood as observations that deviate from the model assumptions, and obviously, samples with lower likelihood are more likely to be outliers. Powerful tools that simultaneously identifies and down weighs the outlying data points for linear regression has been developed. In robust linear regression, the outliers are either identified as a trimming ratio of the total observations with the lowest likelihood, or they are identified in a completely data-driven manner, without the need of a pre-specified pison2002small; leroy1987robust; rousseeuw1984least; rousseeuw1999fast.

Unfortunately, such a definition of outliers becomes less applicable in the case of mixture regression. Given a robust mixture regression model and a trimming ratio , if we follow the same logic as in linear regression, then the observations with the smallest overall likelihood will be deteced as outliers, as in neykov2007robust. This trimmed likelihood approach implies that an observation with lower overall likelihood is more likely to be an outlier than an observation with higher overall likelihood. However, the overall likelihood depends on not only the likelihood of the observation with respect to each component, but also the proportion of each component, and such a criterior for outlier becomes problematic if the mixing components are unbalanced. In other words, a low will down-weigh the “outlierness” of an observation from the k-th component. In addition, if we argue given a set of observations, we could always find certain mixture model to well explain it, there is no basis for us to call any observation an outlier.

The complete data likelihood approach based CEM algorithm disentangles the mixture distribution into exclusive clusters, within which, the robustness issue could be much easily handled give the tremendous amount of research conducted for robust linear regression. More importantly, we could introduce a more natural definition for outliers.

Definition 2.2.

(Outlier of FMGR) Given an FMGR model parameterized by , under CMLE, an observation is considered as an outlier, if and . In other words, an observation is considered as an outlier if it is an outlier to the component it belongs to.

This new definition shifts the robustness issue from a mixture model to its linear regression components, the latter of which has been well defined and studied. Here is a criterior for outlier-ness in linear regression. Naturally, to confer a robust parameter estimation for FMGR under CMLE, we could replace the least square criterion for parameter estimation in the M-step by a robust criterion; and further to enable simultaneous outlier detection, we could choose to use any trimmed likelihood approach with high break-down point.

2.4 The robust CEM algorithm

Under Definition 2.2, detecting outliers of the FMGR model could be accomplished through detecting the component-wise outliers. And the fact that outlier detection in linear regression could be completely data-adaptive makes it possible for us to develop a data-driven algorithm for simultaneous outlier detection and robust parameter estimation in FMGR.

Our Component-wise Adaptive Trimming method, namely CAT, starts by initializing the posterior probability matrix, , as in Algorithm 2. For , we randomly sampling samples to build a robust linear regression model, and the posterior probability of sample for component will be initialized as the density of the residual of sample fitting the -th robust regression line. For robust linear regression, we use the “ltsReg” function in the “robustbase” library in R pison2002small; leroy1987robust; rousseeuw1984least; rousseeuw1999fast, where the outliers were detected in a data-driven manner, in addition to robust parameter estimates.

Input: response vector , independent variables in matrix ; the number component, ; size of initialization random sample, ;
For
Draw a random sample of size from set , indexed by ;
Run robust linear regression:
Initialize posterior probability:
End
Output: posterior probability matrix .
Algorithm 2 Init: initialize the posterior probability matrix

Note: Function RLM outputs robust linear regression model and parameter estimates .

With each initialized , CAT then runs a robust CEM algorithm where the OLS estimates in the M-step was replaced by robust estimates using trimmed likelihood method. At the end of each iteration, an MLE estimate will be obtained on samples excluding the outliers detected from all the components. The MLE estimates were conducted using function “flexmix” from the “flexmix” R package leisch2004flexmix. When the set of outliers do not change between two iterations (or reaching a pre-specificed number of iterations), the outlier and parameter estimates will then be finalized for this random start.

CAT undergoes multiple random starts to stabilize the results, and select the results from the start whose obtained outliers are the closest to the average frequency of outliers across all the random starts. The complete CAT algorithm was illustrated in Algorithm 3.

Input: response vector ; independent variables in matrix ; the number of mixing component, ; size of initialization random sample, ; the maximum number of iteration ; the number of random starts ;
For
Initialization: ;;
While
Let
For
End
End
End
Output: Robust FMGR parameter estimate ; outlier set
Algorithm 3 CAT: Component-wise Adaptive Trimming

Note: Function RLM outputs robust linear regression model and parameter estimates , with function ltsReg; function outlier outputs the outliers identified by a robust linear regression using trimmed likelihood method, with function ltsReg; function MLE outputs the regular MLE estimates of FMGR model based on EM algorithm, with function flexmix. is an indicator function, where takes value 1 if and 0 otherwise.

3 Simulation studies

We evaluate the performance of CAT on synthetic datasets, and compare it with several existing method, including MLE, TLE, MIXT, MIXL and MIXBI. They stand for the MLE approach leisch2004flexmix, the trimmed likelihood approach neykov2007robust, the mixture yao2014robust, mixture Laplaciansong2014robust, and mixture bisquare bai2012robust approaches.

To compare the methods’ performance on simultaneous outlier detection and robust parameter estimation, we simulate data using the following mean-shift model yu2017new:

where . Here, for each observation, a mean-shift parameter, , is added to its mean structure in each mixture component. We consider scenarios in which the observations are drawn from mixture regression models with different , , , , and contaminated with different levels of additive outliers.

Example 1: For each , is independently generated with

where

is a component indicator generated from a Bernoulli distribution with

; and are independently generated from a ; and the error terms and are independently generated from .

Scenario 1:
Scenario 2:
Scenario 3:
Scenario 4:
Scenario 5:
Scenario 6:

Example 2: For each , is independently generated with

where is a component indicator generated from a Bernoulli distribution with . is independently generated from a ; and the error terms , , are independently generated from a , , .

Scenario 1:
Scenario 2:
Scenario 3:
Scenario 4:
Scenario 5:
Scenario 6:

For scenario in examples 1 and 2, we simulate data for sample sizes 200 and 400. The bias and MSE of parameter estimation over 100 repetitions under each scenario was examined for each competing method, which includes the linear regression coefficients and mixing proportion of all the components. The label switching issue celeux2000computational; stephens2000dealing; yao2009bayesian creates some trouble on how to align the parameters of one component from predicted model to that of the true model. Different component orders in the predicted and true model might give totally different results and there are no widely accepted methods to adjust for that. In our simulation study, we simply choose to order the components in the estimated parameter matrix by minimizing the Euclidian distance to the true parameter matrix.

Tables 1 and 2 report the bias and MSE of the parameter estimates of each method for sample sizes =200 and =400, respectively, in example 1. Tables 3 and 4 report the bias and MSE of the parameter estimates of each method for sample sizes =200 and =400, respectively, in example 2. The first rows are for regression coefficients, and the last rows for mixing components. As seen from the four tables, CAT performs comparable to MLE when there is not outlier present. When the observations are contaminated by high leverage outliers, CAT is able to trim off the outliers and reach a robust parameter estimates, regardless of component variances, feature numbers and sample numbers. Its performance is always better or comparable with regard to the five state-of-the-art methods.

4 Conclusion

We proposed solving FMGR using the CEM algorithm, based on which, the outliers are more naturally defined, and robustness issue better and more conveniently handled. The CEM algorithm represents a variant of the EM algorithm by maximizing the complete data likelihood. It enables a more natural definition of outliers for FMGR, and further the simultaneous detection of outliers and robust estimation of parameters. Most importantly, the adaptive trimming in FMGR boils down into that of the simple linear regression, for which many powerful tools have been developed. In summary, CAT is an automatic algorithm of high potential in mining the heterogeneous relations among variables in the data booming era.

Scenario : CAT : MLE : TLE : MIXBI : MIXL : MIXT
1 -0.040(0.130) -0.030(0.120) -0.030(0.120) -0.030(0.130) -0.080(0.210) -0.030(0.130)
-0.010(0.120) -0.010(0.120) -0.010(0.120) -0.010(0.120) -0.020(0.130) -0.010(0.130)
0.010(0.130) 0.010(0.120) 0.010(0.120) 0.010(0.120) 0.040(0.160) 0.020(0.120)
-0.010(0.100) 0.000(0.100) 0.000(0.100) 0.000(0.100) -0.030(0.130) 0.010(0.100)
0.010(0.140) 0.010(0.130) 0.010(0.130) 0.010(0.130) -0.010(0.170) 0.020(0.140)
-0.020(0.110) -0.030(0.110) -0.030(0.110) -0.020(0.110) -0.020(0.140) -0.030(0.110)
-0.000(0.040) -0.000(0.040) -0.000(0.040) -0.000(0.040) -0.010(0.040) -0.010(0.040)
0.000(0.040) 0.000(0.040) 0.000(0.040) 0.000(0.040) 0.010(0.040) 0.010(0.040)
2 -0.030(0.430) -0.000(0.310) -0.000(0.310) -0.020(0.360) -0.060(0.550) 0.000(0.330)
0.020(0.270) 0.020(0.240) 0.020(0.240) 0.010(0.240) 0.000(0.330) 0.020(0.230)
0.020(0.390) 0.020(0.330) 0.020(0.330) 0.020(0.370) 0.130(0.470) 0.020(0.390)
0.010(0.320) 0.030(0.260) 0.030(0.260) -0.010(0.400) -0.040(0.360) 0.040(0.310)
-0.000(0.340) 0.010(0.270) 0.010(0.270) -0.020(0.300) -0.080(0.440) 0.000(0.280)
-0.020(0.310) -0.000(0.250) -0.000(0.250) -0.010(0.270) -0.020(0.340) 0.000(0.270)
-0.000(0.100) 0.000(0.080) 0.000(0.080) -0.010(0.110) -0.000(0.080) 0.000(0.090)
0.000(0.100) -0.000(0.080) -0.000(0.080) 0.010(0.110) 0.000(0.080) -0.000(0.090)
3 -0.100(0.170) -0.250(0.510) 0.220(1.500) -0.190(0.160) -0.290(0.290) 0.200(0.440)
-0.000(0.130) 0.330(0.340) 0.270(1.010) 0.060(0.130) 0.140(0.190) -0.010(0.210)
-0.090(0.190) 0.280(0.880) 1.780(2.400) -0.200(0.180) -0.220(0.380) 0.940(0.630)
0.020(0.150) 0.040(0.770) -1.160(2.080) 0.090(0.160) 0.180(0.180) -0.500(0.590)
-0.120(0.240) -0.440(1.020) -0.340(1.840) -0.310(0.240) -0.560(0.420) 0.520(0.710)
0.040(0.170) 0.710(0.640) 0.380(1.150) 0.140(0.170) 0.350(0.290) 0.060(0.400)
0.010(0.050) 0.000(0.150) 0.060(0.170) 0.010(0.050) -0.000(0.070) -0.060(0.150)
-0.010(0.050) -0.000(0.150) -0.060(0.170) -0.010(0.050) 0.000(0.070) 0.060(0.150)
4 0.030(0.550) -0.780(1.140) 0.000(2.670) -0.410(1.270) -0.520(1.050) 0.110(0.770)
0.040(0.270) 1.270(1.500) 0.870(2.730) 0.650(1.310) 0.900(1.240) 0.150(0.500)
0.010(0.560) 0.200(1.790) 1.540(3.220) 0.470(1.830) 0.620(1.730) 1.500(0.780)
-0.020(0.420) 0.560(2.050) -0.560(3.160) 0.180(1.450) -0.020(1.500) -1.230(0.620)
0.080(1.190) -1.700(2.570) -0.340(3.310) -1.020(2.310) -0.850(1.990) 0.210(1.130)
-0.050(0.310) 2.370(2.520) 1.110(3.260) 1.650(3.030) 1.770(2.400) 0.280(0.870)
0.000(0.110) 0.060(0.330) 0.080(0.180) 0.110(0.270) 0.060(0.320) -0.050(0.310)
-0.000(0.110) -0.060(0.330) -0.080(0.180) -0.110(0.270) -0.060(0.320) 0.050(0.310)
5 -0.000(0.140) -0.120(0.450) -0.030(1.120) -0.030(0.140) -0.100(0.170) 0.080(0.380)
0.030(0.120) 0.220(0.310) 0.020(1.140) 0.040(0.110) 0.080(0.160) 0.050(0.150)
-0.020(0.150) 0.340(0.780) 1.990(2.660) -0.070(0.160) -0.080(0.190) 0.500(0.630)
0.010(0.120) -0.050(0.580) -1.390(2.430) 0.030(0.120) 0.060(0.120) -0.190(0.460)
-0.040(0.140) -0.240(0.740) -0.220(2.000) -0.120(0.150) -0.260(0.200) 0.130(0.670)
0.020(0.120) 0.410(0.670) -0.580(1.910) 0.040(0.120) 0.150(0.160) 0.040(0.200)
0.010(0.050) -0.030(0.160) 0.070(0.080) 0.010(0.050) 0.000(0.050) -0.040(0.120)
-0.010(0.050) 0.030(0.160) -0.070(0.080) -0.010(0.050) -0.000(0.050) 0.040(0.120)
6 -0.020(0.380) -0.340(1.450) -0.130(2.570) -0.000(0.350) -0.660(1.090) 0.120(0.990)
-0.010(0.310) 0.640(1.380) -0.660(2.780) -0.020(0.350) 0.490(1.050) 0.080(0.680)
0.110(0.330) 0.600(1.690) 1.660(2.920) 0.330(0.580) 0.390(1.550) 1.070(0.930)
0.070(0.370) -0.280(2.000) -1.260(3.700) 0.030(0.380) 0.090(1.460) -0.890(0.880)
0.010(0.370) -0.940(2.730) -0.140(3.770) 0.000(0.330) -0.980(1.750) 0.050(1.300)
0.020(0.270) 1.370(2.180) -0.220(3.460) 0.030(0.260) 1.240(2.070) 0.390(1.230)
0.030(0.090) -0.050(0.400) 0.060(0.080) 0.050(0.140) 0.050(0.330) -0.060(0.330)
-0.030(0.090) 0.050(0.400) -0.060(0.080) -0.050(0.140) -0.050(0.330) 0.060(0.330)
Table 1: =200,=2, =2
Scenario : CAT : MLE : TLE : MIXBI : MIXL : MIXT
1 -0.030(0.290) -0.080(0.430) -0.080(0.430) -0.060(0.330) -0.180(0.320) -0.170(0.430)
-0.020(0.180) 0.030(0.250) 0.030(0.250) -0.020(0.170) 0.000(0.210) 0.040(0.220)
-0.010(0.300) 0.190(0.550) 0.190(0.550) 0.090(0.390) 0.270(0.410) 0.310(0.720)
0.060(0.440) 0.090(0.530) 0.090(0.530) -0.010(0.240) -0.040(0.260) 0.040(0.430)
0.020(0.160) 0.030(0.160) 0.030(0.160) 0.030(0.140) -0.000(0.170) 0.070(0.280)
0.060(0.420) 0.330(0.890) 0.330(0.890) 0.120(0.560) 0.170(0.380) 0.520(1.010)
0.020(0.080) 0.030(0.100) 0.030(0.100) 0.010(0.100) 0.010(0.070) 0.050(0.120)
-0.010(0.050) -0.030(0.080) -0.030(0.080) -0.010(0.070) -0.000(0.060) -0.030(0.090)
-0.010(0.080) -0.000(0.080) -0.000(0.080) -0.000(0.100) -0.000(0.090) -0.020(0.110)
2 0.290(1.060) 0.120(1.150) 0.120(1.150) -0.110(0.860) -0.420(0.770) -0.050(0.930)
0.190(0.610) 0.260(1.040) 0.260(1.040) 0.150(0.660) 0.150(0.600) 0.240(0.650)
-0.150(0.880) 0.170(0.930) 0.170(0.930) 0.210(0.870) 0.550(0.750) 0.500(0.950)
0.330(1.020) 0.120(0.990) 0.120(0.990) 0.200(1.020) -0.020(0.770) -0.010(0.910)
-0.110(0.500) 0.100(0.500) 0.100(0.500) 0.090(0.620) -0.000(0.490) 0.340(0.730)
0.330(1.040) 0.400(0.990) 0.400(0.990) 0.340(1.120) 0.540(0.870) 0.610(1.130)
0.000(0.140) 0.020(0.140) 0.020(0.140) 0.030(0.180) 0.020(0.130) 0.030(0.180)
-0.040(0.090) -0.060(0.110) -0.060(0.110) -0.050(0.130) -0.040(0.110) -0.090(0.130)
0.040(0.130) 0.040(0.130) 0.040(0.130) 0.030(0.170) 0.020(0.140) 0.060(0.140)
3 -0.430(0.510) -1.300(0.890) -0.630(1.590) -0.910(0.630) -1.150(0.840) -0.810(0.450)
0.110(0.470) 0.720(0.770) -0.450(1.530) 0.260(0.500) 0.380(0.560) 0.140(0.330)
0.340(0.680) 0.190(1.690) 0.980(1.370) 0.510(1.070) 0.290(1.370) 1.160(0.650)
-0.170(0.670) -0.430(1.730) 1.520(2.690) -0.330(1.200) -0.650(1.360) 0.360(0.730)
0.220(0.670) 0.890(0.970) -0.910(2.240) 0.450(0.670) 0.480(0.610) 0.170(0.380)
0.330(0.930) -0.780(2.410) 0.900(2.280) 0.340(1.750) -0.670(2.650) 0.430(1.290)
0.030(0.120) 0.110(0.210) 0.010(0.160) 0.120(0.140) 0.050(0.170) 0.120(0.170)
-0.020(0.100) -0.090(0.120) -0.020(0.160) -0.040(0.100) -0.030(0.140) -0.040(0.090)
-0.010(0.100) -0.020(0.250) 0.010(0.150) -0.080(0.130) -0.020(0.160) -0.080(0.180)
4 -0.390(1.690) -1.880(1.970) 0.450(8.430) -0.720(1.350) -3.080(2.200) -0.490(0.800)
0.560(1.430) 2.560(3.520) 0.730(2.670) 0.910(2.100) 2.570(2.170) 0.060(0.750)
0.220(1.290) -0.970(3.150) 0.300(2.640) 0.340(1.920) 0.560(1.850) 0.980(0.850)
-0.570(2.610) -2.020(4.470) 0.310(6.740) -0.030(2.980) -4.100(4.190) 0.630(1.130)
1.240(2.820) 4.850(3.790) 0.750(4.060) 1.880(3.400) 4.390(3.230) 0.000(0.670)
0.480(2.010) -2.250(3.980) -0.100(3.310) -0.470(3.220) -0.250(3.030) 0.330(1.750)
0.040(0.170) 0.110(0.330) 0.020(0.150) 0.090(0.250) -0.050(0.230) 0.060(0.190)
-0.090(0.130) -0.220(0.140) -0.040(0.150) -0.100(0.200) -0.220(0.130) -0.070(0.140)
0.060(0.160) 0.110(0.340) 0.020(0.160) 0.010(0.200) 0.270(0.260) 0.010(0.200)
5 -0.170(0.410) -0.810(0.950) -0.620(1.270) -0.440(0.370) -0.770(0.590) -0.600(0.610)
0.040(0.170) 0.530(0.830) -0.870(1.160) 0.090(0.270) 0.390(0.580) 0.190(0.530)
0.100(0.460) 0.390(1.620) 0.790(1.350) 0.340(0.600) 0.590(1.240) 0.890(1.110)
-0.180(0.390) 0.330(1.570) 0.930(2.040) -0.100(0.360) -0.140(0.930) 0.550(1.050)
-0.010(0.400) 0.620(1.090) -2.060(2.550) 0.150(0.460) 0.430(0.680) 0.190(0.540)
0.120(0.540) -0.590(2.560) 0.350(2.110) 0.270(0.770) 0.080(2.460) -0.020(2.020)
0.000(0.090) 0.110(0.230) -0.010(0.180) 0.060(0.130) 0.090(0.150) 0.060(0.220)
-0.000(0.070) -0.080(0.150) -0.000(0.200) -0.010(0.090) -0.070(0.170) -0.040(0.130)
-0.000(0.100) -0.020(0.250) 0.010(0.170) -0.060(0.120) -0.020(0.150) -0.010(0.240)
6 -0.010(1.040) -1.110(1.350) -0.390(2.160) -0.560(0.670) -1.370(1.750) -0.640(0.920)
0.180(0.700) 1.660(2.570) -0.480(2.360) 0.270(1.210) 1.170(1.800) 0.180(0.790)
-0.200(0.910) 0.000(2.400) 0.920(2.480) 0.570(1.050) 0.340(1.930) 1.000(0.910)
0.290(0.770) -0.920(3.600) 1.910(3.900) 0.280(1.460) -0.780(2.980) 0.180(1.660)
0.010(0.550) 2.250(3.160) -0.430(3.360) 0.290(1.780) 2.080(2.970) 0.290(0.980)
0.570(1.070) -0.750(3.330) 1.600(3.190) 0.580(1.390) -0.370(3.470) 0.410(1.930)
0.050(0.120) 0.090(0.310) 0.020(0.170) 0.050(0.200) 0.080(0.210) 0.020(0.240)
-0.050(0.100) -0.200(0.160) 0.030(0.150) -0.090(0.180) -0.140(0.170) -0.090(0.180)
0.010(0.130) 0.110(0.330) -0.050(0.180) 0.040(0.230) 0.060(0.210) 0.070(0.230)
Table 2: =400,=2, =2
Scenario : CAT : MLE : TLE : MIXBI : MIXL : MIXT
1 0.000(0.110) 0.000(0.100) 0.000(0.100) -0.000(0.100) -0.000(0.130) 0.000(0.100)
0.000(0.090) 0.010(0.080) 0.010(0.080) 0.010(0.080) 0.010(0.100) 0.010(0.080)
0.010(0.080) 0.000(0.080) 0.000(0.080) 0.000(0.080) 0.020(0.100) 0.010(0.080)
-0.020(0.080) -0.010(0.070) -0.010(0.070) -0.010(0.080) -0.030(0.100) -0.000(0.090)
0.010(0.100) 0.000(0.080) 0.000(0.080) 0.000(0.090) -0.000(0.120) 0.000(0.080)
-0.010(0.080) -0.000(0.080) -0.000(0.080) -0.010(0.080) -0.010(0.090) -0.000(0.080)
0.000(0.030) 0.000(0.030) 0.000(0.030) 0.000(0.030) -0.000(0.030) 0.000(0.030)
-0.000(0.030) -0.000(0.030) -0.000(0.030) -0.000(0.030) 0.000(0.030) -0.000(0.030)
2 0.020(0.240) 0.030(0.210) 0.030(0.210) 0.020(0.220) 0.010(0.280) 0.040(0.220)
-0.020(0.230) -0.020(0.190) -0.020(0.190) -0.010(0.200) -0.000(0.250) -0.020(0.190)
-0.040(0.270) -0.000(0.250) -0.000(0.250) -0.030(0.240) 0.070(0.290) -0.020(0.290)
-0.050(0.220) -0.010(0.190) -0.010(0.190) -0.040(0.290) -0.050(0.260) -0.010(0.220)
-0.050(0.250) -0.050(0.210) -0.050(0.210) -0.050(0.220) -0.100(0.280) -0.060(0.220)
0.030(0.210) 0.030(0.180) 0.030(0.180) 0.030(0.180) 0.050(0.220) 0.040(0.200)
0.000(0.070) -0.000(0.060) -0.000(0.060) -0.010(0.080) -0.000(0.060) -0.000(0.070)
-0.000(0.070) 0.000(0.060) 0.000(0.060) 0.010(0.080) 0.000(0.060) 0.000(0.070)
3 -0.060(0.120) -0.100(0.480) -0.010(0.740) -0.170(0.120) -0.280(0.150) 0.320(0.280)
0.010(0.090) 0.270(0.320) -0.010(0.560) 0.070(0.090) 0.170(0.120) 0.030(0.120)
-0.090(0.130) 0.330(0.790) 2.160(1.730) -0.220(0.130) -0.290(0.180) 1.010(0.400)
0.020(0.100) 0.040(0.650) -1.370(1.540) 0.090(0.100) 0.180(0.140) -0.500(0.550)
-0.130(0.180) -0.290(0.920) -0.130(1.140) -0.350(0.170) -0.590(0.200) 0.580(0.490)
0.020(0.110) 0.570(0.610) 0.030(1.130) 0.130(0.120) 0.330(0.150) 0.030(0.180)
0.010(0.040) -0.010(0.120) 0.060(0.160) 0.000(0.040) -0.010(0.050) -0.080(0.130)
-0.010(0.040) 0.010(0.120) -0.060(0.160) -0.000(0.040) 0.010(0.050) 0.080(0.130)
4 -0.010(0.230) -0.760(1.300) -0.260(1.370) -0.760(1.320) -0.620(1.030) 0.090(0.600)
0.000(0.190) 0.810(1.060) -0.020(1.630) 0.530(1.170) 0.640(0.970) 0.050(0.400)
-0.040(0.250) 0.170(1.700) 2.170(2.120) 0.070(1.870) 0.310(1.670) 1.530(0.710)
0.000(0.200) 0.030(1.800) -0.800(2.280) -0.160(1.120) -0.150(1.350) -1.370(0.470)
0.010(0.240) -1.840(2.820) -0.490(3.080) -1.630(2.600) -1.130(1.900) 0.250(0.980)
0.010(0.200) 1.720(2.060) 0.340(3.130) 1.180(2.540) 1.380(2.020) 0.190(0.770)
0.010(0.060) -0.040(0.310) 0.090(0.170) 0.040(0.270) -0.000(0.310) -0.070(0.310)
-0.010(0.060) 0.040(0.310) -0.090(0.170) -0.040(0.270) 0.000(0.310) 0.070(0.310)
5 -0.010(0.110) -0.040(0.380) -0.160(1.290) -0.050(0.110) -0.110(0.130) 0.180(0.250)
0.010(0.090) 0.210(0.260) 0.010(1.220) 0.030(0.090) 0.090(0.110) 0.040(0.160)
-0.020(0.110) 0.440(0.760) 1.480(2.270) -0.060(0.120) -0.090(0.120) 0.730(0.590)
0.010(0.080) -0.070(0.610) -1.400(2.200) 0.030(0.080) 0.070(0.100) -0.210(0.400)
-0.010(0.120) -0.130(0.700) 0.100(1.660) -0.100(0.130) -0.200(0.130) 0.370(0.430)
0.010(0.080) 0.360(0.450) 0.250(1.430) 0.050(0.090) 0.140(0.110) 0.040(0.310)
0.000(0.030) -0.020(0.130) 0.050(0.160) 0.000(0.030) -0.000(0.040) -0.020(0.120)
-0.000(0.030) 0.020(0.130) -0.050(0.160) -0.000(0.030) 0.000(0.040) 0.020(0.120)
6 0.020(0.240) -0.470(1.340) -0.200(1.580) 0.010(0.210) -0.520(1.020) 0.000(0.820)
0.010(0.190) 0.710(1.160) -0.130(1.940) 0.010(0.200) 0.600(1.130) 0.150(0.580)
-0.010(0.250) 0.650(1.610) 1.430(2.970) 0.190(0.400) 0.400(1.560) 1.070(0.950)
0.000(0.210) -0.260(2.070) -1.600(2.820) -0.000(0.250) -0.160(1.360) -1.110(0.780)
0.040(0.230) -0.970(2.260) -0.110(2.890) 0.020(0.210) -0.950(1.740) 0.020(1.150)
-0.020(0.200) 1.600(2.430) 0.440(3.360) -0.010(0.190) 1.330(2.120) 0.250(0.940)
0.020(0.060) -0.070(0.400) 0.050(0.170) 0.040(0.090) 0.030(0.340) -0.100(0.310)
-0.020(0.060) 0.070(0.400) -0.050(0.170) -0.040(0.090) -0.030(0.340) 0.100(0.310)
Table 3: =200,=3, =1
Scenario : CAT : MLE : TLE : MIXBI : MIXL : MIXT
1 -0.000(0.220) -0.080(0.320) -0.080(0.320) -0.030(0.200) -0.140(0.230) -0.080(0.250)
-0.010(0.110) 0.020(0.180) 0.020(0.180) -0.000(0.110) -0.000(0.130) 0.000(0.110)
0.020(0.270) 0.160(0.510) 0.160(0.510) 0.010(0.200) 0.160(0.250) 0.070(0.310)
0.080(0.190) 0.100(0.450) 0.100(0.450) 0.010(0.150) -0.030(0.190) -0.020(0.170)
0.010(0.110) 0.020(0.120) 0.020(0.120) 0.020(0.090) 0.010(0.110) 0.020(0.130)
-0.040(0.350) 0.210(0.760) 0.210(0.760) -0.030(0.180) 0.070(0.210) 0.090(0.390)
0.000(0.060) 0.020(0.100) 0.020(0.100) 0.000(0.060) 0.000(0.050) 0.010(0.070)
-0.010(0.040) -0.020(0.060) -0.020(0.060) 0.000(0.040) 0.000(0.040) -0.000(0.050)
0.000(0.060) -0.000(0.070) -0.000(0.070) -0.000(0.070) -0.000(0.060) -0.010(0.070)
2 -0.070(0.920) -0.170(0.680) -0.170(0.680) -0.240(0.790) -0.310(0.660) -0.270(0.740)
0.240(0.580) 0.210(0.440) 0.210(0.440) 0.260(0.540) 0.100(0.480) 0.220(0.550)
-0.060(0.920) 0.180(0.780) 0.180(0.780) 0.220(0.780) 0.440(0.670) 0.570(0.800)
0.130(0.780) -0.060(0.620) -0.060(0.620) -0.050(0.590) 0.060(0.730) -0.080(0.830)
-0.040(0.420) 0.070(0.350) 0.070(0.350) 0.010(0.380) -0.000(0.400) 0.130(0.450)
0.570(0.980) 0.450(0.870) 0.450(0.870) 0.630(1.170) 0.420(0.800) 0.640(1.110)
0.030(0.130) 0.010(0.120) 0.010(0.120) 0.030(0.170) 0.010(0.100) 0.020(0.160)
-0.060(0.100) -0.050(0.100) -0.050(0.100) -0.050(0.120) -0.040(0.080) -0.070(0.110)
0.030(0.120) 0.030(0.110) 0.030(0.110) 0.020(0.170) 0.030(0.100) 0.050(0.140)
3 -0.410(0.500) -1.280(0.940) -0.540(0.920) -0.850(0.570) -1.120(0.670) -0.770(0.420)
0.040(0.210) 0.620(0.640) -0.370(1.010) 0.250(0.470) 0.570(0.630) 0.120(0.240)
0.320(0.510) 0.460(1.670) 1.060(1.240) 0.500(0.820) 0.580(1.500) 1.410(0.600)
-0.310(0.690) -0.530(1.960) 1.950(1.760) -0.370(1.130) -0.520(1.200) 0.440(0.730)
0.090(0.350) 0.770(0.790) -1.210(1.700) 0.380(0.570) 0.680(0.690) 0.130(0.210)
0.110(0.490) -0.740(2.460) 0.910(1.960) 0.330(1.280) -0.200(2.920) 0.670(1.120)
0.020(0.090) 0.100(0.220) 0.030(0.140) 0.110(0.120) 0.080(0.180) 0.150(0.150)
-0.010(0.050) -0.070(0.100) -0.030(0.160) -0.010(0.090) -0.070(0.160) -0.030(0.070)
-0.010(0.090) -0.030(0.230) 0.000(0.150) -0.100(0.110) -0.010(0.150) -0.110(0.160)
4 -0.450(1.470) -1.590(1.930) -0.560(1.310) -0.990(1.440) -2.340(2.190) -0.480(0.640)
0.310(0.890) 2.620(2.660) -0.260(1.560) 0.830(1.890) 2.270(2.110) -0.060(0.460)
-0.050(1.400) -0.630(2.830) 0.930(1.430) 0.440(1.860) 0.110(2.060) 1.180(0.480)
-0.510(2.190) -1.060(4.020) 1.040(2.770) -0.230(2.600) -2.490(4.180) 0.560(1.050)
0.620(2.040) 3.980(3.570) -1.320(2.350) 1.380(3.000) 3.570(3.260) 0.000(0.420)
0.040(2.320) -2.110(4.010) 0.300(2.340) -0.410(3.260) -1.180(3.680) 0.300(1.170)
0.030(0.140) 0.150(0.310) -0.000(0.170) 0.080(0.240) 0.030(0.260) 0.050(0.180)
-0.060(0.100) -0.210(0.140) -0.030(0.170) -0.100(0.180) -0.190(0.150) -0.050(0.110)
0.030(0.140) 0.060(0.310) 0.030(0.140) 0.010(0.210) 0.150(0.290) 0.010(0.190)
5 -0.070(0.290) -1.060(0.950) -0.660(1.410) -0.530(0.380) -0.870(0.600) -0.670(0.640)
0.030(0.140) 0.320(0.480) -0.270(1.340) 0.050(0.150) 0.260(0.460) 0.130(0.420)
0.070(0.330) 0.580(1.480) 1.330(1.260) 0.310(0.530) 0.560(0.900) 1.270(0.550)
0.010(0.220) -0.220(1.970) 1.080(2.750) -0.140(0.710) -0.400(0.930) 0.610(0.920)
0.030(0.120) 0.310(0.420) -1.390(2.090) 0.110(0.250) 0.320(0.700) 0.160(0.430)
0.010(0.490) -0.660(2.260) 0.740(2.180) 0.170(0.600) -0.060(1.820) 0.400(1.220)
0.010(0.070) 0.090(0.250) 0.040(0.160) 0.060(0.120) 0.030(0.160) 0.100(0.220)
-0.000(0.040) -0.040(0.110) -0.010(0.170) 0.010(0.060) -0.030(0.140) -0.030(0.100)
-0.010(0.070) -0.050(0.250) -0.030(0.180) -0.070(0.120) -0.000(0.140) -0.070(0.220)
6 -0.130(0.800) -0.970(1.450) -0.300(1.650) -0.730(0.880) -1.470(1.460) -0.530(0.720)
0.280(0.580) 1.260(2.460) -0.130(2.060) 0.140(0.950) 1.060(1.610) 0.190(0.740)
-0.060(0.940) 0.370(2.120) 1.310(1.750) 0.670(1.080) 0.920(1.130) 1.080(0.740)
0.150(0.560) -0.210(3.050) 0.920(3.270) 0.120(1.400) -0.920(2.810) 0.650(1.010)
-0.050(0.440) 1.560(2.880) -0.750(3.750) 0.330(1.630) 1.560(2.540) 0.150(0.650)
0.710(1.190) -0.640(2.880) 0.250(3.000) 0.350(1.570) 0.280(2.370) 0.280(1.580)
0.040(0.130) 0.080(0.290) 0.020(0.160) 0.060(0.190) 0.040(0.210) 0.050(0.240)
-0.040(0.090) -0.120(0.180) -0.050(0.180) -0.090(0.170) -0.120(0.180) -0.070(0.140)
-0.000(0.120) 0.050(0.300) 0.040(0.180) 0.040(0.230) 0.090(0.180) 0.020(0.220)
Table 4: =400,=3, =1

References