Quantifying Interpretability of Arbitrary Machine Learning Models Through Functional Decomposition

04/08/2019
by   Christoph Molnar, et al.
Universität München
0

To obtain interpretable machine learning models, either interpretable models are constructed from the outset - e.g. shallow decision trees, rule lists, or sparse generalized linear models - or post-hoc interpretation methods - e.g. partial dependence or ALE plots - are employed. Both approaches have disadvantages. While the former can restrict the hypothesis space too conservatively, leading to potentially suboptimal solutions, the latter can produce too verbose or misleading results if the resulting model is too complex, especially w.r.t. feature interactions. We propose to make the compromise between predictive power and interpretability explicit by quantifying the complexity / interpretability of machine learning models. Based on functional decomposition, we propose measures of number of features used, interaction strength and main effect complexity. We show that post-hoc interpretation of models that minimize the three measures becomes more reliable and compact. Furthermore, we demonstrate the application of such measures in a multi-objective optimization approach which considers predictive power and interpretability at the same time.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

06/17/2021

Interpretable Machine Learning Classifiers for Brain Tumour Survival Prediction

Prediction of survival in patients diagnosed with a brain tumour is chal...
06/10/2016

The Mythos of Model Interpretability

Supervised machine learning models boast remarkable predictive capabilit...
03/01/2021

Interpretable Artificial Intelligence through the Lens of Feature Interaction

Interpretation of deep learning models is a very challenging problem bec...
06/18/2021

It's FLAN time! Summing feature-wise latent representations for interpretability

Interpretability has become a necessary feature for machine learning mod...
11/16/2021

SMACE: A New Method for the Interpretability of Composite Decision Systems

Interpretability is a pressing issue for decision systems. Many post hoc...
09/09/2019

Learning Fair Rule Lists

The widespread use of machine learning models, especially within the con...
01/14/2019

Interpretable machine learning: definitions, methods, and applications

Machine-learning models have demonstrated great success in learning comp...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1 Introduction

Machine learning models are optimized for predictive performance, but it is often required to understand models in order to e.g. debug them, gain trust in the predictions and satisfy regulatory requirements. Therefore, performance often has to be traded off for interpretability. In areas such as life sciences and social sciences, it is common to restrict model selection to interpretable models such as (generalized) linear regression models and decision trees

[23]

. This often relies on an intuitive notion of interpretability, leading to an avoidance of ”black boxes” such as tree ensembles and neural networks

[5]. A restriction to structurally simpler models has the drawback that better performing models are often excluded a priori from model selection. An alternative is to allow any model and apply post-hoc interpretation methods to explain model behavior and predictions. Interpretation methods quantify effects that features have on predictions, compute feature importances or explain individual predictions, see [24] for an overview. While model-agnostic post-hoc interpretation methods can – in general – be used regardless of model complexity, their reliability and compactness deteriorates when models use a high number of features, have strong feature interactions and complex feature main effects.

Model-agnostic interpretability measures are needed to make the compromise between interpretability and predictive performance explicit when selecting models [31, 5]. Instead of fixing the trade-off by preselecting an interpretable model class, model-agnostic measures would allow informed model selection with the desired balance between interpretability and predictive performance [13]. Interpretability is not well defined [23] and depends on user preferences and domain [30, 14, 20, 31]. This supports the conclusion in [4] that we cannot summarize interpretability with a single metric.

1.0.1 Contributions.

We propose three model-agnostic measures of machine learning model interpretability. The measures can be used to compare trained models or to explicitly optimize interpretability during hyperparameter tuning and model selection. First we review related work on interpretability measures and the background of functional decomposition, on which our proposed measures are based. For the

number of features used

by the model, we propose an estimation heuristic. Based on the decomposition of the prediction function, we suggest measures for

interaction strength and for average complexity of the feature main effects. We argue that minimizing these three measures improves the reliability and compactness of post-hoc interpretation methods. Finally, we illustrate the use of our proposed measures in multi-objective optimization and discuss implications of interpretability measures for the field of interpretable machine learning.

2 Related Work and Background

In this section we introduce the notation, review related work and describe the functional decomposition on which we base the proposed complexity measures.

2.0.1 Notation:

We consider machine learning prediction functions , where

is a p-dimensional feature vector and

is the prediction (e.g. regression output or a classification score). For the decomposition of this function, we write , , to denote a function that maps a vector with a subset of features to a marginal prediction. If subset contains a single feature, we write . We refer to the training data for the machine learning model with the tuples and refer to the value of the j-th feature from the i-th instance as . We write

to refer to the j-th feature as a random variable.

2.1 Interpretability Measures

This section provides a non-exhaustive overview of approaches for measuring and optimizing interpretability. Many measures of interpretability are model-specific, i.e. only models of the same class can be compared (e.g. decision trees). Model size is often used as a measure for interpretability (e.g. number of decision rules, tree depth, …) [20, 31, 4, 35]. Akaikes Information Criterion (AIC) [1] and the Bayesian Information Criterion (BIC) [33]

are more widely applicable measures for the trade-off between goodness of fit and degrees of freedom. AIC and BIC fix a certain compromise between interpretability and performance, and consider only one dimension of interpretability, the degrees of freedom. In

[36] the authors propose an interpretability evaluation model that considers the (model-specific) structural complexity of machine learning models. In additive models (e.g. linear regression), the number of features is often used as a measure of interpretability [32]. In [26] the authors propose model-agnostic measures of model stability, based on the semantic similarity of predictions when the model is re-trained on different subsamples of the training data. Similar to our approach, the measures are based on the predictions, not on the structure of the model.

In [27] the authors propose explanation fidelity and explanation stability metrics of local explanation models [29]

. They propose to incorporate the metrics as a regularizer into the loss function of a neural network to simultaneously optimize for predictive performance and higher quality of local explanations. Their local explainability metrics complement ours, since we consider global model properties.

Further approaches measure interpretability as the usability of a (interpretable) model to support a human in a task, usually measured in a survey as response time, correctness of the response and task difficulty [36, 20, 10]. In [15] an interpretability measure based on runtime operation count is proposed and evaluated in user studies.

2.2 Functional Decomposition

Any high-dimensional prediction function can be decomposed into a sum of components with increasing dimensionality: an intercept, first-order feature effects, second-order effects and so on up to the p-th order effect:

(1)

This decomposition is only unique with additional constraints regarding the components. Stone [34] suggested orthogonality constraints and approximating the prediction function with weighted integrals. Hooker [18]

defined centering, orthogonality and variance decomposition as desirable properties, resulting in unique and hierarchically orthogonal components under the correlation inner product.

Accumulated Local Effects (ALE) were proposed in [3] as a tool for visualizing feature effects (e.g. Figure 1) and as an alternative unique decomposition of the prediction function with components . The ALE decomposition is unique under an orthogonality-like property further described in [3].

The ALE main effect of a feature for a prediction function is defined as

(2)

Here, is a lower bound of (usually the minimum observed value of ) and the expectation is computed conditional on the value for and over the marginal distribution of all other features. The constant is chosen so that the mean of with respect to the marginal distribution of is zero. The ALE main effects are defined as the gradients of with respect to the features, but are estimated with finite differences, i.e. access to the gradients of the model is not required. For the estimation we refer to [3]. We base our proposed measures on the ALE decomposition, because ALE are computationally cheap (worst case for a feature main effect), the effects can be computed sequentially instead of simultaneously as in [18] and, most importantly, ALEs do not require knowledge of the joint data distribution. Additionally, ALE have software implementations [25, 2].

3 Functional Complexity

In this section we motivate complexity measures based on functional decomposition. Based on Equation 1, we decompose the prediction function into a constant (estimated as ), plus the main effects (estimate with ALE), and a remainder term containing interactions (difference between full model and + main effects).

(3)

This arrangement of components emphasizes a decomposition of the prediction function into a main effect model and an interaction remainder. The main effect model itself can be used as a prediction function and we can analyze how well it approximates , which is the idea behind the interaction measure IAS. The average main effect complexity (MEC) captures how many parameters are needed to describe the one-dimensional main effects on average. The number of features used (NF) describes how many features were used in the full prediction function.

3.1 Number of Features (NF)

We propose an approach based on feature permutation to determine how many features are used by a model. We regard features as ”used” by the model when changing a feature changes the prediction, which may differ from the numbers of features available during training.

If available, a model-specific method for extracting the number of features used by the model is preferable, e.g. counting the number of non-zero weights in a sparse linear regression model. A model-agnostic heuristic is useful when the prediction function is accessible but not the internal structure of the model (e.g. prediction via API call), or when combining preprocessing steps and models complicates programmatic extraction (e.g. training a decision tree on sparse principal components).

The proposed procedure is formally described in Algorithm 1. To estimate whether the j-th feature was used, we sample instances from data , replace their j-th feature values with random values from the distribution of (e.g. by sampling from other instances from ), and observe whether the predictions change. If the prediction of any sample changes, the feature was used for the prediction.

Input: Number of samples , data
1 NF = 0 for  do
2       Draw instances from dataset Create as a copy of for  do
3             Sample from with the constraint that Set
4      if  then .
return NF
Algorithm 1 Number of Features Used (NF)

The rate of false positives is zero, i.e. the probability that the heuristic counts a feature as used, but the model did not use the feature is zero. The probability of a false negative, i.e. the heuristic overlooks a feature, depends on the number of samples

, the model function and the data distribution. Let be the probability that the prediction of a random instance depends on the value of . For an instance that depends on for its prediction, let be the probability that a sample from changes the prediction for an instance i. Then the probability of overlooking feature j is: With the simplifying assumption that , the probability that we miss at least one feature is . For a linear model without interactions and only numerical features, the false negative rate is 0: and , so that . Let us assume a non-linear model where only one percent of instances rely on feature () and these instances have a probability of 0.02 that the feature permutation changes the prediction (). If we set , then . If we increase M to 500, the probability that NF counts too few features drops to .

We tested the NF heuristic with the Boston Housing data. We trained decision trees (CART) with maximum depths leading to 1, 2 and 4 features used and LASSO with penalty leading to 0, 2, 3, 4, 11 and 13 features used. For each model we estimated NF with sample sizes and repeated each estimation 100 times. For the elastic net models, NF was always equal to the true number of features. For the CART models the mean absolute differences between NF and the true number of features were 0.280 (), 0.020 () and 0.000 ().

3.2 Interaction Strength (IAS)

Interactions between features mean that the prediction cannot be expressed as a sum of independent feature effects, but the effect of a feature depends on values of other features [24]. We propose to measure interaction strength as the scaled approximation error between the ALE main effect model and the prediction function . Based on the ALE decomposition, the ALE main effect model is defined as the sum of first order ALE effects:

We define interaction strength as the approximation error measured with loss :

(4)

Here, is the mean of the predictions and can be interpreted as the functional decomposition where all feature effects are set to zero. IAS with the loss equals 1 minus the R-squared measure, where the true targets are replaced with :

If , then , which means that the first order ALE model perfectly approximates and the model has no interactions. IAS can be larger than for additive models for which we would expect , as observed in e.g. Table 1 (e.g. ). This small deviation can occur when the true ALEs (see Equation 2) are not perfectly approximated by finite differences.

3.3 Main Effect Complexity (MEC)

To determine the average shape complexity of ALE main effects , we propose the main effect complexity (MEC) measure. For a single ALE main effect, we define as the number of parameters needed to approximate the curve with linear segments. For the entire model, MEC is the average over all main effects, weighted with their variance. Figure 1 shows an ALE plot (= main effect) and its approximation with two linear segments. In the remainder, main effect ALE curves are represented by sparklines, e.g. Figure 1 by 10 0 0 0.0441638219945065 0.232513487915872 0.117035778774142 0.526558195857658 0.171886190181984 0.619642111665782 0.239746475324464 0.657537139241305 0.277414438496296 0.688238693822469 0.317298245186784 0.731395801369906 0.353947207030398 0.765394208820276 0.403829747003651 0.790038933602214 0.469197311658307 0.810493373554827 0.512020956107238 0.834760334926249 0.587409734260159 0.879414206937379 0.645926744526776 0.8919348807605 0.698617576125043 0.900750562309592 0.739096495540082 0.917805223022229 0.770559982404395 0.934469145577491 0.833939564075319 0.955553469032704 0.87392416768727 0.958667634577132 0.896662647836431 0.960851148838697 0.943562558234346 0.973613831586732 1 1 / . Vertical bars, if present, e.g. 10 0.153595113848748 1 0 0 0.0441638219945065 0.232513487915872 0.117035778774142 0.526558195857658 0.171886190181984 0.619642111665782 0.239746475324464 0.657537139241305 0.277414438496296 0.688238693822469 0.317298245186784 0.731395801369906 0.353947207030398 0.765394208820276 0.403829747003651 0.790038933602214 0.469197311658307 0.810493373554827 0.512020956107238 0.834760334926249 0.587409734260159 0.879414206937379 0.645926744526776 0.8919348807605 0.698617576125043 0.900750562309592 0.739096495540082 0.917805223022229 0.770559982404395 0.934469145577491 0.833939564075319 0.955553469032704 0.87392416768727 0.958667634577132 0.896662647836431 0.960851148838697 0.943562558234346 0.973613831586732 1 1 / , show borders between the segments of an approximation.

Figure 1: ALE curve (solid line) approximated by two linear segments (dotted line).

Through the approximation with linear segments, the degrees of freedom required for describing a main effect curve become measurable. We measure the degrees of freedom as the number of non-zero coefficients for intercepts and slopes of the linear segments. The approximation allows some error, e.g. an almost linear main effect like 10 0 0 0.0132778626776852 0.00285116378037018 0.0295671811436683 0.0170771166425835 0.0444186096679646 0.0431805104607853 0.0658620408890712 0.0215678731796228 0.0849699452768591 0.0215329317714585 0.105231984880058 0.0183626728057456 0.12136451543357 0.0433335848883314 0.134243964732243 0.0781345679052556 0.157303853070161 0.105808925912978 0.177789961325434 0.0955698483393146 0.203828303980362 0.0804713812179251 0.222649318546446 0.163203615903714 0.235881521510398 0.163982111116384 0.259519545670998 0.21312748684833 0.27530289432283 0.214830326044187 0.292155746717243 0.219773138545309 0.310982931817685 0.259072202209324 0.327782065858443 0.268782350336466 0.343556343668399 0.302789759368351 0.358897477692826 0.301012659498059 0.3787467155724 0.3107283300488 0.398263652240928 0.310445689691796 0.417834264319794 0.337406268475739 0.446287338552747 0.373677304069113 0.469529289978424 0.424609539221726 0.482756460405882 0.47768670096473 0.500476617007672 0.488626929485715 0.521995928600725 0.505798558386822 0.539085841400516 0.515389681193714 0.570956622189269 0.552699532240377 0.592526407581769 0.575391094909306 0.608989020063782 0.584587575526664 0.625436861678931 0.649100695582651 0.65369446844157 0.671808034304655 0.680517912462254 0.695693467438897 0.695592232898421 0.738882164624479 0.719814044362073 0.76985727184902 0.739826303469738 0.780908587371888 0.758648272357656 0.800952788086194 0.7787797367416 0.816446604200582 0.804653811411959 0.868723415850169 0.827621203815595 0.891239714015349 0.849122697900249 0.895343079698122 0.868559241181101 0.905973440521492 0.891829338682749 0.927646170908585 0.91171692611012 0.953345090123354 0.937975189419575 0.989660932627234 0.959500886963727 1 0.977364971121896 0.989742338626531 1 0.984347820948623 / may have , even if dozens of parameters would be needed to describe it perfectly. The approximation quality is measured with R-squared, i.e. the proportion of variance of that is explained by the approximation with linear segments. An approximation has to reach an R-squared of at least , where is the user defined maximum approximation error. We also introduced parameter , the maximum number of segments. In the case that an approximation cannot reach an R-squared above with a given , is computed with the maximum number of segments. The selected maximum approximation error should be small, but not too small. We found between and visually meaningful. We apply a post-processing step that greedily sets slopes of the linear segments to zero, as long as R-squared . This post-processing step potentially decreases the , especially for models with constant segments like decision trees or rule-based models. is averaged over all features to obtain the main effect complexity for the model. Each is weighted with the variance of the corresponding ALE main effect to give more weight to features that contribute more to the prediction. For example, the three main effect curves 5 0 0.49921100373065 0.00907467080405462 0.50102392597831 0.0183871457040132 0.50287666233353 0.0256292738785806 0.504300835922156 0.0356844432954238 0.506228091103388 0.0477080178415051 0.50840149377896 0.0587504693065008 0.510206191310913 0.069400952666147 0.511710913279867 0.0777722392442526 0.512694120956889 0.0896216854292794 0.5137349463342 0.100873827571866 0.514295390307273 0.109937417499274 0.514417295151 0.118747449307008 0.514240628964299 0.126843942582414 0.513817160722721 0.135379854003864 0.513101869558373 0.144243227423012 0.512076712571919 0.156461286692254 0.510223672506481 0.165362240711338 0.508584089938172 0.174759083584151 0.506625049735115 0.187801137505781 0.503596284516061 0.199931272523931 0.500555903759373 0.207107235475712 0.498700944901993 0.223095799150128 0.494555313586441 0.235146462716324 0.491545553363525 0.245527842632113 0.489124286945195 0.254880213160131 0.487135054185048 0.26461574262118 0.485303542866825 0.280203018682216 0.482974838459539 0.287451040523817 0.482176763879104 0.298950783297965 0.481313345452969 0.309911250530707 0.480968758103823 0.31825409224015 0.481024774441034 0.332169035447319 0.481722335899825 0.339193169682137 0.482351689348671 0.347028043816302 0.483260706092001 0.355651364960509 0.484495882644824 0.367032102385068 0.486460774470229 0.377135964956108 0.488475567877085 0.385181732119728 0.49022638356728 0.394632393398661 0.492408758380965 0.402540451210935 0.494307486209818 0.412774572837557 0.496815261234992 0.423740929311783 0.499503376459741 0.434051505330361 0.501969756741826 0.441937098122755 0.503780831222911 0.452501499890278 0.506061529043679 0.460693779055701 0.507687009823583 0.472914452892877 0.509832450940047 0.485403877993357 0.51162695564857 0.494070446303977 0.51260864352235 0.502547241005115 0.513344487831131 0.513043529842317 0.513933887609088 0.524018087558145 0.514157359526449 0.536264236021641 0.513925300259344 0.545497938240587 0.513416296948691 0.557692901033943 0.512316201490542 0.56651947461322 0.511228552276207 0.574155352060264 0.510101050138934 0.582667272811947 0.508654357625509 0.59013987590695 0.507234030146715 0.604108639150705 0.504258418507995 0.613238480249725 0.50213351215707 0.624135872457703 0.499469213207509 0.637411871396257 0.496128703060825 0.64784487278233 0.493511221089564 0.657503514846362 0.491157445268549 0.669252604946499 0.488463919396734 0.676796072265938 0.486874380885034 0.685491334440598 0.485215345299615 0.693579436175783 0.483867585271647 0.700547641949166 0.482877001987978 0.71049974847542 0.481764063808388 0.718930966074907 0.481119484722152 0.731563828817977 0.480695364398809 0.74092146202287 0.480811388662037 0.751781085195682 0.481404891540792 0.760212544569914 0.482195965173528 0.775503861789395 0.484313244604574 0.786919258019664 0.486398994675349 0.796010127731655 0.48831749334494 0.802930834317453 0.489903017154903 0.814154570023779 0.492645718454323 0.822467048738328 0.494765572985638 0.83547499588329 0.498129526873955 0.847586896939602 0.501194971191777 0.859670956166572 0.504069345521028 0.871150267111353 0.506535252597626 0.879213938416926 0.508071861772251 0.888384497830326 0.509594464809613 0.894470081370667 0.510463113376834 0.902916842430947 0.511475206902742 0.91203710547669 0.512317832062143 0.920475130269753 0.512876702103195 0.93114915172852 0.513305738056481 0.939500028139156 0.513449853375996 0.948260958011693 0.513448514794428 0.956124363662118 0.513339328086252 0.964237510512418 0.513144858543175 0.977810524601265 0.512698205193198 0.986652387754562 0.512364673771948 1 0.511843501051419 / , 5 0 0.492110049406885 0.00907467080405462 0.510239269792169 0.0183871457040132 0.528766631204992 0.0256292738785806 0.543008365442004 0.0356844432954238 0.562280915007892 0.0477080178415051 0.584014939190431 0.0587504693065008 0.602061912310836 0.069400952666147 0.617109130080688 0.0777722392442526 0.626941205511677 0.0896216854292794 0.63734945768623 0.100873827571866 0.642953896259774 0.109937417499274 0.644172944041531 0.118747449307008 0.642406281784024 0.126843942582414 0.638171599226643 0.135379854003864 0.631018687656421 0.144243227423012 0.620767118100303 0.156461286692254 0.602236718229615 0.165362240711338 0.585840893350693 0.174759083584151 0.566250492349739 0.187801137505781 0.535962841827203 0.199931272523931 0.505559035975228 0.207107235475712 0.487009448452092 0.223095799150128 0.445553137621545 0.235146462716324 0.415455537030231 0.245527842632113 0.391242874108898 0.254880213160131 0.371350547486945 0.26461574262118 0.35303543513775 0.280203018682216 0.329748391955305 0.287451040523817 0.321767646367501 0.298950783297965 0.313133462188789 0.309911250530707 0.309687588487701 0.31825409224015 0.310247751524509 0.332169035447319 0.317223365257703 0.339193169682137 0.32351689919663 0.347028043816302 0.332607065943535 0.355651364960509 0.344958830648823 0.367032102385068 0.364607747752274 0.377135964956108 0.384755680782383 0.385181732119728 0.402263836873833 0.394632393398661 0.424087584103569 0.402540451210935 0.443074861686667 0.412774572837557 0.468152611116003 0.423740929311783 0.495033762611151 0.434051505330361 0.519697564853832 0.441937098122755 0.537808309306148 0.452501499890278 0.560615287139167 0.460693779055701 0.57687009472115 0.472914452892877 0.598324505657927 0.485403877993357 0.616269552595331 0.494070446303977 0.626086431261033 0.502547241005115 0.633444874289953 0.513043529842317 0.639338872000424 0.524018087558145 0.641573591096695 0.536264236021641 0.639252998330179 0.545497938240587 0.634162965151278 0.557692901033943 0.623162010493386 0.56651947461322 0.612285518325635 0.574155352060264 0.601010496966005 0.582667272811947 0.586543571898212 0.59013987590695 0.572340297226093 0.604108639150705 0.542584181236831 0.613238480249725 0.521335118138844 0.624135872457703 0.494692129309804 0.637411871396257 0.461287028921356 0.64784487278233 0.43511221025036 0.657503514846362 0.411574453135954 0.669252604946499 0.384639195878973 0.676796072265938 0.368743811746509 0.685491334440598 0.352153457040172 0.693579436175783 0.338675857808932 0.700547641949166 0.328770025834349 0.71049974847542 0.317640645157596 0.718930966074907 0.311194855097838 0.731563828817977 0.306953652735017 0.74092146202287 0.308113895702164 0.751781085195682 0.314048924500567 0.760212544569914 0.321959660538736 0.775503861789395 0.343132453656374 0.786919258019664 0.36398995294637 0.796010127731655 0.383174938229278 0.802930834317453 0.399030175112117 0.814154570023779 0.426457185941937 0.822467048738328 0.447655729561035 0.83547499588329 0.481295265783012 0.847586896939602 0.511949706648998 0.859670956166572 0.540693447981941 0.871150267111353 0.565352517362868 0.879213938416926 0.58071860848407 0.888384497830326 0.595944638546232 0.894470081370667 0.604631124263811 0.902916842430947 0.614752059931057 0.91203710547669 0.623178312413501 0.920475130269753 0.6287670140427 0.93114915172852 0.633057375619024 0.939500028139156 0.634498530760325 0.948260958011693 0.634485147264251 0.956124363662118 0.633393282461851 0.964237510512418 0.631448589532825 0.977810524601265 0.626982060441331 0.986652387754562 0.623646749178878 1 0.618435026459563 / and 5 0 0.476331993495636 0.00907467080405462 0.530719335731934 0.0183871457040132 0.586301093720036 0.0256292738785806 0.629026044926315 0.0356844432954238 0.686843351048819 0.0477080178415051 0.75204503119394 0.0587504693065008 0.806185615194782 0.069400952666147 0.851326975757926 0.0777722392442526 0.880822997821449 0.0896216854292794 0.912047510570302 0.100873827571866 0.928860649823101 0.109937417499274 0.932517693204879 0.118747449307008 0.927217646882687 0.126843942582414 0.91451357761581 0.135379854003864 0.893054854077674 0.144243227423012 0.862300192441638 0.156461286692254 0.80670911234101 0.165362240711338 0.757521760338184 0.174759083584151 0.698750714347822 0.187801137505781 0.60788801714672 0.199931272523931 0.516676861108563 0.207107235475712 0.461028258762111 0.223095799150128 0.336659680821013 0.235146462716324 0.246367128814723 0.245527842632113 0.173729332496948 0.254880213160131 0.114052502003557 0.26461574262118 0.0591072919925342 0.280203018682216 -0.0107537017685309 0.287451040523817 -0.0346959055102422 0.298950783297965 -0.0605984454437355 0.309911250530707 -0.0709360985135132 0.31825409224015 -0.0692556605346054 0.332169035447319 -0.0483289496769449 0.339193169682137 -0.0294484316606739 0.347028043816302 -0.00217803609396502 0.355651364960509 0.0348771325272514 0.367032102385068 0.0938237083744217 0.377135964956108 0.154267349104612 0.385181732119728 0.2067916937807 0.394632393398661 0.272262797136777 0.402540451210935 0.329224522309464 0.412774572837557 0.404457645183431 0.423740929311783 0.485100984937194 0.434051505330361 0.559092303495742 0.441937098122755 0.613424482175909 0.452501499890278 0.681845358539866 0.460693779055701 0.730609748183052 0.472914452892877 0.79497294624369 0.485403877993357 0.848808064511091 0.494070446303977 0.878258689512505 0.502547241005115 0.90033400961793 0.513043529842317 0.918015992211501 0.524018087558145 0.924720137705726 0.536264236021641 0.917758344848005 0.545497938240587 0.902488234275372 0.557692901033943 0.869485358651699 0.56651947461322 0.836855878429149 0.574155352060264 0.803030816349307 0.582667272811947 0.759630051281862 0.59013987590695 0.717020244928053 0.604108639150705 0.627751957646771 0.613238480249725 0.564004831070862 0.624135872457703 0.484075966236211 0.637411871396257 0.38386082952292 0.64784487278233 0.305336532353953 0.657503514846362 0.234723428109404 0.669252604946499 0.153917879162665 0.676796072265938 0.106231876903044 0.685491334440598 0.0564609878270583 0.693579436175783 0.0160283500179453 0.700547641949166 -0.0136890144389348 0.71049974847542 -0.0470769858036873 0.718930966074907 -0.0664142335905017 0.731563828817977 -0.0791377079158902 0.74092146202287 -0.0756569279490937 0.751781085195682 -0.0578518398997277 0.760212544569914 -0.034119675887669 0.775503861789395 0.0293985215621611 0.786919258019664 0.0919708032287782 0.796010127731655 0.149525543598704 0.802930834317453 0.197091068690397 0.814154570023779 0.279371771120187 0.822467048738328 0.34296714364064 0.83547499588329 0.443885346484776 0.847586896939602 0.535848316475688 0.859670956166572 0.622079241648019 0.871150267111353 0.696056238575 0.879213938416926 0.742154416621028 0.888384497830326 0.787832459312858 0.894470081370667 0.813891923384387 0.902916842430947 0.844254792630657 0.91203710547669 0.869533685560174 0.920475130269753 0.886299976292437 0.93114915172852 0.899171372642076 0.939500028139156 0.903495134847559 0.948260958011693 0.903455338090458 0.956124363662118 0.900180091277322 0.964237510512418 0.894346393997881 0.977810524601265 0.88094747896879 0.986652387754562 0.870941995053 1 0.85530751098978 / differ from each other by a scaling factor in the y-axis, but each requires five segments to be approximated with : 5 0.095281314742052 1 0.326256148630134 1 0.526069636642834 1 0.715591029158772 1 0 0.49921100373065 0.00907467080405462 0.50102392597831 0.0183871457040132 0.50287666233353 0.0256292738785806 0.504300835922156 0.0356844432954238 0.506228091103388 0.0477080178415051 0.50840149377896 0.0587504693065008 0.510206191310913 0.069400952666147 0.511710913279867 0.0777722392442526 0.512694120956889 0.0896216854292794 0.5137349463342 0.100873827571866 0.514295390307273 0.109937417499274 0.514417295151 0.118747449307008 0.514240628964299 0.126843942582414 0.513817160722721 0.135379854003864 0.513101869558373 0.144243227423012 0.512076712571919 0.156461286692254 0.510223672506481 0.165362240711338 0.508584089938172 0.174759083584151 0.506625049735115 0.187801137505781 0.503596284516061 0.199931272523931 0.500555903759373 0.207107235475712 0.498700944901993 0.223095799150128 0.494555313586441 0.235146462716324 0.491545553363525 0.245527842632113 0.489124286945195 0.254880213160131 0.487135054185048 0.26461574262118 0.485303542866825 0.280203018682216 0.482974838459539 0.287451040523817 0.482176763879104 0.298950783297965 0.481313345452969 0.309911250530707 0.480968758103823 0.31825409224015 0.481024774441034 0.332169035447319 0.481722335899825 0.339193169682137 0.482351689348671 0.347028043816302 0.483260706092001 0.355651364960509 0.484495882644824 0.367032102385068 0.486460774470229 0.377135964956108 0.488475567877085 0.385181732119728 0.49022638356728 0.394632393398661 0.492408758380965 0.402540451210935 0.494307486209818 0.412774572837557 0.496815261234992 0.423740929311783 0.499503376459741 0.434051505330361 0.501969756741826 0.441937098122755 0.503780831222911 0.452501499890278 0.506061529043679 0.460693779055701 0.507687009823583 0.472914452892877 0.509832450940047 0.485403877993357 0.51162695564857 0.494070446303977 0.51260864352235 0.502547241005115 0.513344487831131 0.513043529842317 0.513933887609088 0.524018087558145 0.514157359526449 0.536264236021641 0.513925300259344 0.545497938240587 0.513416296948691 0.557692901033943 0.512316201490542 0.56651947461322 0.511228552276207 0.574155352060264 0.510101050138934 0.582667272811947 0.508654357625509 0.59013987590695 0.507234030146715 0.604108639150705 0.504258418507995 0.613238480249725 0.50213351215707 0.624135872457703 0.499469213207509 0.637411871396257 0.496128703060825 0.64784487278233 0.493511221089564 0.657503514846362 0.491157445268549 0.669252604946499 0.488463919396734 0.676796072265938 0.486874380885034 0.685491334440598 0.485215345299615 0.693579436175783 0.483867585271647 0.700547641949166 0.482877001987978 0.71049974847542 0.481764063808388 0.718930966074907 0.481119484722152 0.731563828817977 0.480695364398809 0.74092146202287 0.480811388662037 0.751781085195682 0.481404891540792 0.760212544569914 0.482195965173528 0.775503861789395 0.484313244604574 0.786919258019664 0.486398994675349 0.796010127731655 0.48831749334494 0.802930834317453 0.489903017154903 0.814154570023779 0.492645718454323 0.822467048738328 0.494765572985638 0.83547499588329 0.498129526873955 0.847586896939602 0.501194971191777 0.859670956166572 0.504069345521028 0.871150267111353 0.506535252597626 0.879213938416926 0.508071861772251 0.888384497830326 0.509594464809613 0.894470081370667 0.510463113376834 0.902916842430947 0.511475206902742 0.91203710547669 0.512317832062143 0.920475130269753 0.512876702103195 0.93114915172852 0.513305738056481 0.939500028139156 0.513449853375996 0.948260958011693 0.513448514794428 0.956124363662118 0.513339328086252 0.964237510512418 0.513144858543175 0.977810524601265 0.512698205193198 0.986652387754562 0.512364673771948 1 0.511843501051419 / , 5 0.095281314742052 1 0.326256148630134 1 0.526069636642834 1 0.715591029833964 1 0 0.492110049406885 0.00907467080405462 0.510239269792169 0.0183871457040132 0.528766631204992 0.0256292738785806 0.543008365442004 0.0356844432954238 0.562280915007892 0.0477080178415051 0.584014939190431 0.0587504693065008 0.602061912310836 0.069400952666147 0.617109130080688 0.0777722392442526 0.626941205511677 0.0896216854292794 0.63734945768623 0.100873827571866 0.642953896259774 0.109937417499274 0.644172944041531 0.118747449307008 0.642406281784024 0.126843942582414 0.638171599226643 0.135379854003864 0.631018687656421 0.144243227423012 0.620767118100303 0.156461286692254 0.602236718229615 0.165362240711338 0.585840893350693 0.174759083584151 0.566250492349739 0.187801137505781 0.535962841827203 0.199931272523931 0.505559035975228 0.207107235475712 0.487009448452092 0.223095799150128 0.445553137621545 0.235146462716324 0.415455537030231 0.245527842632113 0.391242874108898 0.254880213160131 0.371350547486945 0.26461574262118 0.35303543513775 0.280203018682216 0.329748391955305 0.287451040523817 0.321767646367501 0.298950783297965 0.313133462188789 0.309911250530707 0.309687588487701 0.31825409224015 0.310247751524509 0.332169035447319 0.317223365257703 0.339193169682137 0.32351689919663 0.347028043816302 0.332607065943535 0.355651364960509 0.344958830648823 0.367032102385068 0.364607747752274 0.377135964956108 0.384755680782383 0.385181732119728 0.402263836873833 0.394632393398661 0.424087584103569 0.402540451210935 0.443074861686667 0.412774572837557 0.468152611116003 0.423740929311783 0.495033762611151 0.434051505330361 0.519697564853832 0.441937098122755 0.537808309306148 0.452501499890278 0.560615287139167 0.460693779055701 0.57687009472115 0.472914452892877 0.598324505657927 0.485403877993357 0.616269552595331 0.494070446303977 0.626086431261033 0.502547241005115 0.633444874289953 0.513043529842317 0.639338872000424 0.524018087558145 0.641573591096695 0.536264236021641 0.639252998330179 0.545497938240587 0.634162965151278 0.557692901033943 0.623162010493386 0.56651947461322 0.612285518325635 0.574155352060264 0.601010496966005 0.582667272811947 0.586543571898212 0.59013987590695 0.572340297226093 0.604108639150705 0.542584181236831 0.613238480249725 0.521335118138844 0.624135872457703 0.494692129309804 0.637411871396257 0.461287028921356 0.64784487278233 0.43511221025036 0.657503514846362 0.411574453135954 0.669252604946499 0.384639195878973 0.676796072265938 0.368743811746509 0.685491334440598 0.352153457040172 0.693579436175783 0.338675857808932 0.700547641949166 0.328770025834349 0.71049974847542 0.317640645157596 0.718930966074907 0.311194855097838 0.731563828817977 0.306953652735017 0.74092146202287 0.308113895702164 0.751781085195682 0.314048924500567 0.760212544569914 0.321959660538736 0.775503861789395 0.343132453656374 0.786919258019664 0.36398995294637 0.796010127731655 0.383174938229278 0.802930834317453 0.399030175112117 0.814154570023779 0.426457185941937 0.822467048738328 0.447655729561035 0.83547499588329 0.481295265783012 0.847586896939602 0.511949706648998 0.859670956166572 0.540693447981941 0.871150267111353 0.565352517362868 0.879213938416926 0.58071860848407 0.888384497830326 0.595944638546232 0.894470081370667 0.604631124263811 0.902916842430947 0.614752059931057 0.91203710547669 0.623178312413501 0.920475130269753 0.6287670140427 0.93114915172852 0.633057375619024 0.939500028139156 0.634498530760325 0.948260958011693 0.634485147264251 0.956124363662118 0.633393282461851 0.964237510512418 0.631448589532825 0.977810524601265 0.626982060441331 0.986652387754562 0.623646749178878 1 0.618435026459563 / and 5 0.095281314742052 1 0.326256148630134 1 0.526069636642834 1 0.715591064154931 1 0 0.476331993495636 0.00907467080405462 0.530719335731934 0.0183871457040132 0.586301093720036 0.0256292738785806 0.629026044926315 0.0356844432954238 0.686843351048819 0.0477080178415051 0.75204503119394 0.0587504693065008 0.806185615194782 0.069400952666147 0.851326975757926 0.0777722392442526 0.880822997821449 0.0896216854292794 0.912047510570302 0.100873827571866 0.928860649823101 0.109937417499274 0.932517693204879 0.118747449307008 0.927217646882687 0.126843942582414 0.91451357761581 0.135379854003864 0.893054854077674 0.144243227423012 0.862300192441638 0.156461286692254 0.80670911234101 0.165362240711338 0.757521760338184 0.174759083584151 0.698750714347822 0.187801137505781 0.60788801714672 0.199931272523931 0.516676861108563 0.207107235475712 0.461028258762111 0.223095799150128 0.336659680821013 0.235146462716324 0.246367128814723 0.245527842632113 0.173729332496948 0.254880213160131 0.114052502003557 0.26461574262118 0.0591072919925342 0.280203018682216 -0.0107537017685309 0.287451040523817 -0.0346959055102422 0.298950783297965 -0.0605984454437355 0.309911250530707 -0.0709360985135132 0.31825409224015 -0.0692556605346054 0.332169035447319 -0.0483289496769449 0.339193169682137 -0.0294484316606739 0.347028043816302 -0.00217803609396502 0.355651364960509 0.0348771325272514 0.367032102385068 0.0938237083744217 0.377135964956108 0.154267349104612 0.385181732119728 0.2067916937807 0.394632393398661 0.272262797136777 0.402540451210935 0.329224522309464 0.412774572837557 0.404457645183431 0.423740929311783 0.485100984937194 0.434051505330361 0.559092303495742 0.441937098122755 0.613424482175909 0.452501499890278 0.681845358539866 0.460693779055701 0.730609748183052 0.472914452892877 0.79497294624369 0.485403877993357 0.848808064511091 0.494070446303977 0.878258689512505 0.502547241005115 0.90033400961793 0.513043529842317 0.918015992211501 0.524018087558145 0.924720137705726 0.536264236021641 0.917758344848005 0.545497938240587 0.902488234275372 0.557692901033943 0.869485358651699 0.56651947461322 0.836855878429149 0.574155352060264 0.803030816349307 0.582667272811947 0.759630051281862 0.59013987590695 0.717020244928053 0.604108639150705 0.627751957646771 0.613238480249725 0.564004831070862 0.624135872457703 0.484075966236211 0.637411871396257 0.38386082952292 0.64784487278233 0.305336532353953 0.657503514846362 0.234723428109404 0.669252604946499 0.153917879162665 0.676796072265938 0.106231876903044 0.685491334440598 0.0564609878270583 0.693579436175783 0.0160283500179453 0.700547641949166 -0.0136890144389348 0.71049974847542 -0.0470769858036873 0.718930966074907 -0.0664142335905017 0.731563828817977 -0.0791377079158902 0.74092146202287 -0.0756569279490937 0.751781085195682 -0.0578518398997277 0.760212544569914 -0.034119675887669 0.775503861789395 0.0293985215621611 0.786919258019664 0.0919708032287782 0.796010127731655 0.149525543598704 0.802930834317453 0.197091068690397 0.814154570023779 0.279371771120187 0.822467048738328 0.34296714364064 0.83547499588329 0.443885346484776 0.847586896939602 0.535848316475688 0.859670956166572 0.622079241648019 0.871150267111353 0.696056238575 0.879213938416926 0.742154416621028 0.888384497830326 0.787832459312858 0.894470081370667 0.813891923384387 0.902916842430947 0.844254792630657 0.91203710547669 0.869533685560174 0.920475130269753 0.886299976292437 0.93114915172852 0.899171372642076 0.939500028139156 0.903495134847559 0.948260958011693 0.903455338090458 0.956124363662118 0.900180091277322 0.964237510512418 0.894346393997881 0.977810524601265 0.88094747896879 0.986652387754562 0.870941995053 1 0.85530751098978 / . When weighted with the variance, 5 0 0.49921100373065 0.00907467080405462 0.50102392597831 0.0183871457040132 0.50287666233353 0.0256292738785806 0.504300835922156 0.0356844432954238 0.506228091103388 0.0477080178415051 0.50840149377896 0.0587504693065008 0.510206191310913 0.069400952666147 0.511710913279867 0.0777722392442526 0.512694120956889 0.0896216854292794 0.5137349463342 0.100873827571866 0.514295390307273 0.109937417499274 0.514417295151 0.118747449307008 0.514240628964299 0.126843942582414 0.513817160722721 0.135379854003864 0.513101869558373 0.144243227423012 0.512076712571919 0.156461286692254 0.510223672506481 0.165362240711338 0.508584089938172 0.174759083584151 0.506625049735115 0.187801137505781 0.503596284516061 0.199931272523931 0.500555903759373 0.207107235475712 0.498700944901993 0.223095799150128 0.494555313586441 0.235146462716324 0.491545553363525 0.245527842632113 0.489124286945195 0.254880213160131 0.487135054185048 0.26461574262118 0.485303542866825 0.280203018682216 0.482974838459539 0.287451040523817 0.482176763879104 0.298950783297965 0.481313345452969 0.309911250530707 0.480968758103823 0.31825409224015 0.481024774441034 0.332169035447319 0.481722335899825 0.339193169682137 0.482351689348671 0.347028043816302 0.483260706092001 0.355651364960509 0.484495882644824 0.367032102385068 0.486460774470229 0.377135964956108 0.488475567877085 0.385181732119728 0.49022638356728 0.394632393398661 0.492408758380965 0.402540451210935 0.494307486209818 0.412774572837557 0.496815261234992 0.423740929311783 0.499503376459741 0.434051505330361 0.501969756741826 0.441937098122755 0.503780831222911 0.452501499890278 0.506061529043679 0.460693779055701 0.507687009823583 0.472914452892877 0.509832450940047 0.485403877993357 0.51162695564857 0.494070446303977 0.51260864352235 0.502547241005115 0.513344487831131 0.513043529842317 0.513933887609088 0.524018087558145 0.514157359526449 0.536264236021641 0.513925300259344 0.545497938240587 0.513416296948691 0.557692901033943 0.512316201490542 0.56651947461322 0.511228552276207 0.574155352060264 0.510101050138934 0.582667272811947 0.508654357625509 0.59013987590695 0.507234030146715 0.604108639150705 0.504258418507995 0.613238480249725 0.50213351215707 0.624135872457703 0.499469213207509 0.637411871396257 0.496128703060825 0.64784487278233 0.493511221089564 0.657503514846362 0.491157445268549 0.669252604946499 0.488463919396734 0.676796072265938 0.486874380885034 0.685491334440598 0.485215345299615 0.693579436175783 0.483867585271647 0.700547641949166 0.482877001987978 0.71049974847542 0.481764063808388 0.718930966074907 0.481119484722152 0.731563828817977 0.480695364398809 0.74092146202287 0.480811388662037 0.751781085195682 0.481404891540792 0.760212544569914 0.482195965173528 0.775503861789395 0.484313244604574 0.786919258019664 0.486398994675349 0.796010127731655 0.48831749334494 0.802930834317453 0.489903017154903 0.814154570023779 0.492645718454323 0.822467048738328 0.494765572985638 0.83547499588329 0.498129526873955 0.847586896939602 0.501194971191777 0.859670956166572 0.504069345521028 0.871150267111353 0.506535252597626 0.879213938416926 0.508071861772251 0.888384497830326 0.509594464809613 0.894470081370667 0.510463113376834 0.902916842430947 0.511475206902742 0.91203710547669 0.512317832062143 0.920475130269753 0.512876702103195 0.93114915172852 0.513305738056481 0.939500028139156 0.513449853375996 0.948260958011693 0.513448514794428 0.956124363662118 0.513339328086252 0.964237510512418 0.513144858543175 0.977810524601265 0.512698205193198 0.986652387754562 0.512364673771948 1 0.511843501051419 / gets a weight of 0.02, 5 0 0.492110049406885 0.00907467080405462 0.510239269792169 0.0183871457040132 0.528766631204992 0.0256292738785806 0.543008365442004 0.0356844432954238 0.562280915007892 0.0477080178415051 0.584014939190431 0.0587504693065008 0.602061912310836 0.069400952666147 0.617109130080688 0.0777722392442526 0.626941205511677 0.0896216854292794 0.63734945768623 0.100873827571866 0.642953896259774 0.109937417499274 0.644172944041531 0.118747449307008 0.642406281784024 0.126843942582414 0.638171599226643 0.135379854003864 0.631018687656421 0.144243227423012 0.620767118100303 0.156461286692254 0.602236718229615 0.165362240711338 0.585840893350693 0.174759083584151 0.566250492349739 0.187801137505781 0.535962841827203 0.199931272523931 0.505559035975228 0.207107235475712 0.487009448452092 0.223095799150128 0.445553137621545 0.235146462716324 0.415455537030231 0.245527842632113 0.391242874108898 0.254880213160131 0.371350547486945 0.26461574262118 0.35303543513775 0.280203018682216 0.329748391955305 0.287451040523817 0.321767646367501 0.298950783297965 0.313133462188789 0.309911250530707 0.309687588487701 0.31825409224015 0.310247751524509 0.332169035447319 0.317223365257703 0.339193169682137 0.32351689919663 0.347028043816302 0.332607065943535 0.355651364960509 0.344958830648823 0.367032102385068 0.364607747752274 0.377135964956108 0.384755680782383 0.385181732119728 0.402263836873833 0.394632393398661 0.424087584103569 0.402540451210935 0.443074861686667 0.412774572837557 0.468152611116003 0.423740929311783 0.495033762611151 0.434051505330361 0.519697564853832 0.441937098122755 0.537808309306148 0.452501499890278 0.560615287139167 0.460693779055701 0.57687009472115 0.472914452892877 0.598324505657927 0.485403877993357 0.616269552595331 0.494070446303977 0.626086431261033 0.502547241005115 0.633444874289953 0.513043529842317 0.639338872000424 0.524018087558145 0.641573591096695 0.536264236021641 0.639252998330179 0.545497938240587 0.634162965151278 0.557692901033943 0.623162010493386 0.56651947461322 0.612285518325635 0.574155352060264 0.601010496966005 0.582667272811947 0.586543571898212 0.59013987590695 0.572340297226093 0.604108639150705 0.542584181236831 0.613238480249725 0.521335118138844 0.624135872457703 0.494692129309804 0.637411871396257 0.461287028921356 0.64784487278233 0.43511221025036 0.657503514846362 0.411574453135954 0.669252604946499 0.384639195878973 0.676796072265938 0.368743811746509 0.685491334440598 0.352153457040172 0.693579436175783 0.338675857808932 0.700547641949166 0.328770025834349 0.71049974847542 0.317640645157596 0.718930966074907 0.311194855097838 0.731563828817977 0.306953652735017 0.74092146202287 0.308113895702164 0.751781085195682 0.314048924500567 0.760212544569914 0.321959660538736 0.775503861789395 0.343132453656374 0.786919258019664 0.36398995294637 0.796010127731655 0.383174938229278 0.802930834317453 0.399030175112117 0.814154570023779 0.426457185941937 0.822467048738328 0.447655729561035 0.83547499588329 0.481295265783012 0.847586896939602 0.511949706648998 0.859670956166572 0.540693447981941 0.871150267111353 0.565352517362868 0.879213938416926 0.58071860848407 0.888384497830326 0.595944638546232 0.894470081370667 0.604631124263811 0.902916842430947 0.614752059931057 0.91203710547669 0.623178312413501 0.920475130269753 0.6287670140427 0.93114915172852 0.633057375619024 0.939500028139156 0.634498530760325 0.948260958011693 0.634485147264251 0.956124363662118 0.633393282461851 0.964237510512418 0.631448589532825 0.977810524601265 0.626982060441331 0.986652387754562 0.623646749178878 1 0.618435026459563 / a weight of 2.03 and 5 0 0.476331993495636 0.00907467080405462 0.530719335731934 0.0183871457040132 0.586301093720036 0.0256292738785806 0.629026044926315 0.0356844432954238 0.686843351048819 0.0477080178415051 0.75204503119394 0.0587504693065008 0.806185615194782 0.069400952666147 0.851326975757926 0.0777722392442526 0.880822997821449 0.0896216854292794 0.912047510570302 0.100873827571866 0.928860649823101 0.109937417499274 0.932517693204879 0.118747449307008 0.927217646882687 0.126843942582414 0.91451357761581 0.135379854003864 0.893054854077674 0.144243227423012 0.862300192441638 0.156461286692254 0.80670911234101 0.165362240711338 0.757521760338184 0.174759083584151 0.698750714347822 0.187801137505781 0.60788801714672 0.199931272523931 0.516676861108563 0.207107235475712 0.461028258762111 0.223095799150128 0.336659680821013 0.235146462716324 0.246367128814723 0.245527842632113 0.173729332496948 0.254880213160131 0.114052502003557 0.26461574262118 0.0591072919925342 0.280203018682216 -0.0107537017685309 0.287451040523817 -0.0346959055102422 0.298950783297965 -0.0605984454437355 0.309911250530707 -0.0709360985135132 0.31825409224015 -0.0692556605346054 0.332169035447319 -0.0483289496769449 0.339193169682137 -0.0294484316606739 0.347028043816302 -0.00217803609396502 0.355651364960509 0.0348771325272514 0.367032102385068 0.0938237083744217 0.377135964956108 0.154267349104612 0.385181732119728 0.2067916937807 0.394632393398661 0.272262797136777 0.402540451210935 0.329224522309464 0.412774572837557 0.404457645183431 0.423740929311783 0.485100984937194 0.434051505330361 0.559092303495742 0.441937098122755 0.613424482175909 0.452501499890278 0.681845358539866 0.460693779055701 0.730609748183052 0.472914452892877 0.79497294624369 0.485403877993357 0.848808064511091 0.494070446303977 0.878258689512505 0.502547241005115 0.90033400961793 0.513043529842317 0.918015992211501 0.524018087558145 0.924720137705726 0.536264236021641 0.917758344848005 0.545497938240587 0.902488234275372 0.557692901033943 0.869485358651699 0.56651947461322 0.836855878429149 0.574155352060264 0.803030816349307 0.582667272811947 0.759630051281862 0.59013987590695 0.717020244928053 0.604108639150705 0.627751957646771 0.613238480249725 0.564004831070862 0.624135872457703 0.484075966236211 0.637411871396257 0.38386082952292 0.64784487278233 0.305336532353953 0.657503514846362 0.234723428109404 0.669252604946499 0.153917879162665 0.676796072265938 0.106231876903044 0.685491334440598 0.0564609878270583 0.693579436175783 0.0160283500179453 0.700547641949166 -0.0136890144389348 0.71049974847542 -0.0470769858036873 0.718930966074907 -0.0664142335905017 0.731563828817977 -0.0791377079158902 0.74092146202287 -0.0756569279490937 0.751781085195682 -0.0578518398997277 0.760212544569914 -0.034119675887669 0.775503861789395 0.0293985215621611 0.786919258019664 0.0919708032287782 0.796010127731655 0.149525543598704 0.802930834317453 0.197091068690397 0.814154570023779 0.279371771120187 0.822467048738328 0.34296714364064 0.83547499588329 0.443885346484776 0.847586896939602 0.535848316475688 0.859670956166572 0.622079241648019 0.871150267111353 0.696056238575 0.879213938416926 0.742154416621028 0.888384497830326 0.787832459312858 0.894470081370667 0.813891923384387 0.902916842430947 0.844254792630657 0.91203710547669 0.869533685560174 0.920475130269753 0.886299976292437 0.93114915172852 0.899171372642076 0.939500028139156 0.903495134847559 0.948260958011693 0.903455338090458 0.956124363662118 0.900180091277322 0.964237510512418 0.894346393997881 0.977810524601265 0.88094747896879 0.986652387754562 0.870941995053 1 0.85530751098978 / a weight of 18.28. Algorithm 2 describes the MEC computation in detail.

Input: Prediction function , approximation error , maximum number of segments , data
1 Define for  do
       Estimate // Approximate ALE with linear model
       Fit predicting from , Set // Increase number of segments until approximation is good enough
2       while  AND  do
             // Optimize intervals via generalized simulated annealing, estimate

’s per segment with ordinary least squares

             // For categorical feature, set slopes to zero
3             Set
      // Post-processing of slopes
4       for  do
             // is number of instances in interval
5             Set and in if  then  Use new , in else  Keep old , in
      // Sum of non-zero coefficients minus first intercept
6      
return
Algorithm 2 Main Effect Complexity (MEC).

4 Improving Post-hoc Interpretation

Minimizing the number of features (NF), the interaction strength (IAS) and the main effect complexity (AMEC) improves reliability and compactness of post-hoc interpretation methods such as partial dependence plots, ALE plots, feature importance, interaction effects and local surrogate models.

4.0.1 The less features, the less verbose the interpretations.

Our NF measure improves the readability of post-hoc analysis results. The computational complexity and output size of most interpretation methods scales with , like feature effect plots [3, 16] or feature importance [12, 8]. As shown in Table 2, a model with fewer features has a more compact representation and if additionally , the ALE main effects fully characterize the prediction function. Interpretation methods that analyze 2-way feature interactions scale with . A complete functional decomposition [18, 3] would require to estimate components which has a computational complexity of .

4.0.2 The less interaction, the more reliable feature effects.

Feature effect plots, such as partial dependence plots and ALE plots visualize the marginal relationship between a feature and the prediction. The estimated effects are averages across instances. The effects can vary greatly for individual instances and even have opposite directions when the model includes feature interactions.

In the following simulation, we trained three models with different capabilities of modeling interactions between features: a linear regression model, a support vector machine (radial basis kernel, C=0.05) and gradient boosted trees. We simulated 500 data points with 4 features and a continuous target based on Friedman et. al (1991)

[17]

. The features are uniformly distributed in the following intervals:

, , , . The target was simulated as: , where . Figure 2 shows an increasing interaction strength depending on the model used. This means that we prefer models with less interaction when using feature effect plots.

Figure 2: The higher the interaction strength in a model (IAS increases from left to right), the less representative the Partial Dependence Plot (light thick line) becomes for individual instances represented by their Individual Conditional Expectation curves (dark thin lines).

4.0.3 The less complex the main effects, the better summarizable.

In linear models, a feature effect can be expressed by a single number, the regression coefficient. If effects are non-linear the method of choice is visualization [3, 16]. Summarizing the effects with a single number (e.g. average marginal effects [22]) can be misleading, e.g. if the effect has a U-shape, the average effect might be zero. As a by-product of MEC, there is a third option: Instead of reporting a single number, the coefficients of the segmented linear model can be reported. Minimizing MEC means preferring models with main effects that can be described with fewer numbers, offering a more compact model description.

5 Optimization of Performance and Interpretability

As one of the main applications of the proposed interpretability measures, we demonstrate model selection for performance and interpretability in a multi-objective optimization approach.

5.0.1 Predicting Wine Quality.

We used the wine quality dataset [9] which contains physical-chemical properties such as alcohol and residual sugar of 4870 white wines. The goal was to predict wine quality on a scale of 0 to 10, assessed by the median of three blind ratings.

5.0.2 Motivation.

As [14] emphasizes, it is difficult to know the desired compromise between interpretability and performance before modeling the data and suggests multi-objective optimization. This stands in contrast to a priori selecting an interpretable model class (e.g. decision rules) and optimizing within this class or exclusively optimizing performance and applying post-hoc interpretations. We suggest searching over a wide spectrum of model classes and hyper parameter settings, presenting the set of Pareto optimal models, and allowing the practitioner to choose a suitable compromise between interpretability and performance. The three measures of interpretability provide a detailed characterization of machine learning models, which enables making informed decisions (e.g. how much does performance suffer if we use a model without interactions?).

5.0.3 Optimization Setup.

We used the mlrMBO model-based optimization framework [19] to find the best model based on four objectives: number of features used by the model (NF), main effect complexity (MEC), interaction strength (IAS) and the cross-validated mean absolute error (MAE). We optimized over the space of following model classes (and hyperparameters): CART (maximum tree-depth and pruning cp), support vector machine (cost C and sigma), elastic net regression (regularization alpha and penalization lambda), gradient boosted trees (maximum depth, number of iterations), gradient boosted generalized additive model (number of iterations) and random forest (mtry).

5.0.4 Model-based Optimization Setup.

We used the ParEGO algorithm [21] for multi-objective optimization. Within the fitness function, the MAE was estimated using 5-fold cross-validation and the other measures (NF, MEC, IAS) were estimated using all data instances. We set the number of iterations of ParEGO to 350. For all other parameters of the model-based, multi-objective optimization, we relied on the sensitive defaults provided by [7].

Model (Hyperparameters) MAE MEC IAS NF
1 gbt (max_depth:10,nrounds:780) 0.40 3.50 0.71 11.00
2 gbt (max_depth: 8,nrounds:266) 0.41 4.10 0.64 11.00
3 rf (mtry: 5) 0.43 2.40 0.50 11.00
4 rf (mtry: 2) 0.44 2.40 0.48 11.00
5 rf (mtry: 1) 0.45 2.80 0.47 11.00
6 gbt (max_depth: 3,nrounds:617) 0.48 7.30 0.41 11.00
7 gbt (max_depth: 2,nrounds:931) 0.51 8.10 0.26 11.00
8 gbt (max_depth: 2,nrounds:100) 0.54 3.40 0.10 11.00
9 gbt (max_depth: 1,nrounds:949) 0.55 4.40 0.02 11.00
10 gamb (mstop:265) 0.57 1.70 0.00 11.00
11 svm (C:738.6223,sigma:2e-04) 0.57 1.10 0.05 11.00
12 svm (C:126.3303,sigma:2e-04) 0.58 1.00 0.01 11.00
13 gbt (max_depth: 1,nrounds: 43) 0.58 2.40 0.01 10.00
14 CART (maxdepth: 7,cp:0.0038) 0.58 2.30 0.27 10.00
15 CART (maxdepth:12,cp:0.0057) 0.59 2.00 0.21 5.00
16 elastic net (alpha:0.4723,lambda:0.0526) 0.59 1.00 0.00 8.00
17 elastic net (alpha:0.6471,lambda:0.0856) 0.60 1.00 0.00 6.00
18 CART (maxdepth:20,cp:0.0073) 0.60 2.00 0.20 4.00
19 elastic net (alpha:0.8768,lambda:0.0908) 0.61 1.00 0.00 2.00
20 elastic net (alpha:0.8681,lambda:0.2227) 0.63 1.00 0.00 1.00
21 median 0.67 0.00 0.00 0.00
Table 1: Pareto front of models minimizing mean absolute error (MAE), number of features (NF), main effect complexity (MEC) and interaction strength (IAS).

5.0.5 Results.

Table 1 shows the set of Pareto-optimal models along with their MAE, NF, IAS and MEC. If two models from the same class with different hyperparameter values had exactly the same MAE, NF, IAS and MEC, we kept only one and dropped the others. We also dropped constant models (e.g. elastic net regression with strong penalization), with exception of the median model. For a more informative visualization, we propose to visualize the main effects together with the measures in Table 2. The four selected models show different trade-offs between the four measures.

gbt [row 1] svm [row 12] gbt [row 8] CART [row 15]
MAE 0.4 0.58 0.54 0.59
MEC 3.5 1 3.4 2
IAS 0.71 0.01 0.1 0.21
NF 11 11 11 5
fixed.acidity 10 0 0.690707919036094 0.134615384615385 0.928194615215343 0.173076923076923 0.933468517218104 0.182692307692308 0.924809977887494 0.192307692307692 0.916728940623916 0.201923076923077 0.905653672509893 0.211538461538462 0.918096264773697 0.221153846153846 0.925864934764001 0.230769230769231 0.965219470895444 0.240384615384615 0.946891411936885 0.25 0.95438124713288 0.259615384615385 0.981456735611856 0.269230769230769 0.990167420459159 0.278846153846154 0.989464609480932 0.288461538461538 0.985988846482594 0.298076923076923 0.987707132882131 0.307692307692308 0.99290568325406 0.317307692307692 0.987856212182996 0.326923076923077 0.969315034785066 0.336538461538462 0.983762083141826 0.346153846153846 0.965441262589924 0.355769230769231 0.950603761183862 0.365384615384615 0.914554285867848 0.375 0.949873077207401 0.384615384615385 0.984786288338417 0.394230769230769 0.969056684084363 0.413461538461538 1 0.423076923076923 0.980132631599822 0.442307692307692 0.882440805733062 0.490384615384616 0.892465352724099 1 0 / 10 0 0 0.134615384615385 0.238820965333491 0.173076923076923 0.299467432107213 0.182692307692308 0.315572289447791 0.192307692307692 0.330609531748841 0.201923076923077 0.345929197370709 0.211538461538462 0.361076981741637 0.221153846153846 0.376397601743718 0.230769230769231 0.391789486840017 0.240384615384615 0.40711998685364 0.25 0.422391314893148 0.259615384615385 0.437087464211727 0.269230769230769 0.451819058815651 0.278846153846154 0.466136281118613 0.288461538461538 0.480611391003905 0.298076923076923 0.49450811242584 0.307692307692308 0.508241202080937 0.317307692307692 0.521662173310211 0.326923076923077 0.53532759549023 0.336538461538462 0.549347201187238 0.346153846153846 0.563348140587607 0.355769230769231 0.576477632836384 0.365384615384615 0.589388543246202 0.375 0.602035693089168 0.384615384615385 0.614295360126318 0.394230769230769 0.626427569326035 0.413461538461538 0.650100825905763 0.423076923076923 0.660505510175869 0.442307692307692 0.682437542985599 0.490384615384616 0.731366600994285 1 1 / 10 0 0.984034196558325 0.134615384615385 1 0.173076923076923 0.892620842524284 0.182692307692308 0.886716452013937 0.192307692307692 0.886716452013937 0.201923076923077 0.876494913255255 0.211538461538462 0.876494913255255 0.221153846153846 0.876494913255255 0.230769230769231 0.920863714051978 0.240384615384615 0.920863714051978 0.25 0.920863714051978 0.259615384615385 0.920863714051978 0.269230769230769 0.920863714051978 0.278846153846154 0.917290421306132 0.288461538461538 0.917290421306132 0.298076923076923 0.933874884890538 0.307692307692308 0.947710552871571 0.317307692307692 0.947710552871571 0.326923076923077 0.947710552871571 0.336538461538462 0.952626117144189 0.346153846153846 0.931430046421024 0.355769230769231 0.874118061484489 0.365384615384615 0.874118061484489 0.375 0.874118061484489 0.384615384615385 0.874118061484489 0.394230769230769 0.874118061484489 0.413461538461538 0.874118061484489 0.423076923076923 0.829714547987263 0.442307692307692 0.707792625855171 0.490384615384616 0.707792625855171 1 0 /
volatile.acidity 10 0 0.96354002960232 0.0490196078431373 1 0.0686274509803921 0.962019325165485 0.0784313725490196 0.888771579374159 0.0882352941176471 0.903035982300412 0.0980392156862745 0.919511781296651 0.107843137254902 0.911955492393678 0.117647058823529 0.833684093556167 0.127450980392157 0.775409417825467 0.137254901960784 0.793247519375685 0.147058823529412 0.801222383639001 0.156862745098039 0.744397915760961 0.166666666666667 0.721483245395435 0.176470588235294 0.647962742542993 0.186274509803922 0.646453146853176 0.196078431372549 0.611521897879814 0.205882352941176 0.60738011906998 0.215686274509804 0.615928202587115 0.225490196078431 0.584689382389163 0.235294117647059 0.59307400706633 0.245098039215686 0.584812007357166 0.254901960784314 0.600955280553226 0.264705882352941 0.609282497599748 0.274509803921569 0.601393799067924 0.284313725490196 0.59245668760048 0.294117647058824 0.570430156902148 0.313725490196078 0.569430275270326 0.333333333333333 0.589868210675144 0.352941176470588 0.678531054949547 0.392156862745098 0.539320165537527 0.470588235294118 0.334474557605669 1 0 / 10 0 1 0.0490196078431373 0.94516259588144 0.0686274509803921 0.923290685982659 0.0784313725490196 0.912299804439023 0.0882352941176471 0.901432481939709 0.0980392156862745 0.890392645504872 0.107843137254902 0.879354980374031 0.117647058823529 0.868482372074849 0.127450980392157 0.857614065878276 0.137254901960784 0.846631848688578 0.147058823529412 0.835546670804768 0.156862745098039 0.824714091261197 0.166666666666667 0.813711065690566 0.176470588235294 0.802835130855553 0.186274509803922 0.792007070328678 0.196078431372549 0.781205400174368 0.205882352941176 0.7704525869889 0.215686274509804 0.75982381275651 0.225490196078431 0.749157558524254 0.235294117647059 0.738633504750822 0.245098039215686 0.728108336057102 0.254901960784314 0.717501760741804 0.264705882352941 0.706876456495468 0.274509803921569 0.696519340388137 0.284313725490196 0.68617951101787 0.294117647058824 0.675700632226199 0.313725490196078 0.654848076497833 0.333333333333333 0.634259966137902 0.352941176470588 0.613750307457965 0.392156862745098 0.573549002056517 0.470588235294118 0.493013922103945 1 0 / 10 0 1 0.0490196078431373 0.969990949312295 0.0686274509803921 0.92932917683379 0.0784313725490196 0.874362249968319 0.0882352941176471 0.874362249968319 0.0980392156862745 0.874362249968319 0.107843137254902 0.874362249968319 0.117647058823529 0.812153916170645 0.127450980392157 0.743525629625153 0.137254901960784 0.772458411913843 0.147058823529412 0.690979262651919 0.156862745098039 0.691680172257729 0.166666666666667 0.698366830565687 0.176470588235294 0.612526717665467 0.186274509803922 0.604422991767501 0.196078431372549 0.604422991767501 0.205882352941176 0.601881099670535 0.215686274509804 0.601881099670535 0.225490196078431 0.553531537015743 0.235294117647059 0.553531537015743 0.245098039215686 0.553531537015743 0.254901960784314 0.553531537015743 0.264705882352941 0.553531537015743 0.274509803921569 0.553531537015743 0.284313725490196 0.553531537015743 0.294117647058824 0.553531537015743 0.313725490196078 0.553531537015743 0.333333333333333 0.57034640462565 0.352941176470588 0.57034640462565 0.392156862745098 0.523133668775973 0.470588235294118 0.270596859794034 1 0 / 10 0 1 0.0490196078431373 1 0.0686274509803921 1 0.0784313725490196 1 0.0882352941176471 1 0.0980392156862745 1 0.107843137254902 1 0.117647058823529 1 0.127450980392157 0.672432807195019 0.137254901960784 0.672432807195019 0.147058823529412 0.672432807195019 0.156862745098039 0.672432807195019 0.166666666666667 0.672432807195019 0.176470588235294 0.261096860292739 0.186274509803922 0.261096860292739 0.196078431372549 0.261096860292739 0.205882352941176 0.261096860292739 0.215686274509804 0.261096860292739 0.225490196078431 0.261096860292739 0.235294117647059 0.261096860292739 0.245098039215686 0.261096860292739 0.254901960784314 0.261096860292739 0.264705882352941 0.261096860292739 0.274509803921569 0.261096860292739 0.284313725490196 0.261096860292739 0.294117647058824 0.261096860292739 0.313725490196078 0.261096860292739 0.333333333333333 0.261096860292739 0.352941176470588 0.261096860292739 0.392156862745098 0 0.470588235294118 0 1 0 /
citric.acid 10 0 0 0.0662650602409639 0.289476392051235 0.0963855421686747 0.448440438710732 0.114457831325301 0.416665079360712 0.120481927710843 0.439130019382771 0.132530120481928 0.512295997716175 0.13855421686747 0.522764105097084 0.144578313253012 0.669338484646792 0.150602409638554 0.721823986037543 0.156626506024096 0.743884750642429 0.162650602409639 0.930769055591606 0.168674698795181 0.944125411269446 0.174698795180723 0.953679038801206 0.180722891566265 0.935435970784212 0.186746987951807 0.947663382921508 0.192771084337349 0.9425477159149 0.198795180722892 0.959639334050923 0.204819277108434 0.939535654519504 0.210843373493976 0.94437719098172 0.216867469879518 1 0.22289156626506 0.80278141605276 0.228915662650602 0.815284262852064 0.234939759036145 0.843974860681188 0.240963855421687 0.813995906378538 0.246987951807229 0.814294546116438 0.253012048192771 0.892284795598304 0.265060240963855 0.864419922404914 0.27710843373494 0.82141074787329 0.289156626506024 0.860303582198832 0.295180722891566 0.888217567149707 0.307228915662651 0.834360973742799 0.343373493975904 0.849532148772333 0.403614457831325 0.819881577644379 1 0.648162079023234 / 10 0 0.946276763791848 0.0662650602409639 0.981307002170016 0.0963855421686747 0.991183398225994 0.114457831325301 0.99523614403498 0.120481927710843 0.996291544400456 0.132530120481928 0.997988357734738 0.13855421686747 0.998581722442967 0.144578313253012 0.999193237318292 0.150602409638554 0.999598652466527 0.156626506024096 0.99979118126154 0.162650602409639 1 0.168674698795181 0.999902677216915 0.174698795180723 0.999815533761852 0.180722891566265 0.999627403644324 0.186746987951807 0.999223174855528 0.192771084337349 0.99882937738322 0.198795180722892 0.998305099485455 0.204819277108434 0.997541086076027 0.210843373493976 0.996634122571144 0.216867469879518 0.995477475043479 0.22289156626506 0.994333491518508 0.228915662650602 0.993275871694061 0.234939759036145 0.992074198090653 0.240963855421687 0.990591213158613 0.246987951807229 0.988718148443039 0.253012048192771 0.98687533176955 0.265060240963855 0.983007708459646 0.27710843373494 0.978878873886012 0.289156626506024 0.974476038949651 0.295180722891566 0.971962095357859 0.307228915662651 0.966469712280606 0.343373493975904 0.948247699038927 0.403614457831325 0.90733496576271 1 0 / 10 0 0 0.0662650602409639 0.355999124787446 0.0963855421686747 0.609784975349549 0.114457831325301 0.482903696089959 0.120481927710843 0.482903696089959 0.132530120481928 0.482903696089959 0.13855421686747 0.482903696089959 0.144578313253012 0.771832119982319 0.150602409638554 0.771832119982319 0.156626506024096 0.771832119982319 0.162650602409639 1 0.168674698795181 1 0.174698795180723 1 0.180722891566265 1 0.186746987951807 1 0.192771084337349 1 0.198795180722892 1 0.204819277108434 0.971881064361328 0.210843373493976 0.971881064361328 0.216867469879518 0.971881064361328 0.22289156626506 0.823670485657849 0.228915662650602 0.823670485657849 0.234939759036145 0.823670485657849 0.240963855421687 0.823670485657849 0.246987951807229 0.823670485657849 0.253012048192771 0.823670485657849 0.265060240963855 0.823670485657849 0.27710843373494 0.823670485657849 0.289156626506024 0.823670485657849 0.295180722891566 0.823670485657849 0.307228915662651 0.823670485657849 0.343373493975904 0.694561817750192 0.403614457831325 0.694561817750192 1 0.397305858868427 / 10 0 1 0.0662650602409639 1 0.0963855421686747 1 0.114457831325301 1 0.120481927710843 1 0.132530120481928 1 0.13855421686747 1 0.144578313253012 1 0.150602409638554 1 0.156626506024096 1 0.162650602409639 1 0.168674698795181 1 0.174698795180723 1 0.180722891566265 1 0.186746987951807 0 0.192771084337349 0 0.198795180722892 0 0.204819277108434 0 0.210843373493976 0 0.216867469879518 0 0.22289156626506 0 0.228915662650602 0 0.234939759036145 0 0.240963855421687 0 0.246987951807229 0 0.253012048192771 0 0.265060240963855 0 0.27710843373494 0 0.289156626506024 0 0.295180722891566 0 0.307228915662651 0 0.343373493975904 0 0.403614457831325 0 1 0 /
residual.sugar 10 0 0 0.00613496932515337 0.439164749333347 0.00766871165644172 0.467460409537045 0.00920245398773006 0.456078094037364 0.0107361963190184 0.506729242455699 0.0122699386503067 0.532033745680536 0.0138036809815951 0.567401864151054 0.0153374233128834 0.581429036846875 0.0168711656441718 0.581424770131772 0.0184049079754601 0.601160534473875 0.0199386503067485 0.610958684631257 0.0214723926380368 0.628973451676223 0.0245398773006135 0.695843196441006 0.0276073619631902 0.71218614338307 0.0322085889570552 0.705998048215787 0.0352760736196319 0.689836969821547 0.0429447852760736 0.694556125487436 0.0506134969325153 0.725005402749108 0.0567484662576687 0.759001070014666 0.0613496932515337 0.800812517975802 0.0659509202453988 0.789829234000898 0.0705521472392638 0.738807876680054 0.0782208588957055 0.728057275317069 0.0828220858895706 0.746461801280461 0.0889570552147239 0.721614698833241 0.093558282208589 0.737262465497945 0.098159509202454 0.78604743431464 0.104294478527607 0.813973793477376 0.108895705521472 0.784422278287565 0.111963190184049 0.742287831784821 0.116564417177914 0.747719457264558 0.122699386503067 0.734295881701265 0.128834355828221 0.774228523489751 0.138803680981595 0.745469400716772 0.147239263803681 0.795371461916746 0.153374233128834 0.809366493648684 0.162576687116564 0.805621783046989 0.171779141104294 0.763182922878977 0.179447852760736 0.847063652624671 0.187116564417178 0.955069031128404 0.19478527607362 0.928576226959513 0.205521472392638 1 0.213957055214724 0.893142677880856 0.226993865030675 0.738800593283378 0.242331288343558 0.737453107318202 0.263803680981595 0.752786503414166 1 0.697450264779856 / 10 0 0 0.00613496932515337 0.0065002161542584 0.00766871165644172 0.00811906798394432 0.00920245398773006 0.0097351017123647 0.0107361963190184 0.0113435013529783 0.0122699386503067 0.0129529522077752 0.0138036809815951 0.0145629575131844 0.0153374233128834 0.0161744569065174 0.0168711656441718 0.0177749264472089 0.0184049079754601 0.0193802690592338 0.0199386503067485 0.0209852502061992 0.0214723926380368 0.0225939621353872 0.0245398773006135 0.0258021130199586 0.0276073619631902 0.0290164029499524 0.0322085889570552 0.0338216259005352 0.0352760736196319 0.0370262002599932 0.0429447852760736 0.0450817230439164 0.0506134969325153 0.0531185999012018 0.0567484662576687 0.0595344293535811 0.0613496932515337 0.0643365520576854 0.0659509202453988 0.0691372461640783 0.0705521472392638 0.0739501507038686 0.0782208588957055 0.0819778265942149 0.0828220858895706 0.0867935778900785 0.0889570552147239 0.0932021238187466 0.093558282208589 0.0980117480922464 0.098159509202454 0.10279936663469 0.104294478527607 0.109209914454411 0.108895705521472 0.113981547844085 0.111963190184049 0.117176529465613 0.116564417177914 0.121968308746493 0.122699386503067 0.128356015720884 0.128834355828221 0.13477196875312 0.138803680981595 0.145232769500669 0.147239263803681 0.154050690821894 0.153374233128834 0.160464434095987 0.162576687116564 0.170081886545572 0.171779141104294 0.179657411893803 0.179447852760736 0.187598812123893 0.187116564417178 0.195577597899675 0.19478527607362 0.203514667340913 0.205521472392638 0.214733096084275 0.213957055214724 0.223552956223149 0.226993865030675 0.237188602889935 0.242331288343558 0.253139278091933 0.263803680981595 0.275543074708047 1 1 / 10 0 0 0.00613496932515337 0.315637089691733 0.00766871165644172 0.315637089691733 0.00920245398773006 0.315637089691733 0.0107361963190184 0.315637089691733 0.0122699386503067 0.395073310683321 0.0138036809815951 0.395073310683321 0.0153374233128834 0.395073310683321 0.0168711656441718 0.395073310683321 0.0184049079754601 0.395073310683321 0.0199386503067485 0.395073310683321 0.0214723926380368 0.395073310683321 0.0245398773006135 0.486595829075836 0.0276073619631902 0.486595829075836 0.0322085889570552 0.486595829075836 0.0352760736196319 0.488741882495548 0.0429447852760736 0.488741882495548 0.0506134969325153 0.488741882495548 0.0567484662576687 0.488741882495548 0.0613496932515337 0.488741882495548 0.0659509202453988 0.488741882495548 0.0705521472392638 0.488741882495548 0.0782208588957055 0.505975545316647 0.0828220858895706 0.505975545316647 0.0889570552147239 0.505975545316647 0.093558282208589 0.507382738705327 0.098159509202454 0.517436381576793 0.104294478527607 0.539505756749994 0.108895705521472 0.539505756749994 0.111963190184049 0.539505756749994 0.116564417177914 0.539505756749994 0.122699386503067 0.539505756749994 0.128834355828221 0.539505756749994 0.138803680981595 0.539505756749994 0.147239263803681 0.567959751801144 0.153374233128834 0.567959751801144 0.162576687116564 0.567959751801144 0.171779141104294 0.567959751801144 0.179447852760736 0.567959751801144 0.187116564417178 0.645626680970445 0.19478527607362 0.645626680970445 0.205521472392638 1 0.213957055214724 0.79002515884945 0.226993865030675 0.529336981698958 0.242331288343558 0.529336981698958 0.263803680981595 0.558679572095532 1 0.607976282316907 /
chlorides 10 0 0 0.0415430267062315 0.868903877073699 0.0504451038575668 0.898058460083898 0.056379821958457 0.908940909076803 0.0593471810089021 0.912533185176189 0.0623145400593472 0.930911879645649 0.0652818991097923 0.945310495135766 0.0682492581602374 0.929667925700067 0.0712166172106825 0.983113225843998 0.0741839762611276 1 0.0771513353115727 0.984205806451797 0.0801186943620178 0.981505832923031 0.0830860534124629 0.991737293032588 0.086053412462908 0.93426062919014 0.0890207715133531 0.906936334507381 0.0919881305637982 0.765312849597365 0.0949554896142433 0.830420730909193 0.0979228486646884 0.824194575935356 0.100890207715134 0.820511461177219 0.103857566765579 0.852133203507794 0.106824925816024 0.809190766405462 0.109792284866469 0.78882399295596 0.112759643916914 0.791834101190633 0.115727002967359 0.800109708215957 0.118694362017804 0.810232459115103 0.121661721068249 0.679223860298565 0.124629080118694 0.655257719046369 0.127596439169139 0.665450770824077 0.130563798219585 0.668799682668734 0.13353115727003 0.71357657047528 0.136498516320475 0.743082093146317 0.13946587537092 0.757915471997049 0.14540059347181 0.745092501031532 0.1513353115727 0.74290178422937 0.163204747774481 0.691150230842639 0.189910979228487 0.678489055238857 0.320474777448071 0.666669193244409 1 0.564126853883958 / 10 0 0.976337422357529 0.0415430267062315 0.799617087594858 0.0504451038575668 0.762717443780129 0.056379821958457 0.738731694930402 0.0593471810089021 0.72671565832618 0.0623145400593472 0.714402406394176 0.0652818991097923 0.70257155966486 0.0682492581602374 0.690925259385255 0.0712166172106825 0.679292125489271 0.0741839762611276 0.667741410409687 0.0771513353115727 0.656535498972487 0.0801186943620178 0.644942784960922 0.0830860534124629 0.634201557483996 0.086053412462908 0.623073886340416 0.0890207715133531 0.612157297385059 0.0919881305637982 0.601304429985953 0.0949554896142433 0.590665085099432 0.0979228486646884 0.580445980873143 0.100890207715134 0.569976730864833 0.103857566765579 0.559548994804684 0.106824925816024 0.549737637142954 0.109792284866469 0.539393840290912 0.112759643916914 0.529268689216201 0.115727002967359 0.518937722886983 0.118694362017804 0.50905462638309 0.121661721068249 0.499924975912121 0.124629080118694 0.489754149177065 0.127596439169139 0.480217003503583 0.130563798219585 0.470337473977721 0.13353115727003 0.460711508112007 0.136498516320475 0.451075369346413 0.13946587537092 0.442376789494128 0.14540059347181 0.424430884648775 0.1513353115727 0.405100521171102 0.163204747774481 0.369986682353353 0.189910979228487 0.292624599930421 0.320474777448071 0 1 1 / 10 0 0.14428343123864 0.0415430267062315 1 0.0504451038575668 1 0.056379821958457 1 0.0593471810089021 1 0.0623145400593472 1 0.0652818991097923 1 0.0682492581602374 1 0.0712166172106825 1 0.0741839762611276 1 0.0771513353115727 1 0.0801186943620178 1 0.0830860534124629 1 0.086053412462908 1 0.0890207715133531 1 0.0919881305637982 0.722202930471427 0.0949554896142433 0.722202930471427 0.0979228486646884 0.722202930471427 0.100890207715134 0.722202930471427 0.103857566765579 0.722202930471427 0.106824925816024 0.591248398579743 0.109792284866469 0.589043615544378 0.112759643916914 0.589043615544378 0.115727002967359 0.589043615544378 0.118694362017804 0.589043615544378 0.121661721068249 0.494679494798936 0.124629080118694 0.494679494798936 0.127596439169139 0.494679494798936 0.130563798219585 0.494679494798936 0.13353115727003 0.494679494798936 0.136498516320475 0.494679494798936 0.13946587537092 0.494679494798936 0.14540059347181 0.494679494798936 0.1513353115727 0.494679494798936 0.163204747774481 0.494679494798936 0.189910979228487 0.4862132208098 0.320474777448071 0.598584509944145 1 0 /
free.sulfur.dioxide 10 0 0 0.0174216027874564 0.324878020770494 0.0278745644599303 0.394973100004176 0.0348432055749129 0.645697772831512 0.0418118466898955 0.688405493693137 0.0452961672473868 0.687783580347119 0.0522648083623693 0.709591150089082 0.0557491289198606 0.730720145783246 0.0592334494773519 0.741290090889277 0.0627177700348432 0.764401915794732 0.0662020905923345 0.765321925221019 0.0696864111498258 0.827988847515174 0.0731707317073171 0.823070982242639 0.0766550522648084 0.812100847850128 0.0801393728222996 0.870047709996817 0.0836236933797909 0.895163257876249 0.0871080139372822 0.893391484578084 0.0905923344947735 0.922329240879663 0.0940766550522648 0.972971606837966 0.0975609756097561 0.965124382788417 0.101045296167247 0.966804809485206 0.104529616724739 0.965980595021336 0.10801393728223 0.959850211073825 0.111498257839721 0.920764938662994 0.114982578397213 0.923089545539444 0.118466898954704 0.934199510008816 0.121951219512195 0.934292335323546 0.125435540069686 0.924510051877326 0.128919860627178 0.953269143977971 0.132404181184669 0.946724428735769 0.13588850174216 0.969860746170909 0.139372822299652 0.954421069569568 0.142857142857143 0.974566341310487 0.146341463414634 0.992834352815358 0.149825783972125 0.988569100066374 0.156794425087108 0.955158919638448 0.160278745644599 0.94136213835621 0.163763066202091 0.987913728409096 0.167247386759582 0.988402589384614 0.174216027874564 0.980392848584293 0.177700348432056 1 0.184668989547038 0.988304888487436 0.191637630662021 0.988771874425366 0.198606271777003 0.93818943808223 0.209059233449477 0.95892962670705 0.222996515679443 0.952710521425901 0.24390243902439 0.936598527043408 1 0.145917328527209 / 10 0 0.549828191082677 0.0174216027874564 0.600868403970295 0.0278745644599303 0.628894816011419 0.0348432055749129 0.647643071088707 0.0418118466898955 0.665970739367119 0.0452961672473868 0.675043314931037 0.0522648083623693 0.692585007076638 0.0557491289198606 0.701296212957857 0.0592334494773519 0.70983021188015 0.0627177700348432 0.718349748822558 0.0662020905923345 0.726548357175881 0.0696864111498258 0.734637518219414 0.0731707317073171 0.742783278155876 0.0766550522648084 0.75079022333872 0.0801393728222996 0.75857044302836 0.0836236933797909 0.766358238750058 0.0871080139372822 0.774024268690951 0.0905923344947735 0.781492531325194 0.0940766550522648 0.788937665168356 0.0975609756097561 0.796494595404672 0.101045296167247 0.803905585646082 0.104529616724739 0.810936195333224 0.10801393728223 0.817816731119731 0.111498257839721 0.824815612559981 0.114982578397213 0.83136926573666 0.118466898954704 0.838108561553366 0.121951219512195 0.844665569999664 0.125435540069686 0.851328821161752 0.128919860627178 0.857794013143397 0.132404181184669 0.864163649901135 0.13588850174216 0.870248198287116 0.139372822299652 0.876129582884388 0.142857142857143 0.882108830131678 0.146341463414634 0.887543527830009 0.149825783972125 0.892977736451386 0.156794425087108 0.903770898680605 0.160278745644599 0.908763696214189 0.163763066202091 0.913710603777439 0.167247386759582 0.918502178616003 0.174216027874564 0.927828373153936 0.177700348432056 0.932139496523159 0.184668989547038 0.940451045815461 0.191637630662021 0.948833553545862 0.198606271777003 0.956667817077162 0.209059233449477 0.967859337690086 0.222996515679443 0.981878650351247 0.24390243902439 1 1 0 / 10 0 0.233987222499775 0.0174216027874564 0.426869926999459 0.0278745644599303 0.536356690187334 0.0348432055749129 0.750559279781536 0.0418118466898955 0.836861384285179 0.0452961672473868 0.836861384285179 0.0522648083623693 0.836861384285179 0.0557491289198606 0.836861384285179 0.0592334494773519 0.836861384285179 0.0627177700348432 0.836861384285179 0.0662020905923345 0.836861384285179 0.0696864111498258 0.880020278524227 0.0731707317073171 0.880020278524227 0.0766550522648084 0.880020278524227 0.0801393728222996 0.970240832146851 0.0836236933797909 0.970240832146851 0.0871080139372822 0.967169789022559 0.0905923344947735 0.975699168085081 0.0940766550522648 1 0.0975609756097561 1 0.101045296167247 1 0.104529616724739 1 0.10801393728223 1 0.111498257839721 1 0.114982578397213 1 0.118466898954704 1 0.121951219512195 1 0.125435540069686 1 0.128919860627178 1 0.132404181184669 1 0.13588850174216 1 0.139372822299652 1 0.142857142857143 1 0.146341463414634 0.999075921892195 0.149825783972125 0.999075921892195 0.156794425087108 0.999075921892195 0.160278745644599 0.999075921892195 0.163763066202091 0.999075921892195 0.167247386759582 0.999075921892195 0.174216027874564 0.999075921892195 0.177700348432056 0.999075921892195 0.184668989547038 0.999075921892195 0.191637630662021 0.999075921892195 0.198606271777003 0.999075921892195 0.209059233449477 0.977830010394335 0.222996515679443 0.977830010394335 0.24390243902439 0.977830010394335 1 0 / 10 0 0 0.0174216027874564 0 0.0278745644599303 0 0.0348432055749129 0.771620810573821 0.0418118466898955 0.771620810573821 0.0452961672473868 0.771620810573821 0.0522648083623693 0.771620810573821 0.0557491289198606 1 0.0592334494773519 1 0.0627177700348432 1 0.0662020905923345 1 0.0696864111498258 1 0.0731707317073171 1 0.0766550522648084 1 0.0801393728222996 1 0.0836236933797909 1 0.0871080139372822 1 0.0905923344947735 1 0.0940766550522648 1 0.0975609756097561 1 0.101045296167247 1 0.104529616724739 1 0.10801393728223 1 0.111498257839721 1 0.114982578397213 1 0.118466898954704 1 0.121951219512195 1 0.125435540069686 1 0.128919860627178 1 0.132404181184669 1 0.13588850174216 1 0.139372822299652 1 0.142857142857143 1 0.146341463414634 1 0.149825783972125 1 0.156794425087108 1 0.160278745644599 1 0.163763066202091 1 0.167247386759582 1 0.174216027874564 1 0.177700348432056 1 0.184668989547038 1 0.191637630662021 1 0.198606271777003 1 0.209059233449477 1 0.222996515679443 1 0.24390243902439 1 1 1 /
total.sulfur.dioxide 10 0 0 0.122969837587007 0.64399662123053 0.146171693735499 0.772713661018902 0.160092807424594 0.777811634517617 0.17169373549884 0.772062108964938 0.180974477958237 0.761572803559777 0.190255220417633 0.775716060484094 0.197215777262181 0.93334749273911 0.204176334106729 0.949151908870358 0.208816705336427 0.922594032644244 0.215777262180974 0.856902188561021 0.220417633410673 0.873079597095302 0.22737819025522 0.904410449544974 0.234338747099768 0.894581878293652 0.236658932714617 0.918198294160631 0.241299303944316 0.942817889561605 0.245939675174014 0.947234730391994 0.250580046403712 0.918145854078848 0.255220417633411 0.949859089519856 0.259860788863109 0.954793471472217 0.266821345707657 1 0.271461716937355 0.989765549232227 0.276102088167053 0.987305762533497 0.280742459396752 0.908431421733656 0.28538283062645 0.885916357008707 0.290023201856148 0.876157476190311 0.294663573085847 0.820726765342882 0.301624129930394 0.834333311515853 0.308584686774942 0.867286147065998 0.31322505800464 0.839223241919877 0.320185614849188 0.821157220858039 0.324825986078886 0.806252471021543 0.331786542923434 0.851841320763592 0.338747099767981 0.833385695933388 0.34338747099768 0.845918485976282 0.350348027842227 0.822709239031873 0.357308584686775 0.763891124272487 0.364269141531323 0.785366232221783 0.368909512761021 0.807794486088176 0.378190255220418 0.787777231380789 0.387470997679814 0.680621711776449 0.394431554524362 0.76057656916385 0.40139211136891 0.675803716416415 0.410672853828306 0.692697044265934 0.419953596287703 0.788152986516261 0.431554524361949 0.753079877985449 0.443155452436195 0.74947449239669 0.464037122969838 0.897638323178492 0.480278422273782 0.784487910107351 0.510440835266821 0.767105595182933 1 0.640127226385593 / 10 0 0.984220071436404 0.122969837587007 1 0.146171693735499 0.998226364966594 0.160092807424594 0.996194341408216 0.17169373549884 0.994118586788638 0.180974477958237 0.992352105704463 0.190255220417633 0.989908734830261 0.197215777262181 0.988039068464672 0.204176334106729 0.986046610430624 0.208816705336427 0.984719647001783 0.215777262180974 0.982579685959011 0.220417633410673 0.980893730788411 0.22737819025522 0.97835174359671 0.234338747099768 0.975787700954337 0.236658932714617 0.974895054695569 0.241299303944316 0.973016880374028 0.245939675174014 0.971037511093925 0.250580046403712 0.969112851777199 0.255220417633411 0.967182537431708 0.259860788863109 0.965060143379579 0.266821345707657 0.961987372147178 0.271461716937355 0.959985415705927 0.276102088167053 0.957591075051039 0.280742459396752 0.955157134408796 0.28538283062645 0.952915829973516 0.290023201856148 0.950292874497337 0.294663573085847 0.947854082094456 0.301624129930394 0.943958153021853 0.308584686774942 0.939801138143012 0.31322505800464 0.937031100241782 0.320185614849188 0.932503420501108 0.324825986078886 0.929547108366387 0.331786542923434 0.924810176920473 0.338747099767981 0.919884682749501 0.34338747099768 0.916800149811562 0.350348027842227 0.911659095917522 0.357308584686775 0.906408109593906 0.364269141531323 0.901337057638201 0.368909512761021 0.897826886171031 0.378190255220418 0.890415761636659 0.387470997679814 0.882868091777909 0.394431554524362 0.876905004129719 0.40139211136891 0.871060727841113 0.410672853828306 0.862700463917356 0.419953596287703 0.854356106105571 0.431554524361949 0.844008007320226 0.443155452436195 0.832746541984114 0.464037122969838 0.811730388010258 0.480278422273782 0.794680153397945 0.510440835266821 0.762202012394498 1 0 / 10 0 1 0.122969837587007 0.74218889787283 0.146171693735499 0.814267910140581 0.160092807424594 0.814267910140581 0.17169373549884 0.814267910140581 0.180974477958237 0.814267910140581 0.190255220417633 0.814267910140581 0.197215777262181 0.814267910140581 0.204176334106729 0.814267910140581 0.208816705336427 0.814267910140581 0.215777262180974 0.787485618153486 0.220417633410673 0.787485618153486 0.22737819025522 0.787485618153486 0.234338747099768 0.787485618153486 0.236658932714617 0.787485618153486 0.241299303944316 0.823116988444293 0.245939675174014 0.823116988444293 0.250580046403712 0.823116988444293 0.255220417633411 0.823116988444293 0.259860788863109 0.823116988444293 0.266821345707657 0.823116988444293 0.271461716937355 0.823116988444293 0.276102088167053 0.823116988444293 0.280742459396752 0.808645806600887 0.28538283062645 0.807605250766354 0.290023201856148 0.807605250766354 0.294663573085847 0.807605250766354 0.301624129930394 0.807605250766354 0.308584686774942 0.807605250766354 0.31322505800464 0.807605250766354 0.320185614849188 0.807605250766354 0.324825986078886 0.807605250766354 0.331786542923434 0.802292431784213 0.338747099767981 0.802292431784213 0.34338747099768 0.77511947032059 0.350348027842227 0.77511947032059 0.357308584686775 0.77511947032059 0.364269141531323 0.77511947032059 0.368909512761021 0.804025116652988 0.378190255220418 0.801152138049381 0.387470997679814 0.736072212232062 0.394431554524362 0.736072212232062 0.40139211136891 0.725858591590693 0.410672853828306 0.725858591590693 0.419953596287703 0.725858591590693 0.431554524361949 0.712062475074071 0.443155452436195 0.712062475074071 0.464037122969838 0.750299915386557 0.480278422273782 0.750299915386557 0.510440835266821 0.779763605335727 1 0 /
density 10 0 0.751948268022272 0.0391363022941966 0.896950016057425 0.0460767302872567 1 0.0518604202814724 0.854590938098642 0.0572585309427399 0.699444366882878 0.0618854829381133 0.730677925775872 0.0668980142664348 0.733603700947571 0.0703682282629638 0.707719419561253 0.0740312319259679 0.575881424150146 0.0773086562560239 0.69789149798063 0.0807788702525529 0.654313659069167 0.0838635049161359 0.368460473546897 0.087719298245613 0.491686406271175 0.0904183535762478 0.400167475732934 0.0931174089068827 0.210946683653657 0.0956236745710424 0.49762565618989 0.0992866782340465 0.366683641241462 0.101985733564679 0.420315412362407 0.105841526894158 0.345654544303803 0.109311740890687 0.348157566199698 0.111625216888374 0.344464348106539 0.11490264121843 0.287157721141619 0.11740890688259 0.29482129355294 0.121264700212067 0.270177536113426 0.125120493541546 0.312876816250535 0.127626759205706 0.224949911479402 0.130904183535762 0.184147483588442 0.134759976865239 0.251661553579362 0.138615770194716 0.196055112414499 0.142471563524195 0.25257566986445 0.146327356853672 0.259759982907349 0.15018315018315 0.251877417659481 0.154038943512627 0.281978489076939 0.157894736842104 0.319717961884277 0.160401002506266 0.453692763684296 0.163678426836322 0.320511517852161 0.167534220165799 0.230053303626701 0.17061885482938 0.137815311783643 0.176209755157122 0.127070893320223 0.182379024484286 0.100277244768113 0.186813186813186 0 0.192596876807402 0.196669178196507 0.198380566801618 0.147155584726808 0.204164256795836 0.250878487805908 0.208020050125313 0.246968234033745 0.212839791787159 0.209474409995939 0.217659533449006 0.240763152584959 0.223443223443223 0.293716901999437 0.233853865432812 0.343725021521601 0.246577983420088 0.432700669662596 1 0.55513439851513 / 10 0 1 0.0391363022941966 0.954756209331477 0.0460767302872567 0.946761378378916 0.0518604202814724 0.940121288302214 0.0572585309427399 0.933958665196148 0.0618854829381133 0.928673146878016 0.0668980142664348 0.922985093055608 0.0703682282629638 0.919030309481809 0.0740312319259679 0.914888382597508 0.0773086562560239 0.91116853041164 0.0807788702525529 0.907231520347012 0.0838635049161359 0.903754356616317 0.087719298245613 0.899405914675631 0.0904183535762478 0.896360253133006 0.0931174089068827 0.893308132236881 0.0956236745710424 0.890481767263753 0.0992866782340465 0.886339264222602 0.101985733564679 0.883306863597912 0.105841526894158 0.879015971593671 0.109311740890687 0.875119517732167 0.111625216888374 0.872532275539484 0.11490264121843 0.868893344842082 0.11740890688259 0.866080304517249 0.121264700212067 0.861760058556639 0.125120493541546 0.857491236944015 0.127626759205706 0.85471133721274 0.130904183535762 0.85107330861236 0.134759976865239 0.846812805207554 0.138615770194716 0.842555974760068 0.142471563524195 0.838280066141186 0.146327356853672 0.83397832001596 0.15018315018315 0.829682182406623 0.154038943512627 0.82540191290089 0.157894736842104 0.82112233592206 0.160401002506266 0.818365330711631 0.163678426836322 0.814730703839559 0.167534220165799 0.810475874628938 0.17061885482938 0.807066341108654 0.176209755157122 0.800893164123219 0.182379024484286 0.79407670795549 0.186813186813186 0.78923889515376 0.192596876807402 0.782891919705937 0.198380566801618 0.776525069644782 0.204164256795836 0.770165992318849 0.208020050125313 0.765973662416015 0.212839791787159 0.760813968453499 0.217659533449006 0.755636140330024 0.223443223443223 0.749388458079305 0.233853865432812 0.738160500062034 0.246577983420088 0.724489883038293 1 0 / 10 0 0.525134244958949 0.0391363022941966 0.525134244958949 0.0460767302872567 0.696043739981129 0.0518604202814724 0.696043739981129 0.0572585309427399 0.519206368128131 0.0618854829381133 0.519206368128131 0.0668980142664348 0.566123739113159 0.0703682282629638 0.566123739113159 0.0740312319259679 0.566123739113159 0.0773086562560239 0.566123739113159 0.0807788702525529 0.566123739113159 0.0838635049161359 0.566123739113159 0.087719298245613 0.566123739113159 0.0904183535762478 0.597557295536188 0.0931174089068827 0.597557295536188 0.0956236745710424 0.597557295536188 0.0992866782340465 0.597557295536188 0.101985733564679 0.597557295536188 0.105841526894158 0.597557295536188 0.109311740890687 0.597557295536188 0.111625216888374 0.597557295536188 0.11490264121843 0.597557295536188 0.11740890688259 0.597557295536188 0.121264700212067 0.597557295536188 0.125120493541546 0.597557295536188 0.127626759205706 0.368994585851776 0.130904183535762 0.368994585851776 0.134759976865239 0.324923147217768 0.138615770194716 0.291592893069895 0.142471563524195 0.291592893069895 0.146327356853672 0.291592893069895 0.15018315018315 0.291592893069895 0.154038943512627 0.291592893069895 0.157894736842104 0.291592893069895 0.160401002506266 0.291592893069895 0.163678426836322 0.316108257806093 0.167534220165799 0.137376571245887 0.17061885482938 0.11803305085774 0.176209755157122 0.0720099253666468 0.182379024484286 0 0.186813186813186 0 0.192596876807402 0 0.198380566801618 0 0.204164256795836 0 0.208020050125313 0.00622602936902316 0.212839791787159 0.333237769652749 0.217659533449006 0.333237769652749 0.223443223443223 0.474163903758139 0.233853865432812 0.317653290805358 0.246577983420088 0.317653290805358 1 1 / 10 0 0 0.0391363022941966 0 0.0460767302872567 0 0.0518604202814724 0 0.0572585309427399 0 0.0618854829381133 0 0.0668980142664348 0 0.0703682282629638 0 0.0740312319259679 0 0.0773086562560239 0 0.0807788702525529 0 0.0838635049161359 0 0.087719298245613 0 0.0904183535762478 0 0.0931174089068827 0 0.0956236745710424 0 0.0992866782340465 0 0.101985733564679 0 0.105841526894158 0 0.109311740890687 0 0.111625216888374 0 0.11490264121843 0 0.11740890688259 0 0.121264700212067 0 0.125120493541546 0 0.127626759205706 0 0.130904183535762 0 0.134759976865239 0 0.138615770194716 0 0.142471563524195 0 0.146327356853672 0 0.15018315018315 0 0.154038943512627 0 0.157894736842104 0 0.160401002506266 0 0.163678426836322 0 0.167534220165799 0 0.17061885482938 0 0.176209755157122 0 0.182379024484286 0 0.186813186813186 0 0.192596876807402 0 0.198380566801618 0 0.204164256795836 0 0.208020050125313 1 0.212839791787159 1 0.217659533449006 1 0.223443223443223 1 0.233853865432812 1 0.246577983420088 1 1 1 /
pH 10 0 0 0.163636363636363 0.378888964181783 0.2 0.418277639126952 0.218181818181818 0.402093355081907 0.245454545454546 0.487857677610877 0.254545454545454 0.481335172606948 0.272727272727273 0.496997434580757 0.281818181818182 0.483554189276159 0.290909090909091 0.50185611197141 0.3 0.50529179952388 0.309090909090909 0.508100287882468 0.318181818181818 0.521552789943908 0.327272727272727 0.522739059224122 0.336363636363636 0.471168483814558 0.345454545454545 0.458828018330766 0.354545454545454 0.452626906018692 0.363636363636364 0.491132619500923 0.372727272727273 0.497759439765849 0.381818181818182 0.448924705474639 0.390909090909091 0.502724790443356 0.4 0.493140086447634 0.409090909090909 0.494575459093647 0.418181818181818 0.46650928634664 0.427272727272727 0.480753282514885 0.436363636363637 0.474323971841343 0.445454545454545 0.469809732692915 0.454545454545455 0.454985514140703 0.463636363636364 0.47775955654485 0.472727272727273 0.411725801453363 0.481818181818182 0.653871479178092 0.490909090909091 0.631251436683525 0.5 0.681468945506588 0.518181818181818 0.741782904465911 0.527272727272727 0.779198340172092 0.536363636363636 0.73932132356952 0.545454545454545 0.788628662496303 0.563636363636363 0.844762174549418 0.572727272727273 0.860801383932404 0.581818181818182 0.914469956381158 0.6 0.895224447990332 0.627272727272727 1 0.654545454545455 0.990961609719944 0.690909090909091 0.982009750458116 0.736363636363636 0.889746218623354 1 0.59495641726739 / 10 0 0 0.163636363636363 0.171927369966974 0.2 0.209949325648515 0.218181818181818 0.229089304047744 0.245454545454546 0.258205187298345 0.254545454545454 0.267690837880826 0.272727272727273 0.286942777552942 0.281818181818182 0.296512367986754 0.290909090909091 0.305984004432324 0.3 0.315478113815618 0.309090909090909 0.324760402889651 0.318181818181818 0.334169040848586 0.327272727272727 0.343474462394965 0.336363636363636 0.352820409738718 0.345454545454545 0.361939403816733 0.354545454545454 0.371187605541755 0.363636363636364 0.380284048474968 0.372727272727273 0.389804884985315 0.381818181818182 0.398781851477559 0.390909090909091 0.408161077072591 0.4 0.417348306458713 0.409090909090909 0.426392873857414 0.418181818181818 0.435664213122522 0.427272727272727 0.44497408024743 0.436363636363637 0.454112194044465 0.445454545454545 0.463411550211725 0.454545454545455 0.472564208647221 0.463636363636364 0.481762837380646 0.472727272727273 0.491027860401207 0.481818181818182 0.500437024005014 0.490909090909091 0.509866135803033 0.5 0.519165513888056 0.518181818181818 0.537904572904653 0.527272727272727 0.546986628793952 0.536363636363636 0.556283264916301 0.545454545454545 0.565725164749828 0.563636363636363 0.583848467995443 0.572727272727273 0.592933376769069 0.581818181818182 0.602217847892001 0.6 0.620570973573164 0.627272727272727 0.647716592169006 0.654545454545455 0.674325156646936 0.690909090909091 0.710179303938397 0.736363636363636 0.754896158520227 1 1 / 10 0 0 0.163636363636363 0.292060912329704 0.2 0.292060912329704 0.218181818181818 0.292060912329704 0.245454545454546 0.387066127431606 0.254545454545454 0.387066127431606 0.272727272727273 0.387066127431606 0.281818181818182 0.387066127431606 0.290909090909091 0.42649253536337 0.3 0.42649253536337 0.309090909090909 0.42649253536337 0.318181818181818 0.42649253536337 0.327272727272727 0.42649253536337 0.336363636363636 0.42649253536337 0.345454545454545 0.42649253536337 0.354545454545454 0.42649253536337 0.363636363636364 0.42649253536337 0.372727272727273 0.42649253536337 0.381818181818182 0.393412761029477 0.390909090909091 0.393412761029477 0.4 0.393412761029477 0.409090909090909 0.393412761029477 0.418181818181818 0.393412761029477 0.427272727272727 0.393412761029477 0.436363636363637 0.363912572024877 0.445454545454545 0.363912572024877 0.454545454545455 0.363912572024877 0.463636363636364 0.363912572024877 0.472727272727273 0.348032783949141 0.481818181818182 0.593569993806997 0.490909090909091 0.593569993806997 0.5 0.593569993806997 0.518181818181818 0.593569993806997 0.527272727272727 0.733378225331714 0.536363636363636 0.733378225331714 0.545454545454545 0.733378225331714 0.563636363636363 0.836558544452159 0.572727272727273 0.836558544452159 0.581818181818182 0.900828638832574 0.6 0.900828638832574 0.627272727272727 0.900828638832574 0.654545454545455 0.900828638832574 0.690909090909091 0.900828638832574 0.736363636363636 0.900828638832574 1 1 /
sulphates 10 0 0 0.104651162790698 0.15439955867336 0.127906976744186 0.365448835558881 0.13953488372093 0.200854729599451 0.151162790697674 0.367055233640014 0.162790697674419 0.423715465149741 0.174418604651163 0.48221242833202 0.186046511627907 0.643921762523787 0.197674418604651 0.644055032311465 0.209302325581395 0.589787946884607 0.220930232558139 0.598424653926532 0.232558139534884 0.559068603288498 0.244186046511628 0.536377461278887 0.255813953488372 0.58593089015901 0.267441860465116 0.628268283372355 0.27906976744186 0.71762724951006 0.290697674418605 0.665360028991964 0.302325581395349 0.748653395680278 0.313953488372093 0.773087004595604 0.325581395348837 0.744484860730683 0.337209302325581 0.762565764535692 0.348837209302326 0.716969897960138 0.36046511627907 0.772352608940652 0.372093023255814 0.885898347172748 0.383720930232558 0.908382556910697 0.395348837209302 0.923007346034795 0.406976744186046 0.976217335438134 0.418604651162791 0.916380516904013 0.430232558139535 0.931616014978774 0.441860465116279 1 0.465116279069767 0.930731671653748 0.488372093023256 0.895397365961798 0.511627906976744 0.715820293629781 0.546511627906977 0.751042902277289 0.593023255813953 0.612604951363592 0.651162790697674 0.618411906323784 1 0.602049589696115 / 10 0 0 0.104651162790698 0.118831690054699 0.127906976744186 0.145179842384777 0.13953488372093 0.158121056764096 0.151162790697674 0.171005437704395 0.162790697674419 0.183881037444866 0.174418604651163 0.196821756822614 0.186046511627907 0.209421157458573 0.197674418604651 0.222041923759656 0.209302325581395 0.234741615378828 0.220930232558139 0.247192306997672 0.232558139534884 0.259676694410452 0.244186046511628 0.271892953013671 0.255813953488372 0.284181223366071 0.267441860465116 0.296279092710798 0.27906976744186 0.308285649430402 0.290697674418605 0.320305418932091 0.302325581395349 0.332307412546719 0.313953488372093 0.344084797896952 0.325581395348837 0.355876102862952 0.337209302325581 0.3677220231535 0.348837209302326 0.379623615910013 0.36046511627907 0.391401730254824 0.372093023255814 0.403279227974635 0.383720930232558 0.415276660683177 0.395348837209302 0.427035959142969 0.406976744186046 0.43874269916055 0.418604651162791 0.450592975217071 0.430232558139535 0.4624443083643 0.441860465116279 0.474331786049453 0.465116279069767 0.497686248563254 0.488372093023256 0.520939832332713 0.511627906976744 0.544576300159815 0.546511627906977 0.578659166587843 0.593023255813953 0.625301016675919 0.651162790697674 0.681216512168924 1 1 / 10 0 0.839094249148908 0.104651162790698 0 0.127906976744186 0 0.13953488372093 0 0.151162790697674 0 0.162790697674419 0 0.174418604651163 0 0.186046511627907 0.0500826332808826 0.197674418604651 0.0500826332808826 0.209302325581395 0.0500826332808826 0.220930232558139 0.0909970424099675 0.232558139534884 0.0909970424099675 0.244186046511628 0.0909970424099675 0.255813953488372 0.0909970424099675 0.267441860465116 0.0909970424099675 0.27906976744186 0.0909970424099675 0.290697674418605 0.0844412531128331 0.302325581395349 0.128677745363741 0.313953488372093 0.128677745363741 0.325581395348837 0.128677745363741 0.337209302325581 0.128677745363741 0.348837209302326 0.128677745363741 0.36046511627907 0.273622650370133 0.372093023255814 0.273622650370133 0.383720930232558 0.273622650370133 0.395348837209302 0.273622650370133 0.406976744186046 0.273622650370133 0.418604651162791 0.345772470545762 0.430232558139535 0.345772470545762 0.441860465116279 0.345772470545762 0.465116279069767 0.345772470545762 0.488372093023256 0.278464394584768 0.511627906976744 0.278464394584768 0.546511627906977 0.357708389794582 0.593023255813953 0.357708389794582 0.651162790697674 0.248768507570024 1 1 /
alcohol 10 0 0 0.112903225806452 0.0359849315085464 0.129032258064516 0.0650295928113111 0.145161290322581 0.124067612964896 0.161290322580645 0.128730641181348 0.17741935483871 0.111216683574923 0.193548387096774 0.104147808059874 0.209677419354839 0.116572626316499 0.225806451612903 0.105115030513716 0.241935483870968 0.160479544353335 0.258064516129032 0.176299575466693 0.274193548387097 0.166806316904056 0.290322580645161 0.16923922273969 0.306451612903226 0.235895632958772 0.32258064516129 0.22080807453206 0.338709677419355 0.21092686224505 0.354838709677419 0.254533132018336 0.370967741935484 0.280155706970434 0.387096774193548 0.37329295348638 0.403225806451613 0.338690167120653 0.419354838709677 0.315678172085127 0.435483870967742 0.379327827373413 0.451612903225807 0.355838716872259 0.467741935483871 0.485828610848169 0.483870967741936 0.498436887502157 0.516129032258065 0.491748617766301 0.532258064516129 0.498346021999849 0.548387096774194 0.47196321376982 0.564516129032258 0.467694643336711 0.596774193548387 0.477557202020455 0.612903225806452 0.630761192398255 0.645161290322581 0.655932283689856 0.661290322580645 0.754421450587534 0.67741935483871 0.752344473211869 0.709677419354839 0.787400558656125 0.725806451612903 0.767808760179365 0.741935483870968 0.853988019729857 0.774193548387097 1 0.82258064516129 0.949833034534865 1 0.922064192080309 / 10 0 0 0.112903225806452 0.108462127275899 0.129032258064516 0.123927893857325 0.145161290322581 0.139623955968113 0.161290322580645 0.155204370808067 0.17741935483871 0.170894404155461 0.193548387096774 0.186512760048487 0.209677419354839 0.202378758958363 0.225806451612903 0.218264041387759 0.241935483870968 0.234100599936189 0.258064516129032 0.249991145307867 0.274193548387097 0.26586743324989 0.290322580645161 0.28169426866096 0.306451612903226 0.297551577820533 0.32258064516129 0.313495931409762 0.338709677419355 0.329374865803962 0.354838709677419 0.345361702318299 0.370967741935484 0.361127205231569 0.387096774193548 0.377068283901232 0.403225806451613 0.392987237421177 0.419354838709677 0.408934065999713 0.435483870967742 0.424947125050847 0.451612903225807 0.440860488494032 0.467741935483871 0.45663015943777 0.483870967741936 0.472631315202216 0.516129032258065 0.50452342932759 0.532258064516129 0.520683795996887 0.548387096774194 0.536526935920361 0.564516129032258 0.552413759948229 0.596774193548387 0.584753477298965 0.612903225806452 0.600900005271398 0.645161290322581 0.633100186822124 0.661290322580645 0.649159434357829 0.67741935483871 0.665491759127862 0.709677419354839 0.698224743969117 0.725806451612903 0.714549763828803 0.741935483870968 0.731009744749023 0.774193548387097 0.763905851379562 0.82258064516129 0.813548172110775 1 1 / 10 0 0 0.112903225806452 0.343187824535716 0.129032258064516 0.343187824535716 0.145161290322581 0.343187824535716 0.161290322580645 0.343187824535716 0.17741935483871 0.323597991028364 0.193548387096774 0.323597991028364 0.209677419354839 0.323597991028364 0.225806451612903 0.323597991028364 0.241935483870968 0.333740940603922 0.258064516129032 0.335795851002791 0.274193548387097 0.340032966475472 0.290322580645161 0.366212333274482 0.306451612903226 0.366212333274482 0.32258064516129 0.366212333274482 0.338709677419355 0.368229047940177 0.354838709677419 0.406206222672896 0.370967741935484 0.406206222672896 0.387096774193548 0.49214586015275 0.403225806451613 0.491424983423823 0.419354838709677 0.470333572283438 0.435483870967742 0.528985894143034 0.451612903225807 0.528985894143034 0.467741935483871 0.660349251102604 0.483870967741936 0.660349251102604 0.516129032258065 0.660349251102604 0.532258064516129 0.660349251102604 0.548387096774194 0.660349251102604 0.564516129032258 0.660349251102604 0.596774193548387 0.660349251102604 0.612903225806452 0.795049437868269 0.645161290322581 0.855922962160041 0.661290322580645 0.855922962160041 0.67741935483871 0.855922962160041 0.709677419354839 0.855922962160041 0.725806451612903 0.866012325041445 0.741935483870968 0.933831772845881 0.774193548387097 1 0.82258064516129 1 1 1 / 10 0 0 0.112903225806452 0 0.129032258064516 0 0.145161290322581 0 0.161290322580645 0 0.17741935483871 0 0.193548387096774 0 0.209677419354839 0 0.225806451612903 0 0.241935483870968 0 0.258064516129032 0 0.274193548387097 0 0.290322580645161 0 0.306451612903226 0 0.32258064516129 0 0.338709677419355 0 0.354838709677419 0 0.370967741935484 0.145875461052418 0.387096774193548 0.145875461052418 0.403225806451613 0.145875461052418 0.419354838709677 0.145875461052418 0.435483870967742 0.145875461052418 0.451612903225807 0.145875461052418 0.467741935483871 0.491412497848621 0.483870967741936 0.491412497848621 0.516129032258065 0.491412497848621 0.532258064516129 0.491412497848621 0.548387096774194 0.491412497848621 0.564516129032258 0.491412497848621 0.596774193548387 0.491412497848621 0.612903225806452 1 0.645161290322581 1 0.661290322580645 1 0.67741935483871 1 0.709677419354839 1 0.725806451612903 1 0.741935483870968 1 0.774193548387097 1 0.82258064516129 1 1 1 /
Table 2: A selection of four models from the Pareto optimal set, along with their ALE main effect curves. From left to right, the columns show models with 1) lowest MAE, 2) lowest MAE when , 3) lowest MAE when , and 4) lowest MAE with . Corresponding hyperparameters can be found in Table 1.

5.0.6 Performance-Interpretability Trade-off.

The complexity measures allow to study the trade-off between interpretability and performance across different model classes and hyperparameter settings. We mapped each measure to the interval by scaling each measure with meaningful upper and lower bounds:

For MAE, we set to the lowest observed MAE of all models and . For NF, we set and . For MEC, we set and to the highest observed MEC of all models. For IAS, we set and to the highest observed IAS of all models. To combine the three measures in a single dimension, we mapped interpretability ad-hoc as: . This weights all three (scaled) measures equally. The maximum interpretability is 3 for the constant model that always predicts the median wine quality. The theoretical minimum interpretability is 0 for a model that uses all features and has the highest interaction strength and highest main effect complexity measure among all Pareto optimal models. Figure 3 maps each model and hyperparameter configuration from the Pareto set to the performance / interpretability space.

Figure 3: Performance vs. interpretability tradeoff for predicting wine quality. Corresponding hyperparameters and measures are shown in Table 1.

6 Discussion

We proposed three model-agnostic measures for machine learning model complexity based on functional decomposition: number of features used, interaction strength and main effect complexity. Due to their model-agnostic nature, the measures allow model comparison across different model classes. We argued that minimizing these measures for a machine learning model improves its post-hoc interpretation. We demonstrated that the measures can be optimized directly with multi-objective optimization to make the trade-off between interpretability and performance explicit. Our proposed measures can be applied to a wide range of models since they are model-agnostic and work for regression and binary classification (based on classification scores / probabilities). We formulated the measures with both continuous and categorical features in mind, but leave an in-depth investigation of categorical features open for future work. The measures can be used for model selection, for model benchmarks and as objectives in automated machine learning frameworks.

6.0.1 Limitations.

The proposed decomposition of the prediction function and definition of the complexity measures will not be appropriate in every situation. For example, all higher order effects are combined into a single interaction strength measure that does not distinguish between two-way interactions and higher order interactions. Two models can have the same IAS, but one has a single two-way interaction, the other many different higher order interactions. However, the framework of ALE decomposition allows to estimate higher order effects and to construct different interaction measures. The main effect complexity measure only considers linear segments but not e.g. seasonal components or other structures. Furthermore, the complexity measures quantify machine learning models from a functional point of view and ignore the structure of the model (e.g. whether it can be represented by a tree).

6.0.2 The bigger picture.

Interpretability is a high-dimensional concept (sparsity, additivity, fidelity, human simulability, …) and we need several approaches to make interpretability measurable. In this context we see our work complementary to other approaches [15, 10, 27], which together form a basis for a more rigorous definition of interpretability as demanded by [11, 23]. Availability of different interpretability measures also fits in with the notion that interpretability depends on the audience and the context [30]. Different situations require differently weighted interpretability measures. In some situations we might prefer sparseness and a lack of interactions, in others it might be important that we can represent the model as a decision list. A multi-dimensional view of interpretability solves the lack of definition of interpretability and supports researcher to make quantified, verifiable and clearer statements about interpretability.

6.0.3 Implementation.

The code for this paper is available at https://github.com/compstat-lmu/paper_2019_iml_measures. For the examples and experiments we relied on the mlr package [6] in R [28].

6.0.4 Acknowledgements.

This work is funded by the Bavarian State Ministry of Science and the Arts in the framework of the Centre Digitisation.Bavaria (ZD.B) and supported by the German Federal Ministry of Education and Research (BMBF) under Grant No. 01IS18036A. The authors of this work take full responsibilities for its content.

References