State space models for non-stationary intermittently coupled systems

11/11/2017 ∙ by Philip G. Sansom, et al. ∙ 0

Many time series exhibit non-stationary behaviour that can be explained by intermittent coupling between the observed system and one or more unobserved drivers. We develop Bayesian state space methods for modelling changes to the mean level or temporal correlation structure due to intermittent coupling. Improved system diagnostics and prediction are achieved by incorporating expert knowledge for both the observed and driver processes. Time-varying autoregressive residual processes are developed to model changes in the temporal correlation structure. Efficient filtering and smoothing methods are proposed for the resulting class of models. We for evaluating the evidence of unobserved drivers, and for quantifying their overall effect, and their effects during individual events. Methods for evaluating the potential for skilful predictions within coupled periods, and in future coupled periods, are also proposed. The proposed methodology is applied to the study of winter variability in the dominant pattern of climate variation in the northern hemisphere, the North Atlantic Oscillation. Over 70 level is attributable to an unobserved driver. Skilful predictions for the remainder of the winter season are possible as soon as 30 days after the beginning of the coupled period.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1 Introduction

Intermittently coupled systems can be found in many areas of both the natural and social sciences. For example, many climate processes are only active during certain times of year, e.g., sea ice and snow cover change the interaction between the surface and the atmosphere (Chapin III et al., 2010; Bourassa et al., 2013). Migrating birds and animals only mix at certain times of year, allowing disease transmission between populations (Olsen et al., 2006; Altizer et al., 2011). Empirical models have been applied to forecasting intermittent demand in production economics and operational research (Croston, 1972; Shenstone and Hyndman, 2005). In many cases, it is impossible or impractical to observe or even to physically identify the intermittent component of the system. However, physical reasoning or prior knowledge may support the existence of such components, and provide information about their behaviour. By incorporating this information through careful statistical modelling we can separate the effect of intermittently coupled components from the underlying behaviour of the observed system.

The methodology developed in this study was motivated by the problem of diagnosing unusual persistence in the dominant mode of climate variability in the northern hemisphere, known as the North Atlantic Oscillation (NAO). Because of its impact on European climate, the ability to forecast the NAO is currently a topic of great interest for the development of new climate prediction services (Siegert et al., 2016). A daily time series of NAO observations is shown in Fig. 1. There is a clear annual cycle in the observations. Figure 1 also indicates that the NAO exhibits greater inter-annual variability in winter than summer. At the same time, the autocorrelation function indicates increased persistence of day-to-day conditions in winter than summer. Increased persistence, implies increased predictability. The seasonal contrast in inter-annual variability and autocorrelation visible in Fig. 1 could be caused by a transient shift in the mean, or a change in autocorrelation structure during the winter season. Climate scientists typically fit separate models to different seasons (e.g., Keeley et al., 2009; Franzke and Woollings, 2011). This approach makes it difficult to diagnose whether the apparent change in autocorrelation is the cause of the increased inter-annual variability, or a symptom of excess inter-annual variability in the winter mean.

Figure 1:

The North Atlantic Oscillation. (top) Daily time series (grey) and 90 day moving average (black) of our daily NAO index, (left) inter-annual standard deviation of the monthly mean NAO index, and (right) the autocorrelation function of the daily NAO index computed for each month of the year. Grey lines are the individual months. A linear trend and annual and semi-annual cycles were estimated by least-squares and removed before computing the autocorrelation functions.

In this study we propose a flexible class of models capable of separating variability due to unobservable intermittent components, from long term variability in the observed process itself, accumulated short-term variability, and observation errors. We develop tools for diagnosing whether the intermittent component acts on the mean or the autocorrelation structure of the observed system. If we can learn the state of the intermittent component quickly enough, then it should be possible to make skilful predictions about the remainder of a particular coupled period. Alternatively, the effect of the intermittent component may be similar between coupling events. In that case, it should be possible to make predictions about subsequent coupled periods.

State space models, also known as structural time series models, provide a flexible class of models for non-stationary time series (Durbin and Koopman, 2012)

. By modelling the system in terms of physically meaningful components we can incorporate expert knowledge to help separate the effects of intermittent components from long-term variability elsewhere in the system. There is an extensive literature on modelling non-stationarity in the mean by state space methods, particularly where the observed process depends linearly on the state parameters and the observation and innovation processes are both normally distributed

(Harvey, 1989; West and Harrison, 1997; Durbin and Koopman, 2012)

. Time-varying autoregressive (TVAR) models generalise classical autoregressive models to have time-varying coefficients, thus capturing changes in the autocorrelation structure

(Subba Rao, 1970; Kitagawa and Gersch, 1985; Prado and West, 1997, 2010). In Sec. 3.1, we propose a class of models containing latent TVAR components that capture changes in short-term temporal dependence while maintaining the interpretability of the mean and unobserved intermittent effects.

Smooth changes in the mean or the temporal dependence structure can be captured by simple random walk priors on their respective state variables. Rapid changes, such as those that might be expected due to intermittent coupling, often require explicit interventions in the model (Box and Tiao, 1975). Intervention methods were extended to state space models by Harvey and Durbin (1986). Standard intervention approaches (e.g., Harvey (1989, Chapter 7.6), West and Harrison (1997, Chapter 11), Durbin and Koopman (2012, Chapter 3.2.5)) require the introduction of separate intervention and effect variables for each event. The effect is usually assumed to be constant throughout a particular event and independent between events. For intermittent coupling events, the underlying cause of each event will usually be the same, although the effect may vary. In Sec. 3.2, we model the effect of intermittent coupling as a dynamic process, intermittently identifiable through a series of interventions that determine the timing and duration of the coupling events.

The construction of the NAO time series shown in Fig. 1 and analysed in Sec. 6 is described in Sec. 2. Following the methodological developments outlined above, we discuss efficient posterior inference for the resulting class of models in Sec. 5. Section 6 contains the results of our study of the NAO. Section 7 concludes with a discussion.

2 The North Atlantic Oscillation

The North Atlantic Oscillation is the name given to the difference in surface pressure between the Azores High and the Icelandic Low (Walker, 1924). The NAO is important because it affects the strength of the prevailing westerly winds and the position of the storm track, strongly influencing the winter climate of the United Kingdom and Europe (Hurrell, 1995). The NAO varies on time scales from a few days to several decades (Hurrell, 1995; Kushnir et al., 2006). Statistical studies have hinted at the potential to predict the NAO on seasonal time scales (Keeley et al., 2009; Franzke and Woollings, 2011). This potential predictability is often attributed to forcing by slowly varying components of the climate system, including sea surface temperatures, the stratosphere and snow cover (Kushnir et al., 2006). Climate models have recently begun to show significant skill in forecasting the winter NAO a season ahead (Scaife et al., 2014). However, the physical mechanisms behind the predictability remain unclear and the size of the predictable signal appears to be underestimated by the models (Scaife et al., 2014; Eade et al., 2014). Careful statistical modelling may lead to additional insights. If a predictable signal can be extracted from the observations, then it may be possible to identify the source of the forcing effect.

Following Mosedale et al. (2006), we construct a simple NAO index as the area-weighted sea level pressure difference between two boxes, one stretching from N, the other from N, both spanning W–E, using pressure data from the ERA-Interim reanalysis (Dee et al., 2011). The resulting daily time series, shown in Fig. 1, spans the period 1 January 1979 to 31 December 2017, a total of observations.

3 Modelling intermittently coupled systems

In complex systems such as the Earth system, it is reasonable to consider that all components of the system (e.g., mean, seasonality, temporal dependence) may evolve slowly over time. We begin by outlining a general model to capture gradual changes in the underlying components of the observed process. We then propose explicit intervention models to represent rapid transient changes due to intermittent coupling.

3.1 Latent TVAR component models

Classical autoregressive models require that we redefine the mean of the observed process, if the mean is non-zero. This makes it difficult to specify physically meaningful models for the time evolution of the mean and the effect of intermittently coupled components. Latent autoregressive components remove the need to redefine the mean level of the observed time series (Harvey, 1989, Chapter 2). In order to allow for possible changes in the mean, seasonal and autocorrelation structure of an observed process, we propose the following latent time-varying autoregressive component model

(1a)
(1b)
(1c)
(1d)
(1e)
(1f)
(1g)

where . The observed process is modelled as the sum of mean, seasonal and autoregressive components. The variable represents the mean level of the observed process. Any local-in-time systematic trend is captured by the variable . The harmonic components and () represent seasonal behaviour. The local trend and seasonal variables are assumed to be time-varying, evolving according to independent normal innovation processes , , and (). The irregular component represents short-term variability in the observed process and is modelled as a latent time-varying autoregressive process of order with normal innovation process . The autoregressive coefficients are assumed to be time-varying, evolving according to independent normal innovation processes (). The independent residual represents observation or measurement error.

In the case of the NAO, the variance of the innovation process is expected to vary systematically with the solar cycle and is modelled as

(2)

The other innovation and error variances , , , and are assumed constant over time. Model (1) is intended capture gradual changes in the structure of the observed process. Therefore, the innovation variances and are expected to be small, in particular

. The innovation and error variances are assumed unknown and must be inferred from the data. Expert judgement about the scale of the innovation variances can incorporated through appropriate prior probability distributions.

The model defined by (1) is quite general and could be applied to a range of climate, economic and environmental time series. For our NAO index, we set the number of harmonic components and the order of the latent TVAR process based on exploratory analysis.

3.2 Intervention methods for intermittent coupling

The change in the autocorrelation structure of the NAO index in Fig. 1 appears to involve two distinct states, i.e., coupled or not. We model the change from the uncoupled to the coupled state by introducing an intervention variable

where is the set of times where the observed system is believed to be coupled to the unobserved process, e.g., . We assume that the timing and duration of the coupling events is constant between events, but not known precisely. We model the intervention by introducing two hyper-parameters and representing the start and duration of the coupled period respectively (Fig. 2). In practice, we do not expect an instantaneous change in the behaviour of the system, so we linearly taper the leading and trailing edges of the intervention . In the absence of stronger beliefs, we assume the tapering is symmetric and accounts for a proportion of the duration , i.e., in Fig. 2.

Figure 2: Example of form and parametrisation of intervention . Parameters and represent the start and duration of the coupled period, while controls the transition.

We consider two alternative models for the effect of intermittent coupling. First, coupling may lead to a transient change in the mean of the observed process, second, coupling may lead to a transient change in the temporal dependence structure of the observed process. If coupling is believed to induce a change in the mean, then the forecast equation (1a) is modified to include the intervention as follows

(3)

The effect is modelled as

(4)

If coupling is believed to induce a change in the autocorrelation structure, then we modify the forecast equation (1a) again

(5)

and define effects , modelled as

(6)

with common hyper-parameters and .

Most of our prior knowledge about coupling events is likely to be about their timing, and will be expressed through priors on the hyper-parameters , and . Therefore, it is difficult to justify a complex form for the effects or . However, a variety of behaviour can be captured depending on the values of the coefficient and variance .

As noted in the previous section, the mean, trend, seasonal and autoregressive parameters are expected to vary only slowly. Therefore, we can learn their states outside of the coupled period and identify the coupling effects or () when . The form and parametrisation of the coupling intervention in Fig. 2 reflect our physical intuition about the likely influence of an unobserved process on the NAO. For other applications, different forms might be appropriate, e.g., no tapering, non-symmetric tapering, non-linear tapering, etc. We recommend keeping , in order to make the coupling effect easily interpretable. The only other restriction is that the intervention should be transient, not permanent. Permanent changes can be modelled in the same way, but the effects should be fixed in order to be identifiable, i.e., and .

4 State-space form and prior assessment

The model proposed in Section 3.1 can be written in state space form as

for with , where and . The forecast function is given by (1a). The evolution function is given by (1b–g). The innovation covariance matrix is diagonal with main diagonal . The coupling effect or effects (

) can be appended to the state vector

. The innovation vector and covariance matrix can also be extended to include the coupling effect innovation or innovations () and variance respectively.

The prior distribution specifies our beliefs about the state-variables at time . We also need to specify priors for the collection of hyper-parameters .

4.1 Priors for state variables

Independent normal priors were assigned to each component of the state vector at time . The prior means and variances are listed in Table 1. We were able to use previous studies of the NAO to define informative priors for the mean (Hsu and Wallace, 1976), seasonal components (Chen et al., 2012) and TVAR coefficients (Masala, 2015). The prior on the local trend is based on our judgement that the NAO mean is very unlikely to experience a local change equivalent to more than 1 hPa/yr. The daily NAO in Fig. 1 has a range of approximately 40 hPa. Therefore, the TVAR residuals were assigned independent normal priors with mean 0 hPa and standard deviation 10 hPa, based on a range of four standard deviations.

In Fig. 1, the NAO index has an inter-annual standard deviation of 5–6 hPa during winter. Therefore, in the model with a mean intervention, the coupling effect was assigned a normal prior with mean 0 hPa and standard deviation 5 hPa. The partial-autocorrelation functions (not shown) for Dec–Mar and Apr–Dec suggest that the change in the autocorrelation structure represented by the coefficients is quite small. Therefore, in the model with an intervention on the autocorrelation structure, the coupling effects () were assigned normal priors with mean 0 hPa and standard deviation 0.2.

Component Parameter Mean Variance Mean level 6 hPa Local trend 0 hPa/yr Annual cycle 3.6 hPa Annual cycle 1.0 hPa Semi-annual cycle 1.3 hPa Semi-annual cycle 0.7 hPa Irregular component 0 hPa TVAR coefficients +1.8,-1.3,+0.7,-0.3,+0.1

Table 1: Prior probability distributions for state variables . All normally distributed.

4.2 Priors on hyper-parameters

The prior distributions assigned to the hyper-parameters , , , , and are listed in Table 2. In the case of the NAO, the variability in the mean and seasonal components will be driven primarily be solar forcing, therefore we assume equal error variances, i.e., . The observation and innovation variances , , and are all expected to be very small, but non-zero. Therefore, boundary-avoiding priors were specified in the form of Normal distributions on the log of each variance parameter. Simulation studies of the individual components in (1) were used to assign priors that reflect the range of variability we consider plausible for each component. We expect the annual cycle in the day-to-day variance to peak during the winter season, when changes of several hPa per day may be possible. The priors on the hyper-parameters , and reflect these beliefs.

Component Parameter Prior Interval Observation variance Mean variance Trend variance Irregular variance Irregular variance Irregular variance Coefficient variance

Table 2: Prior densities for hyper-parameters.

Table 3 lists the priors for the intervention parameters and and the coupling effect parameters and . Our beliefs about the timing of the intervention are the same regardless of whether coupling effects the mean or the autocorrelation structure. Vague triangular priors are specified for the beginning and duration of the coupled period. These suggest a coupled period with total length around 180 days, beginning around 1 November. A mildly informative prior is specified for the tapering parameter to reflect our physical reasoning that the influence of the unobserved process is unlikely to be constant throughout the coupled period. The coupling coefficients and are expected to be positive and close to but not exceeding one. The mean coupling effect variance is expected to be greater than the mean variance , but still small compared to . Similarly, the autocorrelation coupling effect variance is expected to be greater than the coefficient innovation variance .

Component Parameter Prior Interval Coupling start Coupling length Tapered proportion Mean effect variance Mean effect coefficient Autocorrelation effect variance Autocorrelation effect coefficient

Table 3: Prior densities for intervention hyper-parameters.

5 Posterior Inference

We want evaluate the joint posterior of the model components and the hyper-parameters

If both and were linear functions, then conditional on , we could sample from the marginal posterior of the state variables using the well known forward-filtering backward-sampling algorithm (Frühwirth-Schnatter, 1994). However, the evolution function defined by (1) and (4) is non-linear due to the combination of and in (1f). Therefore, we propose a linear approximation to the state equation

where and . This linearisation leads to approximate forward-filtering backward-sampling recursions, detailed in Appendix A.

The marginal posterior of the hyper-parameters is proportional to

The marginal likelihood can be decomposed as

The forward-filtering recursions in Appendix A include an expression for the one-step ahead forecast distribution , so the marginal likelihood can be evaluated analytically. Therefore, the joint posterior can be efficiently by combining forward-filtering backward-sampling with a Metropolis-Hastings scheme targeting as follows

  • Let denote a sample index, at

    • Sample starting values ;

    • Sample by backward-sampling.

  • For

    • Sample new values from proposal ;

    • Accept with probability

    • Sample by backward-sampling.

Straightforward Gibbs’ sampling is possible for Normal state-space models with unknown variance parameters, provided suitable conjugate priors are specified

(West and Harrison, 1997, Chapter 15). However, we do not expect the data to be very informative for the observation and evolution variances , , and . As a result, the mixing rate of the full conditional sampler is likely to be very poor due to the large number of observations. Therefore, we prefer the Metropolis-Hastings scheme outlined above in combination with the boundary avoiding priors specified in the previous section.

Sequential Monte Carlo methods provide tools for inference in general non-linear and non-normal state-space models. However, sequential Monte Carlo methods are computationally expensive, and perform poorly for long time-series such as the NAO index in Sec. 2 (Doucet and Johansen, 2011). The problem of efficient inference for unknown hyper-parameters also remains an important topic for research in Sequential Monte Carlo methods (Chopin et al., 2013)

. In its most general form, the approximation proposed above is known as the Extended Kalman Filter

(Anderson and Moore, 1979). Approximations of the type proposed above are frequently used as the basis for proposal distributions in sequential Monte Carlo schemes (Doucet et al., 2000). In general, the innovation variance will be small, the coefficients will be only weakly correlated with the other state variables and our uncertainty about the coefficients will decrease rapidly over time. Since the other components of the evolution function are linear and the observation errors and joint state innovation are normal, forward-filtering and backward-sampling based on the linear approximation is expected to be very accurate.

5.1 Model selection

For some applications, it will be possible to choose between the mean and autocorrelation intervention models on the basis of posterior predictive diagnostics, i.e., can the model reproduce the observed behaviour. The posterior distributions of the hyper-parameters

can also be useful for choosing between models, e.g., is the coupling effect variance

negligible. More formally, we can compare the two intervention models by evaluating the Bayes’ factor

(7)

where is the model including an intervention on the mean, and is the model including and intervention on the temporal dependence structure. The Bayes’ factor is defined as the ratio of the marginal likelihoods of the competing models (Kass and Raftery, 1995). Values of greater than one indicate support for the mean intervention model and values of less than one indicate support for the autocorrelation intervention model . The conditional likelihoods can be evaluated using the filtering recursions in Appendix A. The marginal likelihoods can be evaluated based on the posterior samples () by bridge sampling (Gronau et al., 2017).

5.2 What is the effect of the coupling?

Given posterior samples , we can make inferences about any function of the state variables for any time period of interest, e.g., . It is useful to define , which we refer to as the systematic component of the observed process. The relative contributions of the systematic component , the irregular component , the coupling effects or () and observations error to the variability between coupled periods is of particular interest. The means of the systematic and irregular components during period in the th sample are

and (8)

where is the number of time steps in . The means of the coupling effects during period in the mean and autocorrelation models respectively are

and (9)

The contribution due to observation error is

where . The prior expectations of the irregular component and the coupling effects or () during any period is zero by (1f), (4) and (6), i.e., and for all . In general , so for the systematic component it is more useful to consider the anomalies over all similar periods

where and is the day of the year at time . The sample means

(10)

provide a summary of the posterior expected contribution of each component during the period

. Quantiles can also be computed over the samples to form credible intervals for the contribution of each component.

5.3 Analysis of variance

In a stationary model, elements of the marginal posterior would summarise the relative contributions of each model component to the observed variability in the index . However, since our model is non-stationary, we require an alternative summary of the variance components. In particular, we are interested in the proportion of the inter-annual variance of the winter (Dec-Jan-Feb) mean of the NAO index explained by each component. Let be the i winter period. We propose performing an analysis of variance of the observed means for each sample using the component means , and defined in (8) and (9) as explanatory variables. The analysis of variance leads to four sums-of-squares for each sample , corresponding to the sum of squared deviations explained by the systematic and irregular components, the coupling effects or () and observation errors in each sample trajectory. Posterior summaries over the samples summarise the overall contributions of each component to the variability between coupled periods.

5.4 Can we make predictions using unobserved components?

Knowledge of the unobserved component through the intervention effect should provide useful predictability within coupled periods. The model proposed in Sec. 3.2 also allows for dependence between successive coupled periods, so knowledge of the unobserved component during one coupled period may also be useful for predicting the next. The -step ahead forecast distribution given data up to time can be sampled exactly using the recursions in Appendix A. The correlation between the data and the forecast means provides a simple measure of forecast performance.

6 Results

The Metropolis-Hastings sampler outlined in Sec. 5 was used to draw 1000 samples from each of the joint posteriors and . Full details of the sampling design, proposal distributions, diagnostic trace plots and posterior density plots are given in the supplementary material. Both models converge to stable distributions and mix efficiently, however the burn-in period can be very long depending on the initial values of the hyper-parameters .

Despite deliberately vague prior distributions, the posterior distributions of the intervention parameters and are quite sharp for both models. Figure 3 visualises the posterior distribution of the intervention for each model. In the mean intervention model , an unobserved component acts strongly on the NAO between December and February and into March. There is almost no evidence of coupling between May and October. In the autocorrelation intervention model the situation is reversed. The inverted intervention structure is unexpected, but still consistent with a marked difference in behaviour between the winter and extended summer seasons.

Figure 3: Posterior of the intervention . (left) The model with an intervention on the mean ; (right) the model with an intervention on the autocorrelation structure . Grey lines represent a random sample of 10 realisations of the intervention based on the posterior samples of , and . The black line is the pointwise posterior mean over all 1000 realisations of .

Posterior predictive diagnostics were used to check the performance of each model in capturing the observed structure of the NAO. In particular, we are interested whether the model can reproduce the seasonal contrast in the inter-annual variance and autocorrelation structures in Fig. 1. For each sample from each model we simulate a new sequence of states and observations for the period . Figure 4 compares the annual cycle in the inter-annual standard-deviation and the seasonal autocorrelation functions of the observed data and the samples for . The mean intervention model is able to reproduce both the inter-annual variability and the seasonal autocorrelation function. There is a clear difference in the autocorrelation functions simulated between April and November, and between December and March. However, the autocorrelation intervention model is unable to reproduce the seasonal autocorrelation behaviour and doesn’t reproduce the inter-annual variability as well as the mean intervention model . There is a small difference between the summer and winter autocorrelation functions, but much less than in the observed autocorrelation. The inverted intervention structure in Fig. 3 is an attempt exploit the extended summer period to distinguish the small intervention effects . Similar checks (not shown) suggest that both models are able to adequately capture the annual cycle in the NAO, indicating that our choice of harmonics was reasonable.

Figure 4: Posterior predictive checks. (top) The model with an intervention on the mean ; (bottom) the model with an intervention on the autocorrelation structure . Before computing the autocorrelation, the mean, a linear trend, annual and semi-annual cycles were estimated by least-squares and removed. Black lines represent the observed statistics. Dark grey lines indicate the posterior mean. Shading indicates pointwise 90% posterior credible intervals. Dark grey shading in (bottom right) indicates overlap between credible intervals.

The posterior predictive checks strongly favour the mean intervention model over the autocorrelation intervention model. The mean intervention is able to reproduce the observed behaviour, the autocorrelation intervention cannot. The Bayes’ factor of also provides extremely strong evidence in favour of the mean intervention model, i.e., the observed data are more than 1000 times more likely to have arisen from the mean intervention model. We conclude that the most likely explanation for the observed behaviour of the NAO index is a transient change in the mean level during the winter season. The remainder of our analysis focuses on interpreting only the mean intervention model.

Surprisingly for such a complex phenomenon, the mean, trend and seasonal components of the NAO index show very limited evidence of non-stationarity. Figure 5 shows a number of posterior trajectories from each component. There is evidence of a fairly constant trend leading to a reduction in the mean level of the NAO of around 0.8 hPa between 1979 and 2017. The posterior distribution of the trend itself suggests that the rate of decrease in the NAO mean peaked around 1993-94 at around 0.03 hPa/yr (-0.07,+0.01), since when the trend has gradually weakened. The amplitudes of the annual and semi-annual cycles are almost constant (likewise the phases). The 0.95 quantile of the posterior distribution of the mean evolution standard deviation is 0.005 hPA, so changes in excess of 0.2 hPa/yr to the mean and seasonal components are not ruled out under the random walk hypothesis. There is no evidence of non-stationarity in the autoregressive coefficients which are effectively constant throughout the study period. This suggests that the day-to-day variation in the NAO can be adequately represented by an AR process rather than a TVAR process. However, this is a useful conclusion given the observed seasonal autocorrelation structure in Fig 1.

Figure 5: Posterior distributions of model components. (top left) Mean ; (top right) trend ; (bottom left) amplitudes of seasonal harmonics and ; (bottom right) TVAR coefficients . Solid black lines represent the pointwise posterior mean. Dashed black lines represent pointwise 90% credible intervals. Grey lines are a random sample of 10 trajectories .

Quantifying the effect of coupling

The posterior mean estimate of the intervention effect standard deviation is 0.42 (0.32–0.52), indicating a very active process, contributing substantial additional variability during the winter season. Table 4 contains the results of the analysis of variance proposed in Sec. 5 for the mean intervention model . The effect of coupling explains around 66% of the observed variation in the winter (Dec-Jan-Feb) means. Accumulated short-term variability captured by the TVAR residuals explains around 33% of the inter-annual variability. Despite the trend visible in Fig. 5, the contribution of the mean and seasonal components is negligible. Together they account for a maximum of 6% of the inter-annual variability in winter. In contrast, the mean and seasonal components account for around 15% of inter-annual variability in summer (Jun-Jul-Aug) when coupling has no effect and the day-to-day variability is reduced. The contribution of measurement error is negligible.

Mean Coupling Irregular Error Winter 0.00 (0.00,0.04) 0.66 (0.52,0.77) 0.33 (0.22,0.46) 0.00 (0.00,0.00) Summer 0.15 (0.06,0.21) 0.00 (0.00,0.00) 0.85 (0.79,0.94) 0.00 (0.00,0.00)

Table 4: Analysis of variance. Bracketed values indicate a 90% credible interval.

Fig. 6 shows the posterior mean contribution of each component to each observed winter (Dec-Jan-Feb) mean level. This is an important and useful advance over existing methods in climate science that only estimate the fraction of total variance explained by each component. The weak negative trend in the mean component is clearly visible. Both the absolute and relative contributions of the irregular component and the coupling effect vary between years, but both components usually have the same sign. This is a product of the limited data available to estimate the components each winter. If the coupling signal cannot be clearly identified during a particular season, then the contribution to the seasonal mean will be split approximately according to the analysis of variance in Tab. 4 and the two components will have the same sign. The fact that the relative contribution of each component varies widely in Fig. 6 indicates that the model is able to separate the coupling effect from the noise of the irregular component.

Figure 6: Contribution of individual model components. Posterior mean estimates of the winter (Dec-Jan-Feb) mean levels of the mean component , the irregular component , the coupling effect , and the observation error .

Forecasting the winter NAO

The posterior mean estimate of the coupling effect coefficient is 0.995 (90% credible interval 0.991–0.997). In terms of inter-annual variability, this is equivalent to a correlation of around 0.19 (0.05–0.37) between Dec-Jan-Feb means, suggesting limited evidence of persistence and therefore predictability between seasons. However, if we can learn about the coupling effect quickly enough during a specific coupled period, then we can use that knowledge to provide more skilful forecasts for the rest of the period. Figure 3 suggests that the system is at least partially coupled from the beginning of November until around the middle of April. Using the forecasting recursions in Appendix A, we obtained forecasts beginning each day from 1 November to 1 February until the end of the fully coupled period on 28 February for every winter between 1987 and 2016. Figure 7 shows the correlation between the forecast and observed means. By the beginning of December, the correlation approaches 0.5 for the 92 day forecast of the mean NAO to 28 February. This correlation approaches that achieved by computationally expensive numerical weather prediction models (Scaife et al., 2014; Siegert et al., 2016). The correlation increases slightly as more observations are assimilated during December. However, as more observations are assimilated, the forecast period decreases and we are essentially predicting weather noise, so the correlation does not increase further.

Figure 7: Predictability of winter NAO. (left) The correlation between the observations and forecasts initialised on each day between November and February, for the mean level over the remainder of the period to 28 Feb; (right) Observations (x) and forecasts (), for the mean NAO between 1 Dec and 28 Feb each year, given data up to 30 Nov.

Figure 7 also compares forecasts of the 92 day Dec-Jan-Feb winter mean, initialised on 1 December each year, with the observed mean NAO index for the same periods. The model predicts the 2010, 2011 and 2012 winter seasons with remarkable accuracy, and captures the general pattern during the 1990s. However, it fails to predict the extreme winter of 2009–10. Figure 8 plots deseasonalised observations of winter 2009–10 (). Deseasonalising the observations leaves only the contributions from the irregular component and the coupling effect , which represent processes on different time scales. The irregular component captures high frequency fluctuations, while the coupling effect captures any overall departure from the seasonal mean. From the middle of December onwards, the mean of the deseasonalised data is clearly negative, which the model attributes to the coupling effect . Since the seasonal forecasts in Fig. 7 were based on information up to 30 November, it is unsurprising that a fairly normal winter was forecast. In contrast, in winter 2010–11 (Fig. 8), a strong negative signal is visible in November which the model is able to exploit to skilfully forecast the remainder of the Dec-Jan-Feb season. Winter 2010 also illustrates the time-varying nature of the coupling effect , which starts strongly negative early in the season, but weakens from mid-January onwards.

Figure 8: Deseasonalised observations for the winters of (left) 2009–10 and (right) 2010–11. Thin grey lines are a random sample of 10 posterior trajectories for the coupling effect . Thick grey and dashed grey lines represent the posterior mean and pointwise 90% credible interval for the coupling effect.

7 Discussion

In this study we have developed Bayesian state space methods for diagnosing predictability in intermittently coupled systems. Coupling is represented by a transient intervention whose timing and duration is inferred from the data. Interventions to either the mean or temporal dependence structure are considered. The effect of intermittent coupling is modelled as dynamic process rather than a sequence of constant and independent effects. Latent TVAR components are proposed to capture any inherent non-stationarity in the temporal dependence structure. A linearised approximation is proposed that allows efficient forward-filtering and backward-sampling for models containing latent TVAR components, without requiring complicated and computationally expensive sequential Monte Carlo methods. In addition, we develop tools for posterior inference in intermittently coupled systems, including evaluating the evidence of a coupling effect, attribution of historical variation in the system, and demonstrating potential predictability.

We applied the proposed model and inference methods to diagnose excess winter-time variability in the North Atlantic Oscillation. Existing methods in climate science are unable to distinguish between transient changes in the mean or temporal dependence structure. The proposed model strongly points to transient changes in the mean level of the NAO during a period beginning sometime in November and ending around the middle of April. This is an important conclusion given that the excess winter-time variability in the NAO is usually characterised by increased temporal dependence. The mean level of the NAO also appears to change on decadal time scales, in addition to a fairly stable annual cycle and the transient changes in winter-time. The proposed model is also able to separate the coupling effect from accumulated day-to-day variability in individual seasons. For the NAO, the two effects actually oppose each other in some seasons.

Like latent AR components, latent TVAR components improve the interpretability of structural time series models by avoiding the need to redefine the mean level of the observed process. In addition, latent TVAR components permit efficient recursive estimation of the autoregressive parameters and include standard latent AR components as a special case when the innovation variance is zero. For the NAO, we found little evidence of changes in the autoregressive structure throughout the study period, so a standard latent AR component could be used to represent day-to-day variability. However, the fact that we can confirm that autoregressive structure is constant on decadal time-scales is also a useful conclusion.

The model proposed for intermittently coupled systems differs from standard intervention analysis by modelling the effect of repeated coupling events as a dynamic process, rather than a series of independent events. This allows knowledge gained during one coupled period to inform inferences for the next. By modelling the coupling effect as a dynamic process, the effect is also able to vary within individual coupled periods rather than being assumed constant. Climate scientists usually assume that any coupling effect is constant throughout an arbitrarily defined season. We have shown that the coupling effect on the NAO can vary substantially throughout a single season.

Modelling the effect of coupling as a dynamic process also makes the model robust to minor variations in the coupled period. However, the specification of a fixed coupling period remains a limitation. Hidden Markov and semi-Markov models are widely used in similar seasonal state-switching scenarios to allow for changes in onset and duration

(e.g., Carey-Smith et al., 2014)

. Standard hidden Markov models assume instantaneous switching between states. While such an assumption may be acceptable for some applications, we do not consider it plausible for the NAO. A completely general alternative would be a reversible-jump MCMC scheme

(Green, 1995). In such a scheme, coupling events could be estimated with varying onset, duration or other parametrized structural changes. However, unless the timing of coupling events varies dramatically, the additional cost and complexity of a reversible-jump scheme seems unnecessary. The on-line Bayesian changepoint methods proposed by Fearnhead and Liu (2011) might provide a more efficient approach.

In the methodology developed here, we have allowed for non-stationarity in the mean and the temporal dependence structure, but not in the variance. Stochastic volatility models and related ARCH and GARCH models have been widely studied and applied, particularly in economics. Masala (2015) applied a GARCH model to stochastic modelling of the NAO, but found that its performance was poor. Efficient filtering and smoothing is possible for time-varying observation error variance (West and Harrison, 1997, Chapter 10.8). However, fully conjugate models that admit analytic filtering and smoothing for time-varying state innovation variances are not possible, even in the linear normal case. Of particular interest are changes in the residual TVAR innovation variance that drives short-term variability in the system. Sequential Monte Carlo methods or further approximations are required to model time-varying innovation variances.

The authors gratefully acknowledge the support of the Natural Environment Research Council grant NE/M006123/1. We also wish to thank two anonymous reviewers and the associate editor for their helpful comments.

Appendix A Forward-filtering, backward-sampling and forecasting

Forward-filtering

The sequence of posterior distributions can be approximated as follows:

Prior to observing , the predictive distributions at time are

with

where

After observing , the posterior distribution of the state vector at time is

with

where and .

Backward-sampling

The joint posterior can be sampled recursively as follows:

  • Sample from

  • for

    • Sample from

where

and . The quantities ,,, and are obtained from the filtering recursions.

Forecasting

The -step ahead forecast distribution given data up to time can be sampled sequentially as

  • Sample from

  • for

    • Sample from

    • Sample from .

References

  • Altizer et al. (2011) Altizer, S., Bartel, R. and Han, B. A. (2011) Animal migration and infectious disease risk. Science, 331, 296–302.
  • Anderson and Moore (1979) Anderson, B. D. O. and Moore, J. B. (1979) Optimal Filtering. Prentice-Hall.
  • Bourassa et al. (2013) Bourassa, M. A., Gille, S. T., Bitz, C., Carlson, D., Cerovecki, I., Clayson, C. A., Cronin, M. F., Drennan, W. M., Fairall, C. W., Hoffman, R. N., Magnusdottir, G., Pinker, R. T., Renfrew, I. A., Serreze, M. C., Speer, K., Talley, L. D. and Wick, G. A. (2013) High-latitude ocean and sea ice surface fluxes: Challenges for climate research. Bulletin of the American Meteorological Society, 94, 403–423.
  • Box and Tiao (1975) Box, G. E. P. and Tiao, G. C. (1975) Intervention Analysis with Applications to Economic and Environmental Problems. Journal of the American Statistical Association, 70, 70–79.
  • Carey-Smith et al. (2014) Carey-Smith, T., Sansom, J. and Thomson, P. (2014) A hidden seasonal switching model for multisite daily rainfall. Water Resources Research, 50, 257–272.
  • Chapin III et al. (2010) Chapin III, F. S., Sturm, M., Serreze, M. C., McFadden, J. P., Key, J. R., Lloyd, A. H., McGuire, A. D., Rupp, T. S., Lynch, A. H., Schimel, J. P., Beringer, J., Chapman, W. L., Epstein, H. E., Euskirchen, E. S., Hinzman, L. D., Jia, G., Ping, C.-L., Tape, K. D., Thompson, C. D. C., Walker, D. A. and Welker, J. M. (2010) Role of Land-Surface Changes in Arctic Summer Warming. Science, 657, 9–13.
  • Chen et al. (2012) Chen, G., Qian, C. and Zhang, C. (2012) New Insights into Annual and Semiannual Cycles of Sea Level Pressure. Monthly Weather Review, 140, 1347–1355.
  • Chopin et al. (2013) Chopin, N., Jacob, P. E. and Papaspiliopoulos, O. (2013) SMC 2 : an efficient algorithm for sequential analysis of state space models. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 75, 397–426.
  • Croston (1972) Croston, J. D. (1972) Forecasting and Stock Control for Intermittent Demands. Operational Research Quarterly, 23, 289–303.
  • Dee et al. (2011) Dee, D. P., Uppala, S. M., Simmons, A. J., Berrisford, P., Poli, P., Kobayashi, S., Andrae, U., Balmaseda, M. A., Balsamo, G., Bauer, P., Bechtold, P., Beljaars, A. C. M., van de Berg, L., Bidlot, J., Bormann, N., Delsol, C., Dragani, R., Fuentes, M., Geer, A. J., Haimberger, L., Healy, S. B., Hersbach, H., Hólm, E. V., Isaksen, L., Kållberg, P., Köhler, M., Matricardi, M., McNally, A. P., Monge-Sanz, B. M., Morcrette, J. J., Park, B. K., Peubey, C., de Rosnay, P., Tavolato, C., Thépaut, J. N. and Vitart, F. (2011) The ERA-Interim reanalysis: Configuration and performance of the data assimilation system. Quarterly Journal of the Royal Meteorological Society, 137, 553–597.
  • Doucet et al. (2000) Doucet, A., Godsill, S. J. and Andrieu, C. (2000) On sequential Monte Carlo sampling methods for Bayesian filtering. Statistics and Computing, 10, 197–208.
  • Doucet and Johansen (2011) Doucet, A. and Johansen, A. M. (2011) A Tutorial on Particle Filtering and Smoothing: Fifteen years later. In

    The Oxford Handbook of Nonlinear Filtering

    (eds. D. Crisan and B. Rozovskii), chap. 8.2, 656–705. Oxford University Press.
  • Durbin and Koopman (2012) Durbin, J. and Koopman, S. J. (2012) Time Series Analysis by State Space Methods. Oxford University Press, 2nd edn.
  • Eade et al. (2014) Eade, R., Smith, D., Scaife, A., Wallace, E., Dunstone, N., Hermanson, L. and Robinson, N. (2014) Do seasonal-to-decadal climate predictions underestimate the predictability of the real world? Geophysical Research Letters, 41, 5620–5628.
  • Fearnhead and Liu (2011) Fearnhead, P. and Liu, Z. (2011) Efficient Bayesian analysis of multiple changepoint models with dependence across segments. Statistics and Computing, 21, 217–229.
  • Franzke and Woollings (2011) Franzke, C. and Woollings, T. (2011) On the persistence and predictability properties of north atlantic climate variability. Journal of Climate, 24, 466–472.
  • Frühwirth-Schnatter (1994) Frühwirth-Schnatter, S. (1994) Data Augmentation and Dynamic Linear Models. Journal of Time Series Analysis, 15, 183–202.
  • Green (1995)

    Green, P. J. (1995) Reversible jump Markov chain Monte Carlo computation and Bayesian model determination.

    Biometrika, 82, 711–732.
  • Gronau et al. (2017) Gronau, Q. F., Sarafoglou, A., Matzke, D., Ly, A., Boehm, U., Marsman, M., Leslie, D. S., Forster, J. J., Wagenmakers, E. J. and Steingroever, H. (2017) A tutorial on bridge sampling. Journal of Mathematical Psychology, 81, 80–97.
  • Harvey (1989) Harvey, A. C. (1989) Forecasting, structural time series models and the Kalman filter. Cambridge University Press.
  • Harvey and Durbin (1986) Harvey, A. C. and Durbin, J. (1986) The Effects of Seat Belt Legislation on British Road Casualties : A Case Study in Structural Time Series Modelling. Journal of the Royal Statistical Society: Series A (General), 149, 187–227.
  • Hsu and Wallace (1976) Hsu, C.-P. F. and Wallace, J. M. (1976) The Global Distribution of the Annual and Semiannual Cycles in Sea Level Pressure. Monthly Weather Review, 104, 1597–1601.
  • Hurrell (1995) Hurrell, J. W. (1995) Decadal Trends in the North Atlantic Oscillation: Regional Temperatures and Precipitation. Science, 269, 676–679.
  • Kass and Raftery (1995) Kass, R. E. and Raftery, A. E. (1995) Bayes Factors. Journal of the American Statistical Association, 90, 773–795.
  • Keeley et al. (2009) Keeley, S. P. E., Sutton, R. T. and Shaffrey, L. C. (2009) Does the North Atlantic Oscillation show unusual persistence on intraseasonal timescales? Geophysical Research Letters, 36, L22706.
  • Kitagawa and Gersch (1985) Kitagawa, G. and Gersch, W. (1985) A Smoothness Priors Time-Varying AR Coefficient Modeling of Nonstationary Covariance Time Series. IEEE Transactions on Automatic Control, 30, 48–56.
  • Kushnir et al. (2006) Kushnir, Y., Robinson, W. a., Chang, P. and Robertson, A. W. (2006) The Physical Basis for Predicting Atlantic Sector Seasonal-to-Interannual Climate Variability. Journal of Climate, 19, 5949–5970.
  • Masala (2015) Masala, G. (2015) North Atlantic Oscillation index stochastic modelling. International Journal of Climatology, 35, 3624–3632.
  • Mosedale et al. (2006) Mosedale, T. J., Stephenson, D. B., Collins, M. and Mills, T. C. (2006) Granger causality of coupled climate processes: Ocean feedback on the North Atlantic Oscillation. Journal of Climate, 19, 1182–1194.
  • Olsen et al. (2006) Olsen, B., Munster, V. J., Wallensten, A., Waldenström, J., Osterhaus, A. D. M. E. and Fouchier, R. A. M. (2006) Global patterns of Influenza A Virus in wild birds. Science, 312, 384–388.
  • Prado and West (1997) Prado, R. and West, M. (1997) Exploratory Modelling of Multiple Non-Stationary Time Series: Latent Process Structure and Decompositions. In Modelling Longitudinal and Spatially Correlated Data (eds. T. G. Gregoire, D. R. Brillinger, P. J. Diggle, E. Russek-Cohen, W. G. Warren and R. D. Wolfinger), 349–361. New York, NY: Springer New York.
  • Prado and West (2010) — (2010) Time Series: Modelling, Computation and Inference. Chapman and Hall/CRC.
  • Scaife et al. (2014) Scaife, A. A., Arribas, A., Blockey, E., Brookshaw, A., Clark, R. T., Dunstone, N. J., Eade, R., Fereday, D., Folland, C. K., Gordon, M., Hermanson, L., Knight, J. R., Lea, D. J., MacLachlan, C., Maidens, A., Martin, M., Peterson, K. A., Smith, D. M., Vellinga, M., Wallace, E., Waters, J. and Williams, A. (2014) Skillful long range prediction of European and North American winters. Geophysical Research Letters, 5, 2514–2519.
  • Shenstone and Hyndman (2005) Shenstone, L. and Hyndman, R. J. (2005) Stochastic models underlying Croston’s method for intermittent demand forecasting. Journal of Forecasting, 24, 389–402.
  • Siegert et al. (2016) Siegert, S., Stephenson, D. B., Sansom, P. G., Scaife, A. A., Eade, R. and Arribas, A. (2016) A Bayesian framework for verification and recalibration of ensemble forecasts: How uncertain is NAO predictability? Journal of Climate, 29, 995–1012.
  • Subba Rao (1970) Subba Rao, T. (1970) The Fitting of Non-Stationary Time-Series Models with Time-Dependent Parameters. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 32, 312–322.
  • Walker (1924) Walker, G. T. (1924) Correlation in seasonal variation of weather, IX, a preliminary study of world weather. Memoirs of the India Meteorological Department, 24, 275–332.
  • West and Harrison (1997) West, M. and Harrison, P. J. (1997) Bayesian Forecasting and Dynamic Models. Springer-Verlag New York, 2nd edn.