Detecting causality in multivariate time series via non-uniform embedding

03/14/2019 ∙ by Ziyu Jia, et al. ∙ 0

Causal analysis based on non-uniform embedding schemes is an important way to detect the underlying interactions between dynamic systems. However, there are still some obstacles to estimate high-dimensional conditional mutual information and form optimal mixed embedding vector in traditional non-uniform embedding schemes. In this study, we present a new non-uniform embedding method framed in information theory to detect causality for multivariate time series, named LM-PMIME, which integrates the low-dimensional approximation of conditional mutual information and the mixed search strategy for the construction of the mixed embedding vector. We apply the proposed method to simulations of linear stochastic, nonlinear stochastic, and chaotic systems, demonstrating its superiority over partial conditional mutual information from mixed embedding (PMIME) method. Moreover, the proposed method works well for multivariate time series with weak coupling strengths, especially for chaotic systems. In the actual application, we show its applicability to epilepsy multichannel electrocorticographic recordings.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 9

page 15

page 16

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

References

  • [1] Granger C W J. Investigating causal relations by econometric models and cross-spectral methods[J]. Econometrica: Journal of the Econometric Society, 1969: 424-438.
  • [2] Marko H. The bidirectional communication theory–a generalization of information theory[J]. IEEE Transactions on communications, 1973, 21(12): 1345-1351.
  • [3] Schreiber T. Measuring information transfer[J]. Physical Review Letters, 2000, 85(2): 461.
  • [4] Barnett L, Barrett A B, Seth A K. Granger causality and transfer entropy are equivalent for Gaussian variables[J]. Physical Review Letters, 2009, 103(23): 238701.
  • [5] Schindlerova K. Equivalence of Granger Causality and Transfer Entropy: A Generalization[J]. Applied Mathematical Sciences, 2011, 5: 3637–3648.
  • [6] Mao X, Shang P. Transfer entropy between multivariate time series[J]. Communications in Nonlinear Science and Numerical Simulation, 2017, 47: 338-347.
  • [7]

    Montalto A, Stramaglia S, Faes L, et al. Neural networks with non-uniform embedding and explicit validation phase to assess Granger causality[J]. Neural Networks, 2015, 71: 159-171.

  • [8] Faes L, Nollo G, Porta A. Information-based detection of nonlinear Granger causality in multivariate processes via a nonuniform embedding technique[J]. Physical Review E, 2011, 83(5): 051112.
  • [9] Vlachos I, Kugiumtzis D. Nonuniform state-space reconstruction and coupling detection[J]. Physical Review E, 2010, 82(1): 016207.
  • [10] Kugiumtzis D. Direct-coupling information measure from nonuniform embedding[J]. Physical Review E, 2013, 87(6): 062918.
  • [11] Kugiumtzis D, Koutlis C, Tsimpiris A, et al. Dynamics of epileptiform discharges induced by transcranial magnetic stimulation in genetic generalized epilepsy[J]. International Journal of Neural Systems, 2017, 27(07): 1750037.
  • [12] Kugiumtzis D, Kimiskidis V K. Direct causal networks for the study of transcranial magnetic stimulation effects on focal epileptiform discharges[J]. International Journal of Neural Systems, 2015, 25(05): 1550006.
  • [13] Papana A, Kyrtsou C, Kugiumtzis D, et al. Financial networks based on Granger causality: A case study[J]. Physica A: Statistical Mechanics and its Applications, 2017, 482: 65-73.
  • [14] Papana A, Kyrtsou C, Kugiumtzis D, et al. Assessment of resampling methods for causality testing: A note on the US inflation behavior[J]. PloS one, 2017, 12(7): e0180852.
  • [15] Runge J, Heitzig J, Petoukhov V, et al. Escaping the curse of dimensionality in estimating multivariate transfer entropy[J]. Physical Review Letters, 2012, 108(25): 258701.
  • [16] Runge J, Donner R V, Kurths J. Optimal model-free prediction from multivariate time series[J]. Physical Review E, 2015, 91(5): 052909.
  • [17]

    Brown G, Pocock A, Zhao M J, et al. Conditional likelihood maximisation: a unifying framework for information theoretic feature selection[J]. Journal of Machine Learning Research, 2012, 13: 27-66.

  • [18]

    Vinh N X, Zhou S, Chan J, et al. Can high-order dependencies improve mutual information based feature selection?[J]. Pattern Recognition, 2016, 53: 46-58.

  • [19] Wibral M, Vicente R, Lizier JT. Directed Information Measures in Neuroscience. Berlin; Heidelberg: Springer-Verlag. 2014.
  • [20] Battiti R. Using mutual information for selecting features in supervised neural net learning[J]. IEEE Transactions on Neural Networks, 1994, 5(4): 537-550.
  • [21]

    Yang H H, Moody J. Data visualization and feature selection: New algorithms for nongaussian data[C]. Advances in Neural Information Processing Systems. 2000: 687-693.

  • [22] Fleuret F. Fast binary feature selection with conditional mutual information[J]. Journal of Machine Learning Research, 2004, 5 : 1531-1555.
  • [23] Peng H, Long F, Ding C. Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2005, 27(8): 1226-1238.
  • [24]

    Lin D, Tang X. Conditional infomax learning: an integrated framework for feature extraction and fusion[C]. European Conference on Computer Vision. Springer, Berlin, Heidelberg, 2006: 68-82.

  • [25]

    Meyer P E, Bontempi G. On the use of variable complementarity for feature selection in cancer classification[C]. Workshops on Applications of Evolutionary Computation. Springer, Berlin, Heidelberg, 2006: 91-102.

  • [26] Hacine-Gharbi A, Ravier P, Harba R, et al. Low bias histogram-based estimation of mutual information for feature selection[J]. Pattern Recognition Letters, 2012, 33(10): 1302-1308.
  • [27] Kwak, N, Choi, C. H. Input feature selection by mutual information based on Parzen window. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2002 (12): 1667-1671.
  • [28] Kraskov A, Stögbauer H, Grassberger P. Estimating mutual information[J]. Physical Review E, 2004, 69(6): 066138.
  • [29] Schelter B, Winterhalder M, Hellwig B, et al. Direct or indirect? Graphical models for neural oscillators[J]. Journal of Physiology-Paris, 2006, 99(1): 37-46.
  • [30] Zhang J. Low-dimensional approximation searching strategy for transfer entropy from non-uniform embedding[J]. PloS one, 2018, 13(3): e0194382.
  • [31] Gourévitch B, Le Bouquin-Jeannès R, Faucon G. Linear and nonlinear causality between signals: methods, examples and neurophysiological applications[J]. Biological cybernetics, 2006, 95(4): 349-369.
  • [32] Romano M C, Thiel M, Kurths J, et al. Estimation of the direction of the coupling by conditional probabilities of recurrence[J]. Physical Review E, 2007, 76(3): 036211.
  • [33] Kramer M A, Kolaczyk E D, Kirsch H E. Emergent network topology at seizure onset in humans[J]. Epilepsy Research, 2008, 79(2-3): 173-186.
  • [34] Marinazzo D, Pellicoro M, Stramaglia S. Kernel method for nonlinear Granger causality[J]. Physical Review Letters, 2008, 100(14): 144103.
  • [35] Faes L, Marinazzo D, Stramaglia S. Multiscale information decomposition: exact computation for multivariate Gaussian processes[J]. Entropy, 2017, 19(8): 408.
  • [36] Porta A, Faes L. Wiener–Granger causality in network physiology with applications to cardiovascular control and neuroscience[J]. Proceedings of the IEEE, 2016, 104(2): 282-309.
  • [37] Montalto A, Faes L, Marinazzo D. MuTE: a MATLAB toolbox to compare established and novel estimators of the multivariate transfer entropy[J]. PloS one, 2014, 9(10): e109462.