Using Machine Learning to Assess Short Term Causal Dependence and Infer Network Links

12/05/2019
by   Amitava Banerjee, et al.
0

We introduce and test a general machine-learning-based technique for the inference of short term causal dependence between state variables of an unknown dynamical system from time series measurements of its state variables. Our technique leverages the results of a machine learning process for short time prediction to achieve our goal. The basic idea is to use the machine learning to estimate the elements of the Jacobian matrix of the dynamical flow along an orbit. The type of machine learning that we employ is reservoir computing. We present numerical tests on link inference of a network of interacting dynamical nodes. It is seen that dynamical noise can greatly enhance the effectiveness of our technique, while observational noise degrades the effectiveness. We believe that the competition between these two opposing types of noise will be the key factor determining the success of causal inference in many of the most important application situations.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 19

page 22

10/29/2020

Link inference of noisy delay-coupled networks: Machine learning and opto-electronic experimental tests

We devise a machine learning technique to solve the general problem of i...
07/12/2016

From Dependence to Causation

Machine learning is the science of discovering statistical dependencies ...
01/17/2020

Causal models for dynamical systems

A probabilistic model describes a system in its observational state. In ...
12/19/2014

From dependency to causality: a machine learning approach

The relationship between statistical dependency and causality lies at th...
03/09/2018

Hybrid Forecasting of Chaotic Processes: Using Machine Learning in Conjunction with a Knowledge-Based Model

A model-based approach to forecasting chaotic dynamical systems utilizes...
09/16/2018

Short-term Cognitive Networks, Flexible Reasoning and Nonsynaptic Learning

While the machine learning literature dedicated to fully automated reaso...
01/22/2016

Orthogonal Echo State Networks and stochastic evaluations of likelihoods

We report about probabilistic likelihood estimates that are performed on...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

References

  • [1] R. Feynman, The Character of Physical Law (MIT Press, Cambridge, 1965).
  • [2] I. Goodfellow, Y. Bengio and A. Courville, Deep Learning (MIT Press, Cambridge, 2016).
  • [3] C.W.J. Granger, “Investigating Causal Relations by Econometric Methods and Cross-Spectral Methods,” Econometrica 37, 424-430 (1969).
  • [4] M. Ding, Y. Chen. and S. L. Bressler, “Granger Causality: Basic Theory and Applications to Neuroscience,” in Handbook of Time Series Analysis, pp. 437-460 (Wiley-VCH, 2006).
  • [5] C. Sima, J. Hua, and S. Jung, “Inference of Gene Regulatory Networks Using Time-Series Data: A Survey”, Curr Genomics. 10(6): 416–429 (2009).
  • [6] J. F. Donges, Y. Zou, N. Marwan, and J. Kurths, “Complex networks in climate dynamics”, Eur. Phys. J. Spec. Top. 174: 157 (2009).
  • [7] W. L. Ku, G. Duggal, Y. Li, M. Girvan, and E. Ott, “Interpreting Patterns of Gene Expression: Signatures of Coregulation, the Data Processing Inequality, and Triplet Motifs,” PLoS One 7, e31969 (2012).
  • [8] J. Ren, W.-X. Wang, B. Li, and Y.-C. Lai, “Noise Bridges Dynamical Correlation and Topology in Coupled Oscillator Networks”, Phys. Rev. Lett. 104, 058701 (2010).
  • [9] Z. Levnajic and A. Pikovsky, “Untangling complex dynamical systems via derivative-variable correlations”, Sci. Rep. 4, 5030 (2014); M. G. Leguia, R. G. Andrzejak, and Z. Levnajic, “Evolutionary optimization of network reconstruction from derivative-variable correlations”, J. Phys. A: Math. Theor. 50, 334001 (2017).
  • [10] T. Schreiber, “Measuring Information Transfer,” Phys. Rev. Lett. 85, 461-464.
  • [11] J. Sun, and E. M. Bollt, “Causation entropy identifies indirect influences, dominance of neighbors and anticipatory couplings”, Physica D 267, 49-57 (2014).
  • [12] E. J. Molinelli, A. Korkut, W. Wang, M. L. Miller, N. P. Gauthier, X. Jing, P. Kaushik, Q. He, G. Mills, D. B. Solit, C. A. Pratilas, M. Weigt, A. Braunstein, A. Pagnani, R. Zecchina, and C. Sander, “Perturbation Biology: Inferring Signaling Networks in Cellular Systems”, PLoS Comput Biol 9 (12): e1003290 (2013).
  • [13] M. Timme, “Revealing Network Connectivity from Response Dynamics”, M. Timme, Phys. Rev. Lett. 98, 224101 (2007).
  • [14] M. J. Panaggio, M.-V. Ciocanel, L. Lazarus, C. M. Topaz, and B. Xu, “Model Reconstruction from Temporal Data for Coupled Oscillator Networks,” arXiv:
    1905.01408v1, 4 May 2019.
  • [15] M. G. Leguia, C. G. B. Martinez, I. Malvestio, A. T. Campo, R. Rocamora, Z. Levnajic, and R. G. Andrzejak, “Inferring directed networks using a rank-based connectivity measure”, Phys. Rev. E 99, 012319 (2019).
  • [16] T. Stankovski, T. Pereira, P. V. E. McClintock, A. Stefanovska, “Coupling functions: Universal insights into dynamical interaction mechanisms ‘’, Rev. Mod. Phys. 89, 045001 (2017).
  • [17] S. G. Shandilya and M. Timme, “Inferring network topology from complex dynamics”, New J. Phys. 13, 13004 (2011).
  • [18] S. Leng, Z. Xu, and H. Ma, “Reconstructing directional causal networks with random forest: Causality meeting machine learning”, Chaos 29, 093130 (2019).
  • [19] R.-M. Cao, S.-Y. Liu, and X.-K. Xua, “Network embedding for link prediction: The pitfall and improvement”, Chaos 29, 103102 (2019).
  • [20] M. G. Leguia, Z. Levnajic, L. Todorovski, and B. Zenko, “Reconstructing dynamical networks via feature ranking”, Chaos 29, 093107 (2019).
  • [21] H. Jaeger and H. Haas, “Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication,” Science 304, 78-80 (2004).
  • [22] J. Pathak, B. Hunt, M. Girvan, Z. Lu, and E. Ott, “Model-Free Prediction of Large Spatiotemporally Chaotic Systems from Data: A Reservoir Computing Approach,” Phys. Rev. Lett. 120, 024102 (2018).
  • [23] Z. Lu, J. Pathak, B. Hunt, M. Girvan, R. Brockett, and E. Ott, “Reservoir Observers: Model-Free Inference of Unmeasured Variables in Chaotic Systems,” Chaos 27, 041102 (2017).
  • [24] L. Larger, A. Baylon-Fuentes, R. Martinenghi, V. S. Udaltsov, Y. K. Chembo, and M. Jacquot, “High-Speed Photonic Reservoir Computing Using a Time-Delay-Based Architecture: Million Words per Second Classification”, Phys. Rev. X 7, 011015 (2017).
  • [25] B. Schrauwen, M. D’Haene, D. Verstraeten, and J. Van Campenhout, “Compact hardware liquid state machines on FPGA for real-time speech recognition”, Neural Networks 21, 511-523 (2008).
  • [26] L. M. Pecora and T. L. Carroll, “Master Stability Function for Synchronized Chaotic Systems,” Phys. Rev. Lett. 80, 2109 (1998).
  • [27]

    H. Jaeger, “The ‘Echo State’ Approach to Analysing and Training Recurrent Neural Networks,” GMO Report 148, German National Research Center for Information Technology (2001).

  • [28] W. Maass, T. Natschlager, and H. Markham, “Real-Time Computing without Stable States: A New Framework for Neural Computation Based on Perturbations,” Neural Computation 14, 2531-2560 (2002).
  • [29] M. Lukoševičius and H. Jaeger, “Reservoir Computer Approaches to Recurrent Neural Network Training,” Computer Science Review 3, 127-149 (2009).
  • [30] P. Antonik, M. Gulina, J. Pauwels, and S. Massar, “Using a reservoir computer to learn chaotic attractors, with applications to chaos synchronization and cryptography,” Phys. Rev. E 98, 012215 (2018).
  • [31] P. Antonik, M. Haelterman, and S. Massar, “Brain-Inspired Photonic Signal Processor for Generating Periodic Patterns and Emulating Chaotic Systems,” Phys. Rev. Applied 7, 054014 (2017).
  • [32] N. F. Rulkov, M. M. Sushchik, L. S. Tsimring, and H.D.T. Abarbanel, “Generalized Synchronization of Chaos in Directionally Coupled Chaotic Systems,” Phys. Rev. E 51, 980-994 (1995).
  • [33] L. Kocarev and U. Parlitz, “Generalized Synchronization, Predictability, and Equivalence of Unidirectionally Coupled Dynamical Systems,” Phys. Rev. Lett. 76, 1816-1819 (1996).
  • [34] B. R. Hunt, E. Ott, and J. A. Yorke, “Differentiable Generalized Synchronization of Chaos,” Phys. Rev. E 55, 4029-4034 (1997).
  • [35]

    A. E. Hoerl and R. W. Kennard, “Ridge Regression: Biased Estimation for Nonorthogonal Problems,” Technometrics

    12, 55-67 (1970).
  • [36] R. Penrose, “A Generalized Inverse for Matrices,” Proc. Cambridge Philosophical Soc. 51, 406-413 (1955).
  • [37] L. Appeltant, M. C. Soriano, G. van der Sande, S. Massar, J. Dambre, B. Schrauwen, C. R. Mirasso, and I. Fischer, “Information Processing Using a Single Dynamical Node as a Complex System,” Nature Communications 2, 468-473 (2013).
  • [38] L. Gordon and J.-P. Ortega, “Reservoir Computing Universality with Stochastic Inputs,” IEEE Trans. on Neural Networks and Learning Systems 23 (2019).
  • [39] E. N. Lorenz, “Deterministic Nonperiodic Flow,” J. Atmos. Sci. 20, 130 (1963).
  • [40]

    H. D. Nguyen, and G. J. McLachlan, “Maximum likelihood estimation of Gaussian mixture models without matrix operations.”, Adv. Data. Anal. Classif.

    9:371–394 (2015).
  • [41] J. L. Kaplan and J. A. Yorke, “Chaotic Behavior of Multidimensional Difference Equations,” in Functional Differential Equations and Approximations of Fixed Points, pp. 204-227 (Springer, Heidelberg, 1979).
  • [42] J. D. Farmer, E. Ott, and J. A. Yorke, “The Dimension of Chaotic Attractors,” Physica D 7, 153-180 (1983).