Efficiently Learning Nonstationary Gaussian Processes
Most real world phenomena such as sunlight distribution under a forest canopy, minerals concentration, stock valuation, exhibit nonstationary dynamics i.e. phenomenon variation changes depending on the locality. Nonstationary dynamics pose both theoretical and practical challenges to statistical machine learning algorithms that aim to accurately capture the complexities governing the evolution of such processes. Typically the nonstationary dynamics are modeled using nonstationary Gaussian Process models (NGPS) that employ local latent dynamics parameterization to correspondingly model the nonstationary real observable dynamics. Recently, an approach based on most likely induced latent dynamics representation attracted research community's attention for a while. The approach could not be employed for large scale real world applications because learning a most likely latent dynamics representation involves maximization of marginal likelihood of the observed real dynamics that becomes intractable as the number of induced latent points grows with problem size. We have established a direct relationship between informativeness of the induced latent dynamics and the marginal likelihood of the observed real dynamics. This opens up the possibility of maximizing marginal likelihood of observed real dynamics indirectly by near optimally maximizing entropy or mutual information gain on the induced latent dynamics using greedy algorithms. Therefore, for an efficient yet accurate inference, we propose to build an induced latent dynamics representation using a novel algorithm LISAL that adaptively maximizes entropy or mutual information on the induced latent dynamics and marginal likelihood of observed real dynamics in an iterative manner. The relevance of LISAL is validated using real world sensing datasets.
READ FULL TEXT