Posterior Regularized Bayesian Neural Network Incorporating Soft and Hard Knowledge Constraints

10/16/2022
by   Jiayu Huang, et al.
0

Neural Networks (NNs) have been widely used in supervised learning due to their ability to model complex nonlinear patterns, often presented in high-dimensional data such as images and text. However, traditional NNs often lack the ability for uncertainty quantification. Bayesian NNs (BNNS) could help measure the uncertainty by considering the distributions of the NN model parameters. Besides, domain knowledge is commonly available and could improve the performance of BNNs if it can be appropriately incorporated. In this work, we propose a novel Posterior-Regularized Bayesian Neural Network (PR-BNN) model by incorporating different types of knowledge constraints, such as the soft and hard constraints, as a posterior regularization term. Furthermore, we propose to combine the augmented Lagrangian method and the existing BNN solvers for efficient inference. The experiments in simulation and two case studies about aviation landing prediction and solar energy output prediction have shown the knowledge constraints and the performance improvement of the proposed model over traditional BNNs without the constraints.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset