Free energy of Bayesian Convolutional Neural Network with Skip Connection

07/04/2023
by   Shuya Nagayasu, et al.
0

Since the success of Residual Network(ResNet), many of architectures of Convolutional Neural Networks(CNNs) have adopted skip connection. While the generalization performance of CNN with skip connection has been explained within the framework of Ensemble Learning, the dependency on the number of parameters have not been revealed. In this paper, we show that Bayesian free energy of Convolutional Neural Network both with and without skip connection in Bayesian learning. The upper bound of free energy of Bayesian CNN with skip connection does not depend on the oveparametrization and, the generalization error of Bayesian CNN has similar property.

READ FULL TEXT
research
05/14/2019

On the number of k-skip-n-grams

The paper proves that the number of k-skip-n-grams for a corpus of size ...
research
01/27/2023

Fine-tuning Neural-Operator architectures for training and generalization

In this work, we present an analysis of the generalization of Neural Ope...
research
10/21/2019

Universal flow approximation with deep residual networks

Residual networks (ResNets) are a deep learning architecture with the re...
research
05/15/2021

Rethinking Skip Connection with Layer Normalization in Transformers and ResNets

Skip connection, is a widely-used technique to improve the performance a...
research
05/10/2021

AFINet: Attentive Feature Integration Networks for Image Classification

Convolutional Neural Networks (CNNs) have achieved tremendous success in...
research
10/09/2022

SML:Enhance the Network Smoothness with Skip Meta Logit for CTR Prediction

In light of the smoothness property brought by skip connections in ResNe...
research
07/20/2018

Convolutional Neural Networks Analyzed via Inverse Problem Theory and Sparse Representations

Inverse problems in imaging such as denoising, deblurring, superresoluti...

Please sign up or login with your details

Forgot password? Click here to reset