Canonical dual solutions to nonconvex radial basis neural network optimization problem

02/18/2013
by   Vittorio Latorre, et al.
0

Radial Basis Functions Neural Networks (RBFNNs) are tools widely used in regression problems. One of their principal drawbacks is that the formulation corresponding to the training with the supervision of both the centers and the weights is a highly non-convex optimization problem, which leads to some fundamentally difficulties for traditional optimization theory and methods. This paper presents a generalized canonical duality theory for solving this challenging problem. We demonstrate that by sequential canonical dual transformations, the nonconvex optimization problem of the RBFNN can be reformulated as a canonical dual problem (without duality gap). Both global optimal solution and local extrema can be classified. Several applications to one of the most used Radial Basis Functions, the Gaussian function, are illustrated. Our results show that even for one-dimensional case, the global minimizer of the nonconvex problem may not be the best solution to the RBFNNs, and the canonical dual theory is a promising tool for solving general neural networks training problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/26/2018

Canonical Duality Theory and Algorithm for Solving Bilevel Knapsack Problems with Applications

A novel canonical duality theory (CDT) is presented for solving general ...
research
10/27/2020

An efficient nonconvex reformulation of stagewise convex optimization problems

Convex optimization problems with staged structure appear in several con...
research
12/08/2017

On Topology Optimization and Canonical Duality Method

The general problem in topology optimization is correctly formulated as ...
research
01/09/2014

Radial basis function process neural network training based on generalized frechet distance and GA-SA hybrid strategy

For learning problem of Radial Basis Function Process Neural Network (RB...
research
07/13/2021

Lifting the Convex Conjugate in Lagrangian Relaxations: A Tractable Approach for Continuous Markov Random Fields

Dual decomposition approaches in nonconvex optimization may suffer from ...
research
01/14/2022

A Kernel-Expanded Stochastic Neural Network

The deep neural network suffers from many fundamental issues in machine ...
research
09/02/2023

Structured Radial Basis Function Network: Modelling Diversity for Multiple Hypotheses Prediction

Multi-modal regression is important in forecasting nonstationary process...

Please sign up or login with your details

Forgot password? Click here to reset