DeepAI AI Chat
Log In Sign Up

Centralized and Decentralized Global Outer-synchronization of Asymmetric Recurrent Time-varying Neural Network by Data-sampling

by   Wenlian Lu, et al.
FUDAN University

In this paper, we discuss the outer-synchronization of the asymmetrically connected recurrent time-varying neural networks. By both centralized and decentralized discretization data sampling principles, we derive several sufficient conditions based on diverse vector norms that guarantee that any two trajectories from different initial values of the identical neural network system converge together. The lower bounds of the common time intervals between data samples in centralized and decentralized principles are proved to be positive, which guarantees exclusion of Zeno behavior. A numerical example is provided to illustrate the efficiency of the theoretical results.


page 1

page 2

page 3

page 4


Learning Decentralized Controllers for Robot Swarms with Graph Neural Networks

We consider the problem of finding distributed controllers for large net...

Optimal Complexity in Non-Convex Decentralized Learning over Time-Varying Networks

Decentralized optimization with time-varying networks is an emerging par...

On linear convergence of two decentralized algorithms

Decentralized algorithms solve multi-agent problems over a connected net...

A Decentralized Approach to Bayesian Learning

Motivated by decentralized approaches to machine learning, we propose a ...

Accelerated Gradient Tracking over Time-varying Graphs for Decentralized Optimization

Decentralized optimization over time-varying graphs has been increasingl...

Asynchronous Decentralized 20 Questions for Adaptive Search

This paper considers the problem of adaptively searching for an unknown ...