C^*-algebra Net: A New Approach Generalizing Neural Network Parameters to C^*-algebra

06/20/2022
by   Yuka Hashimoto, et al.
0

We propose a new framework that generalizes the parameters of neural network models to C^*-algebra-valued ones. C^*-algebra is a generalization of the space of complex numbers. A typical example is the space of continuous functions on a compact space. This generalization enables us to combine multiple models continuously and use tools for functions such as regression and integration. Consequently, we can learn features of data efficiently and adapt the models to problems continuously. We apply our framework to practical problems such as density estimation and few-shot learning and show that our framework enables us to learn features of data even with a limited number of samples. Our new framework highlights the potential possibility of applying the theory of C^*-algebra to general neural network models.

READ FULL TEXT
research
06/23/2019

Algebraic Statistics in Practice: Applications to Networks

Algebraic statistics uses tools from algebra (especially from multilinea...
research
12/20/2019

(Newtonian) Space-Time Algebra

The space-time (s-t) algebra provides a mathematical model for communica...
research
05/06/2022

One-sorted Program Algebras

Kleene algebra with tests, KAT, provides a simple two-sorted algebraic f...
research
11/29/2018

Utilizing Complex-valued Network for Learning to Compare Image Patches

At present, the great achievements of convolutional neural network(CNN) ...
research
06/24/2019

Lifelong Learning Starting From Zero

We present a deep neural-network model for lifelong learning inspired by...
research
02/20/2020

Set2Graph: Learning Graphs From Sets

Many problems in machine learning (ML) can be cast as learning functions...
research
03/13/2019

GNA: new framework for statistical data analysis

We report on the status of GNA — a new framework for fitting large-scale...

Please sign up or login with your details

Forgot password? Click here to reset