An Analog Neural Network Computing Engine using CMOS-Compatible Charge-Trap-Transistor (CTT)

09/19/2017
by   Yuan Du, et al.
0

An analog neural network computing engine based on CMOS-compatible charge-trap transistor (CTT) is proposed in this paper. CTT devices are used as analog multipliers. Compared to digital multipliers, CTT-based analog multiplier shows significant area and power reduction. The proposed computing engine is composed of a scalable CTT multiplier array and energy efficient analog-digital interfaces. Through implementing the sequential analog fabric (SAF), the engine mixed-signal interfaces are simplified and hardware overhead remains constant regardless of the size of the array. A proof-of-concept 784 by 784 CTT computing engine is implemented using TSMC 28nm CMOS technology and occupied 0.68mm2. The simulated performance achieves 76.8 TOPS (8-bit) with 500 MHz clock frequency and consumes 14.8 mW. As an example, we utilize this computing engine to address a classic pattern recognition problem -- classifying handwritten digits on MNIST database and obtained a performance comparable to state-of-the-art fully connected neural networks using 8-bit fixed-point resolution.

READ FULL TEXT

page 6

page 7

research
09/19/2017

A Memristive Neural Network Computing Engine using CMOS-Compatible Charge-Trap-Transistor (CTT)

A memristive neural network computing engine based on CMOS-compatible ch...
research
12/21/2020

A complete, parallel and autonomous photonic neural network in a semiconductor multimode laser

Neural networks are one of the disruptive computing concepts of our time...
research
01/28/2022

Testable Array Multipliers for a Better Utilization of C-Testability and Bijectivity

This paper presents a design for test (DFT)architecture for fast and sca...
research
06/23/2020

Inference with Artificial Neural Networks on the Analog BrainScaleS-2 Hardware

The neuromorphic BrainScaleS-2 ASIC comprises mixed-signal neurons and s...
research
08/17/2022

Fuse and Mix: MACAM-Enabled Analog Activation for Energy-Efficient Neural Acceleration

Analog computing has been recognized as a promising low-power alternativ...
research
06/21/2023

Synaptic metaplasticity with multi-level memristive devices

Deep learning has made remarkable progress in various tasks, surpassing ...
research
05/24/2021

An In-Memory Analog Computing Co-Processor for Energy-Efficient CNN Inference on Mobile Devices

In this paper, we develop an in-memory analog computing (IMAC) architect...

Please sign up or login with your details

Forgot password? Click here to reset