X-TIME: An in-memory engine for accelerating machine learning on tabular data with CAMs

04/03/2023
by   Giacomo Pedretti, et al.
0

Structured, or tabular, data is the most common format in data science. While deep learning models have proven formidable in learning from unstructured data such as images or speech, they are less accurate than simpler approaches when learning from tabular data. In contrast, modern tree-based Machine Learning (ML) models shine in extracting relevant information from structured data. An essential requirement in data science is to reduce model inference latency in cases where, for example, models are used in a closed loop with simulation to accelerate scientific discovery. However, the hardware acceleration community has mostly focused on deep neural networks and largely ignored other forms of machine learning. Previous work has described the use of an analog content addressable memory (CAM) component for efficiently mapping random forests. In this work, we focus on an overall analog-digital architecture implementing a novel increased precision analog CAM and a programmable network on chip allowing the inference of state-of-the-art tree-based ML models, such as XGBoost and CatBoost. Results evaluated in a single chip at 16nm technology show 119x lower latency at 9740x higher throughput compared with a state-of-the-art GPU, with a 19W peak power consumption.

READ FULL TEXT
research
12/26/2022

Heliophysics Discovery Tools for the 21st Century: Data Science and Machine Learning Structures and Recommendations for 2020-2050

Three main points: 1. Data Science (DS) will be increasingly important t...
research
05/17/2023

AnalogNAS: A Neural Network Design Framework for Accurate Inference with Analog In-Memory Computing

The advancement of Deep Learning (DL) is driven by efficient Deep Neural...
research
05/12/2021

SimNet: Computer Architecture Simulation using Machine Learning

While cycle-accurate simulators are essential tools for architecture res...
research
02/10/2022

CMOS Circuits for Shape-Based Analog Machine Learning

While analog computing is attractive for implementing machine learning (...
research
07/05/2022

How sustainable is "common" data science in terms of power consumption?

Continuous developments in data science have brought forth an exponentia...
research
01/26/2023

GPU-based Private Information Retrieval for On-Device Machine Learning Inference

On-device machine learning (ML) inference can enable the use of private ...
research
04/13/2021

Mitigating Adversarial Attack for Compute-in-Memory Accelerator Utilizing On-chip Finetune

Compute-in-memory (CIM) has been proposed to accelerate the convolution ...

Please sign up or login with your details

Forgot password? Click here to reset