A Novel Multi-Step Finite-State Automaton for Arbitrarily Deterministic Tsetlin Machine Learning

07/04/2020
by   K. Darshana Abeyrathna, et al.
0

Due to the high energy consumption and scalability challenges of deep learning, there is a critical need to shift research focus towards dealing with energy consumption constraints. Tsetlin Machines (TMs) are a recent approach to machine learning that has demonstrated significantly reduced energy usage compared to neural networks alike, while performing competitively accuracy-wise on several benchmarks. However, TMs rely heavily on energy-costly random number generation to stochastically guide a team of Tsetlin Automata to a Nash Equilibrium of the TM game. In this paper, we propose a novel finite-state learning automaton that can replace the Tsetlin Automata in TM learning, for increased determinism. The new automaton uses multi-step deterministic state jumps to reinforce sub-patterns. Simultaneously, flipping a coin to skip every d'th state update ensures diversification by randomization. The d-parameter thus allows the degree of randomization to be finely controlled. E.g., d=1 makes every update random and d=∞ makes the automaton completely deterministic. Our empirical results show that, overall, only substantial degrees of determinism reduces accuracy. Energy-wise, random number generation constitutes switching energy consumption of the TM, saving up to 11 mW power for larger datasets with high d values. We can thus use the new d-parameter to trade off accuracy against energy consumption, to facilitate low-energy machine learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/03/2023

Accuracy is not the only Metric that matters: Estimating the Energy Consumption of Deep Learning Models

Modern machine learning models have started to consume incredible amount...
research
09/12/2022

On the Energy Consumption of Different Dataframe Processing Libraries – An Exploratory Study

Background: The energy consumption of machine learning and its impact on...
research
05/30/2023

Towards Machine Learning and Inference for Resource-constrained MCUs

Machine learning (ML) is moving towards edge devices. However, ML models...
research
05/12/2021

Winograd Algorithm for AdderNet

Adder neural network (AdderNet) is a new kind of deep model that replace...
research
01/31/2020

Towards the Systematic Reporting of the Energy and Carbon Footprints of Machine Learning

Accurate reporting of energy and carbon usage is essential for understan...
research
04/07/2014

Plug and Play! A Simple, Universal Model for Energy Disaggregation

Energy disaggregation is to discover the energy consumption of individua...
research
05/12/2020

On Idle Energy Consumption Minimization in Production: Industrial Example and Mathematical Model

This paper, inspired by a real production process of steel hardening, in...

Please sign up or login with your details

Forgot password? Click here to reset