Power Consumption Variation over Activation Functions

06/12/2020
by   Leon Derczynski, et al.
16

The power that machine learning models consume when making predictions can be affected by a model's architecture. This paper presents various estimates of power consumption for a range of different activation functions, a core factor in neural network model architecture design. Substantial differences in hardware performance exist between activation functions. This difference informs how power consumption in machine learning models can be reduced.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/16/2023

Hardware Realization of Nonlinear Activation Functions for NN-based Optical Equalizers

To reduce the complexity of the hardware implementation of neural networ...
research
10/22/2018

CSI Neural Network: Using Side-channels to Recover Your Artificial Neural Network Information

Machine learning has become mainstream across industries. Numerous examp...
research
07/06/2022

Enhancing Adversarial Attacks on Single-Layer NVM Crossbar-Based Neural Networks with Power Consumption Information

Adversarial attacks on state-of-the-art machine learning models pose a s...
research
06/12/2020

CANOA: CAN Origin Authentication Through Power Side-Channel Monitoring

The lack of any sender authentication mechanism in place makes CAN (Cont...
research
01/13/2023

ML Approach for Power Consumption Prediction in Virtualized Base Stations

The flexibility introduced with the Open Radio Access Network (O-RAN) ar...
research
05/30/2022

A Transistor Operations Model for Deep Learning Energy Consumption Scaling Law

Deep Learning (DL) has transformed the automation of a wide range of ind...
research
03/12/2019

Supervised Machine Learning Techniques for Trojan Detection with Ring Oscillator Network

With the globalization of the semiconductor manufacturing process, elect...

Please sign up or login with your details

Forgot password? Click here to reset