AI Benchmark: All About Deep Learning on Smartphones in 2019

10/15/2019
by   Andrey Ignatov, et al.
0

The performance of mobile AI accelerators has been evolving rapidly in the past two years, nearly doubling with each new generation of SoCs. The current 4th generation of mobile NPUs is already approaching the results of CUDA-compatible Nvidia graphics cards presented not long ago, which together with the increased capabilities of mobile deep learning frameworks makes it possible to run complex and deep AI models on mobile devices. In this paper, we evaluate the performance and compare the results of all chipsets from Qualcomm, HiSilicon, Samsung, MediaTek and Unisoc that are providing hardware acceleration for AI inference. We also discuss the recent changes in the Android ML pipeline and provide an overview of the deployment of deep learning models on mobile devices. All numerical results provided in this paper can be found and are regularly updated on the official project website: http://ai-benchmark.com.

READ FULL TEXT

page 4

page 5

page 6

page 9

page 10

page 11

research
10/02/2018

AI Benchmark: Running Deep Neural Networks on Android Smartphones

Over the last years, the computational power of mobile devices such as s...
research
12/03/2020

MLPerf Mobile Inference Benchmark: Why Mobile AI Benchmarking Is Hard and What to Do About It

MLPerf Mobile is the first industry-standard open-source mobile benchmar...
research
06/20/2023

Exploring the Performance and Efficiency of Transformer Models for NLP on Mobile Devices

Deep learning (DL) is characterised by its dynamic nature, with new deep...
research
12/17/2021

AI-Assisted Verification of Biometric Data Collection

Recognizing actions from a video feed is a challenging task to automate,...
research
03/14/2020

CoCoPIE: Making Mobile AI Sweet As PIE –Compression-Compilation Co-Design Goes a Long Way

Assuming hardware is the major constraint for enabling real-time mobile ...
research
02/26/2021

Swift for TensorFlow: A portable, flexible platform for deep learning

Swift for TensorFlow is a deep learning platform that scales from mobile...
research
11/11/2020

ShadowNet: A Secure and Efficient System for On-device Model Inference

On-device machine learning (ML) is getting more and more popular as fast...

Please sign up or login with your details

Forgot password? Click here to reset