DeepAI AI Chat
Log In Sign Up

Proposal for a High Precision Tensor Processing Unit

06/10/2017
by   Eric B. Olsen, et al.
0

This whitepaper proposes the design and adoption of a new generation of Tensor Processing Unit which has the performance of Google's TPU, yet performs operations on wide precision data. The new generation TPU is made possible by implementing arithmetic circuits which compute using a new general purpose, fractional arithmetic based on the residue number system.

READ FULL TEXT

page 4

page 10

page 11

page 14

07/18/2022

MCTensor: A High-Precision Deep Learning Library with Multi-Component Floating-Point

In this paper, we introduce MCTensor, a library based on PyTorch for pro...
12/03/2015

Introduction of the Residue Number Arithmetic Logic Unit With Brief Computational Complexity Analysis

Digital System Research has pioneered the mathematics and design for a n...
01/29/2021

lrsarith: a small fixed/hybrid arithmetic C library

We describe lrsarith which is a small fixed precision and hybrid arithme...
03/05/2022

New satellites of figure-eight orbit computed with high precision

In this paper we use a Modified Newton's method based on the Continuous ...
01/22/2021

Accelerated Polynomial Evaluation and Differentiation at Power Series in Multiple Double Precision

The problem is to evaluate a polynomial in several variables and its gra...
12/22/2021

Iterative Krylov Methods for Acoustic Problems on Graphics Processing Unit

This paper deals with linear algebra operations on Graphics Processing U...