Use of Metamorphic Relations as Knowledge Carriers to Train Deep Neural Networks

04/10/2021
by   Tsong Yueh Chen, et al.
0

Training multiple-layered deep neural networks (DNNs) is difficult. The standard practice of using a large number of samples for training often does not improve the performance of a DNN to a satisfactory level. Thus, a systematic training approach is needed. To address this need, we introduce an innovative approach of using metamorphic relations (MRs) as "knowledge carriers" to train DNNs. Based on the concept of metamorphic testing and MRs (which play the role of a test oracle in software testing), we make use of the notion of metamorphic group of inputs as concrete instances of MRs (which are abstractions of knowledge) to train a DNN in a systematic and effective manner. To verify the viability of our training approach, we have conducted a preliminary experiment to compare the performance of two DNNs: one trained with MRs and the other trained without MRs. We found that the DNN trained with MRs has delivered a better performance, thereby confirming that our approach of using MRs as knowledge carriers to train DNNs is promising. More work and studies, however, are needed to solidify and leverage this approach to generate widespread impact on effective DNN training.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/22/2018

The Optimal ANN Model for Predicting Bearing Capacity of Shallow Foundations Trained on Scarce Data

This study is focused on determining the potential of using deep neural ...
research
08/29/2022

Data Isotopes for Data Provenance in DNNs

Today, creators of data-hungry deep neural networks (DNNs) scour the Int...
research
06/18/2020

An Investigation of the Weight Space for Version Control of Neural Networks

Deployed Deep Neural Networks (DNNs) are often trained further to improv...
research
05/01/2020

Computing the Testing Error without a Testing Set

Deep Neural Networks (DNNs) have revolutionized computer vision. We now ...
research
10/06/2021

Generalizing Neural Networks by Reflecting Deviating Data in Production

Trained with a sufficiently large training and testing dataset, Deep Neu...
research
03/24/2022

On Exploiting Layerwise Gradient Statistics for Effective Training of Deep Neural Networks

Adam and AdaBelief compute and make use of elementwise adaptive stepsize...
research
06/12/2023

On the Viability of using LLMs for SW/HW Co-Design: An Example in Designing CiM DNN Accelerators

Deep Neural Networks (DNNs) have demonstrated impressive performance acr...

Please sign up or login with your details

Forgot password? Click here to reset