Amplifying Sine Unit: An Oscillatory Activation Function for Deep Neural Networks to Recover Nonlinear Oscillations Efficiently

04/18/2023
by   Jamshaid Ul Rahman, et al.
0

Many industrial and real life problems exhibit highly nonlinear periodic behaviors and the conventional methods may fall short of finding their analytical or closed form solutions. Such problems demand some cutting edge computational tools with increased functionality and reduced cost. Recently, deep neural networks have gained massive research interest due to their ability to handle large data and universality to learn complex functions. In this work, we put forward a methodology based on deep neural networks with responsive layers structure to deal nonlinear oscillations in microelectromechanical systems. We incorporated some oscillatory and non oscillatory activation functions such as growing cosine unit known as GCU, Sine, Mish and Tanh in our designed network to have a comprehensive analysis on their performance for highly nonlinear and vibrational problems. Integrating oscillatory activation functions with deep neural networks definitely outperform in predicting the periodic patterns of underlying systems. To support oscillatory actuation for nonlinear systems, we have proposed a novel oscillatory activation function called Amplifying Sine Unit denoted as ASU which is more efficient than GCU for complex vibratory systems such as microelectromechanical systems. Experimental results show that the designed network with our proposed activation function ASU is more reliable and robust to handle the challenges posed by nonlinearity and oscillations. To validate the proposed methodology, outputs of our networks are being compared with the results from Livermore solver for ordinary differential equation called LSODA. Further, graphical illustrations of incurred errors are also being presented in the work.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/19/2019

Learning Compact Neural Networks Using Ordinary Differential Equations as Activation Functions

Most deep neural networks use simple, fixed activation functions, such a...
research
09/06/2019

Differential Equation Units: Learning Functional Forms of Activation Functions from Data

Most deep neural networks use simple, fixed activation functions, such a...
research
04/08/2021

Learning specialized activation functions with the Piecewise Linear Unit

The choice of activation functions is crucial for modern deep neural net...
research
05/28/2023

ASU-CNN: An Efficient Deep Architecture for Image Classification and Feature Visualizations

Activation functions play a decisive role in determining the capacity of...
research
09/21/2022

Periodic Extrapolative Generalisation in Neural Networks

The learning of the simplest possible computational pattern – periodicit...
research
06/24/2022

A Deep Learning Approach to Nonconvex Energy Minimization for Martensitic Phase Transitions

We propose a mesh-free method to solve nonconvex energy minimization pro...
research
11/27/2021

AIS: A nonlinear activation function for industrial safety engineering

In the task of Chinese named entity recognition based on deep learning, ...

Please sign up or login with your details

Forgot password? Click here to reset