AIS: A nonlinear activation function for industrial safety engineering

11/27/2021
by   Zhenhua Wang, et al.
0

In the task of Chinese named entity recognition based on deep learning, activation function plays an irreplaceable role, it introduces nonlinear characteristics into neural network, so that the fitted model can be applied to various tasks. However, the information density of industrial safety analysis text is relatively high, and the correlation and similarity between the information are large, which is easy to cause the problem of high deviation and high standard deviation of the model, no specific activation function has been designed in previous studies, and the traditional activation function has the problems of gradient vanishing and negative region, which also lead to the recognition accuracy of the model can not be further improved. To solve these problems, a novel activation function AIS is proposed in this paper. AIS is an activation function applied in industrial safety engineering, which is composed of two piecewise nonlinear functions. In the positive region, the structure combining exponential function and quadratic function is used to alleviate the problem of deviation and standard deviation, and the linear function is added to modify it, which makes the whole activation function smoother and overcomes the problem of gradient vanishing. In the negative region, the cubic function structure is used to solve the negative region problem and accelerate the convergence of the model. Based on the deep learning model of BERT-BiLSTM-CRF, the performance of AIS is evaluated. The results show that, compared with other activation functions, AIS overcomes the problems of gradient vanishing and negative region, reduces the deviation of the model, speeds up the model fitting, and improves the extraction ability of the model for industrial entities.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/27/2021

Why KDAC? A general activation function for knowledge discovery

Named entity recognition based on deep learning (DNER) can effectively m...
research
05/08/2023

TaLU: A Hybrid Activation Function Combining Tanh and Rectified Linear Unit to Enhance Neural Networks

The application of the deep learning model in classification plays an im...
research
05/17/2021

Activation function design for deep networks: linearity and effective initialisation

The activation function deployed in a deep neural network has great infl...
research
01/15/2023

Empirical study of the modulus as activation function in computer vision applications

In this work we propose a new non-monotonic activation function: the mod...
research
04/18/2023

Amplifying Sine Unit: An Oscillatory Activation Function for Deep Neural Networks to Recover Nonlinear Oscillations Efficiently

Many industrial and real life problems exhibit highly nonlinear periodic...
research
01/31/2019

Network Parameter Learning Using Nonlinear Transforms, Local Representation Goals and Local Propagation Constraints

In this paper, we introduce a novel concept for learning of the paramete...
research
09/21/2020

Reservoir Computing and its Sensitivity to Symmetry in the Activation Function

Reservoir computing has repeatedly been shown to be extremely successful...

Please sign up or login with your details

Forgot password? Click here to reset