Otimizacao de pesos e funcoes de ativacao de redes neurais aplicadas na previsao de series temporais

07/29/2021
by   Gecynalda Gomes, et al.
0

Neural Networks have been applied for time series prediction with good experimental results that indicate the high capacity to approximate functions with good precision. Most neural models used in these applications use activation functions with fixed parameters. However, it is known that the choice of activation function strongly influences the complexity and performance of the neural network and that a limited number of activation functions have been used. In this work, we propose the use of a family of free parameter asymmetric activation functions for neural networks and show that this family of defined activation functions satisfies the requirements of the universal approximation theorem. A methodology for the global optimization of this family of activation functions with free parameter and the weights of the connections between the processing units of the neural network is used. The central idea of the proposed methodology is to simultaneously optimize the weights and the activation function used in a multilayer perceptron network (MLP), through an approach that combines the advantages of simulated annealing, tabu search and a local learning algorithm, with the purpose of improving performance in the adjustment and forecasting of time series. We chose two learning algorithms: backpropagation with the term momentum (BPM) and LevenbergMarquardt (LM).

READ FULL TEXT
research
01/16/2023

Data-aware customization of activation functions reduces neural network error

Activation functions play critical roles in neural networks, yet current...
research
05/22/2019

Effect of shapes of activation functions on predictability in the echo state network

We investigate prediction accuracy for time series of Echo state network...
research
05/25/2017

Neural Decomposition of Time-Series Data for Effective Generalization

We present a neural network technique for the analysis and extrapolation...
research
01/04/2022

An unfeasability view of neural network learning

We define the notion of a continuously differentiable perfect learning a...
research
03/01/2016

Noisy Activation Functions

Common nonlinear activation functions used in neural networks can cause ...
research
12/07/2020

Generalised Perceptron Learning

We present a generalisation of Rosenblatt's traditional perceptron learn...
research
09/11/2018

Deep Asymmetric Networks with a Set of Node-wise Variant Activation Functions

This work presents deep asymmetric networks with a set of node-wise vari...

Please sign up or login with your details

Forgot password? Click here to reset