Effect of Choosing Loss Function when Using T-batching for Representation Learning on Dynamic Networks

08/13/2023
by   Erfan Loghmani, et al.
0

Representation learning methods have revolutionized machine learning on networks by converting discrete network structures into continuous domains. However, dynamic networks that evolve over time pose new challenges. To address this, dynamic representation learning methods have gained attention, offering benefits like reduced learning time and improved accuracy by utilizing temporal information. T-batching is a valuable technique for training dynamic network models that reduces training time while preserving vital conditions for accurate modeling. However, we have identified a limitation in the training loss function used with t-batching. Through mathematical analysis, we propose two alternative loss functions that overcome these issues, resulting in enhanced training performance. We extensively evaluate the proposed loss functions on synthetic and real-world dynamic networks. The results consistently demonstrate superior performance compared to the original loss function. Notably, in a real-world network characterized by diverse user interaction histories, the proposed loss functions achieved more than 26.9 and more than 11.8 efficacy of the proposed loss functions in dynamic network modeling.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/30/2023

Online Loss Function Learning

Loss function learning is a new meta-learning paradigm that aims to auto...
research
05/27/2019

Improved Training Speed, Accuracy, and Data Utilization Through Loss Function Optimization

As the complexity of neural network models has grown, it has become incr...
research
10/21/2020

ComboLoss for Facial Attractiveness Analysis with Squeeze-and-Excitation Networks

Loss function is crucial for model training and feature representation l...
research
02/07/2021

Tilting the playing field: Dynamical loss functions for machine learning

We show that learning can be improved by using loss functions that evolv...
research
12/24/2018

Improving MMD-GAN Training with Repulsive Loss Function

Generative adversarial nets (GANs) are widely used to learn the data sam...
research
03/21/2022

Can we integrate spatial verification methods into neural-network loss functions for atmospheric science?

In the last decade, much work in atmospheric science has focused on spat...
research
04/16/2021

Controlled abstention neural networks for identifying skillful predictions for classification problems

The earth system is exceedingly complex and often chaotic in nature, mak...

Please sign up or login with your details

Forgot password? Click here to reset