Deep multi-task mining Calabi-Yau four-folds

08/04/2021
by   Harold Erbin, et al.
1

We continue earlier efforts in computing the dimensions of tangent space cohomologies of Calabi-Yau manifolds using deep learning. In this paper, we consider the dataset of all Calabi-Yau four-folds constructed as complete intersections in products of projective spaces. Employing neural networks inspired by state-of-the-art computer vision architectures, we improve earlier benchmarks and demonstrate that all four non-trivial Hodge numbers can be learned at the same time using a multi-task architecture. With 30 training ratio, we reach an accuracy of 100 h^(2,1) (100 h^(2,2). Assuming that the Euler number is known, as it is easy to compute, and taking into account the linear constraint arising from index computations, we get 100

READ FULL TEXT

page 3

page 6

page 10

page 11

research
01/17/2018

Deep Neural Networks for Survival Analysis Based on a Multi-Task Framework

Survival analysis/time-to-event models are extremely useful as they can ...
research
10/26/2021

Adversarial Robustness in Multi-Task Learning: Promises and Illusions

Vulnerability to adversarial attacks is a well-known weakness of Deep Ne...
research
11/16/2016

Fully-adaptive Feature Sharing in Multi-Task Networks with Applications in Person Attribute Classification

Multi-task learning aims to improve generalization performance of multip...
research
08/12/2019

Feature Partitioning for Efficient Multi-Task Architectures

Multi-task learning holds the promise of less data, parameters, and time...
research
03/20/2019

Regularize, Expand and Compress: Multi-task based Lifelong Learning via NonExpansive AutoML

Lifelong learning, the problem of continual learning where tasks arrive ...
research
08/14/2023

Multi-Receiver Task-Oriented Communications via Multi-Task Deep Learning

This paper studies task-oriented, otherwise known as goal-oriented, comm...

Please sign up or login with your details

Forgot password? Click here to reset