
Componentbased regularisation of multivariate generalised linear mixed models
We address the componentbased regularisation of a multivariate Generali...
read it

Cadre Modeling: Simultaneously Discovering Subpopulations and Predictive Models
We consider the problem in regression analysis of identifying subpopulat...
read it

Extension to mixed models of the Supervised Componentbased Generalised Linear Regression
We address the componentbased regularisation of a multivariate Generali...
read it

Linear Regression with Limited Observation
We consider the most common variants of linear regression, including Rid...
read it

TΓΌbingenOslo system: Linear regression works the best at Predicting Current and Future Psychological Health from Childhood Essays in the CLPsych 2018 Shared Task
This paper describes our efforts in predicting current and future psycho...
read it

Ensemble learning reveals dissimilarity between rareearth transition metal binary alloys with respect to the Curie temperature
We propose a datadriven method to extract dissimilarity between materia...
read it
Learning to extrapolate using continued fractions: Predicting the critical temperature of superconductor materials
In Artificial Intelligence we often seek to identify an unknown target function of many variables y=f(π±) giving a limited set of instances S={(π±^(π’),y^(i))} with π±^(π’)β D where D is a domain of interest. We refer to S as the training set and the final quest is to identify the mathematical model that approximates this target function for new π±; with the set T={π±^(π£)}β D with T β S (i.e. thus testing the model generalisation). However, for some applications, the main interest is approximating well the unknown function on a larger domain D' that contains D. In cases involving the design of new structures, for instance, we may be interested in maximizing f; thus, the model derived from S alone should also generalize well in D' for samples with values of y larger than the largest observed in S. In that sense, the AI system would provide important information that could guide the design process, e.g., using the learned model as a surrogate function to design new lab experiments. We introduce a method for multivariate regression based on iterative fitting of a continued fraction by incorporating additive spline models. We compared it with established methods such as AdaBoost, Kernel Ridge, Linear Regression, Lasso Lars, Linear Support Vector Regression, MultiLayer Perceptrons, Random Forests, Stochastic Gradient Descent and XGBoost. We tested the performance on the important problem of predicting the critical temperature of superconductors based on physicalchemical characteristics.
READ FULL TEXT
Comments
There are no comments yet.