
On Random Subsampling of Gaussian Process Regression: A GraphonBased Analysis
In this paper, we study random subsampling of Gaussian process regressio...
read it

Einconv: Exploring Unexplored Tensor Decompositions for Convolutional Neural Networks
Tensor decomposition methods are one of the primary approaches for model...
read it

Data Interpolating Prediction: Alternative Interpretation of Mixup
Data augmentation by mixing samples, such as Mixup, has widely been used...
read it

An Optimality Proof for the PairDiff operator for Representing Relations between Words
Representing the semantic relations that exist between two given words (...
read it

On Tensor Train Rank Minimization: Statistical Efficiency and Scalable Algorithm
Tensor train (TT) decomposition provides a spaceefficient representatio...
read it

Minimizing Quadratic Functions in Constant Time
A samplingbased optimization method for quadratic functions is proposed...
read it

Making Tree Ensembles Interpretable: A Bayesian Model Selection Approach
Tree ensembles, such as random forests and boosted trees, are renowned f...
read it

Making Tree Ensembles Interpretable
Tree ensembles, such as random forest and boosted trees, are renowned fo...
read it

A Tractable Fully Bayesian Method for the Stochastic Block Model
The stochastic block model (SBM) is a generative model revealing macrosc...
read it

Bayesian Masking: Sparse Bayesian Estimation with Weaker Shrinkage Bias
A common strategy for sparse linear regression is to introduce regulariz...
read it

Doubly Decomposing Nonparametric Tensor Regression
Nonparametric extension of tensor regression is proposed. Nonlinearity i...
read it

Rebuilding Factorized Information Criterion: Asymptotically Accurate Marginal Likelihood
Factorized information criterion (FIC) is a recently developed approxima...
read it

Factorized Asymptotic Bayesian Hidden Markov Models
This paper addresses the issue of model selection for hidden Markov mode...
read it

Estimation of lowrank tensors via convex optimization
In this paper, we propose three approaches for the estimation of the Tuc...
read it

Think Globally, Embed Locally  Locally Linear Metaembedding of Words
Distributed word embeddings have shown superior performances in numerous...
read it
Kohei Hayashi
is this you? claim profile
Machine Learning researcher at AI Research Center, National Institute of Advanced Industrial Science and Technology (AIST)