
Theory of Deep Convolutional Neural Networks III: Approximating Radial Functions
We consider a family of deep neural networks consisting of two groups of...
read it

Universal Consistency of Deep Convolutional Neural Networks
Compared with avid research activities of deep convolutional neural netw...
read it

Robust Kernelbased Distribution Regression
Regularization schemes for regression have been widely studied in learni...
read it

Theory of Deep Convolutional Neural Networks II: Spherical Analysis
Deep learning based on deep neural networks of various structures and ar...
read it

Depth Selection for Deep ReLU Nets in Feature Extraction and Generalization
Deep learning is recognized to be capable of discovering deep features f...
read it

Distributed Kernel Ridge Regression with Communications
This paper focuses on generalization performance analysis for distribute...
read it

Realization of spatial sparseness by deep ReLU nets with massive data
The great success of deep learning poses urgent challenges for understan...
read it

Towards Understanding the Spectral Bias of Deep Learning
An intriguing phenomenon observed during training neural networks is the...
read it

Fast Polynomial Kernel Classification for Massive Data
In the era of big data, it is highly desired to develop efficient machin...
read it

Distributed filtered hyperinterpolation for noisy data on the sphere
Problems in astrophysics, space weather research and geophysics usually ...
read it

Deep Neural Networks for RotationInvariance Approximation and Learning
Based on the tree architecture, the objective of this paper is to design...
read it

Universality of Deep Convolutional Neural Networks
Deep learning has been widely applied and brought breakthroughs in speec...
read it

Construction of neural networks for realization of localized deep learning
The subject of deep learning has recently attracted users of machine lea...
read it

Convergence of Online Mirror Descent Algorithms
In this paper we consider online mirror descent (OMD) algorithms, a clas...
read it

Total stability of kernel methods
Regularized empirical risk minimization using kernels and their correspo...
read it

Distributed learning with regularized least squares
We study distributed learning with the least squares regularization sche...
read it

On the Robustness of Regularized Pairwise Learning Methods Based on Kernels
Regularized empirical risk minimization including support vector machine...
read it

Iterative Regularization for Learning with Convex Loss Functions
We consider the problem of supervised learning with convex loss function...
read it

Minimax Optimal Rates of Estimation in High Dimensional Additive Models: Universal Phase Transition
We establish minimax optimal rates of convergence for estimation in a hi...
read it

Unregularized Online Learning Algorithms with General Loss Functions
In this paper, we consider unregularized online learning algorithms in a...
read it

Online Pairwise Learning Algorithms with Kernels
Pairwise learning usually refers to a learning task which involves a los...
read it

Consistency Analysis of an Empirical Minimum Error Entropy Algorithm
In this paper we study the consistency of an empirical minimum error ent...
read it

Learning rates for the risk of kernel based quantile regression estimators in additive models
Additive models play an important role in semiparametric statistics. Thi...
read it

Learning Theory Approach to Minimum Error Entropy Criterion
We consider the minimum error entropy (MEE) criterion and an empirical r...
read it
DingXuan Zhou
is this you? claim profile
Chair Professor at City University of Hong Kong since 1996, Research Assistant Professor at City University of Hong kong since 1996, Joint Research Fund for Hong Kong and Macau Young Scholars from the National Science Fund for Distinguished Young Scholars in 2005, and a Humboldt Research Fellowship in 1993.