Learning Deep Representation Without Parameter Inference for Nonlinear Dimensionality Reduction

08/22/2013
by   Xiao-Lei Zhang, et al.
0

Unsupervised deep learning is one of the most powerful representation learning techniques. Restricted Boltzman machine, sparse coding, regularized auto-encoders, and convolutional neural networks are pioneering building blocks of deep learning. In this paper, we propose a new building block -- distributed random models. The proposed method is a special full implementation of the product of experts: (i) each expert owns multiple hidden units and different experts have different numbers of hidden units; (ii) the model of each expert is a k-center clustering, whose k-centers are only uniformly sampled examples, and whose output (i.e. the hidden units) is a sparse code that only the similarity values from a few nearest neighbors are reserved. The relationship between the pioneering building blocks, several notable research branches and the proposed method is analyzed. Experimental results show that the proposed deep model can learn better representations than deep belief networks and meanwhile can train a much larger network with much less time than deep belief networks.

READ FULL TEXT
research
01/28/2020

CSNNs: Unsupervised, Backpropagation-free Convolutional Neural Networks for Representation Learning

This work combines Convolutional Neural Networks (CNNs), clustering via ...
research
01/25/2018

Effective Building Block Design for Deep Convolutional Neural Networks using Search

Deep learning has shown promising results on many machine learning tasks...
research
12/31/2009

Learning the Structure of Deep Sparse Graphical Models

Deep belief networks are a powerful way to model complex probability dis...
research
02/21/2022

On the Suitability of Neural Networks as Building Blocks for The Design of Efficient Learned Indexes

With the aim of obtaining time/space improvements in classic Data Struct...
research
09/01/2015

Learning Deep ℓ_0 Encoders

Despite its nonconvex nature, ℓ_0 sparse approximation is desirable in m...
research
12/03/2014

Deep Distributed Random Samplings for Supervised Learning: An Alternative to Random Forests?

In (zhang2014nonlinear,zhang2014nonlinear2), we have viewed machine lear...
research
05/26/2023

Explaining Deep Learning for ECG Analysis: Building Blocks for Auditing and Knowledge Discovery

Deep neural networks have become increasingly popular for analyzing ECG ...

Please sign up or login with your details

Forgot password? Click here to reset