
Learning to Combine TopDown and BottomUp Signals in Recurrent Neural Networks with Attention over Modules
Robust perception relies on both bottomup and topdown signals. Bottom...
read it

Object Files and Schemata: Factorizing Declarative and Procedural Knowledge in Dynamical Systems
Modeling a structured, dynamic environment like a video game requires ke...
read it

JigsawVAE: Towards Balancing Features in Variational Autoencoders
The latent variables learned by VAEs have seen considerable interest as ...
read it

KaoKore: A Premodern Japanese Art Facial Expression Dataset
From classifying handwritten digits to generating strings of text, the d...
read it

SketchTransfer: A Challenging New Task for Exploring DetailInvariance and the Abstractions Learned by Deep Networks
Deep networks have achieved excellent results in perceptual tasks, yet t...
read it

KuroNet: PreModern Japanese Kuzushiji Character Recognition with Deep Learning
Kuzushiji, a cursive writing style, had been used in Japan for over a th...
read it

GraphMix: Regularized Training of Graph Neural Networks for SemiSupervised Learning
We present GraphMix, a regularization technique for Graph Neural Network...
read it

Recurrent Independent Mechanisms
Learning modular structures which reflect the dynamics of the environmen...
read it

Interpolated Adversarial Training: Achieving Robust Neural Networks without Sacrificing Accuracy
Adversarial robustness has become a central goal in deep learning, both ...
read it

StateReification Networks: Improving Generalization by Modeling the Distribution of Hidden Representations
Machine learning promises methods that generalize well from finite label...
read it

Interpolation Consistency Training for SemiSupervised Learning
We introduce Interpolation Consistency Training (ICT), a simple and comp...
read it

Adversarial Mixup Resynthesizers
In this paper, we explore new approaches to combining information encode...
read it

Deep Learning for Classical Japanese Literature
Much of machine learning research focuses on producing models which perf...
read it

Manifold Mixup: Encouraging Meaningful OnManifold Interpolation as a Regularizer
Deep networks often perform well on the data manifold on which they are ...
read it

Fortified Networks: Improving the Robustness of Deep Networks by Modeling the Manifold of Hidden Representations
Deep networks have achieved impressive results across a variety of impor...
read it

GibbsNet: Iterative Adversarial Inference for Deep Graphical Models
Directed latent variable models that formulate the joint distribution as...
read it

ACtuAL: ActorCritic Under Adversarial Learning
Generative Adversarial Networks (GANs) are a powerful framework for deep...
read it

Professor Forcing: A New Algorithm for Training Recurrent Networks
The Teacher Forcing algorithm trains recurrent networks by supplying obs...
read it

Adversarially Learned Inference
We introduce the adversarially learned inference (ALI) model, which join...
read it

Theano: A Python framework for fast computation of mathematical expressions
Theano is a Python library that allows to define, optimize, and evaluate...
read it

Discriminative Regularization for Generative Models
We explore the question of whether the representations learned by classi...
read it

Variance Reduction in SGD by Distributed Importance Sampling
Humans are able to accelerate their learning by selecting training mater...
read it
Alex Lamb
is this you? claim profile