
-
Word meaning in minds and machines
Machines show an increasingly broad set of linguistic competencies, than...
read it
-
Self-supervised learning through the eyes of a child
Within months of birth, children have meaningful expectations about the ...
read it
-
Learning Task-General Representations with Generative Neuro-Symbolic Modeling
A hallmark of human intelligence is the ability to interact directly wit...
read it
-
Generating new concepts with hybrid neuro-symbolic models
Human conceptual knowledge supports the ability to generate novel yet hi...
read it
-
Learning word-referent mappings and concepts from raw inputs
How do children learn correspondences between the language and the world...
read it
-
Learning Compositional Rules via Neural Program Synthesis
Many aspects of human reasoning, including language, require learning ru...
read it
-
A Benchmark for Systematic Generalization in Grounded Language Understanding
Human language users easily interpret expressions that describe unfamili...
read it
-
Investigating Simple Object Representations in Model-Free Deep Reinforcement Learning
We explore the benefits of augmenting state-of-the-art model-free deep r...
read it
-
Modeling question asking using neural program generation
People ask questions that are far richer, more informative, and more cre...
read it
-
Mutual exclusivity as a challenge for neural networks
Strong inductive biases allow children to learn in fast and adaptable wa...
read it
-
Improving the robustness of ImageNet classifiers using elements of human visual cognition
We investigate the robustness properties of image recognition models equ...
read it
-
Compositional generalization through meta sequence-to-sequence learning
People can learn a new concept and use it compositionally, understanding...
read it
-
People infer recursive visual concepts from just a few examples
Machine learning has made major advances in categorizing objects in imag...
read it
-
Learning a smooth kernel regularizer for convolutional neural networks
Modern deep neural networks require a tremendous amount of data to train...
read it
-
The Omniglot Challenge: A 3-Year Progress Report
Three years ago, we released the Omniglot dataset for developing more hu...
read it
-
Human few-shot learning of compositional instructions
People learn in fast and flexible ways that have not been emulated by ma...
read it
-
Rearranging the Familiar: Testing Compositional Generalization in Recurrent Networks
Systematic compositionality is the ability to recombine meaningful units...
read it
-
Learning Inductive Biases with Simple Neural Networks
People use rich prior knowledge about the world in order to efficiently ...
read it
-
Question Asking as Program Generation
A hallmark of human intelligence is the ability to ask rich, creative, a...
read it
-
Still not systematic after all these years: On the compositional skills of sequence-to-sequence recurrent networks
Humans can understand and produce new utterances effortlessly, thanks to...
read it
-
Building Machines That Learn and Think Like People
Recent progress in artificial intelligence (AI) has renewed interest in ...
read it