Training Frankenstein's Creature to Stack: HyperTree Architecture Search

by   Andrew Hundt, et al.

We propose HyperTrees for the low cost automatic design of multiple-input neural network models. Much like how Dr. Frankenstein's creature was assembled from pieces before he came to life in the eponymous book, HyperTrees combine parts of other architectures to optimize for a new problem domain. We compare HyperTrees to rENAS, our extension of Efficient Neural Architecture Search (ENAS). To evaluate these architectures we introduce the CoSTAR Block Stacking Dataset for the benchmarking of neural network models. We utilize 5.1 cm colored blocks and introduce complexity with a stacking task, a bin providing wall obstacles, dramatic lighting variation, and object ambiguity in the depth space. We demonstrate HyperTrees and rENAS on this dataset by predicting full 3D poses semantically for the purpose of grasping and placing specific objects. Inputs to the network include RGB images, the current gripper pose, and the action to take. Predictions with our best model are accurate to within 30 degrees 90 of 3.3 cm and 12.6 degrees. The dataset contains more than 10,000 stacking attempts and 1 million frames of real data. Code and dataset instructions are available at .


page 1

page 2

page 3

page 4


ShuffleNASNets: Efficient CNN models through modified Efficient Neural Architecture Search

Neural network architectures found by sophistic search algorithms achiev...

Differentiable Neural Architecture Learning for Efficient Neural Network Design

Automated neural network design has received ever-increasing attention w...

MFAS: Multimodal Fusion Architecture Search

We tackle the problem of finding good architectures for multimodal class...

Visionary: Vision architecture discovery for robot learning

We propose a vision-based architecture search algorithm for robot manipu...

Improving Neural Architecture Search Image Classifiers via Ensemble Learning

Finding the best neural network architecture requires significant time, ...

Neural Architecture Search for Joint Optimization of Predictive Power and Biological Knowledge

We report a neural architecture search framework, BioNAS, that is tailor...

Please sign up or login with your details

Forgot password? Click here to reset