Learning Perceptual Locomotion on Uneven Terrains using Sparse Visual Observations

09/28/2021
by   Fernando Acero, et al.
0

Legged robots have achieved remarkable performance in blind walking using either model-based control or data-driven deep reinforcement learning. To proactively navigate and traverse various terrains, active use of visual perception becomes indispensable, and this work aims to exploit the use of sparse visual observations to achieve perceptual locomotion over a range of commonly seen bumps, ramps, and stairs in human-centred environments. We first formulate the selection of minimal visual input that can represent the uneven surfaces of interest, and propose a learning framework that integrates such exteroceptive and proprioceptive data. We specifically select state observations and design a training curriculum to learn feedback control policies more effectively over a range of different terrains. Using an extensive benchmark, we validate the learned policy in tasks that require omnidirectional walking over flat ground and forward locomotion over terrains with obstacles, showing a high success rate of traversal. Particularly, the robot performs autonomous perceptual locomotion with minimal visual perception using depth measurements, which are easily available from a Lidar or RGB-D sensor, and successfully demonstrates robust ascent and descent over high stairs of 20 cm step height, i.e., 50

READ FULL TEXT

page 1

page 3

page 4

page 6

research
03/26/2021

Reinforcement Learning for Robust Parameterized Locomotion Control of Bipedal Robots

Developing robust walking controllers for bipedal robots is a challengin...
research
09/09/2021

Learning Vision-Guided Dynamic Locomotion Over Challenging Terrains

Legged robots are becoming increasingly powerful and popular in recent y...
research
03/13/2018

Coregionalised Locomotion Envelopes - A Qualitative Approach

'Sharing of statistical strength' is a phrase often employed in machine ...
research
05/25/2018

Heuristic Planning for Rough Terrain Locomotion in Presence of External Disturbances and Variable Perception Quality

The quality of the visual feedback can vary significantly on a legged ro...
research
04/27/2020

Learning for Microrobot Exploration: Model-based Locomotion, Sparse-robust Navigation, and Low-power Deep Classification

Building intelligent autonomous systems at any scale is challenging. The...
research
06/10/2019

Data Efficient and Safe Learning for Locomotion via Simplified Model

In this letter, we formulate a novel Markov Decision Process (MDP) for d...
research
01/20/2022

Learning robust perceptive locomotion for quadrupedal robots in the wild

Legged robots that can operate autonomously in remote and hazardous envi...

Please sign up or login with your details

Forgot password? Click here to reset