Multi-Modal Legged Locomotion Framework with Automated Residual Reinforcement Learning

02/24/2022
by   Chen Yu, et al.
0

While quadruped robots usually have good stability and load capacity, bipedal robots offer a higher level of flexibility / adaptability to different tasks and environments. A multi-modal legged robot can take the best of both worlds. In this paper, we propose a multi-modal locomotion framework that is composed of a hand-crafted transition motion and a learning-based bipedal controller – learnt by a novel algorithm called Automated Residual Reinforcement Learning. This framework aims to endow arbitrary quadruped robots with the ability to walk bipedally. In particular, we 1) design an additional supporting structure for a quadruped robot and a sequential multi-modal transition strategy; 2) propose a novel class of Reinforcement Learning algorithms for bipedal control and evaluate their performances in both simulation and the real world. Experimental results show that our proposed algorithms have the best performance in simulation and maintain a good performance in a real-world robot. Overall, our multi-modal robot could successfully switch between biped and quadruped, and walk in both modes. Experiment videos and code are available at https://chenaah.github.io/multimodal/.

READ FULL TEXT

page 1

page 2

page 5

page 6

research
07/25/2022

A Letter on Progress Made on Husky Carbon: A Legged-Aerial, Multi-modal Platform

Animals, such as birds, widely use multi-modal locomotion by combining l...
research
08/01/2023

Demonstrating Autonomous 3D Path Planning on a Novel Scalable UGV-UAV Morphing Robot

Some animals exhibit multi-modal locomotion capability to traverse a wid...
research
11/30/2022

Design and Verification of a Novel Triphibian Platform

Multi-modal robots expand their operations from one working media to ano...
research
10/30/2022

A Multi-modal Deformable Land-air Robot for Complex Environments

Single locomotion robots often struggle to adapt in highly variable or u...
research
09/29/2022

Multimodal analogs to infer humanities visualization requirements

Gaps and requirements for multi-modal interfaces for humanities can be e...
research
05/10/2019

Autonomous Locomotion Mode Transition Simulation of a Track-legged Quadruped Robot Step Negotiation

Multi-modal locomotion (e.g. terrestrial, aerial, and aquatic) is gainin...
research
07/27/2022

Learning to Assess Danger from Movies for Cooperative Escape Planning in Hazardous Environments

There has been a plethora of work towards improving robot perception and...

Please sign up or login with your details

Forgot password? Click here to reset