SurfelGAN: Synthesizing Realistic Sensor Data for Autonomous Driving

by   Zhenpei Yang, et al.

Autonomous driving system development is critically dependent on the ability to replay complex and diverse traffic scenarios in simulation. In such scenarios, the ability to accurately simulate the vehicle sensors such as cameras, lidar or radar is essential. However, current sensor simulators leverage gaming engines such as Unreal or Unity, requiring manual creation of environments, objects and material properties. Such approaches have limited scalability and fail to produce realistic approximations of camera, lidar, and radar data without significant additional work. In this paper, we present a simple yet effective approach to generate realistic scenario sensor data, based only on a limited amount of lidar and camera data collected by an autonomous vehicle. Our approach uses texture-mapped surfels to efficiently reconstruct the scene from an initial vehicle pass or set of passes, preserving rich information about object 3D geometry and appearance, as well as the scene conditions. We then leverage a SurfelGAN network to reconstruct realistic camera images for novel positions and orientations of the self-driving vehicle and moving objects in the scene. We demonstrate our approach on the Waymo Open Dataset and show that it can synthesize realistic camera data for simulated scenarios. We also create a novel dataset that contains cases in which two self-driving vehicles observe the same scene at the same time. We use this dataset to provide additional evaluation and demonstrate the usefulness of our SurfelGAN model.


page 4

page 6

page 8

page 12

page 13

page 14

page 15

page 16


Customized Co-Simulation Environment for Autonomous Driving Algorithm Development and Evaluation

Increasing the implemented SAE level of autonomy in road vehicles requir...

GeoSim: Realistic Video Simulation via Geometry-Aware Composition for Self-Driving

Scalable sensor simulation is an important yet challenging open problem ...

UniSim: A Neural Closed-Loop Sensor Simulator

Rigorously testing autonomy systems is essential for making safe self-dr...

GINA-3D: Learning to Generate Implicit Neural Assets in the Wild

Modeling the 3D world from sensor data for simulation is a scalable way ...

Enhancing an eco-driving gamification platform through wearable and vehicle sensor data integration

As road transportation has been identified as a major contributor of env...

Quantity over Quality: Training an AV Motion Planner with Large Scale Commodity Vision Data

With the Autonomous Vehicle (AV) industry shifting towards Autonomy 2.0,...

LiDARsim: Realistic LiDAR Simulation by Leveraging the Real World

We tackle the problem of producing realistic simulations of LiDAR point ...

Please sign up or login with your details

Forgot password? Click here to reset