DELTAR: Depth Estimation from a Light-weight ToF Sensor and RGB Image

09/27/2022
by   Yijin Li, et al.
0

Light-weight time-of-flight (ToF) depth sensors are small, cheap, low-energy and have been massively deployed on mobile devices for the purposes like autofocus, obstacle detection, etc. However, due to their specific measurements (depth distribution in a region instead of the depth value at a certain pixel) and extremely low resolution, they are insufficient for applications requiring high-fidelity depth such as 3D reconstruction. In this paper, we propose DELTAR, a novel method to empower light-weight ToF sensors with the capability of measuring high resolution and accurate depth by cooperating with a color image. As the core of DELTAR, a feature extractor customized for depth distribution and an attention-based neural architecture is proposed to fuse the information from the color and ToF domain efficiently. To evaluate our system in real-world scenarios, we design a data collection device and propose a new approach to calibrate the RGB camera and ToF sensor. Experiments show that our method produces more accurate depth than existing frameworks designed for depth completion and depth super-resolution and achieves on par performance with a commodity-level RGB-D sensor. Code and data are available at https://zju3dv.github.io/deltar/.

READ FULL TEXT

page 5

page 12

research
08/01/2017

Depth Super-Resolution Meets Uncalibrated Photometric Stereo

A novel depth super-resolution approach for RGB-D sensors is presented. ...
research
07/09/2019

3D pavement surface reconstruction using an RGB-D sensor

A core procedure of pavement management systems is data collection. The ...
research
08/28/2023

Multi-Modal Neural Radiance Field for Monocular Dense SLAM with a Light-Weight ToF Sensor

Light-weight time-of-flight (ToF) depth sensors are compact and cost-eff...
research
03/30/2021

Mask-ToF: Learning Microlens Masks for Flying Pixel Correction in Time-of-Flight Imaging

We introduce Mask-ToF, a method to reduce flying pixels (FP) in time-of-...
research
11/16/2022

Consistent Direct Time-of-Flight Video Depth Super-Resolution

Direct time-of-flight (dToF) sensors are promising for next-generation o...
research
12/07/2021

Wild ToFu: Improving Range and Quality of Indirect Time-of-Flight Depth with RGB Fusion in Challenging Environments

Indirect Time-of-Flight (I-ToF) imaging is a widespread way of depth est...
research
08/07/2023

High-Throughput and Accurate 3D Scanning of Cattle Using Time-of-Flight Sensors and Deep Learning

We introduce a high throughput 3D scanning solution specifically designe...

Please sign up or login with your details

Forgot password? Click here to reset