Rotation-constrained optical see-through headset calibration withbare-hand alignment

08/24/2021
by   Xue Hu, et al.
0

The inaccessibility of user-perceived reality remains an open issue in pursuing the accurate calibration of optical see-through (OST) head-mounted displays (HMDs). Manual user alignment is usually required to collect a set of virtual-to-real correspondences, so that a default or an offline display calibration can be updated to account for the user's eye position(s). Current alignment-based calibration procedures usually require point-wise alignments between rendered image point(s) and associated physical landmark(s) of a target calibration tool. As each alignment can only provide one or a few correspondences, repeated alignments are required to ensure calibration quality. This work presents an accurate and tool-less online OST calibration method to update an offline-calibrated eye-display model. The user's bare hand is markerlessly tracked by a commercial RGBD camera anchored to the OST headset to generate a user-specific cursor for correspondence collection. The required alignment is object-wise, and can provide thousands of unordered corresponding points in tracked space. The collected correspondences are registered by a proposed rotation-constrained iterative closest point (rcICP) method to optimise the viewpoint-related calibration parameters. We implemented such a method for the Microsoft HoloLens 1. The resiliency of the proposed procedure to noisy data was evaluated through simulated tests and real experiments performed with an eye-replacement camera. According to the simulation test, the rcICP registration is robust against possible user-induced rotational misalignment. With a single alignment, our method achieves 8.81 arcmin (1.37 mm) positional error and 1.76 degree rotational error by camera-based tests in the arm-reach distance, and 10.79 arcmin (7.71 pixels) reprojection error by user tests.

READ FULL TEXT

page 4

page 6

research
02/10/2023

General, Single-shot, Target-less, and Automatic LiDAR-Camera Extrinsic Calibration Toolbox

This paper presents an open source LiDAR-camera calibration toolbox that...
research
07/04/2014

Calibration of Multiple Fish-Eye Cameras Using a Wand

Fish-eye cameras are becoming increasingly popular in computer vision, b...
research
07/28/2023

3D Radar and Camera Co-Calibration: A Flexible and Accurate Method for Target-based Extrinsic Calibration

Advances in autonomous driving are inseparable from sensor fusion. Heter...
research
10/01/2018

Natural measures of alignment

Natural coordinate system will be proposed. In this coordinate system al...
research
12/01/2018

Fast and Accurate Reconstruction of Pan-Tilt RGB-D Scans via Axis Bound Registration

A fast and accurate algorithm is presented for registering scans from an...
research
06/23/2016

3D Display Calibration by Visual Pattern Analysis

Nearly all 3D displays need calibration for correct rendering. More ofte...
research
01/26/2021

CoMo: A novel co-moving 3D camera system

Motivated by the theoretical interest in reconstructing long 3D trajecto...

Please sign up or login with your details

Forgot password? Click here to reset