EgoCOL: Egocentric Camera pose estimation for Open-world 3D object Localization @Ego4D challenge 2023

06/29/2023
by   Cristhian Forigua, et al.
0

We present EgoCOL, an egocentric camera pose estimation method for open-world 3D object localization. Our method leverages sparse camera pose reconstructions in a two-fold manner, video and scan independently, to estimate the camera pose of egocentric frames in 3D renders with high recall and precision. We extensively evaluate our method on the Visual Query (VQ) 3D object localization Ego4D benchmark. EgoCOL can estimate 62 Ego4D baseline in the Ego4D Visual Queries 3D Localization challenge at CVPR 2023 in the val and test sets, respectively. Our code is publicly available at https://github.com/BCV-Uniandes/EgoCOL

READ FULL TEXT
research
03/23/2023

Mobile MoCap: Retroreflector Localization On-The-Go

Motion capture (MoCap) through tracking retroreflectors obtains high pre...
research
03/16/2021

Back to the Feature: Learning Robust Camera Localization from Pixels to Pose

Camera pose estimation in known scenes is a 3D geometry task recently ta...
research
08/03/2022

Negative Frames Matter in Egocentric Visual Query 2D Localization

The recently released Ego4D dataset and benchmark significantly scales a...
research
07/27/2020

Robust Image Retrieval-based Visual Localization using Kapture

In this paper, we present a versatile method for visual localization. It...
research
01/05/2023

A Probabilistic Framework for Visual Localization in Ambiguous Scenes

Visual localization allows autonomous robots to relocalize when losing t...
research
09/21/2022

D-InLoc++: Indoor Localization in Dynamic Environments

Most state-of-the-art localization algorithms rely on robust relative po...
research
06/18/2020

Video Moment Localization using Object Evidence and Reverse Captioning

We address the problem of language-based temporal localization of moment...

Please sign up or login with your details

Forgot password? Click here to reset