Accurate Bounding-box Regression with Distance-IoU Loss for Visual Tracking
Most existing tracking methods are based on using a classifier and multi-scale estimation to estimate the state of the target. Consequently, and as expected, trackers have become more stable while tracking accuracy has stagnated. While the ATOM <cit.> tracker adopts a maximum overlap method based on an intersection-over-union (IoU) loss to mitigate this problem, there are defects in the IoU loss itself, that make it impossible to continue to optimize the objective function when a given bounding box is completely contained within another bounding box; this makes it very challenging to accurately estimate the target state. Accordingly, in this paper, we address the above-mentioned problem by proposing a novel tracking method based on a distance-IoU (DIoU) loss, such that the proposed tracker consists of a target estimation component and a target classification component. The target estimation component is trained to predict the DIoU score between the target ground-truth bounding-box and the estimated bounding-box. The DIoU loss can maintain the advantage provided by the IoU loss while minimizing the distance between the center points of two bounding boxes, thereby making the target estimation more accurate. Moreover, we introduce a classification component that is trained online to guarantee real-time tracking speed. Comprehensive experimental results demonstrate that our DIoUTrack achieves competitive tracking accuracy when compared with state-of-the-art trackers while also tracking speed is over 50fps.
READ FULL TEXT