Learned Sorted Table Search and Static Indexes in Small Model Space

07/19/2021
by   Domenico Amato, et al.
6

Machine Learning Techniques, properly combined with Data Structures, have resulted in Learned Static Indexes, innovative and powerful tools that speed-up Binary Search, with the use of additional space with respect to the table being searched into. Such space is devoted to the ML model. Although in their infancy, they are methodologically and practically important, due to the pervasiveness of Sorted Table Search procedures. In modern applications, model space is a key factor and, in fact, a major open question concerning this area is to assess to what extent one can enjoy the speed-up of Learned Indexes while using constant or nearly constant space models. We address it here by (a) introducing two new models, i.e., denoted KO-BFS and SY-RMI, respectively; (b) by systematically exploring, for the first time, the time-space trade-offs of a hierarchy of existing models, i.e., the ones in SOSD, together with the new ones. We document a novel and rather complex time-space trade-off picture, which is very informative for users. We experimentally show that the KO-BFS can speed up Interpolation Search and Uniform Binary Search in constant space. For other versions of Binary Search, our second model, together with the bi-criteria PGM index, can achieve a speed-up with a model space of 0.05% more than the one taken by the table, being competitive in terms of time-space trade-off with existing proposals. The SY-RMI and the bi-criteria PGM complement each other quite well across the various levels of the internal memory hierarchy. Finally, our findings are of interest to designers, since they highlight the need for further studies regarding the time-space relation in Learned Indexes.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset