Spatial Locality and Granularity Change in Caching
Caches exploit temporal and spatial locality to allow a small memory to provide fast access to data stored in large, slow memory. The temporal aspect of locality is extremely well studied and understood, but the spatial aspect much less so. We seek to gain an increased understanding of spatial locality by defining and studying the Granularity-Change Caching Problem. This problem modifies the traditional caching setup by grouping data items into blocks, such that a cache can choose any subset of a block to load for the same cost as loading any individual item in the block. We show that modeling such spatial locality significantly changes the caching problem. This begins with a proof that Granularity-Change Caching is NP-Complete in the offline setting, even when all items have unit size and all blocks have unit load cost. In the online setting, we show a lower bound for competitive ratios of deterministic policies that is significantly worse than traditional caching. Moreover, we present a deterministic replacement policy called Item-Block Layered Partitioning and show that it obtains a competitive ratio close to that lower bound. Moreover, our bounds reveal a new issue arising in the Granularity-Change Caching Problem where the choice of offline cache size affects the competitiveness of different online algorithms relative to one another. To deal with this issue, we extend a prior (temporal) locality model to account for spatial locality, and provide a general lower bound in addition to an upper bound for Item-Block Layered Partitioning.
READ FULL TEXT