A New Upper Bound on Cache Hit Probability for Non-anticipative Caching Policies

06/11/2021
by   Nitish K. Panigrahy, et al.
0

Caching systems have long been crucial for improving the performance of a wide variety of network and web based online applications. In such systems, end-to-end application performance heavily depends on the fraction of objects transferred from the cache, also known as the cache hit probability. Many caching policies have been proposed and implemented to improve the hit probability. In this work, we propose a new method to compute an upper bound on hit probability for all non-anticipative caching policies, i.e., for policies that have no knowledge of future requests. Our key insight is to order the objects according to the ratio of their Hazard Rate (HR) function values to their sizes and place in the cache the objects with the largest ratios till the cache capacity is exhausted. Under some statistical assumptions, we prove that our proposed HR to size ratio based ordering model computes the maximum achievable hit probability and serves as an upper bound for all non-anticipative caching policies. We derive closed form expressions for the upper bound under some specific object request arrival processes. We also provide simulation results to validate its correctness and to compare it to the state-of-the-art upper bounds. We find it to be tighter than state-of-the-art upper bounds for a variety of object request arrival processes.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/28/2017

Caching under Content Freshness Constraints

Several real-time delay-sensitive applications pose varying degrees of f...
research
11/10/2017

Practical Bounds on Optimal Caching with Variable Object Sizes

Many recent caching systems aim to improve hit ratios, but there is no g...
research
01/27/2022

On the Impact of Network Delays on Time-to-Live Caching

We consider Time-to-Live (TTL) caches that tag every object in cache wit...
research
08/05/2023

An Overview of Analysis Methods and Evaluation Results for Caching Strategies

We survey analytical methods and evaluation results for the performance ...
research
01/27/2023

A Learned Cache Eviction Framework with Minimal Overhead

Recent work shows the effectiveness of Machine Learning (ML) to reduce c...
research
08/04/2017

On Resource Pooling and Separation for LRU Caching

Caching systems using the Least Recently Used (LRU) principle have now b...
research
09/19/2020

DEAP Cache: Deep Eviction Admission and Prefetching for Cache

Recent approaches for learning policies to improve caching, target just ...

Please sign up or login with your details

Forgot password? Click here to reset