Limited Associativity Makes Concurrent Software Caches a Breeze

07/19/2021
by   Dolev Adas, et al.
0

Software caches optimize the performance of diverse storage systems, databases and other software systems. Existing works on software caches automatically resort to fully associative cache designs. Our work shows that limited associativity caches are a promising direction for concurrent software caches. Specifically, we demonstrate that limited associativity enables simple yet efficient realizations of multiple cache management schemes that can be trivially parallelized. We show that the obtained hit ratio is usually similar to fully associative caches of the same management policy, but the throughput is improved by up to X5 compared to production-grade caching libraries, especially in multi-threaded executions.

READ FULL TEXT
research
03/09/2022

Limited Associativity Caching in the Data Plane

In-network caching promises to improve the performance of networked and ...
research
03/12/2018

FDRC: Flow-Driven Rule Caching Optimization in Software Defined Networking

With the sharp growth of cloud services and their possible combinations,...
research
12/05/2018

LBICA: A Load Balancer for I/O Cache Architectures

In recent years, enterprise Solid-State Drives (SSDs) are used in the ca...
research
03/21/2022

LQoCo: Learning to Optimize Cache Capacity Overloading in Storage Systems

Cache plays an important role to maintain high and stable performance (i...
research
05/03/2023

NVMM cache design: Logging vs. Paging

Modern NVMM is closing the gap between DRAM and persistent storage, both...
research
06/03/2019

Cache Contention on Multicore Systems: An Ontology-based Approach

Multicore processors have proved to be the right choice for both desktop...
research
08/02/2018

Go-HEP: writing concurrent software with ease and Go

High Energy and Nuclear Physics (HENP) libraries are now required to be ...

Please sign up or login with your details

Forgot password? Click here to reset