Cache Performance Insight is a framework designed to evaluate the performance of different cache policies with different workloads and configurations.
mkdir build && cd build
cmake .. && make
./src/main <cache_policy> <buffer_size> <trace_file> <params>?
./src/main -h
usage: ./src/main <cache_policy> <buffer_size> <trace_file> <param_0>
<cache_policy> -- cache policy in {LRU, LFU, ARC, ARC_2, ARC_3, OPT}
<buffer_size> -- buffer size
<trace_file> -- path of trace_file
./src/main LRU 65536 ../traces/P1.lis
LRU_CACHE_MANAGER: buffer_size:65536 hit_count:11011495 miss_count:21043978 hit_rate:34.3514%
ctest && make coverage
pip install -r requirement
./scripts/test_runner.py
Find plot in local/plot.png:
- LRU
- LFU
- ARC
- OPT
- ARC-2
- ARC-3
- MRF
- hybrid LRU-LFU
- ...
- cmake build
- unittest & code coverage
- test scripts
- result visualization
- log
- more parameters (e.g. scales, policies, unique keys)
- ...
- cache hit rate
- runtime
- actual memory allocation
- concurrency
- real time cache distribution
- ...