A customizable hardware prefetching framework using online reinforcement learning as described in the MICRO 2021 paper by Bera et al. (https://arxiv.org/pdf/2109.12021.pdf).
-
Updated
Dec 18, 2024 - C++
A customizable hardware prefetching framework using online reinforcement learning as described in the MICRO 2021 paper by Bera et al. (https://arxiv.org/pdf/2109.12021.pdf).
Using Belady's algorithm for improved cache replacement
The PyTorch codebase for -> DEAP Cache: Deep Eviction Admission and Prefetching for Cache.
A project for Advanced Operating System(CS604) that implements ARC cache replacement policy.
Trace-driven cache memory simulator with LRU, MRU, RR and Belady replacement policies.
High performing caching package for node/javascript
Implemented modern last-level cache(LLC) with the concept of "Perceptron Learning for Reuse Prediction" that use neural network idea, which is training the predictor by a smaller independent cache with a series of features.
Source code for the cache replacement policy published in [Faldu et al., PACT'17] and [Faldu et al., CRC2'17].
Source code for the cache replacement policy published in [Faldu et al., PACT'17] and [Faldu et al., CRC2'17].
Programs implemented during lab sessions of course CSC403 i.e (Computer Organization & Architecture)
A transformer-based cache replacement model for CDNs using imitation learning technique with Belady's optimal policy
A console application to simulate cache and cache replacement policies : LRU, LFU and MRU
Implementation of Cache Replacement Algorithm (FIFO and LRU) and calculation of corresponding Cache Hit Ratio.
🧠 Implementation of two cache replacement and two memory scheduling algorithms.
A comprehensive collection of cache eviction policies implemented in python, providing practical examples for Least Recently Used (LRU), Least Frequently Used (LFU), and other strategies to optimize data caching in your applications.
Cache replacement algorithms, including Least Recently Used (LRU), First In First Out (FIFO), and Enhanced Not Recently Used (ENRU).
ARC Cache Implementation (in Haskell!)
Manageable LRU in-memory cache instance for fast lookups and configurable eviction policies
Add a description, image, and links to the cache-replacement topic page so that developers can more easily learn about it.
To associate your repository with the cache-replacement topic, visit your repo's landing page and select "manage topics."