티스토리 뷰
Selfish-LRU: Preemption-Aware Caching for Predictability and Performance
덕쑤 2014. 4. 25. 15:51Selfish-LRU: Preemption-Aware Caching for Predictability and Performance
Jan Reineke, Sebastian Altmeyery, Daniel Grundz, Sebastian Hahn, and Claire Maizax
Saarland University, Saarbrücken, Germany
Abstract We introduce Selfish-LRU, a variant of the LRU (least recently used) cache replacement policy that improves performance and predictability in preemptive scheduling scenarios. In multitasking systems with conventional caches, a single memory access by a preempting task can trigger a chain reaction leading to a large number of additional cache misses in the preempted task. Selfish-LRU prevents such chain reactions by first evicting cache blocks that do not belong to the currently active task. Simulations confirm that Selfish-LRU reduces the CRPD (cache-related preemption delay) as well as the overall number of cache misses. At the same time, it simplifies CRPD analysis and results in smaller CRPD bounds.