Subscribe by Email

Monday, January 18, 2010

Least Recently Used Page Replacement Algorithm - LRU

The least recently used (LRU) algorithm applies to a cache memory having a number of independently replaceable pages. When the cache is full and a memory reference is not found in any page in the cache, the page to be replaced is the one which has not been accessed for the longest amount of time. An ordered history of page numbers, called an LRU stack, must be generated for successive memory references.
Although LRU is theoretically realizable, it is not cheap. To fully implement LRU, it is necessary to maintain a linked list of all pages in memory, with the most recently used page at the front and the least recently used page at the rear. The difficulty is that the list must be updated on every memory reference. Finding a page in the list, deleting it, and then moving it to the front is a very time consuming operation, even in hardware.

However, LRU can be implemented using special hardware. Suppose the hardware has a 64-bit counter that is incremented at every instruction. Whenever a page is accessed, it gains a value equal to the counter at the time of page access. Whenever a page needs to be replaced, the operating system selects the page with the lowest counter and swaps it out. With present hardware, this is not feasible because the required hardware counters do not exist.
One important advantage of the LRU algorithm is that it is amenable to full statistical analysis. It has been proven, for example, that LRU can never result in more than N-times more page faults than OPT algorithm, where N is proportional to the number of pages in the managed pool.
On the other hand, LRU's weakness is that its performance tends to degenerate under many quite common reference patterns.

No comments:

Facebook activity