Understanding Sequential Flooding

LRU is a popular choice for an eviction policy but isn’t always the best. Consider a Buffer Manager that uses an LRU eviction policy to replace pages in a memory buffer of size N pages. Let’s take a look at the eviction policy’s performance when we do repeated sequential scans of N+1 pages of data. Assume, for simplicity, the buffer is empty at first. Pages 1 to N get added to the buffer initially to as they get requested. Now the buffer is full and page N+1 replaces page 1 in the buffer. The next time we request pages 1 to N+1 sequentially we would always have a cache miss – requesting page 1 evicts page 2, requesting page 2 after that evicts and page 3 so on and so forth. This scenario is known as “sequential flooding” of the buffer memory.

Nested loop joins are prime candidates that exhibit sequential flooding if an LRU eviction policy is used.

Leave a comment