Pronounced "cash." A cache is used to speed up data transfer and may be either temporary or permanent. Memory and disk caches are in every computer to speed up instruction execution and data retrieval and updating. These temporary caches serve as staging areas, and their contents are constantly changing.
Browser caches and Internet caches store copies of Web pages retrieved by the user for some period of time in order to speed up retrieval the next time the same page is requested (see Web cache and browser cache). See also router cache.
Following are descriptions of the traditional memory and disk caches that are common in all computers.
A memory cache, or "CPU cache," is a memory bank that bridges main memory and the CPU. It is faster than main memory and allows instructions to be executed and data to be read and written at higher speed. Instructions and data are transferred from main memory to the cache in fixed blocks, known as cache "lines," using some kind of look-ahead algorithm. See cache line. Temporal and Spatial (Time and Space)
Caches take advantage of "temporal locality," which means the same data item is often reused many times. They also benefit from "spatial locality," wherein the next instruction to be executed or the next data item to be processed is likely to be the next in line. The more often the same data item is processed or the more sequential the instructions or data, the greater the chance for a "cache hit." If the next item is not in the cache, a "cache miss" occurs, and the CPU has to go to main memory to retrieve it.
Level 1 and Level 2
A level 1 (L1) cache is a memory bank built into the CPU chip. A level 2 cache (L2) is a secondary staging area that feeds the L1 cache. Increasing the size of the L2 cache may speed up some applications but have no effect on others. L2 may be built into the CPU chip, reside on a separate chip in a multichip package module (see MCP) or be a separate bank of chips on the motherboard. Caches are typically static RAM (SRAM), while main memory is generally some variety of dynamic RAM (DRAM). See SRAM and DRAM.
A disk cache is a section of main memory or memory on the disk controller board that bridges the disk and the CPU. When the disk is read, a larger block of data is copied into the cache than is immediately required. If subsequent reads find the data already stored in the cache, there is no need to retrieve it from the disk, which is slower to access.
If the cache is used for writing, data are queued up at high speed and then written to disk during idle machine cycles by the caching program. If the cache is built into the hardware, the disk controller figures out when to do it. Seecache coherency, write back cache, write through cache, pipeline burst cache, lookaside cache, inline cache, backside cache and NV cache.