The performance gain attained through read caching is dependent on the achieved (read) hit rate. Data requests which can be served directly from the cache are handled very fast when compared with disk accesses.
The hit rate is in turn dependent on access locality, the size of the cache area available and the prefetching factor. In the last analysis, the interaction of these three parameters is decisive for read cache efficiency. This is illustrated by the examples given below:
Sequential file processing
In many applications, files are processed by a series of sequential data accesses, i.e. the processing sequence matches the sequence of the data on the disk. Favorable hit rates are obtained automatically for data accesses by prefetching major sets of data and holding them in the cache.Frequent read accesses to selected data areas
In many applications, access to special data areas is particularly frequent (e.g. catalogs, index areas and directories). In these cases, buffering of this data in a fast storage is advisable to speed up access to the data (this principle is taken into consideration by DAB). This also considerably reduces the overall number of disk accesses, with positive effects on system performance.