Reading from the Cache
There are four types of application reads:
- In copy reads, data mapped in the cache is copied into memory so it can be read. An application's first read from a file usually is a copy read.
- Subsequent reads are usually fast reads, in which an application or other process calls the cache directly rather than calling the file system.
- With pin reads, data is mapped into the cache just to be changed and is then written back to disk. It is pinned in the cache; that is, it is held at the same address and is not pageable. This prevents page faults.
- With read aheads, the Virtual Memory Manager recognizes that the application is reading a file sequentially and, predicting its read pattern, begins to map larger blocks of data into the cache. Read aheads are usually efficient and are a sign that data references are localized. However, some application read patterns might fool the prediction logic of the Virtual Memory Manager and do read aheads when smaller reads might be more efficient. Only the application designer knows for sure!
The following graph shows the frequency of different kinds of cache reads during the run of a compiler. The intersecting curves are difficult to interpret, so a second copy of Performance Monitor—a report set to the same Time Window as the graph—is appended.
In this example, copy reads are more frequent than fast reads. This pattern of many first reads and fewer subsequent reads indicates that the application is probably reading from many small files. The rate of read aheads is also low, which is another indication that the application is skipping from file to file. When more fast reads than copy reads occur, the application is reading several times from the same file. The rate of read aheads should increase as well.