Skip to main content
SANtricity 11.7
A newer release of this product is available.

What is controller cache?

Contributors

The controller cache is a physical memory space that streamlines two types of I/O (input/output) operations: between the controllers and hosts, and between the controllers and disks.

For read and write data transfers, the hosts and controllers communicate over high-speed connections. However, communications from the back-end of the controller to the disks is slower, because disks are relatively slow devices.

When the controller cache receives data, the controller acknowledges to the host applications that it is now holding the data. This way, the host applications do not need to wait for the I/O to be written to disk. Instead, applications can continue operations. The cached data is also readily accessible by server applications, eliminating the need for extra disk reads to access the data.

The controller cache affects the overall performance of the storage array in several ways:

  • The cache acts as a buffer, so that host and disk data transfers do not need to be synchronized.

  • The data for a read or write operation from the host might be in cache from a previous operation, which eliminates the need to access the disk.

  • If write caching is used, the host can send subsequent write commands before the data from a previous write operation is written to disk.

  • If cache prefetch is enabled, sequential read access is optimized. Cache prefetch makes a read operation more likely to find its data in the cache, instead of reading the data from disk.

Caution

Possible loss of data — If you enable the Write caching without batteries option and do not have a universal power supply for protection, you could lose data. In addition, you could lose data if you do not have controller batteries and you enable the Write caching without batteries option.