Skip to main content
StorageGRID solutions and resources

Use StorageGRID load balancers

Contributors netapp-aronk

Learn about the role of a StorageGRID Gateway Node load balancer.

General guidance for implementing NetApp® StorageGRID® Gateway Nodes.

StorageGRID Gateway Node load balancer versus third-party load balancer

StorageGRID is unique among S3-compatible object storage vendors in that it provides a native load balancer available as a purpose-built appliance, VM, or container. The StorageGRID provided load balancer is also referred to as a Gateway Node.

For customers that do not already own a load balancer such as F5, Citrix, and so on, implementation of a third-party load balancer can be very complex. The StorageGRID load balancer greatly simplifies load balancer operations.

The Gateway Node is an enterprise grade, highly available, and high-performance load balancer. Customers can choose to implement the Gateway Node, third-party load balancer, or even both, in the same grid. The Gateway Node is a local traffic manager versus a GSLB.

The StorageGRID load balancer provides the following advantages:

  • Simplicity. Automatic configuration of resource pools, health checks, patching, and maintenance, all managed by StorageGRID.

  • Performance. The StorageGRID load balancer is dedicated to StorageGRID, can provide high performance caching, and you do not compete with other applications for bandwidth.

  • Cost. The virtual machine (VM) and container versions are provided at no additional cost.

  • Traffic classifications. The Advanced Traffic Classification feature allows for StorageGRID-specific QoS rules along with workload analytics.

  • Future StorageGRID specific features. StorageGRID will continue to optimize and add innovative features to the load balancer over upcoming releases.

As an integrated node of StorageGRID, the local traffic manager has the ability to use advanced health checking to distribute requests based on Storage Node health, load and resource availability. In addition it has the ability to distribute the load across multiple sites when the StorageGRID link costs are set to "0" between the sites. In the event the Storage Nodes are unavailable but the Gateway Node is available in a site, the load will automatically be directed to another site in the grid.

The Load balancer caching feature of the Gateway Node is intended to provide a substantial performance improvement for certain workloads (such as AI training) which re-read a data set multiple times as part of processing that data.
Caching gateway nodes can also be deployed physically distant from the rest of the grid enabling better performance and lower WAN network utilization in some workloads. The cache operates in a read back mode where writes are not cached and do not modify the state of the cache. Each Caching Gateway Node operates independently of any other caching Gateway node.

For details about deploying the StorageGRID Gateway Node, see the StorageGRID documentation.