Cable the hardware for your ASA r2 storage system
After you install the rack hardware for your ASA r2 storage system, install the network cables for the controllers, and connect the cables between the controllers and storage shelves.
Contact your network administrator for information about connecting the storage system to your network switches.
-
These procedures show common configurations. The specific cabling depends on the components ordered for your storage system. For comprehensive configuration and slot priority details, see NetApp Hardware Universe.
-
The cluster/HA and host network cabling procedures show common configurations.
If you do not see your configuration in the cabling procedures, go to NetApp Hardware Universe for comprehensive configuration and slot priority information to properly cable your storage system.
-
If you have an ASA A1K, ASA A70, or ASA A90 storage system, the I/O slots are numbered 1 through 11.
-
The cabling graphics have arrow icons showing the proper orientation (up or down) of the cable connector pull-tab when inserting a connector into a port.
As you insert the connector, you should feel it click into place; if you do not feel it click, remove it, turn it over and try again.
-
If cabling to an optical switch, insert the optical transceiver into the controller port before cabling to the switch port.
Step 1: Cable the cluster/HA connections
Cable the controllers to your ONTAP cluster. This procedure differs depending on your storage system model and I/O module configuration.
|
The cluster interconnect traffic and the HA traffic share the same physical ports. |
Create the ONTAP cluster connections. For switchless clusters, connect the controllers to each other. For switched clusters, connect the controllers to the cluster network switches.
Switchless cluster cabling
Use the Cluster/HA interconnect cable to connect ports e1a to e1a and ports e7a to e7a.
-
Connect port e1a on Controller A to port e1a on Controller B.
-
Connect port e7a on Controller A to port e1a on Controller B.
Cluster/HA interconnect cables
Switched cluster cabling
Use the 100 GbE cable to connect ports e1a to e1a and ports e7a to e7a.
|
Switched cluster configurations are supported in 9.16.1 and later. |
-
Connect port e1a on Controller A and port e1a on Controller B to cluster network switch A.
-
Connect port e7a on Controller A and port e7a on Controller B to cluster network switch B.
100 GbE cable
Create the ONTAP cluster connections. For switchless clusters, connect the controllers to each other. For switched clusters, connect the controllers to the cluster network switches.
Switchless cluster cabling
Use the the Cluster/HA interconnect cable to connect ports e1a to e1a and ports e7a to e7a.
-
Connect port e1a on Controller A to port e1a on Controller B.
-
Connect port e7a on Controller A to port e1a on Controller B.
Cluster/HA interconnect cables
Switched cluster cabling
Use the 100 GbE cable to connect ports e1a to e1a and ports e7a to e7a.
|
Switched cluster configurations are supported in 9.16.1 and later. |
-
Connect port e1a on Controller A and port e1a on Controller B to cluster network switch A.
-
Connect port e7a on Controller A and port e7a on Controller B to cluster network switch B.
100 GbE cable
Create the ONTAP cluster connections. For switchless clusters, connect the controllers to each other. For switched clusters, connect the controllers to the cluster network switches.
Switchless cluster cabling
Connect the controllers to each other to create the ONTAP cluster connections.
ASA A30 and ASA A50 with two 2-port 40/100 GbE I/O modules
-
Connect the Cluster/HA interconnect connections:
The cluster interconnect traffic and the HA traffic share the same physical ports (on the I/O modules in slots 2 and 4). The ports are 40/100 GbE. -
Connect controller A port e2a to controller B port e2a.
-
Connect controller A port e4a to controller B port e4a.
I/O module ports e2b and e4b are unused and available for host network connectivity. 100 GbE Cluster/HA interconnect cables
-
ASA A30 and ASA A50 with one 2-port 40/100 GbE I/O module
-
Connect the Cluster/HA interconnect connections:
The cluster interconnect traffic and the HA traffic share the same physical ports (on the I/O module in slot 4). The ports are 40/100 GbE. -
Connect controller A port e4a to controller B port e4a.
-
Connect controller A port e4b to controller B port e4b.
100 GbE Cluster/HA interconnect cables
-
ASA A20 with one 2-port 10/25 GbE I/O module
-
Connect the Cluster/HA interconnect connections:
The cluster interconnect traffic and the HA traffic share the same physical ports (on the I/O module in slot 4). The ports are 10/25 GbE. -
Connect controller A port e4a to controller B port e4a.
-
Connect controller A port e4b to controller B port e4b.
25 GbE Cluster/HA interconnect cables
-
Switched cluster cabling
Connect the controllers to the cluster network switches to create the ONTAP cluster connections.
ASA A30 or ASA A50 with two 2-port 40/100 GbE I/O modules
-
Cable the Cluster/HA interconnect connections:
The cluster interconnect traffic and the HA traffic share the same physical ports (on the I/O modules in slots 2 and 4). The ports are 40/100 GbE. -
Connect controller A port e4a to cluster network switch A.
-
Connect controller A port e2a to cluster network switch B.
-
Connect controller B port e4a to cluster network switch A.
-
Connect controller B port e2a to cluster network switch B.
I/O module ports e2b and e4b are unused and available for host network connectivity. 40/100 GbE Cluster/HA interconnect cables
-
ASA A30 or ASA A50 with one 2-port 40/100 GbE I/O module
-
Cable the controllers to the cluster network switches:
The cluster interconnect traffic and the HA traffic share the same physical ports (on the I/O module in slot 4). The ports are 40/100 GbE. -
Connect controller A port e4a to cluster network switch A.
-
Connect controller A port e4b to cluster network switch B.
-
Connect controller B port e4a to cluster network switch A.
-
Connect controller B port e4b to cluster network switch B.
40/100 GbE Cluster/HA interconnect cables
-
ASA A20 with one 2-port 10/25 GbE I/O module
-
Cable the controllers to the cluster network switches:
The cluster interconnect traffic and the HA traffic share the same physical ports (on the I/O module in slot 4). The ports are 10/25 GbE. -
Connect controller A port e4a to cluster network switch A.
-
Connect controller A port e4b to cluster network switch B.
-
Connect controller B port e4a to cluster network switch A.
-
Connect controller B port e4b to cluster network switch B.
10/25 GbE Cluster/HA interconnect cables
-
Create the ONTAP cluster connections. For switchless clusters, connect the controllers to each other. For switched clusters, connect the controllers to the cluster network switches.
Switchless cluster cabling
Connect the controllers to each other to create the ONTAP cluster connections.
ASA C30 with two 2-port 40/100 GbE I/O modules
-
Cable the Cluster/HA interconnect connections:
The cluster interconnect traffic and the HA traffic share the same physical ports (on the I/O modules in slots 2 and 4). The ports are 40/100 GbE. -
Connect controller A port e2a to controller B port e2a.
-
Connect controller A port e4a to controller B port e4a.
I/O module ports e2b and e4b are unused and available for host network connectivity. 100 GbE Cluster/HA interconnect cables
-
ASA C30 with one 2-port 40/100 GbE I/O module
-
Cable the Cluster/HA interconnect connections:
The cluster interconnect traffic and the HA traffic share the same physical ports (on the I/O module in slot 4). The ports are 40/100 GbE. -
Connect controller A port e4a to controller B port e4a.
-
Connect controller A port e4b to controller B port e4b.
100 GbE Cluster/HA interconnect cables
-
Switched cluster cabling
Connect the controllers to the cluster network switches to create the ONTAP cluster connections.
ASA C30 with two 2-port 40/100 GbE I/O modules
-
Cable the Cluster/HA interconnect connections:
The cluster interconnect traffic and the HA traffic share the same physical ports (on the I/O modules in slots 2 and 4). The ports are 40/100 GbE. -
Connect controller A port e4a to cluster network switch A.
-
Connect controller A port e2a to cluster network switch B.
-
Connect controller B port e4a to cluster network switch A.
-
Connect controller B port e2a to cluster network switch B.
I/O module ports e2b and e4b are unused and available for host network connectivity. 40/100 GbE Cluster/HA interconnect cables
-
ASA C30 with one 2-port 40/100 GbE I/O module
-
Connect the controllers to the cluster network switches:
The cluster interconnect traffic and the HA traffic share the same physical ports (on the I/O module in slot 4). The ports are 40/100 GbE. -
Connect controller A port e4a to cluster network switch A.
-
Connect controller A port e4b to cluster network switch B.
-
Connect controller B port e4a to cluster network switch A.
-
Connect controller B port e4b to cluster network switch B.
40/100 GbE Cluster/HA interconnect cables
-
Step 2: Cable the host network connections
Connect the controllers to your host network.
This procedure differs depending on your storage system model and I/O module configuration.
Connect the Ethernet module ports to your host network.
The following are some typical host network cabling examples. See NetApp Hardware Universe for your specific system configuration.
-
Connect ports e9a and e9b to your Ethernet data network switch.
For maximum system performance for cluster and HA traffic, do not use ports e1b and e7b ports for host network connections. Use a separate host card to maximize performance. 100 GbE cable
-
Connect your 10/25 GbE host network switches.
10/25 GbE Host
Connect the Ethernet module ports to your host network.
The following are some typical host network cabling examples. See NetApp Hardware Universe for your specific system configuration.
-
Connect ports e9a and e9b to your Ethernet data network switch.
For maximum system performance for cluster and HA traffic, do not use ports e1b and e7b ports for host network connections. Use a separate host card to maximize performance. 100 GbE cable
-
Connect your 10/25 GbE host network switches.
4-ports, 10/25 GbE Host
Connect the Ethernet module ports or the Fibre Channel (FC) module ports to your host network.
Ethernet host cabling
ASA A30 and ASA A50 with two 2-port 40/100 GbE I/O modules
On each controller, connect ports e2b and e4b to the Ethernet host network switches.
|
The ports on I/O modules in slot 2 and 4 are 40/100 GbE (host connectivity is 40/100 GbE). |
40/100 GbE cables

ASA A20, A30, and A50 with one 4-port 10/25 GbE I/O module
On each controller, connect ports e2a, e2b, e2c and e2d to the Ethernet host network switches.
10/25 GbE cables
FC host cabling
ASA A20, A30, and A50 with One 4-port 64 Gb/s FC I/O module
On each controller, connect ports 1a, 1b, 1c and 1d to the FC host network switches.
64 Gb/s FC cables
Connect the Ethernet module ports or the Fibre Channel (FC) module ports to your host network.
Ethernet host cabling
ASA C30 with two 2-port 40/100 GbE I/O modules
-
On each controller, cable ports e2b and e4b to the Ethernet host network switches.
The ports on I/O modules in slot 2 and 4 are 40/100 GbE (host connectivity is 40/100 GbE). 40/100 GbE cables
ASA C30 with one 4-port 10/25 GbE I/O module
-
On each controller, cable ports e2a, e2b, e2c and e2d to the Ethernet host network switches.
10/25 GbE cables
ASA C30 with one 4-port 64 Gb/s FC I/O module
-
On each controller, cable ports 1a, 1b, 1c and 1d to the FC host network switches.
64 Gb/s FC cables
Step 3: Cable the management network connections
Connect the controllers to your management network.
Contact your network administrator for information about connecting your storage system to the management network switches.
Use the 1000BASE-T RJ-45 cables to connect the management (wrench) ports on each controller to the management network switches.
1000BASE-T RJ-45 cables
|
Do not plug in the power cords yet. |
Use the 1000BASE-T RJ-45 cables to connect the management (wrench) ports on each controller to the management network switches.
1000BASE-T RJ-45 cables
|
Do not plug in the power cords yet. |
Connect the management (wrench) ports on each controller to the management network switches.
1000BASE-T RJ-45 cables

|
Do not plug in the power cords yet. |
Connect the management (wrench) ports on each controller to the management network switches.
1000BASE-T RJ-45 cables

|
Do not plug in the power cords yet. |
Step 4: Cable the shelf connections
The following cabling procedures show how to connect your controllers to a storage shelf.
For the maximum number of shelves supported for your storage system and for all of your cabling options, such as optical and switch-attached, see NetApp Hardware Universe.
The AFF A1K storage systems support NS224 shelves with either the NSM100 or NSM100B module. The major differences between the modules are:
-
NSM100 shelf modules use built-in port e0a and e0b.
-
NSM100B shelf modules use ports e1a and e1b in slot 1.
The following cabling example shows NSM100 modules in the NS224 shelves when referring to shelf module ports.
Choose one of the following cabling options that matches your setup.
Option 1: One NS224 storage shelf
Connect each controller to the NSM modules on the NS224 shelf. The graphics show cabling from each of the controllers: Controller A cabling is shown in blue and Controller B cabling is shown in yellow.
-
On controller A, connect the following ports:
-
Connect port e11a to NSM A port e0a.
-
Connect port e11b to port NSM B port e0b.
-
-
On controller B, connect the following ports:
-
Connect port e11a to NSM B port e0a.
-
Connect port e11b to NSM A port e0b.
-
Option 2: Two NS224 storage shelves
Connect each controller to the NSM modules on both NS224 shelves. The graphics show cabling from each of the controllers: Controller A cabling is shown in blue and Controller B cabling is shown in yellow.
-
On controller A, connect the following ports:
-
Connect port e11a to shelf 1 NSM A port e0a.
-
Connect port e11b to shelf 2 NSM B port e0b.
-
Connect port e10a to shelf 2 NSM A port e0a.
-
Connect port e10b to shelf 1 NSM A port e0b.
-
-
On controller B, connect the following ports:
-
Connect port e11a to shelf 1 NSM B port e0a.
-
Connect port e11b to shelf 2 NSM A port e0b.
-
Connect port e10a to shelf 2 NSM B port e0a.
-
Connect port e10b to shelf 1 NSM A port e0b.
-
The AFF A70 and 90 storage systems support NS224 shelves with either the NSM100 or NSM100B module. The major differences between the modules are:
-
NSM100 shelf modules use built-in ports e0a and e0b.
-
NSM100B shelf modules use ports e1a and e1b in slot 1.
The following cabling example shows NSM100 modules in the NS224 shelves when referring to shelf module ports.
Choose one of the following cabling options that matches your setup.
Option 1: One NS224 storage shelf
Connect each controller to the NSM modules on the NS224 shelf. The graphics show cabling from each of the controllers: Controller A cabling is shown in blue and Controller B cabling is shown in yellow.
100 GbE QSFP28 copper cables
-
Connect controller A port e11a to NSM A port e0a.
-
Connect controller A port e11b to port NSM B port e0b.
-
Connect controller B port e11a to NSM B port e0a.
-
Connect controller B port e11b to NSM A port e0b.
Option 2: Two NS224 storage shelves
Connect each controller to the NSM modules on both NS224 shelves. The graphics show cabling from each of the controllers: Controller A cabling is shown in blue and Controller B cabling is shown in yellow.
100 GbE QSFP28 copper cables
-
On on controller A, connect the following ports:
-
Connect port e11a to shelf 1, NSM A port e0a.
-
Connect port e11b to shelf 2, NSM B port e0b.
-
Connect port e8a to shelf 2, NSM A port e0a.
-
Connect port e8b to shelf 1, NSM B port e0b.
-
-
On controller B, connect the following ports:
-
Connect port e11a to shelf 1, NSM B port e0a.
-
Connect port e11b to shelf 2, NSM A port e0b.
-
Connect port e8a to shelf 2, NSM B port e0a.
-
Connect port e8b to shelf 1, NSM A port e0b.
-
The NS224 shelf cabling procedure shows NSM100B modules instead of NSM100 modules. The cabling is the same regardless of the type of NSM modules used, only the port names are different:
-
NSM100B modules use ports e1a and e1b on an I/O module in slot 1.
-
NSM100 modules use built-in (onboard) ports e0a and e0b.
You cable each controller to each NSM module on the NS224 shelf using the storage cables that came with your storage system, which could be the following cable type:
100 GbE QSFP28 copper cables

The graphics show controller A cabling in blue and controller B cabling in yellow.
-
Connect controller A to the shelf:
-
Connect controller A port e3a to NSM A port e1a.
-
Connect controller A port e3b to NSM B port e1b.
-
-
Connect controller B to the shelf:
-
Connect controller B port e3a to NSM B port e1a.
-
Connect controller B port e3b to NSM A port e1b.
-
The NS224 shelf cabling procedure shows NSM100B modules instead of NSM100 modules. The cabling is the same regardless of the type of NSM modules used, only the port names are different:
-
NSM100B modules use ports e1a and e1b on an I/O module in slot 1.
-
NSM100 modules use built-in (onboard) ports e0a and e0b.
You cable each controller to each NSM module on the NS224 shelf using the storage cables that came with your storage system, which could be the following cable type:
100 GbE QSFP28 copper cables

The graphics show controller A cabling in blue and controller B cabling in yellow.
-
Connect controller A to the shelf:
-
Connect controller A port e3a to NSM A port e1a.
-
Connect controller A port e3b to NSM B port e1b.
-
-
Connect controller B to the shelf:
-
Connect controller B port e3a to NSM B port e1a.
-
Connect controller B port e3b to NSM A port e1b.
-
After you've connected the storage controllers to your network and then connected the controllers to your storage shelves, you power on the ASA r2 storage system.