Skip to main content

Cable the hardware for your ASA r2 storage system

Contributors netapp-jsnyder netapp-aherbin netapp-lisa

After you install the rack hardware for your ASA r2 storage system, install the network cables for the controllers, and connect the cables between the controllers and storage shelves.

Before you begin

Contact your network administrator for information about connecting the storage system to your network switches.

About this task
  • These procedures show common configurations. The specific cabling depends on the components ordered for your storage system. For comprehensive configuration and slot priority details, see NetApp Hardware Universe.

  • The cluster/HA and host network cabling procedures show common configurations.

    If you do not see your configuration in the cabling procedures, go to NetApp Hardware Universe for comprehensive configuration and slot priority information to properly cable your storage system.

  • If you have an ASA A1K, ASA A70, or ASA A90 storage system, the I/O slots are numbered 1 through 11.

    Slot numbering on an ASA A1K ASA A70 and ASA A90 controller
  • The cabling graphics have arrow icons showing the proper orientation (up or down) of the cable connector pull-tab when inserting a connector into a port.

    As you insert the connector, you should feel it click into place; if you do not feel it click, remove it, turn it over and try again.

    Cable pull tab direction

  • If cabling to an optical switch, insert the optical transceiver into the controller port before cabling to the switch port.

Step 1: Cable the cluster/HA connections

Cable the controllers to your ONTAP cluster. This procedure differs depending on your storage system model and I/O module configuration.

Note The cluster interconnect traffic and the HA traffic share the same physical ports.
A1K

Create the ONTAP cluster connections. For switchless clusters, connect the controllers to each other. For switched clusters, connect the controllers to the cluster network switches.

Switchless cluster cabling

Use the Cluster/HA interconnect cable to connect ports e1a to e1a and ports e7a to e7a.

Steps
  1. Connect port e1a on Controller A to port e1a on Controller B.

  2. Connect port e7a on Controller A to port e1a on Controller B.

    Cluster/HA interconnect cables

    Cluster HA cable
    Two-node switchless cluster cabling diagram
Switched cluster cabling

Use the 100 GbE cable to connect ports e1a to e1a and ports e7a to e7a.

Note Switched cluster configurations are supported in 9.16.1 and later.
Steps
  1. Connect port e1a on Controller A and port e1a on Controller B to cluster network switch A.

  2. Connect port e7a on Controller A and port e7a on Controller B to cluster network switch B.

    100 GbE cable

    100 Gb cable
    Cable cluster connections to cluster network
A70 and A90

Create the ONTAP cluster connections. For switchless clusters, connect the controllers to each other. For switched clusters, connect the controllers to the cluster network switches.

Switchless cluster cabling

Use the the Cluster/HA interconnect cable to connect ports e1a to e1a and ports e7a to e7a.

Steps
  1. Connect port e1a on Controller A to port e1a on Controller B.

  2. Connect port e7a on Controller A to port e1a on Controller B.

    Cluster/HA interconnect cables

    Cluster HA cable
    Two-node switchless cluster cabling diagram
Switched cluster cabling

Use the 100 GbE cable to connect ports e1a to e1a and ports e7a to e7a.

Note Switched cluster configurations are supported in 9.16.1 and later.
Steps
  1. Connect port e1a on Controller A and port e1a on Controller B to cluster network switch A.

  2. Connect port e7a on Controller A and port e7a on Controller B to cluster network switch B.

    100 GbE cable

    100 Gb cable
    Cable cluster connections to cluster network
A20, A30, and A50

Create the ONTAP cluster connections. For switchless clusters, connect the controllers to each other. For switched clusters, connect the controllers to the cluster network switches.

Switchless cluster cabling

Connect the controllers to each other to create the ONTAP cluster connections.

ASA A30 and ASA A50 with two 2-port 40/100 GbE I/O modules
Steps
  1. Connect the Cluster/HA interconnect connections:

    Note The cluster interconnect traffic and the HA traffic share the same physical ports (on the I/O modules in slots 2 and 4). The ports are 40/100 GbE.
    1. Connect controller A port e2a to controller B port e2a.

    2. Connect controller A port e4a to controller B port e4a.

      Note I/O module ports e2b and e4b are unused and available for host network connectivity.

      100 GbE Cluster/HA interconnect cables

      Cluster HA 100 GbE cable
      a30 and a50 switchless cluster cabling diagram using two 100gbe io modules
ASA A30 and ASA A50 with one 2-port 40/100 GbE I/O module
Steps
  1. Connect the Cluster/HA interconnect connections:

    Note The cluster interconnect traffic and the HA traffic share the same physical ports (on the I/O module in slot 4). The ports are 40/100 GbE.
    1. Connect controller A port e4a to controller B port e4a.

    2. Connect controller A port e4b to controller B port e4b.

      100 GbE Cluster/HA interconnect cables

      Cluster HA 100 GbE cable
      a30 and a50 switchless cluster cabling diagram using one 100gbe io module
ASA A20 with one 2-port 10/25 GbE I/O module
Steps
  1. Connect the Cluster/HA interconnect connections:

    Note The cluster interconnect traffic and the HA traffic share the same physical ports (on the I/O module in slot 4). The ports are 10/25 GbE.
    1. Connect controller A port e4a to controller B port e4a.

    2. Connect controller A port e4b to controller B port e4b.

      25 GbE Cluster/HA interconnect cables

      GbE SFP copper connector

      a20 switchless cluster cabling diagram using one 25 gbe io module

Switched cluster cabling

Connect the controllers to the cluster network switches to create the ONTAP cluster connections.

ASA A30 or ASA A50 with two 2-port 40/100 GbE I/O modules
Steps
  1. Cable the Cluster/HA interconnect connections:

    Note The cluster interconnect traffic and the HA traffic share the same physical ports (on the I/O modules in slots 2 and 4). The ports are 40/100 GbE.
    1. Connect controller A port e4a to cluster network switch A.

    2. Connect controller A port e2a to cluster network switch B.

    3. Connect controller B port e4a to cluster network switch A.

    4. Connect controller B port e2a to cluster network switch B.

      Note I/O module ports e2b and e4b are unused and available for host network connectivity.

      40/100 GbE Cluster/HA interconnect cables

      Cluster HA 40/100 GbE cable
      a30 and a50 switched cluster cabling diagram using two 100gbe io modules
ASA A30 or ASA A50 with one 2-port 40/100 GbE I/O module
Steps
  1. Cable the controllers to the cluster network switches:

    Note The cluster interconnect traffic and the HA traffic share the same physical ports (on the I/O module in slot 4). The ports are 40/100 GbE.
    1. Connect controller A port e4a to cluster network switch A.

    2. Connect controller A port e4b to cluster network switch B.

    3. Connect controller B port e4a to cluster network switch A.

    4. Connect controller B port e4b to cluster network switch B.

      40/100 GbE Cluster/HA interconnect cables

      Cluster HA 40/100 GbE cable
      Cable cluster connections to cluster network
ASA A20 with one 2-port 10/25 GbE I/O module
  1. Cable the controllers to the cluster network switches:

    Note The cluster interconnect traffic and the HA traffic share the same physical ports (on the I/O module in slot 4). The ports are 10/25 GbE.
    1. Connect controller A port e4a to cluster network switch A.

    2. Connect controller A port e4b to cluster network switch B.

    3. Connect controller B port e4a to cluster network switch A.

    4. Connect controller B port e4b to cluster network switch B.

      10/25 GbE Cluster/HA interconnect cables

      GbE SFP copper connector
      a20 switched cluster cabling diagram using one 25gbe io module
C30

Create the ONTAP cluster connections. For switchless clusters, connect the controllers to each other. For switched clusters, connect the controllers to the cluster network switches.

Switchless cluster cabling

Connect the controllers to each other to create the ONTAP cluster connections.

ASA C30 with two 2-port 40/100 GbE I/O modules
Steps
  1. Cable the Cluster/HA interconnect connections:

    Note The cluster interconnect traffic and the HA traffic share the same physical ports (on the I/O modules in slots 2 and 4). The ports are 40/100 GbE.
    1. Connect controller A port e2a to controller B port e2a.

    2. Connect controller A port e4a to controller B port e4a.

      Note I/O module ports e2b and e4b are unused and available for host network connectivity.

      100 GbE Cluster/HA interconnect cables

      Cluster HA 100 GbE cable
      a30 and a50 switchless cluster cabling diagram using two 100gbe io modules
ASA C30 with one 2-port 40/100 GbE I/O module
Steps
  1. Cable the Cluster/HA interconnect connections:

    Note The cluster interconnect traffic and the HA traffic share the same physical ports (on the I/O module in slot 4). The ports are 40/100 GbE.
    1. Connect controller A port e4a to controller B port e4a.

    2. Connect controller A port e4b to controller B port e4b.

      100 GbE Cluster/HA interconnect cables

      Cluster HA 100 GbE cable
      c30 switchless cluster cabling diagram using one 100gbe io module

Switched cluster cabling

Connect the controllers to the cluster network switches to create the ONTAP cluster connections.

ASA C30 with two 2-port 40/100 GbE I/O modules
Steps
  1. Cable the Cluster/HA interconnect connections:

    Note The cluster interconnect traffic and the HA traffic share the same physical ports (on the I/O modules in slots 2 and 4). The ports are 40/100 GbE.
    1. Connect controller A port e4a to cluster network switch A.

    2. Connect controller A port e2a to cluster network switch B.

    3. Connect controller B port e4a to cluster network switch A.

    4. Connect controller B port e2a to cluster network switch B.

      Note I/O module ports e2b and e4b are unused and available for host network connectivity.

      40/100 GbE Cluster/HA interconnect cables

      Cluster HA 40/100 GbE cable
      c30 switched cluster cabling diagram using two 100gbe io modules
ASA C30 with one 2-port 40/100 GbE I/O module
Steps
  1. Connect the controllers to the cluster network switches:

    Note The cluster interconnect traffic and the HA traffic share the same physical ports (on the I/O module in slot 4). The ports are 40/100 GbE.
    1. Connect controller A port e4a to cluster network switch A.

    2. Connect controller A port e4b to cluster network switch B.

    3. Connect controller B port e4a to cluster network switch A.

    4. Connect controller B port e4b to cluster network switch B.

      40/100 GbE Cluster/HA interconnect cables

      Cluster HA 40/100 GbE cable
      Cable cluster connections to cluster network

Step 2: Cable the host network connections

Connect the controllers to your host network.

This procedure differs depending on your storage system model and I/O module configuration.

A1K

Connect the Ethernet module ports to your host network.

The following are some typical host network cabling examples. See NetApp Hardware Universe for your specific system configuration.

Steps
  1. Connect ports e9a and e9b to your Ethernet data network switch.

    Note For maximum system performance for cluster and HA traffic, do not use ports e1b and e7b ports for host network connections. Use a separate host card to maximize performance.

    100 GbE cable

    100Gb Ethernet cable
    Cable to 100Gb Ethernet network
  2. Connect your 10/25 GbE host network switches.

    10/25 GbE Host

    10/25Gb Ethernet cable
    Cable to 10/25Gb Ethernet network
A70 and A90

Connect the Ethernet module ports to your host network.

The following are some typical host network cabling examples. See NetApp Hardware Universe for your specific system configuration.

Steps
  1. Connect ports e9a and e9b to your Ethernet data network switch.

    Note For maximum system performance for cluster and HA traffic, do not use ports e1b and e7b ports for host network connections. Use a separate host card to maximize performance.

    100 GbE cable

    100Gb Ethernet cable
    Cable to 100 Gb Ethernet network
  2. Connect your 10/25 GbE host network switches.

    4-ports, 10/25 GbE Host

    10/25 Gb cable
    Cable to 100Gb Ethernet network
A20, A30, and A50

Connect the Ethernet module ports or the Fibre Channel (FC) module ports to your host network.

Ethernet host cabling

ASA A30 and ASA A50 with two 2-port 40/100 GbE I/O modules

On each controller, connect ports e2b and e4b to the Ethernet host network switches.

Note The ports on I/O modules in slot 2 and 4 are 40/100 GbE (host connectivity is 40/100 GbE).

40/100 GbE cables

40/100 Gb cable
Cable to 40/100gbe ethernet host network switches
ASA A20, A30, and A50 with one 4-port 10/25 GbE I/O module

On each controller, connect ports e2a, e2b, e2c and e2d to the Ethernet host network switches.

10/25 GbE cables

GbE SFP copper connector

Cable to 40/100gbe ethernet host network switches

FC host cabling

ASA A20, A30, and A50 with One 4-port 64 Gb/s FC I/O module

On each controller, connect ports 1a, 1b, 1c and 1d to the FC host network switches.

64 Gb/s FC cables

64 Gb fc cable

Cable to 64gb fc host network switches
C30

Connect the Ethernet module ports or the Fibre Channel (FC) module ports to your host network.

Ethernet host cabling

ASA C30 with two 2-port 40/100 GbE I/O modules
Steps
  1. On each controller, cable ports e2b and e4b to the Ethernet host network switches.

    Note The ports on I/O modules in slot 2 and 4 are 40/100 GbE (host connectivity is 40/100 GbE).

    40/100 GbE cables

    40/100 Gb cable
    Cable to 40/100gbe ethernet host network switches
ASA C30 with one 4-port 10/25 GbE I/O module
Steps
  1. On each controller, cable ports e2a, e2b, e2c and e2d to the Ethernet host network switches.

    10/25 GbE cables

    GbE SFP copper connector

    Cable to 40/100gbe ethernet host network switches
ASA C30 with one 4-port 64 Gb/s FC I/O module
Steps
  1. On each controller, cable ports 1a, 1b, 1c and 1d to the FC host network switches.

    64 Gb/s FC cables

    64 Gb fc cable

    Cable to 64gb fc host network switches

Step 3: Cable the management network connections

Connect the controllers to your management network.

Contact your network administrator for information about connecting your storage system to the management network switches.

A1K

Use the 1000BASE-T RJ-45 cables to connect the management (wrench) ports on each controller to the management network switches.

RJ-45 cables

1000BASE-T RJ-45 cables

Connect to your management network
Important Do not plug in the power cords yet.
A70 and A90

Use the 1000BASE-T RJ-45 cables to connect the management (wrench) ports on each controller to the management network switches.

RJ45 cables

1000BASE-T RJ-45 cables

Connect to your management network
Important Do not plug in the power cords yet.
A20, A30, and A50

Connect the management (wrench) ports on each controller to the management network switches.

1000BASE-T RJ-45 cables

RJ-45 cables
Connect to your management network
Important Do not plug in the power cords yet.
C30

Connect the management (wrench) ports on each controller to the management network switches.

1000BASE-T RJ-45 cables

RJ-45 cables
Connect to your management network
Important Do not plug in the power cords yet.

Step 4: Cable the shelf connections

The following cabling procedures show how to connect your controllers to a storage shelf.

For the maximum number of shelves supported for your storage system and for all of your cabling options, such as optical and switch-attached, see NetApp Hardware Universe.

A1K

The AFF A1K storage systems support NS224 shelves with either the NSM100 or NSM100B module. The major differences between the modules are:

  • NSM100 shelf modules use built-in port e0a and e0b.

  • NSM100B shelf modules use ports e1a and e1b in slot 1.

The following cabling example shows NSM100 modules in the NS224 shelves when referring to shelf module ports.

Choose one of the following cabling options that matches your setup.

Option 1: One NS224 storage shelf

Connect each controller to the NSM modules on the NS224 shelf. The graphics show cabling from each of the controllers: Controller A cabling is shown in blue and Controller B cabling is shown in yellow.

Steps
  1. On controller A, connect the following ports:

    1. Connect port e11a to NSM A port e0a.

    2. Connect port e11b to port NSM B port e0b.

      Controller A e11a and e11b to a single NS224 shelf

  2. On controller B, connect the following ports:

    1. Connect port e11a to NSM B port e0a.

    2. Connect port e11b to NSM A port e0b.

      Connect controller B ports e11a and e11b to a single NS224 shelf

Option 2: Two NS224 storage shelves

Connect each controller to the NSM modules on both NS224 shelves. The graphics show cabling from each of the controllers: Controller A cabling is shown in blue and Controller B cabling is shown in yellow.

Steps
  1. On controller A, connect the following ports:

    1. Connect port e11a to shelf 1 NSM A port e0a.

    2. Connect port e11b to shelf 2 NSM B port e0b.

    3. Connect port e10a to shelf 2 NSM A port e0a.

    4. Connect port e10b to shelf 1 NSM A port e0b.

      Controller-to-shelf connections for controller A

  2. On controller B, connect the following ports:

    1. Connect port e11a to shelf 1 NSM B port e0a.

    2. Connect port e11b to shelf 2 NSM A port e0b.

    3. Connect port e10a to shelf 2 NSM B port e0a.

    4. Connect port e10b to shelf 1 NSM A port e0b.

      Controller-to-shelf connections for controller B

A70 and A90

The AFF A70 and 90 storage systems support NS224 shelves with either the NSM100 or NSM100B module. The major differences between the modules are:

  • NSM100 shelf modules use built-in ports e0a and e0b.

  • NSM100B shelf modules use ports e1a and e1b in slot 1.

The following cabling example shows NSM100 modules in the NS224 shelves when referring to shelf module ports.

Choose one of the following cabling options that matches your setup.

Option 1: One NS224 storage shelf

Connect each controller to the NSM modules on the NS224 shelf. The graphics show cabling from each of the controllers: Controller A cabling is shown in blue and Controller B cabling is shown in yellow.

100 GbE QSFP28 copper cables

100 GbE QSFP28 copper cable
Steps
  1. Connect controller A port e11a to NSM A port e0a.

  2. Connect controller A port e11b to port NSM B port e0b.

    Controller A e11a and e11b to a single NS224 shelf

  3. Connect controller B port e11a to NSM B port e0a.

  4. Connect controller B port e11b to NSM A port e0b.

    Controller B e11a and e11b to a single NS224 shelf

Option 2: Two NS224 storage shelves

Connect each controller to the NSM modules on both NS224 shelves. The graphics show cabling from each of the controllers: Controller A cabling is shown in blue and Controller B cabling is shown in yellow.

100 GbE QSFP28 copper cables

100 GbE QSFP28 copper cable
Steps
  1. On on controller A, connect the following ports:

    1. Connect port e11a to shelf 1, NSM A port e0a.

    2. Connect port e11b to shelf 2, NSM B port e0b.

    3. Connect port e8a to shelf 2, NSM A port e0a.

    4. Connect port e8b to shelf 1, NSM B port e0b.

      Controller-to-shelf connections for controller A

  2. On controller B, connect the following ports:

    1. Connect port e11a to shelf 1, NSM B port e0a.

    2. Connect port e11b to shelf 2, NSM A port e0b.

    3. Connect port e8a to shelf 2, NSM B port e0a.

    4. Connect port e8b to shelf 1, NSM A port e0b.

      Controller-to-shelf connections for controller B

A20, A30, and A50

The NS224 shelf cabling procedure shows NSM100B modules instead of NSM100 modules. The cabling is the same regardless of the type of NSM modules used, only the port names are different:

  • NSM100B modules use ports e1a and e1b on an I/O module in slot 1.

  • NSM100 modules use built-in (onboard) ports e0a and e0b.

You cable each controller to each NSM module on the NS224 shelf using the storage cables that came with your storage system, which could be the following cable type:

100 GbE QSFP28 copper cables

100 GbE QSFP28 copper cable

The graphics show controller A cabling in blue and controller B cabling in yellow.

Steps
  1. Connect controller A to the shelf:

    1. Connect controller A port e3a to NSM A port e1a.

    2. Connect controller A port e3b to NSM B port e1b.

      Controller A ports e3a and e3b cabled to one NS224 shelf

  2. Connect controller B to the shelf:

    1. Connect controller B port e3a to NSM B port e1a.

    2. Connect controller B port e3b to NSM A port e1b.

      Controller B ports e3a and e3b cabled to one NS224 shelf

C30

The NS224 shelf cabling procedure shows NSM100B modules instead of NSM100 modules. The cabling is the same regardless of the type of NSM modules used, only the port names are different:

  • NSM100B modules use ports e1a and e1b on an I/O module in slot 1.

  • NSM100 modules use built-in (onboard) ports e0a and e0b.

You cable each controller to each NSM module on the NS224 shelf using the storage cables that came with your storage system, which could be the following cable type:

100 GbE QSFP28 copper cables

100 GbE QSFP28 copper cable

The graphics show controller A cabling in blue and controller B cabling in yellow.

Steps
  1. Connect controller A to the shelf:

    1. Connect controller A port e3a to NSM A port e1a.

    2. Connect controller A port e3b to NSM B port e1b.

      Controller A ports e3a and e3b cabled to one NS224 shelf

  2. Connect controller B to the shelf:

    1. Connect controller B port e3a to NSM B port e1a.

    2. Connect controller B port e3b to NSM A port e1b.

      Controller B ports e3a and e3b cabled to one NS224 shelf

What's next?

After you've connected the storage controllers to your network and then connected the controllers to your storage shelves, you power on the ASA r2 storage system.