Skip to main content
Install and maintain

Cable the hardware - AFF A20, AFF A30, and AFF A50

Contributors netapp-lisa

After you install your AFF A20, AFF A30, or AFF A50 storage system hardware, cable the controllers to the network and shelves.

Before you begin

Contact your network administrator for information about connecting the storage system to your network switches.

About this task
  • The cluster/HA and host network cabling procedures show common configurations. Keep in mind that the specific cabling depends on the components ordered for your storage system. For comprehensive configuration and slot priority details, see NetApp Hardware Universe.

  • The cabling graphics have arrow icons showing the proper orientation (up or down) of the cable connector pull-tab when inserting a connector into a port.

    As you insert the connector, you should feel it click into place; if you do not feel it click, remove it, turn it over and try again.

    Cable pull tab direction

  • If cabling to an optical switch, insert the optical transceiver into the controller port before cabling to the switch port.

Step 1: Cable the cluster/HA connections

Cable the controllers to your ONTAP cluster. This procedure differs depending on your storage system model and I/O module configuration.

Switchless cluster cabling
AFF A30 or AFF A50 with two 2-port 40/100 GbE I/O modules

Cable the controllers to each other to create the ONTAP cluster connections.

Steps
  1. Cable the Cluster/HA interconnect connections:

    Note The cluster interconnect traffic and the HA traffic share the same physical ports (on the I/O modules in slots 2 and 4). The ports are 40/100 GbE.
    1. Cable controller A port e2a to controller B port e2a.

    2. Cable controller A port e4a to controller B port e4a.

      Note I/O module ports e2b and e4b are unused and available for host network connectivity.

      100 GbE Cluster/HA interconnect cables

      Cluster HA 100 GbE cable
      a30 and a50 switchless cluster cabling diagram using two 100gbe io modules
AFF A30 or AFF A50 with one 2-port 40/100 GbE I/O module

Cable the controllers to each other to create the ONTAP cluster connections.

Steps
  1. Cable the Cluster/HA interconnect connections:

    Note The cluster interconnect traffic and the HA traffic share the same physical ports (on the I/O module in slot 4). The ports are 40/100 GbE.
    1. Cable controller A port e4a to controller B port e4a.

    2. Cable controller A port e4b to controller B port e4b.

      100 GbE Cluster/HA interconnect cables

      Cluster HA 100 GbE cable
      a30 and a50 switchless cluster cabling diagram using one 100gbe io module
AFF A20 with one 2-port 10/25 GbE I/O module

Cable the controllers to each other to create the ONTAP cluster connections.

Steps
  1. Cable the Cluster/HA interconnect connections:

    Note The cluster interconnect traffic and the HA traffic share the same physical ports (on the I/O module in slot 4). The ports are 10/25 GbE.
    1. Cable controller A port e4a to controller B port e4a.

    2. Cable controller A port e4b to controller B port e4b.

      25 GbE Cluster/HA interconnect cables

      GbE SFP copper connector

      a20 switchless cluster cabling diagram using one 25 gbe io module
Switched cluster cabling
AFF A30 or AFF A50 with two 2-port 40/100 GbE I/O modules

Cable the controllers to the cluster network switches to create the ONTAP cluster connections.

Steps
  1. Cable the Cluster/HA interconnect connections:

    Note The cluster interconnect traffic and the HA traffic share the same physical ports (on the I/O modules in slots 2 and 4). The ports are 40/100 GbE.
    1. Cable controller A port e4a to cluster network switch A.

    2. Cable controller A port e2a to cluster network switch B.

    3. Cable controller B port e4a to cluster network switch A.

    4. Cable controller B port e2a to cluster network switch B.

      Note I/O module ports e2b and e4b are unused and available for host network connectivity.

      40/100 GbE Cluster/HA interconnect cables

      Cluster HA 40/100 GbE cable
      a30 and a50 switched cluster cabling diagram using two 100gbe io modules
AFF A30 or AFF A50 with one 2-port 40/100 GbE I/O module

Cable the controllers to the cluster network switches to create the ONTAP cluster connections.

Steps
  1. Cable the controllers to the cluster network switches:

    Note The cluster interconnect traffic and the HA traffic share the same physical ports (on the I/O module in slot 4). The ports are 40/100 GbE.
    1. Cable controller A port e4a to cluster network switch A.

    2. Cable controller A port e4b to cluster network switch B.

    3. Cable controller B port e4a to cluster network switch A.

    4. Cable controller B port e4b to cluster network switch B.

      40/100 GbE Cluster/HA interconnect cables

      Cluster HA 40/100 GbE cable
      Cable cluster connections to cluster network
AFF A20 with one 2-port 10/25 GbE I/O module

Cable the controllers to the cluster network switches to create the ONTAP cluster connections.

  1. Cable the controllers to the cluster network switches:

    Note The cluster interconnect traffic and the HA traffic share the same physical ports(on the I/O module in slot 4). The ports are 10/25 GbE.
    1. Cable controller A port e4a to cluster network switch A.

    2. Cable controller A port e4b to cluster network switch B.

    3. Cable controller B port e4a to cluster network switch A.

    4. Cable controller B port e4b to cluster network switch B.

      10/25 GbE Cluster/HA interconnect cables

      GbE SFP copper connector

      a20 switched cluster cabling diagram using one 25gbe io module

Step 2: Cable the host network connections

Cable the controllers to your host network.

This procedure differs depending on your storage system model and I/O module configuration.

AFF A30 or AFF A50 with two 2-port 40/100 GbE I/O modules
Steps
  1. Cable the host network connections.

    The following substeps are examples of optional host network cabling. If needed, see NetApp Hardware Universe for your specific storage system configuration.

    1. Optional: Cable controllers to the host network switches.

      On each controller, cable ports e2b and e4b to the Ethernet host network switches.

      Note The ports on I/O modules in slot 2 and 4 are 40/100 GbE (host connectivity is 40/100 GbE).

      40/100 GbE cables

      40/100 Gb cable
      Cable to 40/100gbe ethernet host network switches
    2. Optional: Cable controllers to FC host network switches.

      On each controller, cable ports 1a, 1b, 1c and 1d to the FC host network switches.

      64 Gb/s FC cables

      64 Gb fc cable

      Cable a30 or a50 to 64gb fc host network switches using two io modules
AFF A30 or AFF A50 with one 2-port 10/25 GbE I/O module
Steps
  1. Cable the host network connections.

    The following substeps are examples of optional host network cabling. If needed, see NetApp Hardware Universe for your specific storage system configuration.

    1. Optional: Cable controllers to the host network switches.

      On each controller, cable ports e2a, e2b, e2c and e2d to the Ethernet host network switches.

      10/25 GbE cables

      GbE SFP copper connector

      Cable to 40/100gbe ethernet host network switches
    2. Optional: Cable controllers to FC host network switches.

      On each controller, cable ports 1a, 1b, 1c and 1d to the FC host network switches.

      64 Gb/s FC cables

      64 Gb fc cable

      Cable to 64gb fc host network switches
AFF A20 with one 2-port 10/25 GbE module
Steps
  1. Cable the host network connections.

    The following substeps are examples of optional host network cabling. If needed, see NetApp Hardware Universe for your specific storage system configuration.

    1. Optional: Cable controllers to host network switches.

      On each controller, cable ports e2a, e2b, e2c and e2d to the Ethernet host network switches.

      10/25 GbE cables

      GbE SFP copper connector

      Cable a20 to 40/100gbe ethernet host network switches
    2. Optional: Cable controllers to FC host network switches.

      On each controller, cable ports 1a, 1b, 1c and 1d to the FC host network switches.

      64 Gb/s FC cables

      64 Gb fc cable

      Cable a20 to 64gb fc host network switches

Step 3: Cable the management network connections

Cable the controllers to your management network.

  1. Cable the management (wrench) ports on each controller to the management network switches.

    1000BASE-T RJ-45 cables

    RJ-45 cables
    Connect to your management network
Important Do not plug in the power cords yet.

Step 4: Cable the shelf connections

This procedure shows you how to cable the controllers to one NS224 shelf.

About this task
  • For the maximum number of shelves supported for your storage system and for all of your cabling options, such as optical and switch-attached, see NetApp Hardware Universe.

  • You cable each controller to each NSM100B module on the NS224 shelf using the storage cables that came with your storage system, which could be the following cable type:

    100 GbE QSFP28 copper cables

    100 GbE QSFP28 copper cable
  • The graphics show controller A cabling in blue and controller B cabling in yellow.

Steps
  1. Cable controller A to the shelf:

    1. Cable controller A port e3a to NSM A port e1a.

    2. Cable controller A port e3b to NSM B port e1b.

      Controller A ports e3a and e3b cabled to one NS224 shelf

  2. Cable controller B to the shelf:

    1. Cable controller B port e3a to NSM B port e1a.

    2. Cable controller B port e3b to NSM A port e1b.

      Controller B ports e3a and e3b cabled to one NS224 shelf

What's next?

After you’ve cabled the hardware for your storage system, you power on the storage system.