Conclusion
The NetApp and Lenovo solution validated here is a flexible scale-out architecture that is ideal for entry into mid-level enterprise AI.
NetApp storage delivers the same or better performance as local SSD storage and offers the following benefits to data scientists, data engineers, and IT decision makers:
-
Effortless sharing of data between AI systems, analytics, and other critical business systems. This data sharing reduces infrastructure overhead, improves performance, and streamlines data management across the enterprise.
-
Independently scalable compute and storage to minimize costs and improve resource utilization.
-
Streamlined development and deployment workflows using integrated snapshots and clones for instantaneous and space-efficient user workspaces, integrated version control, and automated deployment.
-
Enterprise-grade data protection for disaster recovery and business continuance.
Acknowledgments
-
Karthikeyan Nagalingam, Technical Marketing Engineer, NetApp
-
Jarrett Upton, Admin, AI Lab Systems, Lenovo
Where to find additional information
To learn more about the information described in this document, refer to the following documents and/or websites:
-
NetApp All Flash Arrays product page
-
NetApp AFF A400 page
-
NetApp ONTAP data management software product page
-
MLPerf
-
TensorFlow benchmark
-
NVIDIA SMI (nvidia-smi)