Menu

Enterprise-class AI: NVIDIA, VMware, and NetApp

people working around a computer
Table Of Contents

Share this page

Mike Oglesby
Mike Oglesby
179 views

Managing AI infrastructure often feels like a daunting task. For IT architects and admins, much of the tooling is new and unfamiliar. The learning curve is steep, and there are often important questions around enterprise-readiness. If these concerns sound familiar to you, then NetApp, NVIDIA, and VMware have the perfect answer.

NVIDIA AI Enterprise

NVIDIA recently announced the global availability of NVIDIA AI Enterprise. This end-to-end, cloud-native suite of AI and data analytics software is optimized, certified, and supported by NVIDIA to run on VMware vSphere with NVIDIA-Certified Systems. This software facilitates the simple and rapid deployment, management, and scaling of AI workloads in the modern hybrid cloud environment.

NetApp and VMware: Better together

Where does NetApp come into play, you ask? Enterprise workloads, with all of their accompanying support and availability expectations, require enterprise-class storage. NetApp has a long and storied history of providing enterprise-class storage for VMware workloads. For over 18 years, NetApp and VMware have worked together to define the virtualized data center industry, unlocking the full speed, efficiency, and cost savings of virtualized applications. NetApp and VMware are trusted by public and private sector entities all over the world to power their mission-critical workloads. NetApp supports NVIDIA AI Enterprise environments just as we support any other VMware environment.

NetApp and NVIDIA: Leading enterprise-class AI

But wait. There’s more! NetApp is closely aligned with our partner NVIDIA to enable enterprise-class AI. NetApp and NVIDIA have been collaborating for over 3 years to develop and release joint solutions that help customers fully realize the promise of AI. Some of the largest organizations in the world are using solutions from NetApp and NVIDIA to fuel their own AI revolution. NVIDIA AI Enterprise extends these efforts to VMware-based environments.

“Demanding AI workloads require a full-stack solution of optimized infrastructure and software. NVIDIA, VMware, and NetApp provide customers with an ideal solution for running AI on their hybrid cloud infrastructure,” said Jeff Weiss, director, MLOps Solutions Architects at NVIDIA.

NVIDIA chart

On-demand GPU-enabled AI environments

At NetApp, we practice what we preach. We combine NetApp, NVIDIA, and VMware solutions to run our Lab on Demand environment for AI workloads. Our lab lets NetApp employees and partners get familiar with the most modern AI data pipeline workflows. If you request a lab, you get your own fully-contained, GPU-accelerated Kubernetes environment with the NetApp AI Control Plane software stack already installed. Within your lab environment, you can use NVIDIA NeMo, an open-source toolkit for developing state-of-the-art conversational AI models. With this toolkit, you can train a Natural Language Processing (NLP) model to tell whether sentences, phrases, and words have a positive or negative connotation.

These isolated GPU-enabled lab environments are provisioned using VMware vSphere, powered by NVIDIA virtual GPU (vGPU) technology, which is available with the NVIDIA AI Enterprise software suite, and NetApp ONTAP storage. The setup can support 32 concurrent instances of the lab environment. Instances are rapidly provisioned on demand when you request them. It’s a good user experience and an efficient use of high-powered infrastructure.

More information and resources

Our Lab on Demand represents one of the many types of use cases that NVIDIA AI Enterprise can support. When NetApp, NVIDIA, and VMware work together, enterprise-class AI is not just within reach. It is a reality. To learn more about all of NetApp’s AI solutions, visit www.netapp.com/ai. To learn more about NetApp and VMware’s long history of working together, visit www.netapp.com/hybrid-cloud/vmware.

Mike Oglesby

Mike is a Technical Marketing Engineer at NetApp focused on MLOps and Data Pipeline solutions. He architects and validates full-stack AI/ML/DL data and experiment management solutions that span a hybrid cloud. Mike has a DevOps background and a strong knowledge of DevOps processes and tools. Prior to joining NetApp, Mike worked on a line of business application development team at a large global financial services company. Outside of work, Mike loves to travel. One of his passions is experiencing other places and cultures through their food.

View all Posts by Mike Oglesby

Next Steps

Drift chat loading