Menu

Boost performance with NVIDIA Magnum IO GPUDirect Storage

twp people working together
Table of Contents

Share this page

Image of Mike McNamara
Mike McNamara
985 views

NVIDIA Magnum IO GPUDirect Storage (MIO GDS) enables a direct data path for direct memory access (DMA) transfers between GPU memory and storage, which avoids a bounce buffer through the CPU. The direct path increases system bandwidth and decreases the latency and utilization load on the CPU. With this performance improvement, for example, oil and gas refineries can pinpoint drill locations in half the time. And weather services can run climate simulations up to six times faster to identify extreme weather patterns.

GDS provides value in many ways:

  • Bandwidth is two to eight times higher with data transfers directly between storage and GPU.
  • Latency is lower, because data transfers don’t fault and don’t go through a bounce buffer.
  • Access to petabytes of storage can be at higher bandwidth than with local storage or local CPU memory.
  • Use of DMA engines near storage is less invasive to CPU load and doesn’t interfere with GPU load.
  • The GPU becomes the highest-bandwidth computing engine.
  • Bandwidth into GPU memory from CPU memory, local storage, and remote storage can be additively combined to nearly saturate the bandwidth into and out of the GPUs.

NetApp AI Solutions for NVIDIA DGX A100 systems

The NVIDIA DGX POD reference architecture combines NVIDIA DGX A100 systems, NVIDIA InfiniBand networking, and storage solutions into fully integrated offerings that are verified and ready to deploy. As a key NVIDIA partner, NetApp offers two solutions for DGX A100 systems. One is based on NetApp® AFF systems, and the other is based on NetApp EF-Series EF600 arrays with BeeGFS.

If your enterprise plans to run many distributed jobs using GPUs, and if you plan to use NFS and the rich data management available in NetApp ONTAP®AFF solutions are a great fit. If you have fewer jobs using GPUs for long-running training operations and require the extreme performance of a parallel file system, consider NetApp E-Series solutions. Both solutions are accompanied by a reference architecture that includes observed bandwidth, IOPS, and training performance results under certain testing conditions. And ONTAP AI is also available in an integrated solution, with your choice of three preconfigured offerings that include installation and support.

Magnum IO GPUDirect Storage enables data to move directly from the NetApp EF600 systems into GPU memory, bypassing the CPU. Direct memory access from storage to GPU relieves the CPU I/O bottleneck, increasing performance.

NVIDIA storage chart

BeeGFS is a parallel file system that provides great flexibility and is key to meeting the needs of diverse and evolving AI workloads. Today, NetApp EF-Series storage systems supercharge BeeGFS storage and metadata services by offloading RAID and other storage tasks, including drive monitoring and wear detection. BeeGFS GDS with EF-Series for both DGX POD and NVIDIA DGX SuperPOD configurations will be generally available in the near future but can be used now for proofs of concept. Support for ONTAP AI will follow later in the year. To learn more, visit www.NetApp.com/ai.

Mike McNamara

Mike McNamara is a senior leader of product and solution marketing at NetApp with 25 years of data management and data storage marketing experience. Before joining NetApp over 10 years ago, Mike worked at Adaptec, EMC and HP. Mike was a key team leader driving the launch of the industry’s first cloud-connected AI/ML solution (NetApp), unified scale-out and hybrid cloud storage system and software (NetApp), iSCSI and SAS storage system and software (Adaptec), and Fibre Channel storage system (EMC CLARiiON). In addition to his past role as marketing chairperson for the Fibre Channel Industry Association, he is a member of the Ethernet Technology Summit Conference Advisory Board, a member of the Ethernet Alliance, a regular contributor to industry journals, and a frequent speaker at events. Mike also published a book through FriesenPress titled "Scale-Out Storage - The Next Frontier in Enterprise Data Management", and was listed as a top 50 B2B product marketer to watch by Kapos.

View all Posts by Mike McNamara
Drift chat loading