GIGABYTE has launched two new liquid-cooled HPC and AI training servers, G262-ZLO and G492-ZL2, that can push the NVIDIA HGX A100 accelerators and AMD EPYC 7003 processors to limit with enterprise-grade liquid cooling. The G262-ZL0 is a 2U GPU-centric server that comes with support for the NVIDIA HGX A100 4-GPU baseboard while the G492-ZL2 is a 4U GPU-centric server with the NVIDIA HGX A100 8-GPU baseboard.
The company joined hands with CoolIt Systems to produce a thermal solution that employs direct-liquid cooling to balance optimal performance, high availability, and efficient cooling, to avoid overheating and server downtime in a compute-dense data center.
Commenting on the topic, a spokesperson from GIGABYTE stated:
The inclusion and choices of the NVIDIA HGX A100 platform in the new GIGABYTE servers is important, in that new NVIDIA Magnum IO™ GPUDirect technologies favor faster throughput while offloading workloads from the CPU to achieve notable performance boosts. The HGX platform supports NVIDIA GPUDirect RDMA for direct data exchange between GPUs and third-party devices such as NICs or storage adapters. And there is support for GPUDirect Storage for a direct data path to move data from storage to GPU memory while offloading the CPU, thus resulting in higher bandwidth and lower latency. For high-speed interconnects the four NVIDIA A100 server incorporates NVIDIA NVLink®, while the eight NVIDIA A100 server uses NVSwitch™ and NVLink to enable 600GB/s GPU peer-to-peer communication.
The latest servers isolate the GPU baseboard from the other components to allow the accelerators to be cooled by a liquid coolant to maintain peak performance. In these servers, the dual CPU sockets are also liquid-cooled. The servers feature 2.5" U.2 bays that come with support for PCIe 4.0 x4 lanes and multiple PCIe slots for faster networking employing a SmartNIC like the NVIDIA ConnectX -7 for four ports of connectivity and over 400 Gb/s of throughput.
To buy either of the two servers, customers can contact GIGABYTE, and for information on the integration of the cooling structure into the data center and what additional cooling components are required, you can contact CoolIt Systems.
2 Comments - Add comment