trainserver (computer) Wave NF5688M6 Artificial Intelligence + AI Learning +GPUsServer Wave NF5688M6
InspurNF5688M6is Wave's solution for hyperscaledata centerNF5688M6 is the first air-cooled NVLink AI server in the industry to support 500W A100 GPUs, and it can provide up to 12 PCIe expansion products, supporting self-developed double-wide N20X, NV DPU and other intelligent network cards. The NF5688M6 is the first air-cooled product in the industry to support 500W A100 GPUs, and it also provides up to 12 PCIe expansion products and supports self-developed dual-width N20X, NV DPU, and other intelligent NICs. Combined with AIStation, the leading AI resource scheduling platform, it fully unleashes powerful AI computing performance of up to 5 petaFLOPS.
Functional characteristics

advanced technology
2 Intel Ice Lake processors on 10nm processor
8 NVIDIA A100 GPUs, 600GB/s bandwidth NVSwitch Full Interconnect
supports Multi-Instance GPU (MIG) to dramatically increase GPU resource utilization.
Up to 10 200G HDR InfiniBand, high-speed interconnect expansion
Stable quality
supports hard/soft RAID scheme to ensure datasurety
N+N redundant power supply to ensure reliable system operation
Optimized heat dissipation design supports stable operation under high ring temperature
Intelligent remote management for fast fault localization
optimal design
The industry's only air-cooled 500W A100 GPU.
High Performance Rationing, GPU:Compute IB.stockpile IB=8:8:2
Modular design for flexible operation and easy O&M
Leading Support for Wave N20X, NV-DPU, A/T Customer Intelligencenetwork card
Ecological excellence
Extensive and Mature x86+CUDA Global Development Ecosystem
Leading deep learning framework support, TensorFlow/PyTorch/Flying Paddle, etc.
Efficiently supports large-scale CV/NLP/NMT/DLRM model training and inference
Easily Connects with Metabrain Eco-Partners to Provide Rich Industry AI Solution Tech Specs
models | NF5688M6 |
high degree | 6U |
GPU computing module | 1* HGX A100 8GPU |
processing unit | 2 3rd Generation Intel® Xeon® Scalable Processors (Ice Lake) with 270W TDP and 3 UPI interconnect support |
chipsets | Intel® C621A Series Chipset (Lewisburg-R) |
random access memory (RAM) | Supports 32 sticks of DDR4 RDIMM/LRDIMM memory at up to 3200MT/s. |
stockpile | 8 x 2.5'' NVMe SSDs or 16 x 2.5'' SATA/SAS SSDs |
M.2 | 2 NVMe/SATA M.2 on board |
PCIe Expansion | 10 PCIe 4.0 x16 slots, 2 PCIe 4.0 x16 slots (PCIe 4.0 x8 rate) or 6 PCIe 4.0 x16 |
RAID Support | Optional support for RAID0/1/10/5/50/6/60, etc., support for Cache supercapacitor protection to provide RAID state migration, RAID configuration memory |
reticulation | Optional 1x PCIe 4.0 x16 OCP 3.0 NIC, rate support 10G/25G/100G |
Front I/O | 1 USB 3.0 port, 1 USB 2.0 port, 1 VGA port, 1 RJ45 management port |
Rear I/O | 1 USB 3.0 port, 1 VGA port |
remote management | Built-in BMC remote management module, supports Redfish/IPMI/SOL/KVM, etc. |
operating system | Red Hat Enterprise 7.8 64bit, CentOS 7.8, Ubuntu 18.04 or later |
radiator | N+1 redundant hot-swappable fans |
power supply | Six 3000W 80Plus Platinum PSUs with 3+3 redundancy support |
Chassis Size | Width 447mm, Height 263.9mm, Depth 850mm |
operating temperature | 5℃~35℃/41℉~95℉ |
full complementary weight | ≤88kg |