In Stock
FP4 Inference: 144 petaFLOPS, System Memory: up to 4TB, Power Usage: 14.3kW max
The Foundation for your AI Factory. NVIDIA DGX B200 is a next-generation AI server built for training and deploying the most demanding AI workloads. Powered by 8 Blackwell GPUs and fifth-gen NVLink interconnect, it delivers up to 3x training and 15x inference performance over its predecessor. Ideal for LLMs, recommender systems, and real-time inference applications, DGX B200 is a high-performance system designed for teams scaling production AI infrastructure with confidence.
8x B200
GPU
1,440 GB
GPU
MEMORY
2x
NVIDIA®
NVSWITCH™
72 petaFLOPS
FP8
TRAINING
144 petaFLOPS
FP4
INFERENCE
up to 4TB
SYSTEM
MEMORY
14.3kW max
POWER
USAGE
24 month
WARRANTY
GPU:
value
Out of Stock
FP4 Inference: 144 petaFLOPS, System Memory: up to 4TB, Power Usage: 14.3kW max
*Estimated prices, not final