NVIDIA DGX B200

$3.50 - $5.00/hr*

In Stock

FP4 Inference: 144 petaFLOPS, System Memory: up to 4TB, Power Usage: 14.3kW max

GPU

8x NVIDIA B200

GPU MEMORY

1,440GB total GPU memory

PERFORMANCE

72 petaFLOPS training and 144 petaFLOPS inference

CPU

2 Intel® Xeon® Platinum 8570 Processors 112 Cores total, 2.1 GHz (Base), 4 GHz (Max Boost)

SYSTEM MEMORY

Up to 4TB

STORAGE

OS: 2x 1.9TB NVMe M.2 Internal storage: 8x 3.84TB NVMe U.2

nvidia

Warranty

The Foundation for your AI Factory. NVIDIA DGX B200 is a next-generation AI server built for training and deploying the most demanding AI workloads. Powered by 8 Blackwell GPUs and fifth-gen NVLink interconnect, it delivers up to 3x training and 15x inference performance over its predecessor. Ideal for LLMs, recommender systems, and real-time inference applications, DGX B200 is a high-performance system designed for teams scaling production AI infrastructure with confidence.

nvidia

8x B200

GPU

1,440 GB

GPU

MEMORY

2x

NVIDIA®

NVSWITCH™

72 petaFLOPS

FP8

TRAINING

144 petaFLOPS

FP4

INFERENCE

up to 4TB

SYSTEM

MEMORY

14.3kW max

POWER

USAGE

24 month

WARRANTY

Specifications

GPU

GPU:

value

Interested in buying this GPU?

NVIDIA DGX B200

$370000*

Out of Stock

FP4 Inference: 144 petaFLOPS, System Memory: up to 4TB, Power Usage: 14.3kW max

nvidia

*Estimated prices, not final

Similar GPUs