H: 17.2″ (438.8 mm) × W: 17.6″ (449 mm) × D: 33.2″ (843.3 mm)
The Powerworks Matrix SXI is a 10U multi-GPU AI server purpose-built for training and deploying next-generation models, including large language models (LLMs), multimodal AI, and scientific simulations. Powered by dual Intel Xeon 6 8570P processors and four NVIDIA B200 SXM5 96GB GPUs, it delivers unparalleled compute density and reliability for enterprise-scale AI.
Designed for training the most advanced transformer models used in generative AI (like GPT-4/5, Gemini, Claude). High memory bandwidth, massive core counts, and multi-GPU scalability make it ideal for today’s most computationally intensive models.
Supports large-scale models that integrate natural language, vision, audio, and more—perfect for enterprise information technology solutions for AI and research labs developing next-gen interfaces.
Massively parallel compute and fast storage make this system ideal for quantum mechanics, molecular simulations, materials science, and more.
Optimized for maximum throughput and memory bandwidth in data-intensive AI workloads.
Massive capacity and speed to manage multi-billion parameter models and massive datasets.
Includes dual 1.92TB M.2 boot drives (RAID 1) and 4x 3.84TB U.2 NVMe drives for model weights, datasets, and temporary compute caching.
Each GPU provides 96GB HBM3e with industry-leading memory bandwidth—ideal for large models and multimodal AI systems.
Six redundant 5250W PSUs (3+3 configuration) provide uninterrupted power with maximum efficiency and failover.
Micro-Tower
Height: 14.5″ (368.3mm)
Width: 7.5″ (190.5mm)
Depth: 15.5″ (394mm)