ACE POWERWORKS MATRIX SXI

A large enterprise server chassis with multiple hot-swappable drive bays and cooling fans visible on the front panel.

H: 17.2″ (438.8 mm) × W: 17.6″ (449 mm) × D: 33.2″ (843.3 mm)

ACE POWERWORKS MATRIX SXI

Model: PM-X2SCEG

The Powerworks Matrix SXI is a 10U multi-GPU AI server purpose-built for training and deploying next-generation models, including large language models (LLMs), multimodal AI, and scientific simulations. Powered by dual Intel Xeon 6 8570P processors and four NVIDIA B200 SXM5 96GB GPUs, it delivers unparalleled compute density and reliability for enterprise-scale AI.

Next-Gen LLM Training

Designed for training the most advanced transformer models used in generative AI (like GPT-4/5, Gemini, Claude). High memory bandwidth, massive core counts, and multi-GPU scalability make it ideal for today’s most computationally intensive models.

Multimodal AI Development

Supports large-scale models that integrate natural language, vision, audio, and more—perfect for enterprise information technology solutions for AI and research labs developing next-gen interfaces.

Scientific Computing & Simulations

Massively parallel compute and fast storage make this system ideal for quantum mechanics, molecular simulations, materials science, and more.

Train GPT-Class Models. At Scale.

B200-Powered AI Infrastructure for Tomorrow’s Workloads

The LLM Server for Multimodal Intelligence

A next-gen AI server platform purpose-built for large language model training, multi-modal workloads, and advanced scientific simulations. The Matrix SXI is a top-tier compute node in modern AI infrastructure, trusted by R&D teams, enterprise AI groups, and hyperscale cloud builders.

Key Features

  • Dual Intel Xeon 6 8570P CPUs

    Optimized for maximum throughput and memory bandwidth in data-intensive AI workloads.

  • 1TB DDR5 ECC Memory

    Massive capacity and speed to manage multi-billion parameter models and massive datasets.

  • Tiered High-Speed NVMe Storage

    Includes dual 1.92TB M.2 boot drives (RAID 1) and 4x 3.84TB U.2 NVMe drives for model weights, datasets, and temporary compute caching.

  • 4x NVIDIA B200 SXM5 GPUs

    Each GPU provides 96GB HBM3e with industry-leading memory bandwidth—ideal for large models and multimodal AI systems.

  • Titanium-Grade Power Redundancy

    Six redundant 5250W PSUs (3+3 configuration) provide uninterrupted power with maximum efficiency and failover.

A large enterprise server chassis with multiple hot-swappable drive bays and cooling fans visible on the front panel.

SPECIFICATIONS

Form Factor

Micro-Tower

Height: 14.5″ (368.3mm)

Width: 7.5″ (190.5mm)

Depth: 15.5″ (394mm)

Intel Core i5 14600K Processor
– 14 Cores, 20 Threads
– 3.5 GHz up to 5.3 GHz
– 24 MB Cache
RTX A2000 6GB
Intel B760 Chipset
 
Front I/O: 
– 1 x USB 2.0 Port
– 2 x USB 3.0 Ports
– 1 x Type-C Port
– 1 x Headphone Jack
– 1 x Microphone Jack
– 1 x Reset Button
– 1 x Power Button
 
Rear I/O:
– 2 x Antenna Mounting Points
– 1 x PS/2 Mouse/Keyboard Port
– 2 x DisplayPort
– 1 x USB 3.2 Gen1 Type-C Port
– 3 x USB 3.2 Gen1 Type-A Ports
– 2 x USB 2.0 Ports
– 1 x RJ-45 LAN Port
– HD Audio Jacks: Line in / Front Speaker / Microphone
 
Connectors:
– 1 x Chassis Intrusion and Speaker Header
– 1 x RGB LED Header
– 3 x Addressable LED Headers
– 1 x CPU Fan Connector (4-pin)
– 1 x CPU/Water Pump Fan Connector (4-pin) (Smart Fan Speed Control)
– 4 x Chassis/Water Pump Fan Connectors (4-pin) (Smart Fan Speed Control)
– 1 x 24 pin ATX Power Connector
– 1 x 8 pin 12V Power Connector (Hi-Density Power Connector)
– 1 x Front Panel Audio Connector
– 1 x Thunderbolt AIC Connector (5-pin) (Supports ASRock Thunderbolt 4 AIC Card)
– 2 x USB 2.0 Headers (Support 4 USB 2.0 ports)
– 1 x USB 3.2 Gen1 Header (Supports 2 USB 3.2 Gen1 ports)
– 1 x Front Panel Type C USB 3.2 Gen1 Header
2TB NVMe M.2
32GB DDR5 Memory
500W 80+ Gold