ACE POWERWORKS MATRIX A1

A rackmount server with multiple front-facing drive bays, cooling fans, and control panel, housed in a metal enclosure.

H: 6.87″ (174.5 mm) x W: 17.3″ (440 mm) x D: 31.5″ (800 mm)

ACE POWERWORKS MATRIX A1

Model: PM-B4AS6R

The Powerworks Matrix A1 is designed to meet the demands of cutting-edge AI infrastructure. With support for massive memory bandwidth, PCIe Gen5, and up to eight B200 NVL GPUs, it delivers top-tier compute performance for the most advanced enterprise, research, and cloud-scale AI workloads. Its efficient 4U chassis supports dense GPU configurations without compromising airflow or reliability.

Next-Generation LLM Training

Train GPT-4/5-class models with maximum compute density and GPU acceleration. Designed for enterprise technology solutions and research-grade AI labs.

Scientific Computing

Accelerates complex simulations across physics, life sciences, and industrial research requiring precision and scale.

AI-as-a-Service Infrastructure

Empowers cloud providers to offer AI workloads at scale, from inference pipelines to full model training as a service.

High-Density AI Compute in 4U

Build the Backbone of Tomorrow’s AI Infrastructure

LLM Training, Redefined: Dual EPYC + B200 Power

A premium high-density 4U server built for large-scale AI, deep learning, and high-performance computing. Powered by dual AMD EPYC 9654 processors and supporting up to eight NVIDIA B200 NVL GPUs, the Powerworks Matrix A1 delivers exceptional performance for LLM training, multimodal AI, and simulation workloads.

Key Features

  • Dual AMD EPYC 9654 CPUs

    192 cores of parallel processing power designed for compute-intensive AI and HPC tasks.

  • 1TB ECC DDR5 Memory

    Handles massive AI datasets and complex simulations with high-speed, error-correcting memory.

  • Enterprise-Grade NVMe Storage

    Blazing-fast I/O with RAID-configured NVMe for model checkpointing, datasets, and AI pipelines.

  • Built for AI Infrastructure

    Ideal for AI research labs, tech enterprises, and cloud AI providers scaling generative AI models.

  • Redundant 3000W Titanium PSUs

    Reliable, power-efficient dual-supply design for mission-critical uptime and load balancing.

A rackmount server with multiple front-facing drive bays, cooling fans, and control panel, housed in a metal enclosure.

SPECIFICATIONS

Form Factor

Micro-Tower

Height: 14.5″ (368.3mm)

Width: 7.5″ (190.5mm)

Depth: 15.5″ (394mm)

Intel Core i5 14600K Processor
– 14 Cores, 20 Threads
– 3.5 GHz up to 5.3 GHz
– 24 MB Cache
RTX A2000 6GB
Intel B760 Chipset
 
Front I/O: 
– 1 x USB 2.0 Port
– 2 x USB 3.0 Ports
– 1 x Type-C Port
– 1 x Headphone Jack
– 1 x Microphone Jack
– 1 x Reset Button
– 1 x Power Button
 
Rear I/O:
– 2 x Antenna Mounting Points
– 1 x PS/2 Mouse/Keyboard Port
– 2 x DisplayPort
– 1 x USB 3.2 Gen1 Type-C Port
– 3 x USB 3.2 Gen1 Type-A Ports
– 2 x USB 2.0 Ports
– 1 x RJ-45 LAN Port
– HD Audio Jacks: Line in / Front Speaker / Microphone
 
Connectors:
– 1 x Chassis Intrusion and Speaker Header
– 1 x RGB LED Header
– 3 x Addressable LED Headers
– 1 x CPU Fan Connector (4-pin)
– 1 x CPU/Water Pump Fan Connector (4-pin) (Smart Fan Speed Control)
– 4 x Chassis/Water Pump Fan Connectors (4-pin) (Smart Fan Speed Control)
– 1 x 24 pin ATX Power Connector
– 1 x 8 pin 12V Power Connector (Hi-Density Power Connector)
– 1 x Front Panel Audio Connector
– 1 x Thunderbolt AIC Connector (5-pin) (Supports ASRock Thunderbolt 4 AIC Card)
– 2 x USB 2.0 Headers (Support 4 USB 2.0 ports)
– 1 x USB 3.2 Gen1 Header (Supports 2 USB 3.2 Gen1 ports)
– 1 x Front Panel Type C USB 3.2 Gen1 Header
2TB NVMe M.2
32GB DDR5 Memory
500W 80+ Gold