Current location:
Home > News > Company News > Gooxi AI Server Family: Tailored Performance for Every AI ScenarioFrom lightweight edge inference to large-scale data center model training and flexible cross-scenario deployment, AI workloads demand servers with distinct form factors, scalability, and energy efficiency. Gooxi’s latest AI server lineup, built on the IntelⓇ Eagle Stream platform, meets these varied needs with precision. Powered by 4th/5th Gen IntelⓇ XeonⓇ Scalable processors (1- or 2-socket configurations) and equipped with IntelⓇ E810 or XXV710 25GbE network adapters, the series delivers scenario-specific, deployable compute solutions.
This 4U server excels in space efficiency and flexible deployment. Supporting CPU-GPU direct connection or switch expansion, it accommodates up to eight 600W dual-width GPUs. With 32 DDR5 DIMM slots boosting memory bandwidth by 75%, it’s ideal for lightweight AI inference tasks. The system offers up to 15 PCIe 5.0 slots plus one OCP 3.0 slot, along with 12/24-bay tri-mode storage (SATA/SAS/NVMe) for real-time processing in sectors like healthcare and financial risk control. Four redundant CRPS power supplies (2000W–3200W) and twelve hot-swappable fans ensure reliable 24/7 edge operations.
With a vertical 6U design, the SY6108G-G4 supports eight 600W GPUs (up to 70.1mm width, including 3-wide and 3.3-wide cards), enabling high-concurrency training clusters in a single chassis. Storage remains a 12-bay tri-mode configuration, while cooling is upgraded to a layered airflow design (12×6056 + 4×8038 fans) with redundant hot-swappable CRPS power. Optimized for long-duration, high-load tasks, it’s the ideal choice for generative AI, digital twins, and other compute-intensive applications.
The 8U modular SY8108G-G4 adapts to a wide range of AI compute needs. Available in fan-cooled (8×4.5-wide GPUs) and turbo-cooled (8×2.5-wide GPUs) versions, it offers both high-density and quiet operation options. With 13 PCIe expansion slots plus half-height PCIe module support, it handles AI training, graphics rendering, and HPC with ease. Eight CRPS power modules (1600W–2700W, N+N/N+M redundancy) combine with IPMI 2.0 and TPM 2.0 security for zero-downtime performance in large-scale AI clusters and enterprise cloud deployments.
From the space-maximizing SY4108G-G4, to the training-optimized SY6108G-G4, and the cross-scenario SY8108G-G4, Gooxi’s AI server family goes beyond hardware specs—addressing real AI industry pain points with targeted solutions. As AI moves from algorithm innovation to industrial adoption, Gooxi will continue to advance in domestic substitution, green computing, and edge intelligence—driving AI infrastructure from usable to truly optimal.
Related recommendations
Learn more newsLeading Provider of Server Solutions
YouTube