
IPU Server16
IPU SERVERS FOR AI TRAINING AND INFERENCE
Make new breakthroughs in machine intelligence with 8 Graphcore C2 PCIe cards connected with high-speed IPU-Links™ in industry standard OEM systems from Dell and Inspur
Download White PaperThe Dell DSS8440 IPU Server is a 4U rack-mounted chassis with eight Graphcore C2 PCIe cards, fully connected with high speed IPU-Links™.
Designed for both training and inference, this IPU Server is ideal for experimentation, pre-production pilots and commercial deployment.
Read DSS8440 product briefTackle your most challenging machine learning workloads for both training and inference with the new Inspur NF5568M5 IPU Server.
An industry standard 5U chassis with eight C2 PCIe cards tightly coupled with high speed IPU-Links™ AI innovators have 16 Colossus™ Mk1 GC2 IPUs delivering 1.6 PetaFlops of AI compute at their disposal for experiments and research, pilots and full production deployment. Available for purchase today.
Get started with the Inspur IPU Server
Delivering a new level of fine-grained, parallel processing across thousands of independent processing threads on each individual IPU. The whole machine intelligence model is held inside the IPU with In-Processor Memory to maximise memory bandwidth and deliver high throughput for faster time to train and the lowest latency inference.
See record-breaking time to train with modern high accuracy computer vision models, like ResNext and EfficientNet. Explore new, large Natural Language Processing models that take full advantage of the IPU's native sparsity support.
High-performance training and low-latency inference capability on the same hardware improves utilisation and flexibility in the cloud and on-premise, significantly enhancing the total cost of ownership.
The IPU is designed to scale. Models are getting larger and demand for AI compute is scaling exponentially. High bandwidth IPU-Links™ allow tight integration of 16 IPUs in the server while Infiniband support allows IPU Servers to work together in a datacenter.
Learn more about how it works