<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=145304570664993&amp;ev=PageView&amp;noscript=1">

INTRODUCING IP‌‍U-MA‍C‍HINE A‌ND‌ IP‌U-P‌OD‌ SYSTEMS

Second generation IPU systems for AI infrastructure at scale

Watch Video
GC011_IPURACK_009_W4K-2_hp

IPU-M2000

The core building block for AI infrastructure. The IPU-M2000 packs 1 petaFLOP of AI compute in a slim 1U blade.

Learn More
IPU-POD16

IPU-POD16

Pre-configured with a 4 petaFLOP AI system, IPU-POD16 is where you experience the power and flexibility of larger IPU systems. 

Learn More
IPU POD64

IPU-POD64

16 petaFLOPS of AI-compute for both training and inference workloads, the IPU-POD64 is designed for AI at scale.

Learn More
Graphcloud

Graphcloud

A secure IPU cloud service to add state of the art AI compute on demand - no on-premise infrastructure deployment required.

Request Access
Bert-Large: Training Click to Zoom

Best for Natural Language Processing

The IPU delivers impressive performance for NLP. IPU-POD64 trains BERT-Large over 2.5 times faster than comparable DGX A100 platforms, cutting hours from AI development cycles.

EfficientNet-BO- Inference Click to Zoom

Best for Computer Vision

The IPU-M2000 delivers a significant performance advantage compared with the Nvidia A100 GPU. Running EfficientNet on the IPU is straightforward and doesn't require extra INT8 quantisation effort which can also affect accuracy.

May 05, 2021
Accelerating America with new Graphcore partners
Learn more
Apr 27, 2021
Graphcore Japan hits the ground running with SCSK and HPC Systems
Learn more
Apr 22, 2021
Graphcore dives deep into software architecture layers on Practical AI Podcast
Learn more
Apr 22, 2021
Graphcore advances sparse compute in new EU research project
Learn more

Graphcore enters the System Business claiming Economics vastly better than Nvidia's

Learn more

Graphcore unveils New GC200 Chip and the Expandable M2000 IPU Machine that runs on them

Learn more

Graphcore takes on Nvidia with Latest AI Chip

Learn more
×
×