We’re thrilled to announce a partnership with Spell, an exciting leader in operationalizing AI, to deliver next generation IPU-based infrastructure that lets developers run AI applications quickly and easily at scale, resulting in faster time to value.
Graphcore’s AI compute systems are delivering outstanding performance and efficiency for more-and-more customers, across a wider range of industries, every day.
Ensuring ease of access and delivering outstanding functionality for AI in production are key to growing our user-base and helping developers maximise their investment in IPUs.
That’s why we're working with Spell.
Spell operationalizes deep learning for natural language processing, computer vision and speech recognition at scale, bringing together the most popular tools and frameworks in a single, powerful software platform.
From initial model training and automating the optimization process, to integrating with deployment tools and cloud service providers, Spell is unifying the expanding AI ecosystem.
They are the perfect partner to help take Graphcore users from initial exploration to managing scale-up systems.
IPU Test Drive
As part of our partnership, Spell and Graphcore are offering a self-guided free trial so users can get hands-on with the integrated solution. Using browser-based notebook access to IPUs in the cloud via the Spell platform, developers can get 6 hours of free access to road-test IPU systems with a range of popular AI models, including natural language models such as BERT, computer vision models like EfficientNet, Graph Neural Networks (GNNs) such as TGN.
The notebooks provide an interactive way to help you quickly get familiar with key programming principles on the IPU while also exploring its power with the latest and most commonly used machine learning model examples.
It’s an exciting and entirely new way for anyone who has heard about the IPU’s amazing capabilities to experience them first-hand.
Scaling with Spell
Graphcore’s relationship with Spell extends well beyond the use of Spell Workspaces to broaden access to IPU compute.
IPU hardware has been seamlessly integrated with the Spell orchestrator, letting developers streamline their end-to-end MLOps workflow through Spell’s powerful command-line interface.
Workspaces manage the IPU compute, storage, and everything under the hood, allowing users to focus on experimentation, model development, and fine-tuning for the IPU.
It is now possible to automate pipelines for training, optimisation, testing, and inference on IPUs, while Spell takes care of all infrastructure configuration behind the scenes. Spell also offers a model registry to let users track versioning of different models as well as monitoring model performance.
Spell’s platform promises to be a powerful companion to Graphcore users as they progress on their AI journey and extend their compute capability.
With a technology-agnostic approach, Spell is bringing together leading players in the AI industry – from Tensorflow and PyTorch to Kubernetes, Docker, AWS, Google Cloud, Snowflake, Grafana and many more.
The result is that Spell users are able to ‘unlock answers faster’ - a goal that is closely aligned with Graphcore’s mission to accelerate innovation in AI.
Graphcore customers frequently talk about how the IPU’s enhanced capabilities allow them to complete tasks faster and iterate their model development more quickly. Tools such as Spell’s model optimization, measurement and automation can only enhance that process.
We will continue to work with Spell to ensure that the power and flexibility of Graphcore and our Poplar software stack and deeply integrated with Spell’s ML tools – both lowering the barriers to entry and raising the performance bar in AI compute.