The latest release of Graphcore’s Poplar SDK, version 2.6, is now available for download from our support portal and Docker Hub.
Supporting our newest Bow Pod systems, as well as previous generations, the Poplar SDK empowers our customers to innovate and develop high performance applications on IPUs.
Additions to Graphcore Model Garden and public examples
Our aim is to make it as easy as possible for users to deploy a wide range of models optimised for the IPU. We are continually updating our Model Garden and associated GitHub repositories. To that end, a number of new models and public examples have been made available.
- GPT-2-Large – inference (PyTorch)
- GPT-2-Medium – inference (PyTorch)
- CLIP – training (PyTorch)
- Conformer-Medium – training (PyTorch)
Benchmark performance results for many models in our model garden across multiple frameworks and multiple platforms are updated for SDK2.6 and published on the Performance Results page of our website.
Poplar SDK 2.6 highlights and new features
While a summary of the Poplar SDK 2.6 release is provided below, a full list of updates can be found in the Poplar SDK 2.6 release notes.
Support for TensorFlow 2.6 and the addition of TensorFlow Serving (preview)
For this release we have migrated the codebase of our open-source version of TensorFlow 2 to version 2.6 (from 2.5).
Additionally, Poplar SDK 2.6 includes a preview release of TensorFlow Serving for the IPU, as an alternative to the Poplar Triton backend for production serving. This distribution will allow users to export a precompiled model to the standard SavedModel format, enabling later deployment for inference using the Graphcore distribution of TensorFlow Serving.
User guides for TensorFlow Serving 2 and TensorFlow Serving 1 are available on the Graphcore documentation portal.
Open-sourcing of IPU TensorFlow Addons
IPU TensorFlow Addons is a collection of addons created for the Graphcore port of TensorFlow. These include layers and optimizers for Keras, as well as legacy TensorFlow layers and optimizers.
This package was initially added into the Poplar SDK 2.4 release and has now been open-sourced to the community.
Keras: new separate package
The TensorFlow 2.6 release provides a separate Keras package, installed using pip. The Poplar SDK 2.6 release, which includes the Graphcore distribution of TensorFlow 2.6, provides a new wheel for the Graphcore distribution of Keras incorporating IPU-specific extensions. This new Keras implementation has been open-sourced.
Updated installation instructions for TensorFlow 2 can be found in the Pod system getting started guides and the Graphcloud TensorFlow 2 Quick Start.
Keras gradient aggregation selection and training stability improvement
The Poplar SDK 2.6 release adds the ability to control how gradients are aggregated when using Keras models with gradient accumulation or pipelining. Using a running mean can improve training stability when using float16 gradients.
Documentation related to gradient accumulation and pipelining with Keras models is provided in the IPU TensorFlow 2 user guide. For reference, see our TensorFlow 2 BERT and ResNet50 models on GitHub.
PopXL
PopXL, a framework library allowing explicit optimisation for best use of compute, communication, and memory, has received several new features, bug fixes, and improvements, which are detailed in the Poplar SDK 2.6 release notes.
The PopXL user guide has been substantially updated, and six new PopXL tutorials (basic concepts, custom optimiser, data parallelism, pipelining, remote variables, phased execution) have been published.
We have also published popxl-addons, a new public GitHub repo extending the PopXL framework through the inclusion of additional operations, transformations, modules and utilities.
New and updated developer resources
In addition to the highlights, features, and additions to the Model Garden and public examples detailed above, the following developer resources have been created or updated between the release of Poplar SDK versions 2.5 and 2.6:
For access to all the latest documentation, tutorials, code examples, webinars, videos, research papers and further resources for IPU programming, check out our developer portal.