<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=145304570664993&amp;ev=PageView&amp;noscript=1">
Graphcore talks scaling up AI on Weights and Biases Podcast

May 31, 2021

Graphcore talks Scaling up AI on Weights and Biases Podcast

Written By:

Sally Doherty

Join the IPU conversation

Join our Graphcore community for free. Get help and share knowledge, find tutorials and tools that will help you grow.

Join on Slack

Machine intelligence is a unique computational workload with distinctly different characteristics to HPC algorithms or graphics programs. With the slowing down of Moore’s Law and model sizes on the rise, there is a need for specialised machine learning hardware designed to run AI workloads efficiently.

Phil Brown, Graphcore's Director of Applications, recently spoke to Founder of Weights & Biases, Lukas Biewald, about the role of AI processors such as the IPU in driving forward progress in machine intelligence, from enabling sparsity to accelerating BERT.

 

Listen to the Weights & Biases Podcast 

Scaling Experiments with Weights & Biases

Pursuing new approaches to machine learning can be a challenge, particularly once AI workloads move from pilot to production. At scale, even a slight drop in performance can be costly. Recognising this, Weights & Biases have created a suite of tools to help developers scale up their projects more easily.

Graphcore engineers have been using Weights & Biases tools for AI and machine learning to support their work scaling IPU experiments. In a recent Weights & Biases case study, Graphcore’s Phil Brown shared how the complexities of tracking experiments across IPU-POD systems and multiple deployment locations led his team to turn to the Weights & Biases platform to track their large-scale experiments, including their BERT-Large Training implementation on the IPU.

 

Read the Weights & Biases Case Study