<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=145304570664993&amp;ev=PageView&amp;noscript=1">

Model Garden

Framework:
  • PyTorch
  • TensorFlow 1
  • TensorFlow 2
  • Hugging Face
  • PopART
  • PaddlePaddle
Category:
  • Natural Language Processing
  • Computer Vision
  • Speech Processing
  • GNN
  • Probabilistic Modelling
  • Reinforcement Learning
  • Generative
  • AI for Simulation
  • Other
Filter:
  • Benchmarks

Training Models

GPT2-Large Training

GPT2-L training in PyTorch leveraging the Hugging Face Transformers library.

View the code
  • Natural Language Processing
  • PyTorch

GPT2-Medium Training

GPT2-M training in PyTorch leveraging the Hugging Face Transformers library.

View the code
  • Natural Language Processing
  • PyTorch

GPT2-Medium Fine-tuning

HuggingFace Optimum implementation for fine-tuning a GPT2-Medium transformer model.

View the code
  • Natural Language Processing
  • Hugging Face

GPT2-Small Training

GPT2-S training in PyTorch leveraging the Hugging Face Transformers library.

View the code
  • Natural Language Processing
  • PyTorch

GPT2-Small Fine-tuning

HuggingFace Optimum implementation for fine-tuning a GPT2-Small transformer model.

View the code
  • Natural Language Processing
  • Hugging Face

BERT-Large Training

BERT-Large (Bidirectional Encoder Representations from Transformers) using PyTorch for NLP training on IPUs.

View the code
  • Natural Language Processing
  • PyTorch

BERT-Large Training

BERT-Large (Bidirectional Encoder Representations from Transformers) using TensorFlow 1 for NLP training on IPUs.

View the code
  • Natural Language Processing
  • TensorFlow 1

BERT-Large Training

BERT-Large (Bidirectional Encoder Representations from Transformers) using TensorFlow 2 for NLP training on IPUs.

View the code
  • Natural Language Processing
  • TensorFlow 2

BERT-Large Training

BERT-Large (Bidirectional Encoder Representations from Transformers) using PopART for NLP training on IPUs.

View the code
  • Natural Language Processing
  • PopART

BERT-Large Pretraining

HuggingFace Optimum implementation for pre-training a BERT-Large transformer model.

View the code
  • Natural Language Processing
  • Hugging Face

BERT-Base Training

BERT-Base (Bidirectional Encoder Representations from Transformers) using PyTorch for NLP training on IPUs.

View the code
  • Natural Language Processing
  • PyTorch

BERT-Base Training

BERT-Base (Bidirectional Encoder Representations from Transformers) using TensorFlow 2 for NLP training on IPUs.

View the code
  • Natural Language Processing
  • TensorFlow 2

BERT-Base Training

BERT-Base (Bidirectional Encoder Representations from Transformers) using TensorFlow 1 for NLP training on IPUs.

View the code
  • Natural Language Processing
  • TensorFlow 1

BERT-Base Training

BERT-Base (Bidirectional Encoder Representations from Transformers) using PopART for NLP training on IPUs.

View the code
  • Natural Language Processing
  • PopART

BERT-Base Training

BERT-Base pre-training and SQuAD fine-tuning using Baidu's PaddlePaddle framework on IPUs.

View the code
  • Natural Language Processing
  • PaddlePaddle

BERT-Base Pretraining

HuggingFace Optimum implementation for pretraining a BERT-Base transformer model using bert-based-uncased datasets.

View the code
  • Natural Language Processing
  • Hugging Face

BERT-Base Fine-tuning

HuggingFace Optimum implementation for fine-tuning a BERT-Base transformer model using bert-base-uncased on the squad dataset.

View the code
  • Natural Language Processing
  • Hugging Face

RoBERTa-Large Training

HuggingFace Optimum implementation for training RoBERTa-Large - a transformer model for sequence classification, token classification or question answering.

View the code
  • Natural Language Processing
  • Hugging Face

RoBERTa-Base Fine-tuning

HuggingFace Optimum implementation for fine-tuning RoBERTa-Base on the squad dataset for text generation and comprehension tasks

View the code
  • Natural Language Processing
  • Hugging Face

RoBERTa-Base Fine-tuning

HuggingFace Optimum implementation for fine-tuning RoBERTa-Base on the squad_v2 dataset for text generation and comprehension tasks

View the code
  • Natural Language Processing
  • Hugging Face

LXMERT Fine-tuning

HuggingFace Optimum implementation for fine-tuning LXMERT on the gqa-lxmert dataset for learning vision-and-language cross-modality representations.

View the code
  • Natural Language Processing
  • Hugging Face

DeBERTa Training

HuggingFace Optimum implementation for training DeBERTa - a transformer models that improves BERT and RoBERTa models using disentangled attention and enhanced mask decoder.

View the code
  • Natural Language Processing
  • Hugging Face

LXMERT Fine-tuning

HuggingFace Optimum implementation for fine-tuning LXMERT on the vqa-lxmert dataset for learning vision-and-language cross-modality representations.

View the code
  • Natural Language Processing
  • Hugging Face

HuBERT Training

HuggingFace Optimum implementation for training HuBERT (Hidden-Unit BERT) for self-supervised speech representation learning approach.

View the code
  • Natural Language Processing
  • Hugging Face

BART Training

HuggingFace Optimum implementation for training BART - a transformer model for text generation and comprehension tasks

View the code
  • Natural Language Processing
  • Hugging Face

T5 Training

HuggingFace Optimum implementation for training T5 - a transformer based model that uses a text-to-text approach for translation, question answering, and classification.

View the code
  • Natural Language Processing
  • Hugging Face

GroupBERT Training

GroupBERT is an enhanced transformer architecture with efficient grouped structures.

View the code
  • Natural Language Processing
  • TensorFlow 1

PackedBERT Training

New BERT packing algorithm that removes padding for more efficient training in PyTorch.

View the code
  • Natural Language Processing
  • PyTorch

PackedBERT Training

New BERT packing algorithm that removes padding for more efficient training in PopART.

View the code
  • Natural Language Processing
  • PopART

ViT (Vision Transformer) Fine-tuning

ViT (Vision Transformer) fine-tuning in PyTorch using Hugging Face transformers.

View the code
  • Computer Vision
  • PyTorch

ViT (Vision Transformer) Pretraining

ViT (Vision Transformer) pretraining in PyTorch using Hugging Face transformers.

View the code
  • Computer Vision
  • PyTorch

ViT (Vision Transformer) Training

HuggingFace Optimum implementation for training a ViT (vision transformer) model.

View the code
  • Computer Vision
  • Hugging Face

DINO Training

Self-supervised Vision Transformer model for training in PyTorch.

View the code
  • Computer Vision
  • PyTorch

YOLOv3 Training

YOLOv3 - You Only Look Once - is a convolutional neural network model that performs object detection tasks on IPUs.

View the code
  • Computer Vision
  • TensorFlow 1

ResNet-50 Training

Image classification training on IPUs using the CNN (Convolutional Neural Network) model ResNet-50 with PyTorch.

View the code
  • Computer Vision
  • PyTorch

ResNet-50 Training

Image classification training on IPUs using the CNN (Convolutional Neural Network) model ResNet-50 with TensorFlow 2.

View the code
  • Computer Vision
  • TensorFlow 2

ResNet-50 Training

Image classification training on IPUs using the CNN (Convolutional Neural Network) model ResNet-50 with TensorFlow 1.

View the code
  • Computer Vision
  • TensorFlow 1

EfficientNet-B0 Training

CNN (Convolutional Neural Network) image classification training on EfficientNet with PyTorch for IPU.

View the code
  • Computer Vision
  • PyTorch

EfficientNet-B4 Training

CNN (Convolutional Neural Network) image classification training on EfficientNet with TensorFlow 1 for IPU.

View the code
  • Computer Vision
  • TensorFlow 1

ResNeXt-101 Training

Image classification training on IPUs using the CNN (Convolutional Neural Network) model ResNeXt-101 with TensorFlow 1.

View the code
  • Computer Vision
  • TensorFlow 1

Faster-RCNN Training

IPU implementation of Faster-RCNN detection framework.

View the code
  • Computer Vision
  • PopART

Swin Pretraining

Swin: Hierarchical Vision Transformer model using Shifted Windows for training in PyTorch.

View the code
  • Computer Vision
  • PyTorch

UNet Medical Training

U-Net for biomedical image segmentation using TensorFlow 2 Keras for the IPU.

View the code
  • Computer Vision
  • TensorFlow 2

UNet Industrial Training

How to run a UNet Industrial training example with TensorFlow for image segmentation.

View the code
  • Computer Vision
  • TensorFlow 1

Mini DALL-E Training

Mini DALL-E Text-to-Image Generation training example with PyTorch for the IPU.

View the code
  • Computer Vision
  • PyTorch

TGN Training

TGN: Temporal Graph Networks is a dynamic GNN model for training on the IPU.

View the code
  • GNN
  • TensorFlow 1

MPNN Training

MPNN: Message Passing Neural Network - a popular GNN architecture for training on the IPU.

View the code
  • GNN
  • TensorFlow 2

Cluster-GCN Training

An efficient algorithm for training deep and large Graph Convolutional Networks

View the code
  • GNN
  • TensorFlow 2

Neural Image Fields Training

Training a neural network model for reconstructing / compressing images in TensorFlow 2.

View the code
  • Other
  • TensorFlow 2

MCMC Training

Markov Chain Monte Carlo (MCMC) training on IPUs using standard TensorFlow Probability.

View the code
  • Probabilistic Modelling
  • TensorFlow 1

Deep Voice 3 Training

Text-To-Speech training on IPUs using a Convolutional Sequence Learning technique.

View the code
  • Speech Processing
  • PopART

FastSpeech2 Training

FastSpeech2: Fast and High-Quality End-to-End Text to Speech training on IPUs with TensorFlow 2.

View the code
  • Speech Processing
  • TensorFlow 2

FastPitch Training

FastPitch: Parallel Text-to-speech with Pitch Prediction using PyTorch.

View the code
  • Speech Processing
  • PyTorch

Wav2Vec2 Training

HuggingFace Optimum implementation for training Wav2Vec2-Base - a speech recognition transformer model.

View the code
  • Speech Processing
  • Hugging Face

Conformer-Small Training

Convolution-augmented Transformer for Speech Recognition on IPUs using PyTorch.

View the code
  • Speech Processing
  • PyTorch

Conformer-Large Training

Convolution-augmented Transformer for Speech Recognition on IPUs using PyTorch.

View the code
  • Speech Processing
  • PyTorch

Transformer Transducer (RNN-T) Training

IPU implementation of the Speech Recognition Model with Transformer Encoders and RNN-T Loss.

View the code
  • Speech Processing
  • PopART

DIEN Training

DIEN (Deep Interest Evolution Network) training on IPUs with TensorFlow 1 - a recommendation model for click-through rate prediction.

View the code
  • Other
  • TensorFlow 1

DIN Training

DIN (Deep Interest Network) training on IPUs with TensorFlow 1 - a recommendation model for click-through rate prediction.

View the code
  • Other
  • TensorFlow 1

DeepDriveMD Training

Deep-Learning Driven Adaptive Molecular Simulations for Protein Folding using TensorFlow 2.

View the code
  • AI for Simulation
  • TensorFlow 2

CosmoFlow Training

A deep learning model for calculating cosmological parameters. The model primarily consists of 3D convolutions, pooling operations, and dense layers.

View the code
  • AI for Simulation
  • TensorFlow 1

Deep Molecular Dynamics (DeePMD-kit) Training

DeePMD-kit is a deep learning package for many-body potential energy representation and molecular dynamics.

View the code
  • AI for Simulation
  • TensorFlow 1

MobileNetv3 Training

MobileNetv3 - Convolutional neural network training for classification, detection and segmentation.

View the code
  • Computer Vision
  • PyTorch

Autoencoder Training

Custom autoencoder model on the IPU using TensorFlow 1 to train collaborative filtering in recommender systems.

View the code
  • Generative
  • TensorFlow 1

Contrastive Divergence VAE Training

Train a Variational Autoencoder / Markov Chain Monte Carlo hybrid model on IPUs with TensorFlow.

View the code
  • Generative
  • TensorFlow 1

Reinforcement Learning Training

How to train a deep reinforcement learning model on multiple IPUs with synchronous data parallel training.

View the code
  • Reinforcement Learning
  • TensorFlow 1

Sales Forecasting Training

How to train a sales forecasting machine learning model with TensorFlow on Graphcore's IPUs.

View the code
  • Other
  • TensorFlow 1

Inference Models

GPT2-Small Inference

GPT2-S inference in PyTorch leveraging the Hugging Face Transformers library.

View the code
  • Natural Language Processing
  • PyTorch

BERT-Large Inference

BERT-Large (Bidirectional Encoder Representations from Transformers) for NLP inference on IPUs with TensorFlow 1.

View the code
  • Natural Language Processing
  • TensorFlow 1

BERT-Large Inference

BERT-Large (Bidirectional Encoder Representations from Transformers) using PopART for NLP inference on IPUs.

View the code
  • Natural Language Processing
  • PopART

BERT-Base Inference

BERT-Base (Bidirectional Encoder Representations from Transformers) using PopART for NLP inference on IPUs.

View the code
  • Natural Language Processing
  • PopART

YOLOv3 Inference

YOLOv3 - You Only Look Once - is a convolutional neural network model that performs object detection tasks on IPUs.

View the code
  • Computer Vision
  • TensorFlow 1

YOLOv4 Inference

YOLOv4 - You Only Look Once - is a convolutional neural network model that performs object detection tasks on IPUs.

View the code
  • Computer Vision
  • PyTorch

ResNet-50 Inference

Image classification inference on IPUs using the CNN (Convolutional Neural Network) model ResNet-50 with PyTorch.

View the code
  • Computer Vision
  • PyTorch

ResNet-50 Inference

Image classification inference on IPUs using the CNN (Convolutional Neural Network) model ResNet-50 with TensorFlow 1.

View the code
  • Computer Vision
  • TensorFlow 1

EfficientNet-B0/B4 Inference

CNN (Convolutional Neural Network) image classification inference on EfficientNet with PyTorch for IPU.

View the code
  • Computer Vision
  • PyTorch

EfficientDet (D0-D4) Inference

Efficient object detection model for inference using TensorFlow 2 on the IPU.

View the code
  • Computer Vision
  • TensorFlow 2

Reference Evapotranspiration (ET0) Inference

Spatial interpolation analysis and prediction calculation for weather forecasting, drought forecasting, and smart irrigation.

View the code
  • AI for Simulation
  • TensorFlow 1

ResNeXt-101 Inference

Image classification inference on IPUs using the CNN (Convolutional Neural Network) model ResNeXt-101 with PyTorch.

View the code
  • Computer Vision
  • PyTorch

ResNeXt-101 Inference

Image classification inference on IPUs using the CNN (Convolutional Neural Network) model ResNeXt-101 with TensorFlow 1.

View the code
  • Computer Vision
  • TensorFlow 1

ResNeXt-101 Inference

Image classification inference on IPUs using the CNN (Convolutional Neural Network) model ResNeXt-101 with PopART.

View the code
  • Computer Vision
  • PopART

UNet Medical Inference

U-Net for biomedical image segmentation using TensorFlow 2 Keras for the IPU.

View the code
  • Computer Vision
  • TensorFlow 2

Neural Image Fields Inference

Running inference on a neural network model for reconstructing / compressing images in TensorFlow 2.

View the code
  • Other
  • TensorFlow 2

FastSpeech2 Inference

FastSpeech2: Fast and High-Quality End-to-End Text to Speech inference on IPUs with TensorFlow 2.

View the code
  • Speech Processing
  • TensorFlow 2

DIEN Inference

DIEN (Deep Interest Evolution Network) inference on IPUs with TensorFlow 1 - a recommendation model for click-through rate prediction.

View the code
  • Other
  • TensorFlow 1

DIN Inference

DIN (Deep Interest Network) inference on IPUs with TensorFlow 1 - a recommendation model for click-through rate prediction.

View the code
  • Other
  • TensorFlow 1

Approximate Bayesian Computation (ABC) COVID-19 Inference

A representative implementation of ABC for Simulation-based Inference for observing data from COVID-19 infections to enable statistical inference.

View the code
  • AI for Simulation
  • TensorFlow 2

MobileNetv2 Inference

MobileNetv2 - Convolutional neural network inference for classification, detection and segmentation.

View the code
  • Computer Vision
  • TensorFlow 1

MobileNetv3 Inference

MobileNetv3 - Convolutional neural network inference for classification, detection and segmentation.

View the code
  • Computer Vision
  • PyTorch

Autoencoder Inference

Custom autoencoder inference model on the IPU using TensorFlow 1 to perform collaborative filtering in recommender systems.

View the code
  • Generative
  • TensorFlow 1
×