<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=145304570664993&amp;ev=PageView&amp;noscript=1">

Model Garden

Type:
Framework:
Category:

The popular latent diffusion model for generative AI with support for text-to-image on IPUs using Hugging Face Optimum.

The popular latent diffusion model for generative AI with support for image-to-image on IPUs using Hugging Face Optimum.

The popular latent diffusion model for generative AI with support for inpainting on IPUs using Hugging Face Optimum.

Text generation on IPU using GPT-J 6B on PyTorch for inference.

Text entailment on IPU using GPT-J 6B on PyTorch using fine-tuning.

GPT-3 (Generative Pretrained Transformer 3) is a state-of-the-art language processing AI model developed by OpenAI.

GPT-3 (Generative Pretrained Transformer 3) is a state-of-the-art language processing AI model developed by OpenAI.

GPT2-L training in PyTorch leveraging the Hugging Face Transformers library.

GPT2-L inference in PyTorch leveraging the Hugging Face Transformers library.

GPT2-M training in PyTorch leveraging the Hugging Face Transformers library.

HuggingFace Optimum implementation for fine-tuning a GPT2-Medium transformer model.

GPT2-M inference in PyTorch leveraging the Hugging Face Transformers library.

GPT2-S training in PyTorch leveraging the Hugging Face Transformers library.

HuggingFace Optimum implementation for fine-tuning a GPT2-Small transformer model.

GPT2-S inference in PyTorch leveraging the Hugging Face Transformers library.

A hybrid GNN/Transformer for training Molecular Property Prediction using IPUs on the PCQM4Mv2 dataset. Winner of the Open Graph Benchmark Large-Scale Challenge.

A hybrid GNN/Transformer for Molecular Property Prediction inference using IPUs trained on the PCQM4Mv2 dataset. Winner of the Open Graph Benchmark Large-Scale Challenge.

Knowledge graph embedding (KGE) for link-prediction training on IPUs using Poplar with the WikiKG90Mv2 dataset. Winner of the Open Graph Benchmark Large-Scale Challenge.

Knowledge graph embedding (KGE) for link-prediction inference on IPUs using Poplar with the WikiKG90Mv2 dataset. Winner of the Open Graph Benchmark Large-Scale Challenge.

Knowledge graph embedding (KGE) for link-prediction training on IPUs using PyTorch with the WikiKG90Mv2 dataset. Winner of the Open Graph Benchmark Large-Scale Challenge.

BERT-Large (Bidirectional Encoder Representations from Transformers) using PyTorch for NLP training on IPUs.

BERT-Large (Bidirectional Encoder Representations from Transformers) using TensorFlow 1 for NLP training on IPUs.

BERT-Large (Bidirectional Encoder Representations from Transformers) for NLP inference on IPUs with TensorFlow 1.

BERT-Large (Bidirectional Encoder Representations from Transformers) using TensorFlow 2 for NLP training on IPUs.

BERT-Large (Bidirectional Encoder Representations from Transformers) using PopART for NLP training on IPUs.

BERT-Large (Bidirectional Encoder Representations from Transformers) using PopART for NLP inference on IPUs.

HuggingFace Optimum implementation for fine-tuning a BERT-Large transformer model.

HuggingFace Optimum implementation for pre-training a BERT-Large transformer model.

DistilBERT is a small, fast, cheap and light Transformer model trained by distilling BERT base using Hugging Face Optimum on IPUs.

BERT-Base (Bidirectional Encoder Representations from Transformers) using PyTorch for NLP training on IPUs.

BERT-Base (Bidirectional Encoder Representations from Transformers) using TensorFlow 2 for NLP training on IPUs.

BERT-Base (Bidirectional Encoder Representations from Transformers) using TensorFlow 1 for NLP training on IPUs.

BERT-Base (Bidirectional Encoder Representations from Transformers) using PopART for NLP training on IPUs.

BERT-Base (Bidirectional Encoder Representations from Transformers) using PopART for NLP inference on IPUs.

BERT-Base pre-training and SQuAD fine-tuning using Baidu's PaddlePaddle framework on IPUs.

HuggingFace Optimum implementation for pretraining a BERT-Base transformer model using bert-based-uncased datasets.

HuggingFace Optimum implementation for fine-tuning a BERT-Base transformer model using bert-base-uncased on the squad dataset.

HuggingFace Optimum implementation for training RoBERTa-Large - a transformer model for sequence classification, token classification or question answering.

HuggingFace Optimum implementation for fine-tuning RoBERTa-Base on the squad dataset for text generation and comprehension tasks

HuggingFace Optimum implementation for fine-tuning RoBERTa-Base on the squad_v2 dataset for text generation and comprehension tasks

HuggingFace Optimum implementation for fine-tuning LXMERT on the gqa-lxmert dataset for learning vision-and-language cross-modality representations.

HuggingFace Optimum implementation for training DeBERTa - a transformer models that improves BERT and RoBERTa models using disentangled attention and enhanced mask decoder.

HuggingFace Optimum implementation for fine-tuning LXMERT on the vqa-lxmert dataset for learning vision-and-language cross-modality representations.

HuggingFace Optimum implementation for training HuBERT (Hidden-Unit BERT) for self-supervised speech representation learning approach.

HuggingFace Optimum implementation for training BART - a transformer model for text generation and comprehension tasks

HuggingFace Optimum implementation for training T5 - a transformer based model that uses a text-to-text approach for translation, question answering, and classification.

GroupBERT - an enhanced transformer architecture with efficient grouped structures in TensorFlow 1.

New BERT packing algorithm that removes padding for more efficient training in PyTorch.

New BERT packing algorithm that removes padding for more efficient training in PopART.

A variant of the conformer model based on WeNet (not ESPnet) using PyTorch which uses a hybrid CTC/attention architecture with transformer or conformer as an encoder.

CLIP (Contrastive Language-Image Pre-Training) - a neural network trained on a variety of (image, text) pairs using PyTorch.

ViT (Vision Transformer) fine-tuning in PyTorch using Hugging Face transformers.

ViT (Vision Transformer) pretraining in PyTorch using Hugging Face transformers.

HuggingFace Optimum implementation for fine-tuning a ViT (vision transformer) model.

Self-supervised Vision Transformer model for training in PyTorch.

YOLOv3 - You Only Look Once - a convolutional neural network model that performs object detection tasks on IPUs using TensorFlow 1.

YOLOv3 - You Only Look Once - a convolutional neural network model that performs object detection tasks on IPUs using TensorFlow 1..

YOLOv4 - You Only Look Once - a convolutional neural network model that performs object detection tasks on IPUs using PyTorch.

Image classification training on IPUs using the CNN (Convolutional Neural Network) model ResNet-50 with PyTorch.

Image classification inference on IPUs using the CNN (Convolutional Neural Network) model ResNet-50 with PyTorch.

Image classification training on IPUs using the CNN (Convolutional Neural Network) model ResNet-50 with TensorFlow 2.

Image classification training on IPUs using the CNN (Convolutional Neural Network) model ResNet-50 with TensorFlow 1.

Image classification inference on IPUs using the CNN (Convolutional Neural Network) model ResNet-50 with TensorFlow 1.

CNN (Convolutional Neural Network) image classification training on EfficientNet with PyTorch for IPU.

CNN (Convolutional Neural Network) image classification inference on EfficientNet with PyTorch for IPU.

Efficient object detection model for inference using TensorFlow 2 on the IPU.

CNN (Convolutional Neural Network) image classification training on EfficientNet with TensorFlow 1 for IPU.

Spatial interpolation analysis and prediction calculation using TensorFlow 1 for weather forecasting, drought forecasting, and smart irrigation.

Image classification training on IPUs using the CNN (Convolutional Neural Network) model ResNeXt-101 with TensorFlow 1.

Image classification inference on IPUs using the CNN (Convolutional Neural Network) model ResNeXt-101 with PyTorch.

Image classification inference on IPUs using the CNN (Convolutional Neural Network) model ResNeXt-101 with TensorFlow 1.

Image classification inference on IPUs using the CNN (Convolutional Neural Network) model ResNeXt-101 with PopART.

IPU implementation of Faster-RCNN detection framework using PopART.

Swin: Hierarchical Vision Transformer model using Shifted Windows for pretraining in PyTorch.

Implementation of MAE computer vision model in PyTorch for the IPU based on the paper "Masked Autoencoders Are Scalable Vision Learners".

Implementation of Frozen in Time on the IPU in PyTorch for joint video and image encoder end-to-end retrieval.

Swin: Hierarchical Vision Transformer model using Shifted Windows for fine-tuning in PyTorch.

U-Net for biomedical image segmentation using TensorFlow 2 Keras for the IPU.

U-Net for biomedical image segmentation using TensorFlow 2 Keras for the IPU.

How to run a UNet Industrial training example with TensorFlow 1 for image segmentation.

Mini DALL-E Text-to-Image Generation training example with PyTorch for the IPU.

TGN: Temporal Graph Networks is a dynamic GNN model for training on the IPU using TensorFlow 1.

TGN: Temporal Graph Networks is a dynamic GNN model for training on the IPU using PyTorch.

GIN: Graph Isomorphism Network (GIN) - a simple graph neural network to show that its discriminative/representational power is equal to the power of the Weisfeiler-Lehman (WL) graph isomorphism test.

An efficient algorithm for training deep and large Graph Convolutional Networks using TensorFlow 2.

GNN-based model in PyTorch Geometric developed for modelling quantum interactions between atoms in a molecule

Training a neural network model for reconstructing / compressing images in TensorFlow 2.

Running inference on a neural network model for reconstructing / compressing images in TensorFlow 2.

Markov Chain Monte Carlo (MCMC) training on IPUs using standard TensorFlow Probability.

Text-To-Speech training on IPUs with PopART using a Convolutional Sequence Learning technique.

FastSpeech2: Fast and High-Quality End-to-End Text to Speech training on IPUs with TensorFlow 2.

FastSpeech2: Fast and High-Quality End-to-End Text to Speech inference on IPUs with TensorFlow 2.

FastPitch: Parallel Text-to-speech with Pitch Prediction using PyTorch.

HuggingFace Optimum implementation for training Wav2Vec2-Base - a speech recognition transformer model.

HuggingFace Optimum implementation for Wav2Vec2-Base inference - a speech recognition transformer model.

IPU implementation of the Speech Recognition Model with Transformer Encoders and RNN-T Loss in PopART.

DIEN (Deep Interest Evolution Network) training on IPUs with TensorFlow 1 - a recommendation model for click-through rate prediction.

DIEN (Deep Interest Evolution Network) inference on IPUs with TensorFlow 1 - a recommendation model for click-through rate prediction.

DIN (Deep Interest Network) training on IPUs with TensorFlow 1 - a recommendation model for click-through rate prediction.

DIN (Deep Interest Network) inference on IPUs with TensorFlow 1 - a recommendation model for click-through rate prediction.

Deep-Learning Driven Adaptive Molecular Simulations for Protein Folding using TensorFlow 2.

A deep learning model for calculating cosmological parameters in TensorFlow 1. The model primarily consists of 3D convolutions, pooling operations, and dense layers.

A representative implementation of ABC for Simulation-based Inference for observing data from COVID-19 infections to enable statistical inference using TensorFlow 2.

DeePMD-kit - a deep learning package for many-body potential energy representation and molecular dynamics using TensorFlow 1.

Monte Carlo ray tracing application built in Poplar for neural rendering on the IPU.

MobileNetv3 - Convolutional neural network training for classification, detection and segmentation using PyTorch.

MobileNetv2 - Convolutional neural network inference for classification, detection and segmentation using TensorFlow 1.

MobileNetv3 - Convolutional neural network inference for classification, detection and segmentation using PyTorch.

Custom autoencoder model on the IPU using TensorFlow 1 to train collaborative filtering in recommender systems.

Custom autoencoder inference model on the IPU using TensorFlow 1 to perform collaborative filtering in recommender systems.

Train a Variational Autoencoder / Markov Chain Monte Carlo hybrid model on IPUs with TensorFlow 1.

How to train a deep reinforcement learning model in TensorFlow 1 on multiple IPUs with synchronous data parallel training.

How to train a sales forecasting machine learning model with TensorFlow 1 on Graphcore's IPUs.

×