🚀 Accelerate training and inference of 🤗 Transformers and 🤗 Diffusers with easy to use hardware optimization tools
-
Updated
Sep 30, 2024 - Python
🚀 Accelerate training and inference of 🤗 Transformers and 🤗 Diffusers with easy to use hardware optimization tools
⚡️An Easy-to-use and Fast Deep Learning Model Deployment Toolkit for ☁️Cloud 📱Mobile and 📹Edge. Including Image, Video, Text and Audio 20+ main stream scenarios and 150+ SOTA models with end-to-end optimization, multi-platform and multi-framework support.
An implementation of the Search by Triplet track reconstruction algorithm on the Graphcore IPU.
A PyTorch library for Knowledge Graph Embedding on Graphcore IPUs implementing the distribution framework BESS
TessellateIPU: low level Poplar tile programming from Python
Poplar implementation of FlashAttention for IPU
JAX for Graphcore IPU (experimental)
Blazing fast training of 🤗 Transformers on Graphcore IPUs
Example code and applications for machine learning on Graphcore IPUs
Code for CoNLL BabyLM workshop Mini Minds: Exploring Bebeshka and Zlata Baby Models
TensorFlow for the IPU
Poplar Advanced Runtime for the IPU
PyTorch interface for the IPU
Track reconstruction on the Graphcore IPU.
This repository has some basic installation steps to the Poplar SDK on a Graphcore IPU. In future, I plan to implement and add some basic codes for Parallel Computing Algorithms
Add a description, image, and links to the graphcore topic page so that developers can more easily learn about it.
To associate your repository with the graphcore topic, visit your repo's landing page and select "manage topics."