Skip to content
Change the repository type filter

All

    Repositories list

    • The Triton backend for the ONNX Runtime.
      C++
      BSD 3-Clause "New" or "Revised" License
      56130714Updated Nov 15, 2024Nov 15, 2024
    • server

      Public
      The Triton Inference Server provides an optimized cloud and edge inferencing solution.
      Python
      BSD 3-Clause "New" or "Revised" License
      1.5k8.3k57660Updated Nov 15, 2024Nov 15, 2024
    • C++
      BSD 3-Clause "New" or "Revised" License
      62298Updated Nov 15, 2024Nov 15, 2024
    • Triton CLI is an open source command line interface that enables users to create, deploy, and profile models served by the Triton Inference Server.
      Python
      25021Updated Nov 15, 2024Nov 15, 2024
    • tutorials

      Public
      This repository contains tutorials and examples for Triton Inference Server
      Python
      BSD 3-Clause "New" or "Revised" License
      96563816Updated Nov 15, 2024Nov 15, 2024
    • pytriton

      Public
      PyTriton is a Flask/FastAPI-like interface that simplifies Triton's deployment in Python environments.
      Python
      Apache License 2.0
      5174290Updated Nov 15, 2024Nov 15, 2024
    • The Triton backend for the PyTorch TorchScript models.
      C++
      BSD 3-Clause "New" or "Revised" License
      4312403Updated Nov 15, 2024Nov 15, 2024
    • The Triton TensorRT-LLM Backend
      Python
      Apache License 2.0
      10570626218Updated Nov 14, 2024Nov 14, 2024
    • client

      Public
      Triton Python, C++ and Java client libraries, and GRPC-generated client examples for go, java and scala.
      Python
      BSD 3-Clause "New" or "Revised" License
      2335633227Updated Nov 14, 2024Nov 14, 2024
    • core

      Public
      The core library and APIs implementing the Triton Inference Server.
      C++
      BSD 3-Clause "New" or "Revised" License
      104105017Updated Nov 14, 2024Nov 14, 2024
    • Triton Model Analyzer is a CLI tool to help with better understanding of the compute and memory requirements of the Triton Inference Server models.
      Python
      Apache License 2.0
      74429214Updated Nov 13, 2024Nov 13, 2024
    • The Triton backend for TensorRT.
      C++
      BSD 3-Clause "New" or "Revised" License
      296301Updated Nov 13, 2024Nov 13, 2024
    • Third-party source packages that are modified for use in Triton.
      C
      BSD 3-Clause "New" or "Revised" License
      56605Updated Nov 12, 2024Nov 12, 2024
    • backend

      Public
      Common source, scripts and utilities for creating Triton backends.
      C++
      BSD 3-Clause "New" or "Revised" License
      8929403Updated Nov 12, 2024Nov 12, 2024
    • common

      Public
      Common source, scripts and utilities shared across all Triton repositories.
      C++
      BSD 3-Clause "New" or "Revised" License
      756203Updated Nov 12, 2024Nov 12, 2024
    • C++
      101804Updated Nov 12, 2024Nov 12, 2024
    • Triton backend that enables pre-process, post-processing and other logic to be implemented in Python.
      C++
      BSD 3-Clause "New" or "Revised" License
      145546012Updated Nov 12, 2024Nov 12, 2024
    • TRITONCACHE implementation of a Redis cache
      C++
      BSD 3-Clause "New" or "Revised" License
      41220Updated Nov 12, 2024Nov 12, 2024
    • An example Triton backend that demonstrates sending zero, one, or multiple responses for each request.
      C++
      BSD 3-Clause "New" or "Revised" License
      7500Updated Nov 12, 2024Nov 12, 2024
    • The Triton backend for TensorFlow.
      C++
      BSD 3-Clause "New" or "Revised" License
      194402Updated Nov 12, 2024Nov 12, 2024
    • The Triton repository agent that verifies model checksums.
      C++
      BSD 3-Clause "New" or "Revised" License
      71000Updated Nov 11, 2024Nov 11, 2024
    • Simple Triton backend used for testing.
      C++
      BSD 3-Clause "New" or "Revised" License
      4200Updated Nov 8, 2024Nov 8, 2024
    • OpenVINO backend for Triton.
      C++
      BSD 3-Clause "New" or "Revised" License
      162953Updated Nov 8, 2024Nov 8, 2024
    • Implementation of a local in-memory cache for Triton Inference Server's TRITONCACHE API
      C++
      BSD 3-Clause "New" or "Revised" License
      1510Updated Nov 8, 2024Nov 8, 2024
    • Example Triton backend that demonstrates most of the Triton Backend API.
      C++
      BSD 3-Clause "New" or "Revised" License
      12600Updated Nov 8, 2024Nov 8, 2024
    • Python
      BSD 3-Clause "New" or "Revised" License
      2019106Updated Nov 8, 2024Nov 8, 2024
    • FIL backend for the Triton Inference Server
      Jupyter Notebook
      Apache License 2.0
      3672512Updated Nov 6, 2024Nov 6, 2024
    • The Triton backend that allows running GPU-accelerated data pre-processing pipelines implemented in DALI's python API.
      C++
      MIT License
      29124225Updated Nov 5, 2024Nov 5, 2024
    • Triton Model Navigator is an inference toolkit designed for optimizing and deploying Deep Learning models with a focus on NVIDIA GPUs.
      Python
      Apache License 2.0
      2518341Updated Sep 10, 2024Sep 10, 2024
    • contrib

      Public
      Community contributions to Triton that are not officially supported or maintained by the Triton project.
      Python
      BSD 3-Clause "New" or "Revised" License
      7801Updated Jun 5, 2024Jun 5, 2024