Skip to content
#

tranformers

Here are 18 public repositories matching this topic...

This project demonstrates the use of Transformers for text generation using the T5 model. The project includes the necessary code for training the model on a custom dataset and generating new text.

  • Updated Nov 5, 2024
  • Jupyter Notebook

Explore the rich flavors of Indian desserts with TunedLlavaDelights. Utilizing the in Llava fine-tuning, our project unveils detailed nutritional profiles, taste notes, and optimal consumption times for beloved sweets. Dive into a fusion of AI innovation and culinary tradition

  • Updated Mar 17, 2024
  • Python

A RAG (Retrieval-Augmented Generation) application combines large language models (LLMs) with a retrieval system to enhance the generation of responses by accessing relevant external knowledge. In this specific case, the RAG application is developed using the GROQ API, OpenAI embeddings, and is trained on the Gemma model.

  • Updated Sep 1, 2024
  • Python

Improve this page

Add a description, image, and links to the tranformers topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the tranformers topic, visit your repo's landing page and select "manage topics."

Learn more