Training transformer models (e.g. RoBERTa, GPT2 and GPT-J) from scratch.
-
Updated
Oct 1, 2024 - Python
Training transformer models (e.g. RoBERTa, GPT2 and GPT-J) from scratch.
Build, customize and control you own LLMs. From data pre-processing to fine-tuning, xTuring provides an easy way to personalize open-source LLMs. Join our discord community: https://discord.gg/TgHXuSJEk6
The Truth Is In There: Improving Reasoning in Language Models with Layer-Selective Rank Reduction
telegram bot for self-hosted local inference of stable diffusion, text-to-speech and large language models, such as llama3
Notebook for running GPT-J/GPT-J-6B – the cost-effective alternative to ChatGPT, GPT-3 & GPT-4 for many NLP tasks. Available on IPUs as a Paperspace notebook.
Super easy to use library for doing LLaMA/GPT-J stuff! - Mirror of: https://gitlab.com/niansa/libjustlm
[ACL 2023] Solving Math Word Problems via Cooperative Reasoning induced Language Models
langchain-chat is an AI-driven Q&A system that leverages OpenAI's GPT-4 model and FAISS for efficient document indexing. It loads and splits documents from websites or PDFs, remembers conversations, and provides accurate, context-aware answers based on the indexed data. Easy to set up and extend.
Natural language model AI via HTTP
Production http server for our finetuned gpt-j model gptj-title-teaser-10k.
Curated list of open source and openly accessible large language models
UI tool for fine-tuning and testing your own LoRA models base on LLaMA, GPT-J and more. One-click run on Google Colab. + A Gradio ChatGPT-like Chat UI to demonstrate your language models.
langchain-chat is an AI-driven Q&A system that leverages OpenAI's GPT-4 model and FAISS for efficient document indexing. It loads and splits documents from websites or PDFs, remembers conversations, and provides accurate, context-aware answers based on the indexed data. Easy to set up and extend.
Title and teaser generation for journalistic texts.
GPT-J 6B inference on TensorRT with INT-8 precision
Add a description, image, and links to the gpt-j topic page so that developers can more easily learn about it.
To associate your repository with the gpt-j topic, visit your repo's landing page and select "manage topics."