Fine-tuning 6-Billion GPT-J (& other models) with LoRA and 8-bit compression
python
nlp
text-generation
transformer
colab
lora
8bit
fine-tuning
colaboratory
colab-notebook
gpt-neo
gpt-j
gpt-j-6b
gptj
french-gpt-j
-
Updated
Oct 5, 2022 - Jupyter Notebook