Skip to content

Cog wrapper for ostris/ai-toolkit + post-finetuning cog inference for flux models

License

Notifications You must be signed in to change notification settings

HYPERHYPER/flux-fine-tuner

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

flux-fine-tuner

This is a Cog training model that creates LoRA-based fine-tunes for the FLUX.1 family of image generation models.

It's live at replicate.com/ostris/flux-dev-lora-trainer.

It also includes code for running inference with a fine-tuned model.

Features

  • Automatic image captioning during training
  • Image generation using the LoRA (inference)
  • Optionally uploads fine-tuned weights to Hugging Face after training
  • Automated test suite with cog-safe-push for continuous deployment
  • Weights and biases integration

Getting Started

If you're looking to create your own fine-tuned model on Replicate, you don't need to do anything with this codebase.

Check out these guides to get started:

👉 Fine-tune Flux to create images of yourself

👉 Fine-tune Flux with an API

Contributing

If you're here to help improve the trainer that Replicate uses to fine-tune Flux models, you've come to the right place.

Check out the contributing guide to get started.

Credits

This project is based on the ai-toolkit project, which was created by @ostris. ❤️

License

The code in this repository is licensed under the Apache-2.0 License.

The ai-toolkit project is licensed under the MIT License.

Flux Dev falls under the FLUX.1 [dev] Non-Commercial License.

FLUX.1 [dev] fine-tuned weights and their outputs are non-commercial by default, but can be used commercially when running on Replicate.

Flux Schnell falls under the Apache-2.0 License.

About

Cog wrapper for ostris/ai-toolkit + post-finetuning cog inference for flux models

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 95.5%
  • Smarty 4.5%