The simplest way to train and run adapters on top of foundation models
Manifesto | Docs | Guides | Discussions | Discord
- Added ELLA for better prompts handling (contributed by @ily-R)
- Added the Box Segmenter all-in-one solution (model, HF Space)
- Added MVANet for high resolution segmentation
- Added IC-Light to manipulate the illumination of images
- Added Multi Upscaler for high-resolution image generation, inspired from Clarity Upscaler (HF Space)
- Added HQ-SAM for high quality mask prediction with Segment Anything
- ...see past releases
The current recommended way to install Refiners is from source using Rye:
git clone "git@github.com:finegrain-ai/refiners.git"
cd refiners
rye sync --all-features
Refiners comes with a MkDocs-based documentation website available at https://refine.rs. You will find there a quick start guide, a description of the key concepts, as well as in-depth foundation model adaptation guides.
- Finegrain Editor: use state-of-the-art visual AI skills to edit product photos
- Visoid: AI-powered architectural visualization
- brycedrennan/imaginAIry: Pythonic AI generation of images and videos
- chloedia/layerdiffuse: an implementation of LayerDiffuse (foreground generation only)
- Pinokio: a browser for running AI apps locally (see Clarity Refiners UI and announcement)
If you're interested in understanding the diversity of use cases for foundation model adaptation (potentially beyond the specific adapters supported by Refiners), we suggest you take a look at these outstanding papers:
- ControlNet
- T2I-Adapter
- IP-Adapter
- Medical SAM Adapter
- 3DSAM-adapter
- SAM-adapter
- Cross Modality Attention Adapter
- UniAdapter
We took inspiration from these great projects:
- tinygrad - For something between PyTorch and karpathy/micrograd
- Composer - A PyTorch Library for Efficient Neural Network Training
- Keras - Deep Learning for humans
@misc{the-finegrain-team-2023-refiners,
author = {Benjamin Trom and Pierre Chapuis and Cédric Deltheil},
title = {Refiners: The simplest way to train and run adapters on top of foundation models},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/finegrain-ai/refiners}}
}