Skip to content

πŸ¦› CHONK your texts with Chonkie ✨ - The no-nonsense RAG chunking library

License

Notifications You must be signed in to change notification settings

chonkie-ai/chonkie

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Chonkie Logo

πŸ¦› Chonkie ✨

PyPI version License Documentation Package size Downloads Discord GitHub stars

The no-nonsense RAG chunking library that's lightweight, lightning-fast, and ready to CHONK your texts

Installation β€’ Usage β€’ Supported Methods β€’ Benchmarks β€’ Documentation β€’ Citation

so i found myself making another RAG bot (for the 2342148th time) and meanwhile, explaining to my juniors about why we should use chunking in our RAG bots, only to realise that i would have to write chunking all over again unless i use the bloated software library X or the extremely feature-less library Y. WHY CAN I NOT HAVE SOMETHING JUST RIGHT, UGH?

Can't i just install, import and run chunking and not have to worry about dependencies, bloat, speed or other factors?

Well, with chonkie you can! (chonkie boi is a gud boi)

πŸš€ Feature-rich: All the CHONKs you'd ever need
✨ Easy to use: Install, Import, CHONK
⚑ Fast: CHONK at the speed of light! zooooom
🌐 Wide support: Supports all your favorite tokenizer CHONKS
πŸͺΆ Light-weight: No bloat, just CHONK
πŸ¦› Cute CHONK mascot: psst it's a pygmy hippo btw
❀️ Moto Moto's favorite python library

What're you waiting for, just CHONK it!

Installation

To install chonkie, simply run:

pip install chonkie

Chonkie follows the rule to have minimal default installs, read the DOCS to know the installation for your required chunker, or simply install all if you don't want to think about it (not recommended).

pip install chonkie[all]

Usage

Here's a basic example to get you started:

# First import the chunker you want from Chonkie 
from chonkie import TokenChunker

# Import your favorite tokenizer library
# Also supports AutoTokenizers, TikToken and AutoTikTokenizer
from tokenizers import Tokenizer 
tokenizer = Tokenizer.from_pretrained("gpt2")

# Initialize the chunker
chunker = TokenChunker(tokenizer)

# Chunk some text
chunks = chunker("Woah! Chonkie, the chunking library is so cool! I love the tiny hippo hehe.")

# Access chunks
for chunk in chunks:
    print(f"Chunk: {chunk.text}")
    print(f"Tokens: {chunk.token_count}")

More example usages given inside the DOCS

Supported Methods

Chonkie provides several chunkers to help you split your text efficiently for RAG applications. Here's a quick overview of the available chunkers:

  • TokenChunker: Splits text into fixed-size token chunks.
  • WordChunker: Splits text into chunks based on words.
  • SentenceChunker: Splits text into chunks based on sentences.
  • SemanticChunker: Splits text into chunks based on semantic similarity.
  • SDPMChunker: Splits text using a Semantic Double-Pass Merge approach.
  • LateChunker (experimental): Embeds text and then splits it to have better chunk embeddings.

More on these methods and the approaches taken inside the DOCS

Benchmarks πŸƒβ€β™‚οΈ

"I may be smol hippo, but I pack a punch!" πŸ¦›

Here's a quick peek at how Chonkie performs:

SizeπŸ“¦

  • Default Install: 9.7MB (vs 80-171MB for alternatives)
  • With Semantic: Still lighter than the competition!

Speed⚑

  • Token Chunking: 33x faster than the slowest alternative
  • Sentence Chunking: Almost 2x faster than competitors
  • Semantic Chunking: Up to 2.5x faster than others

Check out our detailed benchmarks to see how Chonkie races past the competition! πŸƒβ€β™‚οΈπŸ’¨

Acknowledgements

Chonkie would like to CHONK its way through a special thanks to all the users and contributors who have helped make this library what it is today! Your feedback, issue reports, and improvements have helped make Chonkie the CHONKIEST it can be.

And of course, special thanks to Moto Moto for endorsing Chonkie with his famous quote:

"I like them big, I like them chonkie." ~ Moto Moto

Citation

If you use Chonkie in your research, please cite it as follows:

@misc{chonkie2024,
  author = {Minhas, Bhavnick},
  title = {Chonkie: A Fast Feature-full Chunking Library for RAG Bots},
  year = {2024},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/bhavnick/chonkie}},
}