-
Hi, I am new to the TabbyML, after reading the doc, I got a question about the "repository" context about Tabby. In the document, it saids that : I assume this process is similar to the RAG and the code AST would be large and should take a lot of space of input tokens window of the LLM, then if I use a relatively small LLM with limited context window length (e.g. 1024 or 2048), does Tabby only provide a partial part of repo context to the LLM during the inference? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Yes - only relevant context (defined as a combined score of BM25 and semantic relevant) will be inserted into code completion context. See #2674 for a more complete view |
Beta Was this translation helpful? Give feedback.
Yes - only relevant context (defined as a combined score of BM25 and semantic relevant) will be inserted into code completion context. See #2674 for a more complete view