-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How can we create RAG dynamicly? #429
Comments
hi . are you using it with openai ? what is the indexing cost for this ? |
I have same requirement , I want to use this for coding in which everytime i make small changes I have to reindex whole codebase which is very slow . if you find anything please let me know. |
Sorry, I use the Ollama & llama3-8b. But some YouTubers said openai api will cost a lot. |
If you just add a new file to the |
If anyone has it running on ollama please let me know how to set it up. |
If you know how to dynamically add text without generating a full index again, please let me know. Thank you |
If you add new content to the input and do not change your indexing parameters, you should get the benefit of the cache on many of the requests. Changing parameters changes the cache key though, causing a complete reindex. There are some steps in the pipeline that will re-run due to changed inputs any time you add content. |
If anyone has achieved or can suggest for this . In this we want to continuously index the updated code base in the HippoRAG index and query on the updated index and then make code changes and continuously do so. Here consider that we want to do this offline with ollama type models only and dont want to use OpenAI or Claude . If anyone can suggest how can i do this ? |
Consolidating index update requests with #741 |
I am currently facing a challenge with adding new text to RAG. My current approach involves reinitializing the entire RAG, which is slow.
Please provide guidance or suggestions on efficiently adding new text to RAG without requiring complete reinitialization.
The text was updated successfully, but these errors were encountered: