This project is an intelligent routing system for user queries, using the LangChain framework and Groq's Gemma2-9b-It model. It allows queries to be directed to the most relevant data source — either arxiv_search (for AI research papers), Wikipedia (for human-related information), or an LLM (for general queries).
- Context-Aware Routing: Routes user queries based on the context to either arxiv, Wikipedia, or an LLM.
- Flexible Query Handling: Supports different types of user queries related to AI research, human information, or general knowledge.
- Powered by LLM: Utilizes Groq's Gemma2-9b-It model for intelligent query understanding and routing.
- LangChain: Used for building the agent that handles the query routing process.
- Groq's Gemma2-9b-It Model: The large language model used to interpret user queries and generate structured output.
- arxiv_search: Searches for AI-related research papers.
- Wikipedia Search: Handles queries related to human knowledge.
- LLM Search: A fallback option for general queries that don’t fit into the other categories.
-
Clone the repository:
git clone https://github.com/MadhanMohanReddy2301/SmartChainAgents.git
-
Navigate to the project directory:
cd SmartChainAgents
-
Install the required dependencies:
pip install -r requirements.txt
-
Set up your environment variables by adding your Groq API key:
export GROQ_API_KEY=your_groq_api_key
-
Initialize the LLM router system by running the script:
python graph.py
-
Test the routing functionality with some example queries:
question_router.invoke({"question": "Who is Shahrukh Khan?"}) question_router.invoke({"question": "What are the types of agent memory?"})
-
The system will route the query to the appropriate source and return the result accordingly.
# Example query for Wikipedia search
{
"datasource": "wiki_search"
}
# Example query for arxiv search
{
"datasource": "arxiv_search"
}