An open-source project leveraging the capabilities of LLM (Large Language Models) and RAG (Retrieval Augmented Generation) to process queries based on PDF data at hand.
These instructions will get you a copy of the project up and running on your local machine for development and testing purposes.
What things you need to install the software and how to install them:
- Python 3x
- For running using open-source models, Ollama has to be installed and ruuning in your machine. For more details visit :Ollama
- For running using openai, your open_ai_key has to be configured in the config.json
- Clone the repository:
git clone https://github.com/ragesh2000/Chat-with-your-pdf.git
- Navigate into the project directory:
cd Chat-with-your-pdf
- Install the required dependencies:
pip install -r requirements.txt
Run the run.py
file with your preferred model as an argument (Currently supports openai or ollama models) :
for running with openai model:
python run.py -model openai
or for running with ollam model:
python run.py -model ollama