Skip to content

Latest commit

 

History

History
38 lines (32 loc) · 1.2 KB

README.md

File metadata and controls

38 lines (32 loc) · 1.2 KB

Chat-with-your-pdf

An open-source project leveraging the capabilities of LLM (Large Language Models) and RAG (Retrieval Augmented Generation) to process queries based on PDF data at hand.

Getting Started

These instructions will get you a copy of the project up and running on your local machine for development and testing purposes.

Prerequisites

What things you need to install the software and how to install them:

  • Python 3x
  • For running using open-source models, Ollama has to be installed and ruuning in your machine. For more details visit :Ollama
  • For running using openai, your open_ai_key has to be configured in the config.json

Installation

  1. Clone the repository:
git clone https://github.com/ragesh2000/Chat-with-your-pdf.git
  1. Navigate into the project directory:
cd Chat-with-your-pdf
  1. Install the required dependencies:
pip install -r requirements.txt

Usage

Run the run.py file with your preferred model as an argument (Currently supports openai or ollama models) : for running with openai model:

python run.py -model openai

or for running with ollam model:

python run.py -model ollama