A chat interface with local LLMs using Ollama.
OllamaChat is a chat interface designed to work with local Language Model Machines (LLMs) using Ollama. This project aims to provide a robust chat application with support for local LLMs, ensuring privacy and performance.
- Local LLM support using Ollama
- Easy setup and deployment with Django
- Secure and private communication
Before you begin, ensure you have met the following requirements:
- Python 3.x
- Django
- Ollama
- Clone the Repository
git clone https://github.com/traromal/OllamaChat.git cd OllamaChat
- Set Up a Virtual Environment
python3 -m venv venv source venv/bin/activate
- Install Django and Other Dependencies
pip install django
- Configure Django
Navigate to the OllamaChat directory and create a new Django project:
Apply initial migrations:
python manage.py migrate
5.Run the Development Server
python manage.py runserver