Super Simple ChatUI is a React/TypeScript project that provides a simple and intuitive frontend UI for interacting with a local LLM (Large Language Model) on through Ollama. This project enables users to interact with their own LLMs locally, ensuring privacy and control over their data.
This project was setup using Vite, which allows for rapid development thanks to features like Hot Module Replacement, support for TypeScript, CSS modules and more.
- Node (v18.17.0 or later)
- Ollama
- Clone the repository
> git clone https://github.com/longevity-genie/just-chatui.git
> cd just-chatui
- Install dependencies
> npm install
- Run development server
> npm run dev
- Access application by visiting the link in your terminal (I believe Vite uses: http://localhost:5173)
- Ensure that Ollama is running on your machine and exposes its API at:
http://localhost:11434
- Interact with LLM: Use the super-simple-chatui interface to send queries to Ollama and receive responses.
- Add support for IndexedDb via Dexie (longer term storage for conversations, system prompts, various settings, etc)
- Add support for picking from available models via Ollama
- Add support for chatting with models via the AI Horde
- Add support for OpenAI's ChatGPT API via API key
- Write tests! Always with the tests.
Contributions are welcome! Please follow these steps to contribute:
- Fork the repository.
- Create a new branch (git checkout -b feature-branch).
- Make your changes.
- Commit your changes (git commit -m 'Add new feature').
- Push to the branch (git push origin feature-branch).
- Open a pull request.
This project is licensed under the MIT License. See the LICENSE file for details.
- Chat logo / favicon via Flaticon: Bubble Chat free icon
- Ollama
- Vite