Skip to content

Locally run your own LLM - easy, simple, lightweight

License

Notifications You must be signed in to change notification settings

chanulee/coreOllama

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 

Repository files navigation

coreOllama

The most easy-to-use, simple and lightweight interface to run your local LLM, for everyone.

Ollama Web Interface

Features

  • Generate
  • Model Management
    • View and delete models
    • Pull new model
  • Local server status
  • Dark mode
  • Include Context: Full or selection
  • Clear chat history

Versions

  • 0-basic: Basic proof of concept of ollama-gui
  • chat: main project

Advanced Apps

Beginner's guide

  1. Ollama setup - install ollama app for mac (You can download model or just proceed and use gui)
  2. Quit the app (check on your status bar).
  3. Open terminal and enter ollama serve. Keep that terminal window open.
  4. Check http://localhost:11434/, it should say "Ollama is running".
  5. Download the repo and open web/chat/index.html

About

Locally run your own LLM - easy, simple, lightweight

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published