- Request to Ollama server
- Display response chunk by chunk
- Inject clipboard into prompt
- Copy model response into clipboard
- Exchange messages for feedback
- Chat interface
- Copy any of the messages
- Default prompts
- Display possible prompts
- Display Ollama models
- Pick model
- Go back to the prompt/model selection screen
- Calling the app using a shortcut
- Picking a prompt by using a letter
- Display keyboard keys near prompts
- Tray icon
- Closing to tray
- Summoning window with tray menu
- Dismiss the app with a shortcut
- Adding prompts
- Removing prompts
- Editing existing prompt
- Toggling prompts on/off
- Storing prompt library
- Clearing current chat
- Storing previous responses/chats
- Loading into a previous chat
- Deleting responses/chats
- Setting closing behavior
- Setting storage
- Bypass deleting confirmation dialog
- UI for each setting
- Storing settings
- Customizing shortcut
- Customizing prompt triggers
- Customize default model
- Override default model per-prompt
- Custom right-click menu
- Tooltips
- Custom Window menu ?
- More options in system tray
- Toast when copying a message
- Toasts when errors
- Better App Icon
- Better font
- Pretty markdown
- Highlighted code
- Changing light/dark theme
- Documentation efforts on the backend
- Documentation efforts on the frontend
- Sign code
- Proper README
- Better installer
- Translations
- Linux
- MACOS
- Reduced installer size
- In-app updater
- Allow start on startup/login
- Exporting chats
- Importing chats
- Exporting prompt library
- Importing prompt library
- Tool calling
- Custom window top bar
- More LLM providers
- Changing theme