Replies: 2 comments
-
Added differently. Will be on main shortly. |
Beta Was this translation helpful? Give feedback.
-
This is published in v0.3.0-rc.0. Each chat model implements a The The purpose is to all a simple, portable config to be used to define a chat model and be stored and recovered later. |
Beta Was this translation helpful? Give feedback.
-
I'd like a simple way of indicating what LLM service and model a conversation should use. This also allows for changing it when testing and working with multiples.
For instance, when a system works with multiple model types like ChatGPT, Anthropic and Bumblebee hosted models, I would like to convey the selected service, model, and possibly some config all with a simple setting.
In an application, I may have a "conversation". The conversation could have a "model" field that is a string. I imagine the UI might be a select input where the possible values could be:
We could potentially allow additional settings like this:
Using simple string splitting or binary pattern matching, getting the data out is easy. The 3rd position could be a JSON string of model config options that may be specific to that model.
Thoughts? Opinions?
Beta Was this translation helpful? Give feedback.
All reactions