Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

839 optionally return chat model in start controller and level controller #858

Conversation

pmarsh-scottlogic
Copy link
Contributor

@pmarsh-scottlogic pmarsh-scottlogic commented Mar 7, 2024

⚠NOTE: I've put the wrong ticket number on the branch name.

Description

The final PR for #824 !
This one is to stop the getModel calls. We now return the current configured model in getStart or getLevel (but only for sandbox).

Screenshots

Before

  • Refreshing on sandbox (preview mode, so only single render):
    image

  • changing level to sandbox (preview mode, so only single render):
    image

After

  • Refreshing on sandbox (preview mode, so only single render):
    image

  • changing level to sandbox (preview mode, so only single render):
    image

Notes

Here's the whole situation because it's a bit convoluted:

  • In the backend session we store chatModel. This records the id of the model that we want to chat to as well as the configurable settings (temperature, top p, etc). But we use the default chatModel rather than this configured version for all levels apart from sandbox. So really what this value represents is the chatModel used in sandbox (despite the name and it not being part of the levelState). I think this is confusing, but I tried to change it and it snowballed into a bit of a mess, so 'm calling it a problem for another day.
  • Anyhoo, we need to get the configured chatModel whenever we load up or switch to the sandbox. So I've attached a chatModel attribute to the StartResponse and LoadLevelResponse. But because this is only relevant for sandbox, on all other levels this value is undefined.
  • Meanwhile in the frontend, in MainComponent we consume these response objects. We set the chatModel useState value, and then drill that down all the way to the ModelSelection and ModelConfiguration components, which show up in the control panel in sandbox.
  • I've opted to sync the frontend's copy of the chatModel with the backend's one, i.e. it will only store the configured chatModel used in the sandbox.

Concerns

  • I'm sure there are nicer, less confusing ways to organise all the data, but i don't think it's this ticket's job to do that. (read: I tried to reorganise it but it grew arms and legs, so I'm choosing to revert and then ignore the problem)
  • I wonder if I should rename chatModel across the front and back end to something like sandboxChatModel just to make it really clear what it's there for.

Checklist

Have you done the following?

  • Linked the relevant Issue
  • Added tests
  • Ensured the workflow steps are passing

@pmarsh-scottlogic pmarsh-scottlogic linked an issue Mar 7, 2024 that may be closed by this pull request
@pmarsh-scottlogic pmarsh-scottlogic marked this pull request as ready for review March 7, 2024 10:53
Copy link
Member

@chriswilty chriswilty left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A few thoughts at this stage, but looks sound 👍

backend/src/models/api/LevelGetRequest.ts Outdated Show resolved Hide resolved
frontend/src/components/MainComponent/MainComponent.tsx Outdated Show resolved Hide resolved
function setChatModelId(modelId: string) {
if (!chatModel) {
console.error(
'You are trying to change the id of the chatModel but it has not been loaded yet'
Copy link
Member

@chriswilty chriswilty Mar 11, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not convinced this error message will help a future developer work out what's going on here. Can you come up with something indicating we only expect chat model to be changed on Sandbox level?

Or if this genuinely can happen, then I don't think we want to be rendering the dropdown model selector until we have the model response, but I was under the impression that chatModel and "available models" now arrive in the same response.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"available models" arrives under the startResponse, along with chatModel. So if the game is loaded up or refreshed on sandbox, then the available models and chatModel arrive at the same time. However if the user switches from any level to sandbox, then there will be some amount of time (usually negligible) where the chat model dropdown and choose button are rendered, but the chatModel is not yet loaded.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay, conditionally rendering the select model bits depending on chatModel being loaded. Wondering if we're doing this then we should also not render the model configuration sliders...
image

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"available models" arrives under the startResponse, along with chatModel. So if the game is loaded up or refreshed on sandbox, then the available models and chatModel arrive at the same time. However if the user switches from any level to sandbox, then there will be some amount of time (usually negligible) where the chat model dropdown and choose button are rendered, but the chatModel is not yet loaded.

Right, that makes sense now 😅

I'd say if we're going down this route, then we should conditionally render the sliders as you suggest. Can you see how that looks - in the code I mean, check it's not too gnarly?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As discussed in the standup, we'll leave this for a future issue, as it has some (re)design implications.

frontend/src/components/ModelBox/ModelConfiguration.tsx Outdated Show resolved Hide resolved
frontend/src/components/ModelBox/ModelConfiguration.tsx Outdated Show resolved Hide resolved
frontend/src/components/ModelBox/ModelConfiguration.tsx Outdated Show resolved Hide resolved
Copy link
Member

@chriswilty chriswilty left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lovely 🏆

@pmarsh-scottlogic pmarsh-scottlogic merged commit 5d7c903 into dev Mar 14, 2024
4 checks passed
@pmarsh-scottlogic pmarsh-scottlogic deleted the 839-optionally-return-chatModel-in-startController-and-levelController branch March 14, 2024 16:30
chriswilty pushed a commit that referenced this pull request Apr 8, 2024
…ller (#858)

* return chatModel in level and start controllers

* removes getModel endpoint

* frontend start and leveServices now get the chatModel from the response

* save chatModel in MainComponent

* chatModel is drilled down to the components that need it

* now sets the chatModelId so that the message updates

* change a null to an undefined for brevity

* remove old comment

* remove getModel method from chatService

* changes chatModel in LevelGetRequest to be optional

* conditionally render the chat model selection

* move default configs out of function

* wraps comment line

* improve config update method
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Streamline frontend change level network calls 🛠🛠🛠
2 participants