Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

828 streamline chat model configuration info message network call #875

Merged
Show file tree
Hide file tree
Changes from 5 commits
Commits
Show all changes
25 commits
Select commit Hold shift + click to select a range
0ebbc2e
convert model_config_ids away from enum
pmarsh-scottlogic Mar 22, 2024
978bd40
rejig the chat models folder and use type instead of interface
pmarsh-scottlogic Mar 22, 2024
104f8dc
convert Model Config ID away from enum in frontend
pmarsh-scottlogic Mar 22, 2024
bc49402
rename to MODEL_CONFIG_ID
pmarsh-scottlogic Mar 22, 2024
f9382ce
adds deted backend validation on handleConfigureModel
pmarsh-scottlogic Mar 22, 2024
644cd38
adds first modelController unit test
pmarsh-scottlogic Mar 22, 2024
b6ed9be
clarifies error response message
pmarsh-scottlogic Mar 22, 2024
f5975b5
fills in unhappy path tests
pmarsh-scottlogic Mar 22, 2024
47097a0
deletes repeated unit test
pmarsh-scottlogic Mar 22, 2024
309efe0
now returns resulting info message
pmarsh-scottlogic Mar 22, 2024
b4f9885
update modelController test according to its now function
pmarsh-scottlogic Mar 22, 2024
b466448
chatService now extracts the resulting chat info message and returns it
pmarsh-scottlogic Mar 22, 2024
2fa7519
makes frontend use thie resultingChatInfoMessage
pmarsh-scottlogic Mar 22, 2024
2ac42df
renames resultingChatMessage t chatInfoMessage
pmarsh-scottlogic Mar 25, 2024
0de0076
move maxValue calculagtion below validation for configId
pmarsh-scottlogic Mar 25, 2024
7839a32
defines chatModelConfigurations according to existing type
pmarsh-scottlogic Mar 25, 2024
e907458
check response status before trying to convert json
pmarsh-scottlogic Mar 25, 2024
b782656
merge dev
pmarsh-scottlogic Mar 25, 2024
b245977
removes redundant else statmement
pmarsh-scottlogic Mar 26, 2024
1a4431c
merge dev
pmarsh-scottlogic Mar 26, 2024
a70205b
combine two identical response types to become chatInfoMessageResponse
pmarsh-scottlogic Mar 26, 2024
c51a48d
fixes type export
pmarsh-scottlogic Mar 27, 2024
ebac4a0
Merge branch 'dev' into 828-streamline-chat-model-configuration-info-…
pmarsh-scottlogic Mar 27, 2024
3961cd5
merge dev
pmarsh-scottlogic Mar 28, 2024
cff8d7d
fix import
pmarsh-scottlogic Mar 28, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 5 additions & 5 deletions backend/src/controller/modelController.ts
Original file line number Diff line number Diff line change
Expand Up @@ -27,8 +27,6 @@ function handleConfigureModel(req: OpenAiConfigureModelRequest, res: Response) {
const configId = req.body.configId as MODEL_CONFIG_ID | undefined;
const value = req.body.value;

const maxValue = configId === 'topP' ? 1 : 2;

if (!configId) {
pmarsh-scottlogic marked this conversation as resolved.
Show resolved Hide resolved
sendErrorResponse(res, 400, 'Missing configId');
return;
Expand All @@ -44,6 +42,8 @@ function handleConfigureModel(req: OpenAiConfigureModelRequest, res: Response) {
return;
}

const maxValue = configId === 'topP' ? 1 : 2;

if (value < 0 || value > maxValue) {
sendErrorResponse(
res,
Expand All @@ -55,17 +55,17 @@ function handleConfigureModel(req: OpenAiConfigureModelRequest, res: Response) {

req.session.chatModel.configuration[configId] = value;

const resultingChatInfoMessage = {
const chatInfoMessage = {
infoMessage: `changed ${configId} to ${value}`,
chatMessageType: 'GENERIC_INFO',
} as ChatInfoMessage;
req.session.levelState[LEVEL_NAMES.SANDBOX].chatHistory =
pushMessageToHistory(
req.session.levelState[LEVEL_NAMES.SANDBOX].chatHistory,
resultingChatInfoMessage
chatInfoMessage
);

res.status(200).send({ resultingChatInfoMessage });
res.status(200).send({ chatInfoMessage });
}

export { handleSetModel, handleConfigureModel };
2 changes: 1 addition & 1 deletion backend/src/models/api/OpenAiConfigureModelRequest.ts
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ import { ChatMessage } from '@src/models/chatMessage';

export type OpenAiConfigureModelRequest = Request<
never,
null | { resultingChatInfoMessage: ChatMessage },
null | { chatInfoMessage: ChatMessage },
{
configId?: string;
value?: number;
Expand Down
16 changes: 6 additions & 10 deletions backend/src/models/chat.ts
Original file line number Diff line number Diff line change
Expand Up @@ -19,17 +19,9 @@ enum CHAT_MODELS {

type ChatModel = {
id: CHAT_MODELS;
configuration: ChatModelConfiguration;
configuration: ChatModelConfigurations;
};

type ChatModelConfiguration = {
temperature: number;
topP: number;
frequencyPenalty: number;
presencePenalty: number;
};

// these must match the above type because they are used for indexing
const modelConfigIds = [
pmarsh-scottlogic marked this conversation as resolved.
Show resolved Hide resolved
'temperature',
'topP',
Expand All @@ -39,6 +31,10 @@ const modelConfigIds = [

type MODEL_CONFIG_ID = (typeof modelConfigIds)[number];

type ChatModelConfigurations = {
Copy link
Member

@chriswilty chriswilty Mar 26, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There must be a reason you pluralised this, right? Is there a clash with another type, or did you feel it made more sense as a plural? Or did you do so to match the frontend code?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I prefer it that way.

If each of top p, `frequency penalty etc are one configuration, then the set of all of them should be called configurations.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's fine, but then export it as that - currently you're exporting it as ChatModelConfiguration.

[key in MODEL_CONFIG_ID]: number;
};

interface DefenceReport {
blockedReason: string | null;
isBlocked: boolean;
Expand Down Expand Up @@ -125,7 +121,7 @@ export type {
ChatGptReply,
ChatMalicious,
ChatModel,
ChatModelConfiguration,
ChatModelConfigurations as ChatModelConfiguration,
ChatResponse,
LevelHandlerResponse,
ChatHttpResponse,
Expand Down
2 changes: 1 addition & 1 deletion backend/test/unit/controller/modelController.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ describe('handleConfigureModel', () => {
req.session.levelState[LEVEL_NAMES.SANDBOX].chatHistory.at(-1)
).toEqual(expectedInfoMessage);
expect(res.send).toHaveBeenCalledWith({
resultingChatInfoMessage: expectedInfoMessage,
chatInfoMessage: expectedInfoMessage,
});
});

Expand Down
50 changes: 14 additions & 36 deletions frontend/src/components/ControlPanel/ControlPanel.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@ import { DEFENCES_HIDDEN_LEVEL3_IDS, MODEL_DEFENCES } from '@src/Defences';
import DefenceBox from '@src/components/DefenceBox/DefenceBox';
import DocumentViewButton from '@src/components/DocumentViewer/DocumentViewButton';
import ModelBox from '@src/components/ModelBox/ModelBox';
import DetailElement from '@src/components/ThemedButtons/DetailElement';
import { ChatMessage, ChatModel } from '@src/models/chat';
import {
DEFENCE_ID,
Expand Down Expand Up @@ -70,42 +71,18 @@ function ControlPanel({
{(currentLevel === LEVEL_NAMES.LEVEL_3 ||
currentLevel === LEVEL_NAMES.SANDBOX) && (
<>
<h2 className="visually-hidden">ScottBrew System Access</h2>
<details className="control-collapsible-section">
<summary className="control-collapsible-section-header">
Defence Configuration
</summary>
<DefenceBox
currentLevel={currentLevel}
defences={nonModelDefences}
showConfigurations={showConfigurations}
resetDefenceConfiguration={resetDefenceConfiguration}
toggleDefence={toggleDefence}
setDefenceConfiguration={setDefenceConfiguration}
/>
</details>

<details className="control-collapsible-section">
<summary className="control-collapsible-section-header">
Model Configuration
</summary>
<DefenceBox
currentLevel={currentLevel}
defences={modelDefences}
showConfigurations={showConfigurations}
toggleDefence={toggleDefence}
resetDefenceConfiguration={resetDefenceConfiguration}
setDefenceConfiguration={setDefenceConfiguration}
/>

{/* only show model box in sandbox mode */}
{showConfigurations && (
<ModelBox
chatModel={chatModel}
setChatModelId={setChatModelId}
chatModelOptions={chatModelOptions}
addInfoMessage={addInfoMessage}
addChatMessage={addChatMessage}
<h2 className="visually-hidden">
ScottBrewBot Security Configuration
</h2>
<div className="config-container">
<DetailElement useIcon={true} buttonText={'Defence Configuration'}>
<DefenceBox
currentLevel={currentLevel}
defences={nonModelDefences}
showConfigurations={showConfigurations}
resetDefenceConfiguration={resetDefenceConfiguration}
toggleDefence={toggleDefence}
setDefenceConfiguration={setDefenceConfiguration}
/>
</DetailElement>
</div>
Expand All @@ -125,6 +102,7 @@ function ControlPanel({
setChatModelId={setChatModelId}
chatModelOptions={chatModelOptions}
addInfoMessage={addInfoMessage}
addChatMessage={addChatMessage}
/>
)}
</DetailElement>
Expand Down
16 changes: 7 additions & 9 deletions frontend/src/components/ModelBox/ModelConfiguration.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -78,15 +78,13 @@ function ModelConfiguration({

setCustomChatModelByID(id, newValue);

void chatService
.configureGptModel(id, newValue)
.then((resultingChatInfoMessage) => {
if (resultingChatInfoMessage) {
addChatMessage(resultingChatInfoMessage);
} else {
setCustomChatModelByID(id, prevValue);
}
});
void chatService.configureGptModel(id, newValue).then((chatInfoMessage) => {
if (chatInfoMessage) {
addChatMessage(chatInfoMessage);
} else {
setCustomChatModelByID(id, prevValue);
}
});
}

useEffect(() => {
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
import { Slider } from '@mui/material';
import { useEffect, useState } from 'react';

import DetailElement from '@src/components/ThemedButtons/DetailElement';
chriswilty marked this conversation as resolved.
Show resolved Hide resolved
import {
CustomChatModelConfiguration,
MODEL_CONFIG_ID,
Expand Down
12 changes: 4 additions & 8 deletions frontend/src/models/chat.ts
Original file line number Diff line number Diff line change
Expand Up @@ -20,14 +20,6 @@ type ChatModel = {
configuration: ChatModelConfigurations;
};

type ChatModelConfigurations = {
temperature: number;
topP: number;
frequencyPenalty: number;
presencePenalty: number;
};

// these must match the above type because they are used for indexing
const modelConfigIds = [
'temperature',
'topP',
Expand All @@ -37,6 +29,10 @@ const modelConfigIds = [

type MODEL_CONFIG_ID = (typeof modelConfigIds)[number];

type ChatModelConfigurations = {
[key in MODEL_CONFIG_ID]: number;
};

interface CustomChatModelConfiguration {
id: MODEL_CONFIG_ID;
name: string;
Expand Down
2 changes: 1 addition & 1 deletion frontend/src/models/combined.ts
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ type LoadLevelResponse = {
};

type ConfigureGptModelResponse = {
resultingChatInfoMessage: ChatMessageDTO;
chatInfoMessage: ChatMessageDTO;
};

export type { StartReponse, LoadLevelResponse, ConfigureGptModelResponse };
12 changes: 7 additions & 5 deletions frontend/src/service/chatService.ts
Original file line number Diff line number Diff line change
Expand Up @@ -84,11 +84,13 @@ async function configureGptModel(
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ configId, value }),
});
const { resultingChatInfoMessage } =
(await response.json()) as ConfigureGptModelResponse;
return response.status === 200
? makeChatMessageFromDTO(resultingChatInfoMessage)
: null;

if (response.status !== 200) return null;
else {
pmarsh-scottlogic marked this conversation as resolved.
Show resolved Hide resolved
const { chatInfoMessage } =
(await response.json()) as ConfigureGptModelResponse;
return makeChatMessageFromDTO(chatInfoMessage);
}
}

async function addInfoMessageToChatHistory(
Expand Down
Loading