Skip to content

Commit

Permalink
Separate fixie-hello-world into fixie-hello-world and @fixieai/sdk (#251
Browse files Browse the repository at this point in the history
)

- Split out the fastify server from the hello-world sample
- Use new conversation contracts with Fixie service
- Add JWT parsing to the fastify shim to extract agent ID
- Change AI.JSX `<ConversationHistory>` component to be a simple
extensibility point that the Fixie request wrapper injects the
`<FixieConversation>` component into

Subsequent changes will:
- Plumb the auth token so that Fixie corpus queries can use it
  • Loading branch information
petersalas authored Aug 31, 2023
1 parent ad801e8 commit 348294e
Show file tree
Hide file tree
Showing 21 changed files with 554 additions and 208 deletions.
2 changes: 1 addition & 1 deletion packages/ai-jsx/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
"repository": "fixie-ai/ai-jsx",
"bugs": "https://github.com/fixie-ai/ai-jsx/issues",
"homepage": "https://ai-jsx.com",
"version": "0.11.0",
"version": "0.12.0",
"volta": {
"extends": "../../package.json"
},
Expand Down
42 changes: 21 additions & 21 deletions packages/ai-jsx/src/core/conversation.tsx
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
import { type OpenAI as OpenAIClient } from 'openai';
import * as AI from '../index.js';
import { Node } from '../index.js';
import { AIJSXError, ErrorCode } from '../core/errors.js';
Expand Down Expand Up @@ -59,25 +58,26 @@ export function AssistantMessage({ children }: { children: Node }) {
return children;
}

export function ConversationHistory({
messages,
}: {
messages: OpenAIClient.Chat.CreateChatCompletionRequestMessage[];
}) {
return messages.map((message) => {
switch (message.role) {
case 'system':
return <SystemMessage>{message.content}</SystemMessage>;
case 'user':
return <UserMessage>{message.content}</UserMessage>;
case 'assistant':
return <AssistantMessage>{message.content}</AssistantMessage>;
case 'function':
return (
<FunctionCall name={message.function_call!.name!} args={JSON.parse(message.function_call!.arguments!)} />
);
}
});
/**
* Sets the node that the <ConversationHistory /> component will resolve to.
*/
export const ConversationHistoryContext = AI.createContext<AI.Node>(undefined);

/**
* Renders to the conversation history provided through ConversationHistoryContext.
*/
export function ConversationHistory(_: {}, { getContext }: AI.ComponentContext) {
const fromContext = getContext(ConversationHistoryContext);

if (fromContext === undefined) {
throw new AIJSXError(
'No conversation history was present on the context. Use the ConversationHistoryContext.Provider component to set the conversation history.',
ErrorCode.ConversationHistoryComponentRequiresContext,
'user'
);
}

return fromContext;
}

/**
Expand Down Expand Up @@ -263,7 +263,7 @@ export async function renderToConversation(
*
* return null;
* }>
* <ConversationHistory messages={jsonMessages} />
* <ConversationHistory />
* </ChatCompletion>
*
* ==> 'Hello there!'
Expand Down
2 changes: 2 additions & 0 deletions packages/ai-jsx/src/core/errors.ts
Original file line number Diff line number Diff line change
Expand Up @@ -49,6 +49,8 @@ export enum ErrorCode {

ModelOutputCouldNotBeParsedForTool = 2005,
ModelHallucinatedTool = 2006,

ConversationHistoryComponentRequiresContext = 2007,
}

export type ErrorBlame =
Expand Down
5 changes: 3 additions & 2 deletions packages/ai-jsx/src/lib/openai.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -20,19 +20,20 @@ import { Node } from '../index.js';
import { ChatOrCompletionModelOrBoth } from './model.js';
import { getEnvVar, patchedUntruncateJson } from './util.js';
import { OpenAI as OpenAIClient } from 'openai';
export { OpenAI as OpenAIClient } from 'openai';
import { debugRepresentation } from '../core/debug.js';
import { getEncoding } from 'js-tiktoken';
import _ from 'lodash';

// https://platform.openai.com/docs/models/model-endpoint-compatibility
type ValidCompletionModel =
export type ValidCompletionModel =
| 'text-davinci-003'
| 'text-davinci-002'
| 'text-curie-001'
| 'text-babbage-001'
| 'text-ada-001';

type ValidChatModel =
export type ValidChatModel =
| 'gpt-4'
| 'gpt-4-0314' // discontinue on 06/13/2024
| 'gpt-4-0613'
Expand Down
8 changes: 7 additions & 1 deletion packages/docs/docs/changelog.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,12 @@
# Changelog

## 0.11.0
## 0.12.0

- Change the `<ConversationHistory>` component to render to a node from a `ConversationHistoryContext` provider, rather
than from OpenAI message types.
- Replace usage of `openai-edge` with that of the `openai` v4 package.

## [0.11.0](https://github.com/fixie-ai/ai-jsx/commit/44e7702a449861c1f5435215000b6fc3e1a95171)

- Updated the `<FixieCorpus>` component to use the new Fixie Corpus REST API.
This is currently only available to users on `beta.fixie.ai` but will be brought
Expand Down
48 changes: 8 additions & 40 deletions packages/fixie-hello-world/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -2,55 +2,23 @@
"name": "fixie-hello-world",
"version": "1.0.0",
"devDependencies": {
"@fixieai/sdk": "*",
"@tsconfig/node18": "^2.0.1",
"@types/lodash": "^4.14.195",
"@types/node": "^20.3.1",
"@types/prompt-sync": "^4.2.0",
"@types/uuid": "^9.0.2",
"@types/yargs": "^17.0.24",
"@typescript-eslint/eslint-plugin": "^5.60.0",
"@typescript-eslint/parser": "^5.60.0",
"eslint": "^8.40.0",
"eslint-config-nth": "^2.0.1",
"typescript": "^5.1.3"
},
"private": true,
"type": "module",
"scripts": {
"dev": "yarn build && npx ts-node dist/serve-bin.js --port 8080 --package-path fixie",
"lint": "eslint . --max-warnings 0",
"lint:fix": "eslint . --fix",
"dev": "yarn build && npx --package=@fixieai/sdk fixie-serve-bin --packagePath ./dist/index.js --port 8080",
"typecheck": "tsc -p tsconfig.json",
"build": "yarn run typecheck",
"bin:dev": "npx ts-node dist/serve-bin.js --port 8080 --package-path fixie"
"prepack": "yarn build"
},
"dependencies": {
"@opentelemetry/api": "^1.4.1",
"@opentelemetry/api-logs": "^0.41.1",
"@opentelemetry/exporter-logs-otlp-grpc": "^0.41.1",
"@opentelemetry/exporter-trace-otlp-grpc": "^0.41.1",
"@opentelemetry/instrumentation-fastify": "^0.32.0",
"@opentelemetry/instrumentation-fetch": "^0.41.1",
"@opentelemetry/instrumentation-http": "^0.41.2",
"@opentelemetry/sdk-logs": "^0.41.1",
"@opentelemetry/sdk-node": "^0.41.1",
"ai-jsx": "*",
"dotenv": "^16.3.1",
"fastify": "^4.20.0",
"yargs": "^17.7.2"
"ai-jsx": "*"
},
"bin": {
"serve-bin": "dist/serve-bin.js"
},
"exports": {
".": {
"import": {
"types": "./dist/index.d.ts",
"default": "./dist/index.js"
},
"require": {
"default": "./dist/index.cjs"
}
}
}
"files": [
"dist"
],
"main": "dist/index.js"
}
8 changes: 4 additions & 4 deletions packages/fixie-hello-world/src/index.tsx
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
import { ChatCompletion, SystemMessage, UserMessage } from 'ai-jsx/core/completion';
import { ChatCompletion, SystemMessage, ConversationHistory } from 'ai-jsx/core/completion';

export default function HelloWorld({ message }: { message: string }) {
export default function HelloWorld() {
return (
<ChatCompletion temperature={1}>
<SystemMessage>Respond to the user using some variant of the phrase "Hello World!". Be creative!</SystemMessage>
<UserMessage>{message}</UserMessage>
<SystemMessage>You are Clippy from Microsoft Office. Respond to the user accordingly.</SystemMessage>
<ConversationHistory />
</ChatCompletion>
);
}
114 changes: 0 additions & 114 deletions packages/fixie-hello-world/src/serve-bin.ts

This file was deleted.

2 changes: 1 addition & 1 deletion packages/fixie-hello-world/tsconfig.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"extends": "@tsconfig/node18/tsconfig.json",
"include": ["src/**/*", ".eslintrc.cjs"],
"include": ["src/**/*"],
"compilerOptions": {
"jsx": "react-jsx",
"jsxImportSource": "ai-jsx",
Expand Down
File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,8 @@ module.exports = {
'no-trailing-spaces': 'off',
'no-else-return': ['warn', { allowElseIf: false }],

'spaced-comment': ['error', 'always', { markers: ['/'] }],

// Disable style rules to let prettier own it
'object-curly-spacing': 'off',
'comma-dangle': 'off',
Expand Down
55 changes: 55 additions & 0 deletions packages/fixie-sdk/package.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
{
"name": "@fixieai/sdk",
"version": "1.0.0",
"license": "MIT",
"repository": "fixie-ai/ai-jsx",
"bugs": "https://github.com/fixie-ai/ai-jsx/issues",
"homepage": "https://fixie.ai",
"type": "module",
"scripts": {
"lint": "eslint . --max-warnings 0",
"lint:fix": "eslint . --fix",
"typecheck": "tsc -p tsconfig.json",
"build": "yarn run typecheck",
"prepack": "yarn build"
},
"files": [
"dist"
],
"main": "dist/index.js",
"bin": {
"fixie-serve-bin": "dist/fixie-serve-bin.js"
},
"peerDependencies": {
"ai-jsx": ">=0.12.0 <1.0.0"
},
"dependencies": {
"@opentelemetry/api": "^1.4.1",
"@opentelemetry/api-logs": "^0.41.1",
"@opentelemetry/exporter-logs-otlp-grpc": "^0.41.1",
"@opentelemetry/exporter-trace-otlp-grpc": "^0.41.1",
"@opentelemetry/instrumentation-fastify": "^0.32.0",
"@opentelemetry/instrumentation-fetch": "^0.41.1",
"@opentelemetry/instrumentation-http": "^0.41.2",
"@opentelemetry/sdk-logs": "^0.41.1",
"@opentelemetry/sdk-node": "^0.41.1",
"dotenv": "^16.3.1",
"fastify": "^4.20.0",
"jose": "^4.14.4",
"yargs": "^17.7.2"
},
"devDependencies": {
"@tsconfig/node18": "^2.0.1",
"@types/lodash": "^4.14.195",
"@types/node": "^20.3.1",
"@types/prompt-sync": "^4.2.0",
"@types/uuid": "^9.0.2",
"@types/yargs": "^17.0.24",
"@typescript-eslint/eslint-plugin": "^5.60.0",
"@typescript-eslint/parser": "^5.60.0",
"ai-jsx": ">=0.12.0 <1.0.0",
"eslint": "^8.40.0",
"eslint-config-nth": "^2.0.1",
"typescript": "^5.1.3"
}
}
Loading

3 comments on commit 348294e

@vercel
Copy link

@vercel vercel bot commented on 348294e Aug 31, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Successfully deployed to the following URLs:

ai-jsx-docs – ./packages/docs

ai-jsx-docs.vercel.app
ai-jsx-docs-git-main-fixie-ai.vercel.app
ai-jsx-docs-fixie-ai.vercel.app
docs.ai-jsx.com

@vercel
Copy link

@vercel vercel bot commented on 348294e Aug 31, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Successfully deployed to the following URLs:

ai-jsx-nextjs-demo – ./packages/nextjs-demo

ai-jsx-nextjs-demo-fixie-ai.vercel.app
ai-jsx-nextjs-demo.vercel.app
ai-jsx-nextjs-demo-git-main-fixie-ai.vercel.app

@vercel
Copy link

@vercel vercel bot commented on 348294e Aug 31, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Successfully deployed to the following URLs:

ai-jsx-tutorial-nextjs – ./packages/tutorial-nextjs

ai-jsx-tutorial-nextjs-git-main-fixie-ai.vercel.app
ai-jsx-tutorial-nextjs.vercel.app
ai-jsx-tutorial-nextjs-fixie-ai.vercel.app

Please sign in to comment.