Skip to content

Commit

Permalink
feat: 🔧 add LLMProviderFeatures component and update docs
Browse files Browse the repository at this point in the history
  • Loading branch information
pelikhan committed Dec 11, 2024
1 parent 41dc99c commit 4ecf3e5
Show file tree
Hide file tree
Showing 4 changed files with 100 additions and 29 deletions.
22 changes: 4 additions & 18 deletions docs/public/schemas/llms.json
Original file line number Diff line number Diff line change
Expand Up @@ -13,24 +13,6 @@
"properties": {
"id": {
"type": "string",
"enum": [
"openai",
"github",
"azure",
"azure_serverless",
"azure_serverless_models",
"anthropic",
"googe",
"huggingface",
"transformers",
"ollama",
"mistal",
"lmstudio",
"jan",
"alibaba",
"llamafile",
"litellm"
],
"description": "Identifier for the LLM provider"
},
"detail": {
Expand Down Expand Up @@ -60,6 +42,10 @@
"tools": {
"type": "boolean",
"description": "Indicates if tools are supported"
},
"openaiCompatibility": {
"type": "string",
"description": "Uses OpenAI API compatibility layer documentation URL"
}
},
"additionalProperties": false,
Expand Down
56 changes: 56 additions & 0 deletions docs/src/components/LLMProviderFeatures.astro
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
---
import LLMS from "../../../packages/core/src/llms.json"
interface Props {
provider: string
}
const { provider } = Astro.props
const info: Record<string, boolean> & { openaiCompatibility?: string } =
LLMS.providers.find(({ id }) => id === provider) as any
if (!info) {
throw new Error(`Provider ${provider} not found`)
}
const features: Record<string, { name?: string; url?: string }> = {
seed: {
name: "Seed ignored",
},
topP: {
name: "top_p ignored",
},
logprobs: {
name: "logprobs (and top logprobs) ignored",
},
topLogrobs: {
name: "Top logprobs ignored",
},
tools: {
name: "Tools implemented as fallback tools automatically.",
},
}
const oai = info.openaiCompatibility
const unsupported = Object.keys(info)
.sort()
.map((id) => ({ id, supported: info[id] }))
.filter(({ supported }) => supported === false)
---

{
oai || unsupported?.length > 0 ? (
<>
<h3>Limitations</h3>
<ul>
{!!oai && (
<li>
Uses <a href={oai}>OpenAI compatibility layer</a>
</li>
)}
{Object.keys(features)
.map((id) => ({ id, supported: info[id] }))
.filter(({ supported }) => supported === false)
.map(({ id }) => (
<li>{features[id]?.name || id}</li>
))}
</ul>
</>
) : null
}
38 changes: 31 additions & 7 deletions docs/src/content/docs/getting-started/configuration.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@ import { FileTree } from "@astrojs/starlight/components"
import { Steps } from "@astrojs/starlight/components"
import { Tabs, TabItem } from "@astrojs/starlight/components"
import { Image } from "astro:assets"
import LLMProviderFeatures from "../../../components/LLMProviderFeatures.astro"

import lmSrc from "../../../assets/vscode-language-models.png"
import lmAlt from "../../../assets/vscode-language-models.png.txt?raw"
Expand Down Expand Up @@ -245,6 +246,8 @@ GENAISCRIPT_MODEL_SMALL=openai:gpt-4o-mini
:::
<LLMProviderFeatures provider="openai" />
## GitHub Models <a id="github" href=""></a>
The [GitHub Models](https://github.com/marketplace/models) provider, `github`, allows running models through the GitHub Marketplace.
Expand Down Expand Up @@ -330,6 +333,8 @@ script({
})
```
<LLMProviderFeatures provider="github" />
## Azure OpenAI <a id="azure" href=""></a>
The [Azure OpenAI](https://learn.microsoft.com/en-us/azure/ai-services/openai/reference#chat-completions) provider, `azure` uses the `AZURE_OPENAI_...` environment variables.
Expand Down Expand Up @@ -477,6 +482,8 @@ The rest of the steps are the same: Find the deployment name and use it in your

</Steps>

<LLMProviderFeatures provider="azure" />

## Azure AI Serverless Deployments <a id="azure_serverless" href=""></a>

You can deploy "serverless" models through [Azure AI Studio](https://ai.azure.com/) and pay as you go per token.
Expand Down Expand Up @@ -587,6 +594,8 @@ AZURE_SERVERLESS_OPENAI_API_KEY=...

</Steps>

<LLMProviderFeatures provider="azure_serverless" />

### Azure AI Models <a href="" id="azure_serverless_models" />

The `azure_serverless_models` provider supports non-OpenAI models deployed through the Azure AI Studio serverless deployments.
Expand Down Expand Up @@ -680,7 +689,7 @@ AZURE_SERVERLESS_MODELS_API_KEY=...

<li>

Find the deployment name and use it in your script, `model: "azure_serverless:deployment-id"`.
Find the deployment name and use it in your script, `model: "azure_serverless_models:deployment-id"`.

</li>

Expand All @@ -700,6 +709,8 @@ model3=key3
"
```

<LLMProviderFeatures provider="azure_serverless_models" />

## Google AI <a href="" id="google" />

The `google` provider allows you to use Google AI models. It gives you access
Expand Down Expand Up @@ -760,6 +771,8 @@ script({ model: "google:gemini-1.5-pro-002" })

</Steps>

<LLMProviderFeatures provider="google" />

## GitHub Copilot Chat Models <a id="github-copilot" href=""></a>

If you have access to **GitHub Copilot Chat in Visual Studio Code**,
Expand Down Expand Up @@ -855,6 +868,8 @@ script({

</Steps>

<LLMProviderFeatures provider="anthropic" />

## Hugging Face <a href="" id="huggingface" />

The `huggingface` provider allows you to use [Hugging Face Models](https://huggingface.co/models?other=text-generation-inference) using [Text Generation Inference](https://huggingface.co/docs/text-generation-inference/index).
Expand Down Expand Up @@ -915,6 +930,8 @@ Some models may require a Pro account.

:::

<LLMProviderFeatures provider="huggingface" />

## Mistral AI <a href="" id="mistral" />

The `mistral` provider allows you to use [Mistral AI Models](https://mistral.ai/technology/#models)
Expand Down Expand Up @@ -962,6 +979,9 @@ script({

</Steps>

<LLMProviderFeatures provider="mistal" />


## Alibaba Cloud <a href="" id="alibaba" />

The `alibaba` provider access the [Alibaba Cloud](https://www.alibabacloud.com/) models.
Expand Down Expand Up @@ -1015,12 +1035,7 @@ script({

</Steps>

:::note

GenAIScript uses the [OpenAI compatibility](https://www.alibabacloud.com/help/en/model-studio/developer-reference/compatibility-of-openai-with-dashscope) layer
to access Alibaba.

:::
<LLMProviderFeatures provider="alibaba" />

## Ollama

Expand Down Expand Up @@ -1117,6 +1132,8 @@ docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama
docker stop ollama && docker rm ollama
```

<LLMProviderFeatures provider="ollama" />

## LMStudio

The `lmstudio` provider connects to the [LMStudio](https://lmstudio.ai/) headless server.
Expand Down Expand Up @@ -1184,6 +1201,8 @@ script({

Follow [this guide](https://huggingface.co/blog/yagilb/lms-hf) to load Hugging Face models into LMStudio.

<LLMProviderFeatures provider="lmstudio" />

## Jan

The `jan` provider connects to the [Jan](https://jan.ai/) local server.
Expand Down Expand Up @@ -1224,6 +1243,8 @@ If you change the default server URL, you can set the `JAN_API_BASE` environment
JAN_API_BASE=http://localhost:1234/v1
```

<LLMProviderFeatures provider="jan" />

## LocalAI

[LocalAI](https://localai.io/) act as a drop-in replacement REST API that’s compatible
Expand Down Expand Up @@ -1348,6 +1369,9 @@ This provider is experimental and may not work with all models.

:::


<LLMProviderFeatures provider="transformers" />

## Model specific environment variables

You can provide different environment variables
Expand Down
13 changes: 9 additions & 4 deletions packages/core/src/llms.json
Original file line number Diff line number Diff line change
Expand Up @@ -25,13 +25,16 @@
},
{
"id": "anthropic",
"detail": "Anthropic models"
"detail": "Anthropic models",
"logprobs": false,
"topLogprobs": false
},
{
"id": "googe",
"id": "google",
"detail": "Google AI",
"seed": false,
"tools": false
"tools": false,
"openaiCompatibility": "https://ai.google.dev/gemini-api/docs/openai"
},
{
"id": "huggingface",
Expand All @@ -48,7 +51,8 @@
{
"id": "ollama",
"detail": "Ollama local model",
"logitBias": false
"logitBias": false,
"openaiCompatibility": "https://github.com/ollama/ollama/blob/main/docs/openai.md"
},
{
"id": "lmstudio",
Expand All @@ -61,6 +65,7 @@
{
"id": "alibaba",
"detail": "Alibaba models",
"openaiCompatibility": "https://www.alibabacloud.com/help/en/model-studio/developer-reference/compatibility-of-openai-with-dashscope",
"tools": false
},
{
Expand Down

0 comments on commit 4ecf3e5

Please sign in to comment.