Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: 🚀 Integrate Amazon Bedrock Support2 #733

Open
wants to merge 12 commits into
base: main
Choose a base branch
from

Conversation

Sunwood-ai-labs
Copy link

Overview

This PR adds support for Amazon Bedrock models to our LLM integration, enhancing our AI capabilities with Claude 3 models (Opus, Sonnet, Haiku).

Key Changes

  • Implement AWS credentials retrieval for Bedrock
  • Add Bedrock model initialization and handling
  • Include Claude 3 models options for Bedrock
  • Adjust token limits specifically for Bedrock models
  • Update chat action to support dynamic model selection
  • Add necessary dependencies for Amazon Bedrock integration

Detailed Changes

  1. app/lib/.server/llm/api-key.ts:

    • Add getAWSCredentials function to fetch AWS access keys and region
  2. app/lib/.server/llm/constants.ts:

    • Define MAX_TOKENS_BEDROCK constant (4096) for Bedrock models
  3. app/lib/.server/llm/model.ts:

    • Implement getBedrockModel function for Bedrock model initialization
    • Update getModel function to handle Bedrock provider
  4. app/lib/.server/llm/stream-text.ts:

    • Use Bedrock-specific token limit (MAX_TOKENS_BEDROCK)
  5. app/routes/api.chat.ts:

    • Update chat action to support model selection in the request body
    • Set proper Content-Type header for the response
  6. app/utils/constants.ts:

    • Add Bedrock model options (Claude 3 Opus, Sonnet, Haiku)
  7. package.json:

    • Add @ai-sdk/amazon-bedrock dependency (version 0.0.30)
  8. pnpm-lock.yaml:

    • Update with new dependencies including AWS SDK packages

coleam00 and others added 4 commits October 13, 2024 13:53
Add support for Amazon Bedrock models, including:

- Implement AWS credentials retrieval for Bedrock
- Add Bedrock model initialization and handling
- Include Claude 3 models (Opus, Sonnet, Haiku) for Bedrock
- Adjust token limits for Bedrock models
- Update chat action to support model selection
- Add @ai-sdk/amazon-bedrock dependency

Key changes:
- app/lib/.server/llm/api-key.ts: Add getAWSCredentials function
- app/lib/.server/llm/constants.ts: Define MAX_TOKENS_BEDROCK
- app/lib/.server/llm/model.ts: Implement getBedrockModel function
- app/lib/.server/llm/stream-text.ts: Use Bedrock-specific token limit
- app/routes/api.chat.ts: Update to support model selection
- app/utils/constants.ts: Add Bedrock model options
- package.json: Add @ai-sdk/amazon-bedrock dependency
- pnpm-lock.yaml: Update with new dependencies
- Translate comments to English for consistency
- Add explanatory comment for AWS credentials function
- Refactor default region assignment with inline comment
@Sunwood-ai-labs Sunwood-ai-labs changed the title 🚀 feat: Integrate Amazon Bedrock Support2 feat: 🚀 Integrate Amazon Bedrock Support2 Oct 19, 2024
@Sunwood-ai-labs
Copy link
Author

I changed the branch name, but the content is the same as - #689!

- Deleted references to fork by Cole Medin
- Removed information about choosing LLM models
- Maintained focus on original Bolt.new project description
README.md Outdated

- **Batch simple instructions**: Save time by combining simple instructions into one message. For example, you can ask Bolt to change the color scheme, add mobile responsiveness, and restart the dev server, all in one go saving you time and reducing API credit consumption significantly.
```bash
git clone https://github.com/coleam00/bolt.new-any-llm.git
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

repository name is pointing to coleam00's github

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I responded with e141171

return anthropic('claude-3-5-sonnet-20240620');
return anthropic(model);
}

Copy link

@rrfaria rrfaria Oct 20, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

using openRoute you can use gemini but if you want to use gemini directly you need to add gemini provider

export function getGeminiModel(apiKey: string, model: string = 'gemini-1.5-pro-latest') {
  const gemini= createGoogleGenerativeAI({
    apiKey,
  });

  return gemini(model);
}

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I responded with e5d16df

case 'Anthropic':
return getAnthropicModel(apiKey, model);
case 'OpenAI':
return getOpenAIModel(apiKey, model);
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

also add gemini Provider here.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I responded with e5d16df

switch (provider) {
case 'Anthropic':
return env.ANTHROPIC_API_KEY || cloudflareEnv.ANTHROPIC_API_KEY;
case 'OpenAI':
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add api key from gemini provider

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I responded with 7e0287f

- Include Gemini API key in the getAPIKey function
- Allow retrieval of Gemini API key from environment variables
- Import Google Generative AI SDK
- Add getGeminiModel function to create Gemini model instances
- Update getModel function to support Gemini provider
- Include Gemini 1.5 Pro and Flash models in the available model options
- Add latest and stable versions for both Gemini 1.5 Pro and Flash
- Include @ai-sdk/google package version 0.0.52 for Gemini integration
- Add @ai-sdk/google package and its dependencies to the lock file
- Ensure consistent package versions across the project
- Replace old URL (https://github.com/coleam00/bolt.new-any-llm.git) with new URL (https://github.com/stackblitz/bolt.new)
- Improve documentation accuracy for users setting up the project
@@ -1,3 +1,6 @@
interface Env {
ANTHROPIC_API_KEY: string;
OPENAI_API_KEY: string;
GROQ_API_KEY: string;
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One last thing to approve:
add GEMINI_API_KEY here and also on .env.example

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thank you!
I responded with 9e5e73a

- Added AWS credential environment variables:
  - AWS_ACCESS_KEY_ID
  - AWS_SECRET_ACCESS_KEY
  - AWS_REGION
- Enables AWS service integration capabilities
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants