-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: add multiple providers in addition to anthropic #531
base: main
Are you sure you want to change the base?
Conversation
Incredible, excellent contribution |
Great contribution! Maybe we can merge your code with the @coleam00 fork? He had ability to select the LLM provider and model from the landing page and during the chat step: Repo: https://github.com/coleam00/bolt.new-any-llm You can see it on his YouTube video: |
@@ -60,6 +60,10 @@ pnpm install | |||
|
|||
``` | |||
ANTHROPIC_API_KEY=XXX | |||
PROVIDER= gemini | antrophic | openai | ollama |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
typo - did you mean to spell anthropic ? @rrfaria
I implemented Bedrock based on https://github.com/coleam00/bolt.new-any-llm! (#689) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Look good, thanks for adding this I will keep using it in my local instance
Hi, I would like to share a contribution
this MR allow project to work with multiple providers
now you can use gemini, openAi, ollama, anthropic and if you would like to add more providers you can fallow this docs:
https://sdk.vercel.ai/providers/ai-sdk-providers