Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add a variables like OPENAI_BASE_URL and OPENAI_MODEL to use other AI model providers #445

Open
1 task done
gmag11 opened this issue Nov 24, 2024 · 0 comments
Open
1 task done
Labels
type: feature-request New feature or request

Comments

@gmag11
Copy link

gmag11 commented Nov 24, 2024

🔖 Feature description

There are many other AI providers who offer GPT4o models with OpenAI compatible API. This would allow to use even other models even local ones.

🎤 Why is this feature needed ?

Using alternate providers and or models, opening the possibility to use OpenAI compatible endpoints. Some examples:

  • OpenRouter
  • LiteLLM
  • OpenWebUI
  • Ollama

✌️ How do you aim to achieve this?

Just adding the proper parameters to const openai in libraries/nestjs-libraries/src/openai/openai.service.ts would allow this

🔄️ Additional Information

I am trying to implement this locally and will try a PR if working fine

👀 Have you spent some time to check if this feature request has been raised before?

  • I checked and didn't find similar issue

Are you willing to submit PR?

None

@gmag11 gmag11 added the type: feature-request New feature or request label Nov 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type: feature-request New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant