[Question]: Configure OpenAI + vLLM (OpenAI endpoint) at the same time #1432
-
What is your question?I was wondering how I should configure things if I want access to both OpenAI endpoints, as well as models that I have hosted using vLLM with their OpenAI endpoint at the same time? More DetailsThis isn't specifically related to vLLM, but really any OpenAI compatible endpoint that you can self-host. What is the main subject of your question?Endpoints ScreenshotsNo response Code of Conduct
|
Beta Was this translation helpful? Give feedback.
Answered by
danny-avila
Dec 25, 2023
Replies: 1 comment 3 replies
-
The only way to use both simultaneously is to have a dedicated endpoint for reverse proxy: #1344 Not yet implemented, will work on it soon. |
Beta Was this translation helpful? Give feedback.
3 replies
Answer selected by
Tostino
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
The only way to use both simultaneously is to have a dedicated endpoint for reverse proxy: #1344
Not yet implemented, will work on it soon.