Add url prefxi path support for /v1 #357
taoari
announced in
Announcements
Replies: 2 comments
-
Hey @taoari This is already possible via the args / env variable FYI, you can get it from the infinity_emb v2 --help
|
Beta Was this translation helpful? Give feedback.
0 replies
-
@taoari You might want to look why the stuff on openai is named v1. Key words to throw into ChatGPT is to explain you api versioning cascading and canary deployments. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Feature request
Currently, the model is hosted on localhost:/embeddings. It would be good to add option like --mount-path "/v1" to serve the model at localhost:/v1/embeddings.Adding option like --mount-path "/v1" would be good
Motivation
Because OpenAI API serves model at /v1, so does vLLM. It would be great to make the APIs consistent without having to write code to handle the "/v1" difference.
Your contribution
N/A
Beta Was this translation helpful? Give feedback.
All reactions