-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
test: Add python backend tests for the new histogram metric #7540
test: Add python backend tests for the new histogram metric #7540
Conversation
… yinggeh-DLIS-7061-add-vllm-metrics
… yinggeh-DLIS-7061-add-vllm-metrics
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM for cherry-pick purposes, but some comments for improving the tests later on.
…hub.com:triton-inference-server/server into yinggeh-DLIS-7113-support-histogram-metric-type
Please trigger a pipeline encapsulating all the latest changes so can feel confident in the CI impact when looking at the cherry-picks |
Merged #7525 to the wrong branch. Need re-approval to merge to main. |
What does the PR do?
Tests histogram metric in custom_metrics.
Checklist
<commit_type>: <Title>
Commit Type:
Check the conventional commit type
box here and add the label to the github PR.
Related PRs:
triton-inference-server/vllm_backend#56
triton-inference-server/python_backend#374
triton-inference-server/core#386
Where should the reviewer start?
n/a
Test plan:
n/a
17487728
Caveats:
n/a
Background
Customer requested histogram metrics from vLLM.
Related Issues: (use one of the action keywords Closes / Fixes / Resolves / Relates to)
n/a