Skip to content

Commit

Permalink
Update src/content/docs/ai-gateway/observability/evaluations/set-up-e…
Browse files Browse the repository at this point in the history
…valuations.mdx


Caps

Co-authored-by: Jun Lee <junlee@cloudflare.com>
  • Loading branch information
daisyfaithauma and Oxyjun authored Sep 26, 2024
1 parent 13a4e49 commit 15e5f53
Showing 1 changed file with 1 addition and 1 deletion.
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ After creating a dataset, choose the evaluation parameters:
- Cost: Calculates the average cost of inference requests within the dataset (only for requests with [cost data](/ai-gateway/observability/costs/)).
- Speed: Calculates the average duration of inference requests within the dataset.
- Performance:
- Human feedback: Measures performance based on human feedback, calculated by the % of thumbs up on the logs, annotated from the Logs tab.
- Human feedback: measures performance based on human feedback, calculated by the % of thumbs up on the logs, annotated from the Logs tab.

:::note[Note]

Expand Down

0 comments on commit 15e5f53

Please sign in to comment.