Skip to content

Actions: HabanaAI/vllm-fork

mypy

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
985 workflow runs
985 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

Workaround for OOM during loading llama-405 (#396)
mypy #718: Commit 07c98a5 pushed by afierka-intel
October 18, 2024 07:13 46s habana_main
October 18, 2024 07:13 46s
Oct 16 rebase
mypy #717: Pull request #401 synchronize by kzawora-intel
October 17, 2024 17:12 40s private/kzawora/oct_16_rebase
October 17, 2024 17:12 40s
Add notes on the use of Slack (#9442)
mypy #712: Commit dbfa8d3 pushed by kzawora-intel
October 17, 2024 13:56 50s main
October 17, 2024 13:56 50s
Oct 16 rebase
mypy #710: Pull request #401 synchronize by kzawora-intel
October 17, 2024 13:32 42s private/kzawora/oct_16_rebase
October 17, 2024 13:32 42s
Create run-lm-eval-mmlu.sh
mypy #708: Pull request #399 synchronize by michalkuligowski
October 17, 2024 13:16 47s michalkuligowski-mmlu-test
October 17, 2024 13:16 47s
initial works on enabling automatic prefix caching
mypy #703: Pull request #162 synchronize by huijjj
October 17, 2024 10:22 Action required SqueezeBits:enable-prefix-caching
October 17, 2024 10:22 Action required
[New Feature][Habana-Main] speculative_decoding HPU support
mypy #696: Pull request #375 synchronize by xuechendi
October 16, 2024 23:32 Action required xuechendi:habana_main_spec_decode
October 16, 2024 23:32 Action required
[New Feature][Habana-Main] speculative_decoding HPU support
mypy #695: Pull request #375 synchronize by xuechendi
October 16, 2024 23:19 Action required xuechendi:habana_main_spec_decode
October 16, 2024 23:19 Action required
ProTip! You can narrow down the results and go further in time using created:<2024-10-16 or the other filters available.