Skip to content

Actions: ggerganov/llama.cpp

flake8 Lint

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
15,420 workflow runs
15,420 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

readme : add option, update default value, fix formatting
flake8 Lint #16683: Pull request #10271 synchronize by pothitos
November 25, 2024 21:23 Action required pothitos:master
November 25, 2024 21:23 Action required
ci : build docker images only once daily (#10503)
flake8 Lint #16682: Commit 50d5cec pushed by slaren
November 25, 2024 21:05 47m 32s master
November 25, 2024 21:05 47m 32s
ci : build docker images only once daily
flake8 Lint #16681: Pull request #10503 opened by slaren
November 25, 2024 20:32 33m 21s sl/nightly-docker-images
November 25, 2024 20:32 33m 21s
ci : build docker images only once daily
flake8 Lint #16680: Commit 8e39937 pushed by slaren
November 25, 2024 20:31 1h 19m 51s sl/nightly-docker-images
November 25, 2024 20:31 1h 19m 51s
server : add more information about error (#10455)
flake8 Lint #16679: Commit 9fd8c26 pushed by ggerganov
November 25, 2024 20:29 1h 5m 57s master
November 25, 2024 20:29 1h 5m 57s
cmake : enable warnings in llama
flake8 Lint #16678: Pull request #10474 synchronize by ggerganov
November 25, 2024 20:11 1h 15m 52s gg/cmake-warnings
November 25, 2024 20:11 1h 15m 52s
cmake : enable warnings in llama
flake8 Lint #16677: Commit e908ace pushed by ggerganov
November 25, 2024 20:11 1h 14m 23s gg/cmake-warnings
November 25, 2024 20:11 1h 14m 23s
ci : add ubuntu cuda build, build with one arch on windows
flake8 Lint #16676: Pull request #10456 synchronize by slaren
November 25, 2024 20:07 1m 26s sl/cuda-ci-ninja
November 25, 2024 20:07 1m 26s
ci : add ubuntu cuda build, build with one arch on windows
flake8 Lint #16675: Pull request #10456 synchronize by slaren
November 25, 2024 19:56 42s sl/cuda-ci-ninja
November 25, 2024 19:56 42s
ci : use ninja to build windows-cuda
flake8 Lint #16674: Commit 1f92e55 pushed by slaren
November 25, 2024 19:56 1m 16s sl/cuda-ci-ninja
November 25, 2024 19:56 1m 16s
server : enable cache_prompt by default (#10501)
flake8 Lint #16673: Commit 47f931c pushed by ggerganov
November 25, 2024 19:50 1h 22m 23s master
November 25, 2024 19:50 1h 22m 23s
metal : enable mat-vec kernels for bs <= 4 (#10491)
flake8 Lint #16672: Commit 106964e pushed by ggerganov
November 25, 2024 19:49 53m 20s master
November 25, 2024 19:49 53m 20s
fix: ggml: fix vulkan-shaders-gen build
flake8 Lint #16671: Pull request #10448 synchronize by sparkleholic
November 25, 2024 18:47 23s sparkleholic:master_fix
November 25, 2024 18:47 23s
Rename Olmo1124 to Olmo2 (#10500)
flake8 Lint #16670: Commit 80acb7b pushed by slaren
November 25, 2024 18:36 1h 16m 3s master
November 25, 2024 18:36 1h 16m 3s
llama : accept a list of devices to use to offload a model (#10497)
flake8 Lint #16669: Commit 10bce04 pushed by slaren
November 25, 2024 18:30 1h 3m 47s master
November 25, 2024 18:30 1h 3m 47s
server : enable cache_prompt by default
flake8 Lint #16668: Pull request #10501 opened by ggerganov
November 25, 2024 18:29 51m 48s gg/server-enable-cache-prompt
November 25, 2024 18:29 51m 48s
server : enable cache_prompt by default
flake8 Lint #16667: Commit fe48dbd pushed by ggerganov
November 25, 2024 18:29 48m 42s gg/server-enable-cache-prompt
November 25, 2024 18:29 48m 42s
llama : accept a list of devices to use to offload a model
flake8 Lint #16666: Pull request #10497 synchronize by slaren
November 25, 2024 18:29 18s sl/llama-dev-selection
November 25, 2024 18:29 18s
rename env parameter to LLAMA_ARG_DEVICE for consistency
flake8 Lint #16665: Commit 42f61c8 pushed by slaren
November 25, 2024 18:29 48m 32s sl/llama-dev-selection
November 25, 2024 18:29 48m 32s
Introduce llama-run
flake8 Lint #16664: Pull request #10291 synchronize by ericcurtin
November 25, 2024 17:24 1h 44m 46s ericcurtin:simple-chat-smart
November 25, 2024 17:24 1h 44m 46s
Introduce llama-run
flake8 Lint #16663: Pull request #10291 synchronize by ericcurtin
November 25, 2024 17:24 21s ericcurtin:simple-chat-smart
November 25, 2024 17:24 21s
llama : initial Mamba-2 support
flake8 Lint #16662: Pull request #9126 synchronize by compilade
November 25, 2024 17:09 1h 37m 28s compilade/mamba2
November 25, 2024 17:09 1h 37m 28s
Merge branch 'master' into compilade/mamba2
flake8 Lint #16661: Commit 1ee6c48 pushed by compilade
November 25, 2024 17:09 1h 29m 46s compilade/mamba2
November 25, 2024 17:09 1h 29m 46s
llama : accept a list of devices to use to offload a model
flake8 Lint #16660: Pull request #10497 synchronize by slaren
November 25, 2024 17:08 1h 20m 39s sl/llama-dev-selection
November 25, 2024 17:08 1h 20m 39s
fix other examples
flake8 Lint #16659: Commit acf43cc pushed by slaren
November 25, 2024 17:08 1h 27m 2s sl/llama-dev-selection
November 25, 2024 17:08 1h 27m 2s