Replies: 1 comment 1 reply
-
LocalAI supports vllm as backend, and vllm supports HF models(safetensors format) seamlessly. That said, LocalAI supports safetensors. You can find vllm compatible models here: https://github.com/vllm-project/vllm |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
there are some models that I like to use like falcon-40b, but it's only available as safetensors file. Is it possible to use these models as well? Or to convert them to a supported file format?
Thanks in advance
Frank
Beta Was this translation helpful? Give feedback.
All reactions