-
Notifications
You must be signed in to change notification settings - Fork 41
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update LLaMa.cpp #74
Comments
So easy to update and I'll just leave the details here so I can find it again if I need to. I took the latest release from here: https://github.com/ggerganov/llama.cpp/releases/tag/b3358 I used the ARMx64 version (but not really sure about the latest Mac architectures to know if this is the best choice over the x64 version but it seems to work well on my Mac Air M3). Then extracted the llama-server and copied over to the Freechat project and renamed it freechat-server. Gemma2 (https://huggingface.co/bartowski/gemma-2-9b-it-GGUF/blob/main/gemma-2-9b-it-Q6_K.gguf) started working immediately after swapping the server to the latest version. Will try with Deepseek Coder later (https://huggingface.co/TheBloke/deepseek-coder-6.7B-instruct-GGUF) |
glad you figured this out, sorry i missed it! I try to do this manually about monthly but it would be so clutch to have a bot do this. |
will leave this open until the next time i update (maybe this weekend!) |
Is it straightforward / easy / to update the version of LLaMa.cpp being used? Is it something we could enable the end user to do via the UI?
The text was updated successfully, but these errors were encountered: